Hiring for Data Engineer / Data Scientist
Two95 International Inc.
Kuala Lumpur
On-site
MYR 80,000 - 120,000
Full time
14 days ago
Job summary
A dynamic technology firm in Kuala Lumpur is seeking a Data Scientist to drive full lifecycle projects and develop data pipelines. Candidates should possess 4+ years of relevant experience, strong R/Python skills, and a background in ecommerce or related fields. This role involves maintaining BI platforms and using tools like Tableau for data visualization.
Qualifications
- 4+ years of experience in building scalable data pipelines.
- Strong coding proficiency in R or Python.
- Solid experience with GCP.
Responsibilities
- Drive full lifecycle of Data Science projects.
- Develop data pipelines for self-service reports.
- Visualize data and create repeatable tools for end users.
Skills
Data pipeline development
Data visualization using Tableau
R or Python programming
Agile - Scrum project management
Machine learning
Education
Bachelor's, Master's, or PhD in relevant fields
Tools
PostgreSQL
MySQL
MongoDB
BigQuery
Tableau
Power BI
Git
Jira
Jenkins
Airflow
Responsibilities
- Drive the full lifecycle of Data Science projects: from gathering and understanding the end-user needs to implement a fully automated solution.
- Develop and provision of Data pipelines to enable self-service reports and dashboards.
- Deploy techniques to answer the appropriate business problems using R or Python.
- Visualize data using Tableau and create repeatable visual analysis for end users to use as tools.
- Take ownership of the existing BI platforms and maintain the data integrity and accuracy of the numbers and data sources.
- Know Agile - Scrum project management experience/knowledge - Ability to prioritise, pushback and effectively manage a data product and sprint backlog.
Requirements
- Must have completed education from Bachelors, Masters or PHDs from Computer Science, Social Sciences, Physical Sciences, Statistics, Mathematics, Computer Engineering and any related field.
- 4+ years of experience in building out scalable and reliable ETL/ELT pipelines and processes to ingest data from a variety of data sources, preferably in the ecommerce retail industry.
- Deep understanding of Relational Database Management Systems (RDBMS) (e.g. PostgreSQL, MySQL) , No-SQL Databases (e.g. MongoDB, ElasticSearch) and hands-on experience in implementation and performance tuning of MPP databases (e.g. BigQuery).
- Experience with Tableau, Power BI, Superset or any standard data visualization tools.
- Strong proficiency in writing production-quality code preferable in R/Python, engineering experience with machine learning projects like Time Series Forecasting, Classification and Optimization problems
- Experience administering and deploying CI/CD tools (e.g. Git, Jira, Jenkins) Industrialization (e.g. Ansible, Terraform), Workflow Management ( e.g. Airflow, Jenkins, Luigi) in Linux operating system environments.
- Exhibits sound business judgment, a proven ability to influence others, strong analytical skills, and a proven track record of taking ownership, leading data-driven analyses, and influencing results.
- Solid experience in GCP
- E-commerce / logistics / fashion retail background a bonus.