Hiring for Data Engineer / Data Scientist
Two95 International Inc.
Kuala Lumpur
On-site
MYR 80,000 - 120,000
Full time
Job summary
A leading tech consulting firm in Kuala Lumpur is seeking a skilled Data Scientist to manage full lifecycle projects, develop data pipelines, and visualize data using Tableau. The ideal candidate will have 4+ years in ETL/ELT processes, strong coding skills in R/Python, and experience with relational databases. This role offers competitive benefits within a dynamic team environment.
Qualifications
- 4+ years of experience in building scalable and reliable ETL/ELT pipelines.
- Strong proficiency in writing production-quality code in R or Python.
- Experience administering and deploying CI/CD tools in Linux environments.
- Solid understanding of GCP.
Responsibilities
- Drive the full lifecycle of Data Science projects.
- Develop Data pipelines for self-service reports and dashboards.
- Visualize data using Tableau.
- Maintain data integrity and accuracy.
Skills
Data Science
R or Python
Tableau
ETL/ELT processes
Relational Database Management
Machine Learning
Education
Bachelor's, Master's or PhD in relevant fields
Tools
PostgreSQL
MySQL
MongoDB
ElasticSearch
BigQuery
CI/CD tools (Git, Jira, Jenkins)
Ansible
Terraform
Airflow
Responsibilities:
- Drive the full lifecycle of Data Science projects: from gathering and understanding the end-user needs to implement a fully automated solution.
- Develop and provision of Data pipelines to enable self-service reports and dashboards.
- Deploy techniques to answer the appropriate business problems using R or Python.
- Visualize data using Tableau and create repeatable visual analysis for end users to use as tools.
- Take ownership of the existing BI platforms and maintain the data integrity and accuracy of the numbers and data sources.
- Know Agile - Scrum project management experience/knowledge - Ability to prioritise, pushback and effectively manage a data product and sprint backlog.
Requirements:
- Must have completed education from Bachelors, Masters or PHDs from Computer Science, Social Sciences, Physical Sciences, Statistics, Mathematics, Computer Engineering and any related field.
- 4+ years of experience in building out scalable and reliable ETL/ELT pipelines and processes to ingest data from a variety of data sources, preferably in the ecommerce retail industry.
- Deep understanding of Relational Database Management Systems (RDBMS) (e.g. PostgreSQL, MySQL) , No-SQL Databases (e.g. MongoDB, ElasticSearch) and hands-on experience in implementation and performance tuning of MPP databases (e.g. BigQuery).
- Experience with Tableau, Power BI, Superset or any standard data visualization tools.
- Strong proficiency in writing production-quality code preferable in R/Python, engineering experience with machine learning projects like Time Series Forecasting, Classification and Optimization problems
- Experience administering and deploying CI/CD tools (e.g. Git, Jira, Jenkins) Industrialization (e.g. Ansible, Terraform), Workflow Management ( e.g. Airflow, Jenkins, Luigi) in Linux operating system environments.
- Exhibits sound business judgment, a proven ability to influence others, strong analytical skills, and a proven track record of taking ownership, leading data-driven analyses, and influencing results.
- Solid experience in GCP
- E-commerce / logistics / fashion retail background a bonus.