Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A well-established client in Kuala Lumpur seeks a Data Engineer responsible for designing, building, and maintaining robust data pipelines. The candidate will work on ETL processes and database optimization while delivering actionable insights through BI tools. Proficiency in SQL, automation scripting, and strong communication skills are essential, alongside technical expertise in data tools such as Tableau and Power BI.
COMPANY OVERVIEW
A well-established client of us in Kuala Lumpur is seeking for Data Engineer.
JOB RESPONSIBILITIES
Design, build, and maintain robust and scalable data pipelines to collect, process, and store structured and unstructured data from various sources.
Develop and optimize database schemas, tables, and queries to support data storage, retrieval, and analysis.
Implement ETL processes to integrate data from various sources, ensuring data quality, consistency, and integrity.
Monitor and troubleshoot data pipelines and infrastructure issues and provide timely resolutions.
Develop real-time reports and dashboards using BI tools (e.g., Tableau, Power BI, Looker).
Collaborate with Business Insights Analyst and other cross-functional team members to understand data requirements, perform data profiling, and implement data transformation and cleaning processes
JOB REQUIREMENTS
At least 3 years experience
Proven experience as a data pipeline engineer in various functional teams such as Business Intelligence, Data Analytics, or a similar role.
Advanced Excel skills (such as VBA scripting, macros, array formulas, pivot tables, power queries, multi-data source integration, data cleansing methods, text and data manipulation functions, charting and dashboarding) and seasoned in large data validation, reconciliation, and management.
Strong Relational Database and SQL Scripting and skills with the ability to extract and manipulate large data sets.
Proficiency in BI tools (Tableau, Power BI, Looker, or similar) is an advantage.
Experience with real-time data processing and streaming technologies (e.g., Apache Flink, Kafka Streams) is not required but a strong advantage.
Skilled in automation, scripting or data manipulation languages such as Python, JavaScript, Bash/Shell, PowerShell or R for statistical analysis required.
Experience with cloud data platforms (BigQuery, Snowflake, Redshift) is a plus.
Strong problem-solving skills and ability to identify actionable insights.
Excellent communication skills – able to present data-driven recommendations to non-technical stakeholders. Bilingually strong in English and Chinese is an advantage in order to support Chinese speaking stakeholders.