Enable job alerts via email!

Senior Analyst (Data Engineer)

EPF Malaysia

Petaling Jaya

On-site

MYR 60,000 - 100,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a forward-thinking organization as a Data Engineer, where you'll design and optimize data pipelines, ensuring seamless data integration for analytics and reporting. This role emphasizes collaboration with data analysts and IT teams, fostering a data-driven culture within the organization. You'll leverage modern cloud technologies and big data frameworks to enhance data workflows and support business intelligence initiatives. If you're passionate about data engineering and eager to innovate, this position offers a unique opportunity to make a significant impact in a dynamic environment.

Qualifications

  • 7+ years of experience in data engineering and ETL development.
  • Hands-on experience with cloud data platforms and big data frameworks.

Responsibilities

  • Design and maintain ETL/ELT pipelines for data extraction and loading.
  • Collaborate with teams to integrate various data sources into analytics.

Skills

Data Pipeline Development
ETL Processes
SQL
Python
Data Integration
Debugging and Troubleshooting

Education

Bachelor's Degree in Computer Science
Bachelor's Degree in Data Science

Tools

AWS
Azure
Google Cloud
Spark
Hadoop
Airflow
DBT
Git

Job description

JOB SUMMARY

You will be responsible for designing, developing, maintaining, and optimizing data pipelines, ETL processes, and data integration solutions. The role ensures that structured and unstructured data is efficiently processed and made available for business intelligence (BI), analytics, and reporting needs. The Data Engineer collaborates closely with data analysts, data scientists, and IT teams to support the organisation’s data infrastructure to meet EPF business goals.

You will also serve as a change agent to promote data culture across EPF.

JOB SCOPE

  • Design, develop, and maintain ETL/ELT pipelines for data extraction, transformation, and loading.
  • Implement and manage CDC pipelines to ingest data from multiple sources (e.g. relational databases, mainframes, and cloud platforms), and capture change logs using log-based or trigger-based CDC.
  • Manage CDC processes using batch scripts (e.g., start and stop CDC jobs through automated scripts), with proficiency in writing batch scripts (e.g., Bash for Linux/Unix).
  • Implement data ingestion solutions from various sources, including databases, APIs, and flat files.
  • Ensure data integrity, quality, and consistency through validation and error-handling techniques.
  • Assist in building and maintaining cloud-based data lakes, warehouses, and relational databases.
  • Optimize data storage performance and retrieval processes for analytics use cases.
  • Work on partitioning, indexing, and schema optimization to improve query efficiency.
  • Develop and maintain API-based data integrations to support business applications.
  • Work with business teams to integrate ERP, CRM, IoT, and third-party data sources into the analytics ecosystem.
  • Implement real-time and batch data processing using appropriate tools and frameworks.
  • Monitor and optimize data pipelines for performance, scalability, and cost-efficiency.
  • Develop automation scripts and workflows to streamline data processing tasks.
  • Work closely with data analysts, BI teams, and data scientists to understand data requirements.
  • Provide support in data validation, transformation, and preparation for reporting and analytics.
  • Collaborate with IT and cloud engineers to ensure data infrastructure reliability and security.
  • Stay updated on the latest big data technologies, cloud solutions, and engineering best practices.
  • Identify opportunities to enhance data engineering processes, improving efficiency and scalability.
  • Contribute to the development of best practices and standards for data engineering.

REQUIRED COMPETENCIES

  • Expertise in designing, building, and maintaining scalable data pipelines and ETL processes.
  • Knowledge of cloud data architectures and best practices in data engineering.
  • Knowledge of data management disciplines such as data governance and quality, business intelligence, data architecture and strategy, data security and privacy, data integration, master data management and metadata management.
  • Ability to analyze complex data challenges and implement innovative solutions.
  • Strong debugging and troubleshooting skills for optimizing data workflows.
  • Ability to work cross-functionally with business and technical teams to deliver data-driven solutions.
  • Strong communication skills to articulate technical concepts to non-technical stakeholders.
  • Proven track record in delivering high-quality data solutions within deadlines.
  • Focus on continuous improvement and innovation in data engineering practices.
  • Willingness to learn new technologies and adapt to the evolving data landscape.
  • Proactive in staying updated on emerging trends in big data and cloud computing.

JOB QUALIFICATIONS

  • Malaysian citizen.
  • Bachelor’s degree in Computer Science, Data Science, Data Engineering, Information Systems, or equivalent qualification from accredited higher learning institutions.
  • 7 years of experience in data engineering, ETL development, or data pipeline optimization.
  • Hands-on experience with cloud data platforms such as AWS, Azure, or Google Cloud.
  • Proficiency in SQL, Python, or other programming languages relevant to data processing.
  • Experience with big data processing frameworks (Spark, Hadoop) and orchestration tools (Airflow, DBT).
  • Knowledge of data modeling, warehousing principles, and schema design best practices.
  • Familiarity with version control tools (Git) and CI/CD pipelines for data engineering workflows.
  • Strong problem-solving skills and ability to optimize data processes for performance.

JOB STATUS

Permanent

All applications are strictly CONFIDENTIAL and only shortlisted candidates will be called in for interview. Applications are deemed UNSUCCESSFUL if there is no feedback from the EPF 2 MONTHS after the closing date of the advertisement.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.