Enable job alerts via email!

Data Engineer

AKKODIS SINGAPORE PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

11 days ago

Job summary

A global leader in digital engineering is seeking a Senior Data Engineer to improve system resiliency and efficiency through CI/CD and SRE practices. The ideal candidate has over 5 years of experience, strong knowledge of cloud platforms, and is proficient in Python and SQL. This role offers a unique opportunity to work on critical data products in a collaborative environment.

Qualifications

  • Minimum 5+ years experience in data engineering.
  • Strong knowledge of data security and governance.
  • Experience with CI/CD pipeline builds.

Responsibilities

  • Collaborate with users to translate data requirements into specifications.
  • Architect and design data products for engineering teams.
  • Monitor and maintain databases including capacity planning.

Skills

System design
Data structures
Algorithms
Data modeling
Cloud platforms (AWS, Azure, Google Cloud)
Python programming
SQL
Shell scripting

Education

Bachelor’s degree in Computer Science, Software Engineering, IT, or related

Tools

Databricks
AWS
Azure
Google Cloud
Airflow
Azure Data Factory
Docker
Git
Terraform
Job description
About Akkodis

Akkodis is a global leader in digital engineering, offering transformative solutions across Talent, Academy, Consulting, and Solutions services. With a team of over 50,000 experts, we drive innovation in various sectors by leveraging cutting-edge technologies and deep industry expertise. Our mission is to engineer a smarter future and help our clients stay ahead in an ever-evolving digital landscape.

About the Role

We are seeking a Senior Data Engineer to partner directly with client agencies, helping them enhance development efficiency and system resiliency through optimized Continuous Integration/Continuous Deployment (CI/CD) and Site Reliability Engineering (SRE) practices. As a forward-deployed engineer, you will work closely with stakeholders to address complex challenges, gather critical insights, and drive continuous improvement. Your expertise will ensure that our Core Engineering Products and solutions evolve to meet the real-world needs of our clients.

Responsibilities
  • Collaborate with business users to translate data requirements into robust technical specifications.
  • Partner with IT teams to ensure alignment on technology stack, infrastructure, and security standards.
  • Architect, design, and implement data products as part of a collaborative data engineering team.
  • Develop scalable ingestion pipelines to collect, clean, and harmonize data from diverse source systems.
  • Monitor and maintain databases and ETL systems, including capacity planning, performance tuning, and issue prevention.
  • Design and maintain reusable, high-quality data models to support business analytics and reporting.
  • Build secure and efficient access mechanisms for end users and systems to interact with the data warehouse.
  • Research, propose, and implement emerging technologies and best practices to strengthen data infrastructure.
  • Partner with data stewards to establish and enforce governance policies, standards, and procedures.
  • Maintain an up-to-date data catalog documenting assets, metadata, and lineage.
  • Implement data validation and quality assurance processes to ensure accuracy, consistency, and reliability.
  • Apply data security measures such as access control, encryption, and masking to safeguard sensitive information.
Requirements
  • Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related field.
  • Minimum 5+ years experience
  • Strong knowledge of system design, data structures, algorithms, data modeling, access, and storage.
  • Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Proficiency in Databricks for big data processing and analytics.
  • Proven track record in building and maintaining batch and real-time data pipelines.
  • Experience with orchestration frameworks (e.g., Airflow, Azure Data Factory).
  • Strong programming skills in Python, SQL, and Shell scripting.
Preferred Qualifications
  • Experience building and maintaining CI/CD pipelines.
  • Familiarity with DevOps tools such as Docker, Git, and Terraform.
  • Background in implementing processes to ensure data security, quality, and governance.
  • Understanding of government systems, policies, and compliance requirements related to data.
  • Domain knowledge in climate and weather data is a strong plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.