Job Search and Career Advice Platform

Enable job alerts via email!

Lead Data Engineer

The Great Eastern Life Assurance Company Limited

Singapore

On-site

SGD 100,000 - 130,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading life insurance firm in Singapore is seeking a skilled Data Engineer to develop and maintain robust ETL solutions. The candidate will work with cross-functional teams to ensure data quality within technical requirements. Responsibilities include designing scalable data pipelines, optimizing data models, and driving process improvements. The ideal candidate has at least 10 years of experience in data engineering, strong problem-solving skills, and proficiency in big data technologies like Hadoop and Spark.

Qualifications

  • 10+ years of experience in data engineering, preferably in life insurance.
  • Proven experience in ETL development and big data technologies.
  • Strong team player, detail-oriented, and capable of working under pressure.

Responsibilities

  • Design, develop, test, and maintain ETL pipelines.
  • Collect, refine, and integrate new datasets.
  • Create optimized data models aligned with architecture standards.
  • Conduct code reviews and ensure delivery quality.

Skills

Data pipeline development
ETL solutions
Big data technologies
Hadoop
Spark
Hive
Cloud services (AWS, Azure, GCP)
Problem-solving
Interpersonal skills

Education

Diploma with at least 10 years’ experience
Job description

We are seeking a skilled and detail-oriented Data Engineer to design, develop, and maintain robust data pipelines and ETL solutions. This role involves working closely with cross-functional teams to ensure data quality, scalability, and alignment with business and technical requirements.

  • Design, develop, test, and maintain scalable ETL pipelines to meet business, technical, and user requirements.
  • Collect, refine, and integrate new datasets. Maintain comprehensive documentation and data mappings across multiple systems.
  • Create optimized and scalable data models that align with organizational data architecture standards and best practices.
  • Conduct code reviews and perform rigorous testing to ensure high-quality deliverables.
  • Drive continuous improvement in data quality through optimization, testing, and solution design reviews.
  • Ensure all solutions conform to big data architecture guidelines and long-term roadmap.
  • Implement robust monitoring, logging, and alerting systems to ensure pipeline reliability and data accuracy.
  • Apply best practices in data engineering to design and build reliable data marts within the Hadoop ecosystem for planning, reporting, and analytics.
  • Maintain and optimize data pipelines to ensure data accuracy, integrity, and timeliness.
  • Manage code in a centralized repository with clear branching strategies and well-documented commit messages.
  • Coordinate with stakeholders to ensure smooth production deployment and adherence to data governance policies.
  • Proactively identify and implement improvements to data engineering processes and workflows.
  • Architect end-to-end solutions for insurance data modeling in the data warehouse, including data acquisition, contextualization, and integration with business processes.
  • Act as a business process owner for onboarding users and data products onto the data platform and pipelines supporting dashboards and statistical models.
  • Ensure adherence to development standards and perform periodic reviews to maintain pipeline performance and sustainability.
  • Coordinate and conduct testing with stakeholders to ensure effective deployment of data pipelines and dashboards.
  • Monitor data pipelines continuously and collaborate with stakeholders to troubleshoot and optimize performance.
  • Diploma with at least 10 years’ working experience, preferably in Life Insurance
  • Proven experience in data engineering, ETL development, and big data technologies
  • A strong team player who is meticulous, detail-oriented, and capable of performing under pressure
  • Proficiency in tools and platforms such as Hadoop, Spark, Hive, and cloud data services (e.g., AWS, Azure, GCP).
  • Possesses strong problem-solving and interpersonal skills.
  • Committed, dependable, and adaptable with the flexibility to support during peak periods and tight deadlines
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.