Enable job alerts via email!

Talend Big Data Developer

ELLIOTT MOSS CONSULTING PTE. LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

14 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking an ETL Developer to enhance their data engineering team. The successful candidate will design, build, and maintain ETL pipelines, ensuring data quality and alignment with target schemas. This role involves collaboration across functions and contributions to data quality improvement initiatives.

Qualifications

  • Proven experience with Talend, Python, and Apache Spark.
  • Strong understanding of relational databases and Big Data ecosystems.
  • Solid experience in data warehousing and data modeling techniques.

Responsibilities

  • Design, build, and maintain ETL workflows using the Talend ETL toolset.
  • Collaborate with teams to ensure accurate data mapping.
  • Write transformation logic using ETL tools or scripting languages.

Skills

Talend
Python
Apache Spark
Relational Databases
Data Warehousing
Data Quality Management
Data Modeling Techniques
Data Visualization

Tools

Cloudera
PostgreSQL
SQL Server

Job description

Job Description

We are looking for a skilled ETL Developer with hands-on experience in Talend, Python, and Spark, to join data engineering team. The ideal candidate will be responsible for designing, building, and maintaining ETL pipelines that support data extraction, transformation, and loading from various sources into target systems.

Key Responsibilities:

  • Design, build, and maintain ETL workflows using the Talend ETL toolset.
  • Develop ETL solutions for extracting and transforming data from various sources such as Cloudera, PostgreSQL, and SQL Server.
  • Create and manage database schemas, tables, and constraints based on business requirements.
  • Collaborate with cross-functional teams to understand source systems and ensure accurate data mapping and transformation.
  • Write transformation logic using ETL tools or scripting languages like SQL and Python.
  • Ensure data is clean, validated, and aligned with target schema and data quality standards.
  • Contribute to data quality improvement initiatives and proactively resolve data inconsistencies.
  • Participate in troubleshooting and performance tuning of ETL jobs and workflows.

Required Skills & Qualifications:

  • Proven experience with Talend, Python, and Apache Spark.
  • Strong understanding of relational databases and Big Data ecosystems (Hive, Impala, HDFS).
  • Solid experience in data warehousing and data modelling techniques.
  • Familiarity with data quality management and best practices.
  • Experience with data visualization and analytics tools is a plus.

Nice to Have:

  • Experience with scheduling tools and CI/CD pipelines.
  • Knowledge of data governance frameworks and practices.
  • Exposure to cloud platforms like AWS, Azure, or GCP is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.