Enable job alerts via email!

Data Engineer

PT Japfa Comfeed Indonesia, Tbk

Surabaya ꦱꦸꦫꦧꦪ

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A leading Indonesian company is seeking a Data Transformation Engineer to design, build, and maintain scalable data pipelines. This role requires expertise in data modeling, SQL, and cloud platforms. Candidates with experience in ETL processes and data governance are preferred. The company offers opportunities for fresh graduates as well. This position fosters collaboration with business teams to translate needs into technical solutions.

Qualifications

  • Experience in data engineering, data transformation, or data integration is a plus.
  • Proven experience designing ETL/ELT data pipelines.
  • Hands-on experience with cloud-based data platforms.
  • Strong proficiency in SQL and programming languages.

Responsibilities

  • Design and maintain ETL/ELT pipelines for data.
  • Collaborate to define data requirements.
  • Build and optimize data models for BI tools.
  • Ensure data accuracy and compliance.
  • Automate data workflows.

Skills

Data modeling
SQL
ETL/ELT processes
Cloud platforms
Analytical thinking
Automation with Python
Collaboration

Education

Bachelor’s degree in Information Technology
Master’s degree is a plus

Tools

Snowflake
Informatica
dbt
Apache Airflow
Power BI
Job description
Overview

The Data Transformation Functional is responsible for designing, building, and maintaining scalable data pipelines and transformation processes that enable reliable data flow across systems. This role bridges business needs and technical implementation by translating functional data requirements into optimized data models and ETL/ELT workflows.

The engineer ensures data is properly cleansed, structured, and enriched for analytical and operational use—supporting reporting, business intelligence (BI), data science, and digital initiatives. The position requires strong skills in data modeling, SQL, and cloud data platforms (e.g., Snowflake, AWS, Azure, GCP), with an understanding of data governance, quality, and performance optimization.

Responsibilities
  • Design, develop, and maintain ETL/ELT pipelines for data ingestion, transformation, and integration.
  • Collaborate with business and analytics teams to define data requirements and translate them into technical solutions.
  • Build and optimize data models to support BI tools, dashboards, and advanced analytics.
  • Ensure data accuracy, consistency, and compliance with data governance policies.
  • Automate data workflows and monitor pipeline performance for reliability and scalability.
  • Work closely with data architects and functional teams to improve overall data ecosystem efficiency.
Qualifications
  • Bachelor’s degree in Information Technology, Information Systems, or a related field; Master’s degree is a plus.
  • Experience in data engineering, data transformation, or data integration roles is a plus. Fresh grads are welcome.
  • Proven experience designing, developing, and maintaining ETL/ELT data pipelines for structured and semi-structured data using tools such as dbt, Informatica, Talend, or Apache Airflow.
  • Hands-on experience working with cloud-based data platforms such as Snowflake, Databricks, Google BigQuery, Amazon Redshift, or Azure Synapse.
  • Strong proficiency in SQL and practical experience with at least one programming language (e.g., Python, Scala, or Java) for automation and data manipulation.
  • Demonstrated experience integrating and preparing data for BI tools, such as Power BI, Qlik Sense, or Tableau, ensuring optimized data structures for visualization and analysis.
  • Experience collaborating closely with business analysts, functional teams, and data consumers to translate business requirements into technical data solutions.
  • Exposure to version control systems (Git) and CI/CD workflows, ideally within an Agile development environment.
  • Technical skills: SQL, Python or Scala; ELT/ETL tools (dbt, Informatica, Talend); Cloud data platforms (Snowflake, Databricks, BigQuery, Redshift); CI/CD and version control (Git, Jenkins).
  • Analytical thinking with the ability to interpret complex data processes, identify root causes, and propose practical solutions.
  • Business acumen with the ability to translate business processes into technical data solutions that add measurable value.
  • Communication, collaboration, problem-solving, attention to detail, and continuous learning oriented mindset.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.