Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

PT Japfa Comfeed Indonesia, Tbk

Surabaya ꦱꦸꦫꦧꦪ

On-site

IDR 200.000.000 - 300.000.000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions company in Surabaya is seeking a Data Transformation Engineer. In this role, you will design, build, and maintain scalable data pipelines that bridge business needs and technical implementation. The ideal candidate has a Bachelor's degree in Information Technology, strong competency in SQL, and experience with cloud data platforms like Snowflake and AWS. This position offers the chance to work closely with cross-functional teams to optimize data models and ensure reliable data flow across systems.

Qualifications

  • Experience in data engineering or data integration roles.
  • Hands-on with cloud-based data platforms like Snowflake or Azure.
  • Proficiency in SQL and at least one programming language.

Responsibilities

  • Design and maintain ETL/ELT pipelines for data integration.
  • Collaborate with analytics teams to define data requirements.
  • Ensure data accuracy and compliance with governance policies.

Skills

SQL
Python
Analytical thinking
Collaboration

Education

Bachelor's degree in Information Technology
Master's degree

Tools

Snowflake
Informatica
Apache Airflow
Tableau
Job description

The Data Transformation Functional is responsible for designing, building, and maintaining scalable data pipelines and transformation processes that enable reliable data flow across systems. This role bridges business needs and technical implementation by translating functional data requirements into optimized data models and ETL/ELT workflows.

The engineer ensures data is properly cleansed, structured, and enriched for analytical and operational use—supporting reporting, business intelligence (BI), data science, and digital initiatives. The position requires strong skills in data modeling, SQL, and cloud data platforms (e.g., Snowflake, AWS, Azure, GCP), with an understanding of data governance, quality, and performance optimization.

Job Responsibilities
  • Design, develop, and maintain ETL/ELT pipelines for data ingestion, transformation, and integration.
  • Collaborate with business and analytics teams to define data requirements and translate them into technical solutions.
  • Build and optimize data models to support BI tools, dashboards, and advanced analytics.
  • Ensure data accuracy, consistency, and compliance with data governance policies.
  • Automate data workflows and monitor pipeline performance for reliability and scalability.
  • Work closely with data architects and functional teams to improve overall data ecosystem efficiency.
Job Requirements
  • Bachelor’s degree in Information Technology, Information Systems, or a related field; Master’s degree is a plus.
  • Experience in data engineering, data transformation, or data integration roles is a plus. Fresh Grad is welcome.
  • Proven experience designing, developing, and maintaining ETL/ELT data pipelines for structured and semi-structured data using tools such as dbt, Informatica, Talend, or Apache Airflow.
  • Hands-on experience working with cloud-based data platforms such as Snowflake, Databricks, Google BigQuery, Amazon Redshift, or Azure Synapse.
  • Strong proficiency in SQL and practical experience with at least one programming language (e.g., Python, Scala, or Java) for automation and data manipulation.
  • Demonstrated experience integrating and preparing data for BI tools, such as Power BI, Qlik Sense, or Tableau, ensuring optimized data structures for visualization and analysis.
  • Experience collaborating closely with business analysts, functional teams, and data consumers to translate business requirements into technical data solutions.
  • Exposure to version control systems (Git) and CI/CD workflows, ideally within an Agile development environment.
Technical Skills
  • SQL, Python, or Scala.
  • ELT tools (e.g., dbt, Informatica, Talend).
  • Cloud data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift).
  • CI/CD and version control (e.g., Git, Jenkins).
  • Analytical thinking and problem-solving to interpret complex data processes and propose practical solutions.
  • Business acumen to translate processes into technically valuable data solutions.
  • Communication skills to interact with both technical and non-technical stakeholders.
  • Collaboration and teamwork across IT, data, and business units.
  • Continuous learning to stay updated with emerging data technologies and best practices.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.