Enable job alerts via email!

Data Engineer

Multi Finance Industry

Daerah Khusus Ibukota Jakarta

On-site

IDR 1.169.981.000 - 1.671.403.000

Full time

Today
Be an early applicant

Job summary

A leading financial services firm is looking for an experienced Data Engineer to work on financial projects in Jakarta. This role involves designing data models, implementing ETL processes, and optimizing data quality. The ideal candidate should be proficient in SQL, Python, and Spark, with good analytical and communication skills. Experience in financial projects is preferred, alongside a strong understanding of data warehousing. This position offers a chance to contribute to critical business intelligence initiatives.

Qualifications

  • 3 years of experience as Data Engineer.
  • Strong SQL query skills.
  • Experience in Java API development.
  • Experience in SSRS reporting.

Responsibilities

  • Design models according to business requirements.
  • Implement ETL and data pipelines for financial projects.
  • Optimize ETL to improve code efficiency.

Skills

Proficient in Spark
Python
Scala
Performance tuning
Familiar with data warehouse modeling
ETL processes
Communication skills

Education

Bachelor's degree in Engineering

Tools

Tableau
ETL tools such as NiFi
Apache Hadoop
Spark
Kafka
Job description

Explore exciting Data Engineer job opportunities in Jakarta. Data Engineers are in high demand, tasked with designing, building, and maintaining data pipelines and systems. These professionals play a crucial role in transforming raw data into usable information for analysis and decision‑making, supporting business intelligence and data science initiatives.

Jakarta's job market offers diverse roles for Data Engineers, from entry‑level positions to senior leadership roles. These positions involve working with big data technologies, cloud platforms, and various programming languages. Companies are seeking skilled individuals who can optimize data flow and ensure data quality, contributing to data‑driven strategies.

Whether you're an experienced Data Engineer or a recent graduate, Jakarta presents a landscape of opportunities to advance your career. Key skills include proficiency in SQL, Python, ETL processes, and data warehousing solutions. Start your search today to find the perfect Data Engineer role that matches your skills and career aspirations in Jakarta.

Job 1 – Data Engineer (Financial Projects)

Job Responsibilities

  • Participate in the construction and development of data warehouses for financial projects, design models according to business requirements, and implement ETL.
  • Provide data report support for business departments.
  • Participate in optimizing ETL to improve code efficiency and reduce costs.

Job Qualifications

  • Proficient in Spark, Python, and Scala, with good experience in performance tuning.
  • Familiar with various data warehouse modeling theories, proficient in data model design and data layer design.
  • Priority given to those with Tableau development experience.
  • Priority given to those with a background in finance and e‑commerce projects.
  • Diligent, responsible, meticulous, with good team spirit, analytical skills, and communication skills.
Job 2 – Data Engineer (Devoteam)

Responsibilities

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures.
  • Develop and maintain data pipelines, ensuring data extraction, transformation, and loading into data warehouses or storage systems.
  • Manage data warehouses and lakes, ensuring performance, scalability, and security.
  • Integrate data from different sources (databases, APIs, external systems) to create unified datasets.
  • Perform data transformations, implementing ETL processes.
  • Collaborate with data scientists, analysts, and stakeholders to establish data quality standards and governance.
  • Optimize data processing and storage systems.

Requirements

  • Proficiency in Python, Java, Scala, or SQL.
  • Experience with relational databases and DBMS.
  • Knowledge of Apache Hadoop, Spark, Kafka.
  • Experience with ETL tools such as NiFi, Talend, or Informatica.
  • Familiarity with data warehousing platforms like Redshift, BigQuery, or Snowflake.
  • Experience with cloud platforms (AWS, Azure, GCP).
  • Knowledge of Airflow, Luigi, or Oozie.
  • Strong problem‑solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Bachelor's degree in Engineering (minimum two years experience).
Job 3 – Data Engineer (PT Intikom Berlian Mustika)

Requirements

  • 3 years of experience as Data Engineer.
  • Strong SQL query, function, and procedure skills.
  • Experience in Java API development.
  • Experience in SSRS reporting.
  • Knowledge of PowerBI, Google BigQuery, ETL, and data pipeline.
  • Advantage: Python data science skills.
Job 4 – Data Engineer (Contract)

Scope of Work

  • Explore database DI/DX (data pipelines, ETL, data integration).
  • Develop Datamarts to support DI/DX needs.
  • Handle ad‑hoc data queries, extraction, and preparation.
  • Validate extracted data for quality and reliability.
  • Ensure data consistency and accessibility for stakeholders.

Requirements

  • At least 4 years experience in building and managing data pipelines, ETL, and integration.
  • Strong SQL skills and experience with data warehouse and Datamart technologies.
  • BS degree preferred.
  • Contract term: 12 months.
Job 5 – Data Engineer (PT Astra Graphia Information Technology)

Responsibilities

  • Build and optimize robust data pipelines for ETL from multiple sources.
  • Integrate data from heterogeneous sources ensuring quality and availability.
  • Monitor data system performance, identify and resolve bottlenecks.
Job 6 – Data Engineer (PT Mandiri Sekuritas)

Responsibilities

  • Design, develop, and manage data pipelines and infrastructure.
  • Implement data workflows, optimize performance, and ensure consistency.
  • Collaborate with cross‑functional teams for data integration and utilization.

Qualifications

  • 3–5 years experience building large‑scale pipelines in cloud or hybrid environments.
  • Strong skills in SQL, Python, Java, Bash scripting.
  • Experience with GCP, Azure, relational and NoSQL databases, Hadoop.
  • Production experience with Airflow, Spark, Flink.
  • Experience in CI/CD, Docker, Kubernetes.
  • Knowledge of financial services data and security frameworks.
  • Excellent communicator, able to explain pipelines to non‑technical stakeholders.
Job 7 – Data Engineer (Cube)

Responsibilities

  • Build and maintain scalable data pipelines for e‑commerce data.
  • Design and implement a cloud‑based data lakehouse architecture.
  • Explore high‑performance data transformation and analysis tools.
  • Collaborate with data analysts to deliver accessible datasets.
  • Collaborate with architects on data platform design.
  • Enhance data quality and reliability.

Requirements

  • Knowledge of Java and Python.
  • Hands‑on experience with SQL database design.
  • Previous experience as a data engineer or similar role.
  • Experience with data models, mining, and segmentation.
  • Strong numerical and analytical skills.
  • Willingness to learn new tools and technologies.
Job 8 – Data Engineer (Accord Innovations)

Requirements

  • Design, develop, deploy, and maintain ETL workflows using SSIS.
  • Build and maintain SSRS reports.
  • Collaborate with business analysts and stakeholders.
  • Optimize ETL packages for performance and scalability.
  • Perform data profiling, validation, and reconciliation.
  • Maintain and improve existing SSIS/SSRS solutions.
  • Document ETL designs and data mappings.
Job 9 – Data Engineer (Digital Finance Platform)

Key Responsibilities

  • Develop and maintain scalable data infrastructure and pipelines.
  • Build and manage data ingestion and transformation workflows.
  • Apply best practices for stability, availability, and performance.
  • Partner with engineering, data science, and product teams.
  • Design and large‑scale, efficient data pipelines.
  • Translate user needs into tools and platform capabilities.

Qualifications

  • 1–2 years experience in data engineering or backend development.
  • Strong coding skills in Python, Java, or Scala.
  • Familiar with Git, Maven, Docker, Kubernetes.
  • Experience with Spark, Kafka, Flink, Airflow.
  • Comfortable with MySQL, MongoDB.
  • Experience with cloud‑based data platforms.
  • Strong analytical mindset; proactive communicator.
  • Motivated to stay current with emerging technologies.
  • Adequate knowledge of automation and DevOps.
Equal Opportunity Statement

The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. All positions are open to people with disabilities.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.