Enable job alerts via email!

Data Engineer

Pt Idstar Cipta Teknologi (Idstar)

Daerah Khusus Ibukota Jakarta

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A leading technology solutions provider in Jakarta is seeking a skilled Data Engineer responsible for designing, developing, and managing data pipelines and infrastructure. Ideal candidates should possess strong knowledge in SQL, Python, and big data technologies. The role requires collaboration with cross-functional teams to ensure optimal data processing and quality standards. Opportunities in this exciting field await you in Jakarta.

Qualifications

  • Minimum 3 years of experience as Data Engineer.
  • Experience in SQL query, SQL function and SQL procedure is mandatory.
  • Diligent, responsible, meticulous, with good team spirit.

Responsibilities

  • Develop and maintain data pipelines for data extraction, transformation, and loading.
  • Collaborate with data architects to design data architectures.
  • Ensure data quality and consistency.

Skills

Spark
Python
Scala
SQL
ETL processes
Data warehousing

Education

Bachelor's degree in Engineering

Tools

Tableau
Apache Airflow
Google Cloud Platform
Job description

Explore exciting Data Engineer job opportunities in Jakarta . Data Engineers are in high demand, tasked with designing, building, and maintaining data pipelines and systems. These professionals play a crucial role in transforming raw data into usable information for analysis and decision-making, supporting business intelligence and data science initiatives.

Jakarta's job market offers diverse roles for Data Engineers, from entry-level positions to senior leadership roles. These positions involve working with big data technologies , cloud platforms , and various programming languages. Companies are seeking skilled individuals who can optimize data flow and ensure data quality, contributing to data-driven strategies.

Whether you're an experienced Data Engineer or a recent graduate, Jakarta presents a landscape of opportunities to advance your career. Key skills include proficiency in SQL , Python , ETL processes , and data warehousing solutions . Start your search today to find the perfect Data Engineer role that matches your skills and career aspirations in Jakarta .

Showing 186 Data Engineer jobs in Jakarta

Jakarta, Jakarta IDR9000000 - IDR12000000 Y Kredit Pintar

Posted 1 day ago

Job Responsibilities
  • Participate in the construction and development of data warehouses for financial projects, design models according to business requirements, and implement ETL.
  • Provide data report support for business departments.
  • Participate in optimizing ETL to improve code efficiency and reduce costs.
Job Qualifications
  • Proficient in Spark, able to develop in Python and Scala, with good experience in performance tuning;
  • Familiar with various data warehouse modeling theories, proficient in data model design and data layer design;
  • Priority will be given to those with Tableau development experience;
  • Priority will be given to those with a background in finance and e-commerce projects;
  • Diligent, responsible, meticulous, with good team spirit, analytical skills, and communication skills.
  • Spark;
  • Phyton;
  • Scala;
Job Description – Devoteam
  • Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.

The Data Engineer will be responsible for the following activities:

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization's requirements
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices
  • Optimise data processing and storage systems for performance and scalability

Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions

Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.

Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.

Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large-scale data processing.

ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.

Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.

Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud-based data services is valuable.

Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.

Problem-Solving and Analytical Skills: Data engineers should have strong problem-solving abilities and analytical thinking to identify data-related issues, troubleshoot problems, and optimize data processing workflows.

Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross-functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms.

Education and Experience

Bachelor's degree in Engineering required.

  • Minimum Two years of related experience is highly preferred.
  • Two certifications in GCP (within 3 months after joining).

Status: Full-Time

Duration: -

The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities.

Job Requirements
  • Min experience – 3 years as Data Engineer
  • Experience in SQL query, SQL function and SQL procedure is mandatory
  • Experience in Java development of API is an advantage
  • Have experience in SSRS reporting
  • PowerBI analytic, Google Big Query, ETL & data pipeline is mandatory.
  • Have knowledge and experience in data science with Python is an advantage
Job Description – PT Intikom Berlian Mustika

Scope of Work – Data Engineer

  • Mengeksplorasi database DI/DX (data pipelines, ETL, integrasi data).
  • Mengeksplorasi dan, jika diperlukan, mengembangkan Datamart untuk mendukung kebutuhan bisnis terkait DI/DX.
  • Menangani permintaan ad-hoc untuk query, ekstraksi data, dan persiapan data terkait DI/DX.
  • Melakukan validasi pada data yang diekstrak untuk memastikan kualitas dan keandalan terkait DI/DX.
  • Memastikan konsistensi dan aksesibilitas data bagi pemangku kepentingan DXO terkait DI/DX.

Jika Anda merasa cocok atau memiliki referensi kandidat yang sesuai, silakan hubungi kami via WhatsApp:

SoftwareDeveloper #Hiring #Career #ITJobs #Java #SQL

Jenis Pekerjaan: Kontrak
Panjang kontrak: 12 bulan

Pertanyaan Lamaran:

  • Berapa usia kamu saat ini ?
  • apakah kamu Memiliki pengalaman kerja dalam membangun dan mengelola data pipelines, ETL, dan integrasi data?
  • apakah Kamu Menguasai SQL serta tools/teknologi terkait Data Warehouse dan Datamart?

Pendidikan:

  • S1 (Diutamakan)

Pengalaman:

  • Data Engineer: 4 tahun (Diutamakan)
Job Description – PT Astra Graphia Information Technology (AGIT)
  • Build and optimize robust data pipelines for extracting, transforming, and loading (ETL) data from multiple sources into a central data warehouse or data lake.
  • Integrate data from multiple heterogeneous sources, ensuring data quality, consistency, and availability.
  • Monitor the performance of data systems, identify bottlenecks, and resolve issues related to data quality or processing failures.
Company Description – PT Mandiri Sekuritas

PT Mandiri Sekuritas (Mandiri Sekuritas/Company) has been awarded as Indonesia's Best Investment Bank and Best Broker by the FinanceAsia Country Awards 2022. These recognitions have established the Company's strong position as the Best Investment Bank in Indonesia for 12 consecutive years and Best Broker for 8 consecutive years. Established in 2000, Mandiri Sekuritas provides customers with comprehensive and value‑added capital market financial solutions. The Company obtained its business license as a securities broker and underwriter from Bapepam‑LK, demonstrating its commitment to excellence in financial services.

Role Description

This is a contract, on‑site role for a Data Engineer located in Jakarta. The Data Engineer will be responsible for designing, developing, and managing data pipelines and infrastructure. Daily tasks include data acquisition, processing, and storage solutions; implementing data workflows; optimizing performance of data‑centric systems; and ensuring data quality and consistency. Additionally, the Data Engineer will collaborate with other teams to integrate and utilize data effectively.

Qualifications
  • 3–5 years of experience building large‑scale data pipelines in cloud or hybrid environments
  • Strong in SQL, Python, and Java; skilled with Bash scripting for automation
  • Hands‑on expertise with GCP, Azure, relational & non‑relational databases, and Hadoop/on‑prem systems
  • Production experience with Airflow DAGs, Spark, and Flink
  • Experienced with CI/CD & containerization (Git, Terraform, Helm, Docker, Kubernetes)
  • Solid grasp of distributed systems (partitioning, replication, fault tolerance)
  • Familiar with financial services data, regulations, and security frameworks
  • Excellent communicator—able to explain complex pipelines to non‑technical stakeholders
  • Motivated to stay current with emerging technologies and continuously enhance technical capabilities
  • Familiarity with automation and DevOps practices is a plus
Job Description – Cube Asia

As a Data Engineer at Cube Asia, you will use various methods to transform raw data into useful data systems. You'll strive for efficiency by aligning data systems with business goals.

To succeed in this position, you should have prior experience in large‑scale public data collection from the web using open APIs and other tools, as well as a good understanding of the terms and guidelines, as well as the technical considerations, governing such data collection.

Data engineer skills also include familiarity with several programming languages and a basic knowledge of machine learning methods. If you are detail‑oriented, with excellent organizational skills and experience in this field, we’d like to hear from you.

Responsibilities
  • Build and maintain scalable data pipelines to process and integrate e‑commerce data from multiple sources
  • Design and implement a modern cloud‑based data lakehouse architecture using AWS services such as S3, Athena, Glue, Fargate, and Iceberg
  • Explore tools and solutions for high‑performance data transformation and analysis, such as Polars, DuckDB, and PySpark
  • Work with data analysts to deliver accessible and well‑structured datasets for reporting and advanced analytics
  • Collaborate architects on designing data platforms
  • Explore ways to enhance data quality and reliability
What You’ll Love About This Role
  • Build from Scratch: Be part of a team creating foundational data systems and processes, shaping the future of our platform
  • Learn by Doing: Gain hands‑on experience working with modern tools, cloud technologies, and real‑world data challenges
  • Work on Complex Projects: Tackle exciting and challenging problems, from integrating large‑scale e-commerce data to optimizing data pipelines for performance and scalability
Requirements
  • Knowledge of programming languages (e.g. Java and Python)
  • Hands‑on experience with SQL database design
  • Previous experience as a data engineer or in a similar role
  • Technical expertise with data models, data mining, and segmentation techniques
  • Great numerical and analytical skills
  • Willingness to learn and work with new tools and technologies
Job Description – Accord Innovations

Hi #TalentReady, our client is looking for
Data Engineer (ETL)
for their project.

Full WFO at Jakarta Area | 12 Month Contract (PKWT) | Banking Industry

Requirements
  • Design, develop, deploy, and maintain robust ETL workflows using SSIS to support data integration from multiple sources into the data warehouse.
  • Build and maintain operational and analytical reports using SSRS, delivering insights to business users.
  • Collaborate with business analysts, data architects, and stakeholders to understand data requirements and translate them into technical specifications.
  • Optimize ETL packages for performance, scalability, and error handling.
  • Perform data profiling, validation, and reconciliation to ensure high data quality and integrity.
  • Maintain and improve existing SSIS/SSRS solutions.
  • Document ETL designs, data mappings, and workflow processes.

Notes: Only shortlisted candidates will be contacted.

Job Description – Digital Finance Platform

On behalf of a fast‑growing digital finance platform, we are currently seeking a skilled and motivated
Data Engineer
to support the development of scalable data infrastructure and analytical capabilities. Based in Jakarta, this role will be instrumental in enabling data‑driven decision‑making across multiple product lines and markets in Southeast Asia.

You will collaborate closely with cross‑functional teams to design and implement efficient ETL pipelines, develop robust data models, and optimize data processing workflows to ensure high performance and cost‑efficiency in a high‑volume environment.

Key Responsibilities
  • Develop and maintain scalable data infrastructure, databases, and pipelines to support reliable and efficient data operations.
  • Build and manage data ingestion and transformation workflows to enable seamless data integration across multiple platforms.
  • Apply best practices to ensure the stability, availability, and performance of data systems.
  • Partner with engineering, data science, and product teams to enhance data accessibility and usability across the organization.
  • Design and sustain large‑scale, efficient data pipelines to process complex datasets.
  • Translate user needs into well‑crafted tools and platform capabilities that address real‑world data challenges.
Qualifications
  • 1–2 years of hands‑on experience in data engineering or backend development with a strong focus on data systems.
  • Solid coding skills in Python, Java, or Scala.
  • Familiar with source control tools (e.g., Git) and build/dependency management tools like Maven.
  • Knowledge of container tools (Docker) and orchestration frameworks (Kubernetes).
  • Practical experience with real‑time and batch data technologies, such as Spark, Kafka, Flink, Flume, or Airflow.
  • Comfortable working with both relational (e.g., MySQL) and NoSQL databases (e.g., MongoDB).
  • Prior involvement with cloud‑based data platforms and large‑scale data solutions.
  • Strong analytical mindset, with the ability to juggle multiple tasks and projects.
  • A proactive communicator and team player who thrives in a collaborative environment.
  • Motivated to stay current with emerging technologies and continuously enhance technical capabilities.
  • Familiarity with automation and DevOps practices is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.