Enable job alerts via email!

Data Engineer

Pt Karisma Zona Kreatifku

Daerah Khusus Ibukota Jakarta

On-site

IDR 200.000.000 - 300.000.000

Full time

2 days ago
Be an early applicant

Job summary

A leading tech consulting firm in Jakarta is seeking an experienced Data Engineer. Responsibilities include designing and maintaining data pipelines, optimizing ETL processes, and collaborating with teams to ensure data quality. Ideal candidates should hold a degree in Engineering, have significant experience in SQL and Python, and be familiar with cloud platforms. Competitive salary offered based on skills and experience.

Benefits

Health insurance
Professional development opportunities
Flexible working hours

Qualifications

  • Minimum of 3-5 years in data engineering or relevant field.
  • Experience with SQL functions and procedures is mandatory.
  • Strong knowledge of data modeling and schema design.

Responsibilities

  • Design and maintain data pipelines for data acquisition and processing.
  • Collaborate with teams to determine data requirements and optimize data flow.
  • Implement ETL processes to ensure high data quality and availability.

Skills

Proficient in SQL
Python programming
ETL processes
Data warehousing solutions
Familiar with Spark
Tableau development
Scala programming

Education

Bachelor's degree in Engineering or related field

Tools

Data pipeline tools (e.g., Apache NiFi)
Cloud platforms (e.g., AWS, GCP)
Database management systems
Apache Hadoop
Job description

Explore exciting Data Engineer job opportunities in Jakarta. Data engineers are in high demand, tasked with designing, building, and maintaining data pipelines and systems. These professionals play a crucial role in transforming raw data into usable information for analysis and decision‑making, supporting business intelligence and data science initiatives.

Jakarta’s job market offers diverse roles for data engineers, from entry‑level positions to senior leadership roles. These positions involve working with big data technologies, cloud platforms, and various programming languages. Companies are seeking skilled individuals who can optimize data flow and ensure data quality, contributing to data‑driven strategies.

Whether you are an experienced data engineer or a recent graduate, Jakarta presents a landscape of opportunities to advance your career. Key skills include proficiency in SQL, Python, ETL processes, and data warehousing solutions. Start your search today to find the perfect data engineer role that matches your skills and career aspirations in Jakarta.

Data engineers design and build data pipelines, transform raw data into usable formats, and maintain data systems. They work on data integration, data quality, and optimizing data flow. They collaborate with data scientists and business analysts to support data‑driven decision‑making.

Several prominent companies in Jakarta frequently hire data engineers. Some of these include Gojek, Tokopedia, and Traveloka. These companies are known for their data‑intensive operations and offer opportunities for professional growth.

The average salary for a data engineer in Jakarta ranges from IDR 12,000,000 to IDR 25,000,000 per month. This range depends on factors such as experience, skills, and company size. Senior data engineers with specialized expertise can earn even higher salaries.

Showing 186 Data Engineer jobs in Jakarta.

Job Responsibilities
  • Participate in the construction and development of data warehouses for financial projects, design models according to business requirements, and implement ETL.
  • Provide data report support for business departments.
  • Participate in optimizing ETL to improve code efficiency and reduce costs.
Job Qualifications
  • Proficient in Spark, able to develop in Python and Scala, with good experience in performance tuning.
  • Familiar with various data warehouse modeling theories, proficient in data model design and data layer design.
  • Priority will be given to those with Tableau development experience.
  • Priority will be given to those with a background in finance and e-commerce projects.
  • Diligent, responsible, meticulous, with good team spirit, analytical skills, and communication skills.
  • Spark.
  • Phyton.
  • Scala.
Job Description

Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.

By combining creativity, tech, and data insights, we empower our customers to transform their business and unlock the future.

With 25 years of experience and 8,000 employees across Europe and the Middle East, Devoteam promotes responsible tech for people and works to create better change.

#Creative Tech for Better Change

Devoteam has launched in January 2021 its new strategic plan, Infinite 2024, with the ambition to become the #1 EMEA partner of the leading Cloud‑based platform companies (AWS, Google Cloud, Microsoft, Salesforce, ServiceNow), further reinforced by deep expertise in digital strategy, cybersecurity, and data.

The Data Engineer will be responsible for the following activities:

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization's requirements.
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems.
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security.
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting.
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices.
  • Optimise data processing and storage systems for performance and scalability.

Collaborate with cross‑functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions.

Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.

Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.

Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large‑scale data processing.

ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.

Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.

Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud‑based data services is valuable.

Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.

Problem‑Solving and Analytical Skills: Data engineers should have strong problem‑solving abilities and analytical thinking to identify data‑related issues, troubleshoot problems, and optimize data processing workflows.

Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross‑functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms.

Education and Experience

  • Bachelor's degree in Engineering required.
  • Minimum Two years of related experience is highly preferred.
  • Two certifications in GCP (within 3 months after joining).
Job Description

a.Min experience

3 years as Data Engineer

b.Experience in writing SQL query, SQL function and SQL procedure is mandatory

c.Experience in Java development of API is an advantage

d.Have experience in SSRS reporting

,PowerBI analytic, Google Big Query, ETL & data pipeline is mandatory.

e.Have knowledge and experience in data science with Python is an advantage

Job Description

Scope of Work – Data Engineer

  • Mengeksplorasi database DI/DX (data pipelines, ETL, integrasi data).
  • Mengeksplorasi dan, jika diperlukan, mengembangkan Datamart untuk mendukung kebutuhan bisnis terkait DI/DX.
  • Menangani permintaan ad‑hoc untuk query, ekstraksi data, dan persiapan data terkait DI/DX.
  • Melakukan validasi pada data yang diekstrak untuk memastikan kualitas dan keandalan terkait DI/DX.
  • Memastikan konsistensi dan aksesibilitas data bagi pemangku kepentingan DXO terkait DI/DX.

Jenis Pekerjaan: Kontrak
Panjang kontrak: 12 bulan

Pertanyaan Lamaran:

  • Berapa usia kamu saat ini?
  • apakah kamu Memiliki pengalaman kerja dalam membangun dan mengelola data pipelines, ETL, dan integrasi data?
  • apakah Kamu Menguasai SQL serta tools/teknologi terkait Data Warehouse dan Datamart?

Pendidikan:

  • S1 (Diutamakan)

Pengalaman:

  • Data Engineer: 4 tahun (Diutamakan)
Job Description

Requirements:

  • Hold bachelor degree (S-1 degree) in Information Technology or Computer Engineering or Telecommunication or Statistic
  • Excellent written & verbal communication skill
  • Min. 3-5 years experience at ETL tools
  • Deep understanding in SQL, SQL optimization, data pipeline, and job optimization
  • Having knowledge shell script, VB script is a plus
  • Good and effective communication and leadership skills
  • Proactive, self-learn motivated person, and eager to learn new technologies
  • Good verbal and written English communication skills
  • Good interpersonal skills to be able to relate with people or personnel from different units of the company
  • Excellent numerical, analytical and problem‑solving skills
  • Ability to perform under pressure and tight time constraints; be able to work under limited guidance in line with a broad plan, or strategy;
Job Description
  • Build and optimize robust data pipelines for extracting, transforming, and loading (ETL) data from multiple sources into a central data warehouse or data lake.
  • Integrate data from multiple heterogeneous sources, ensuring data quality, consistency, and availability.
  • Monitor the performance of data systems, identify bottlenecks, and resolve issues related to data quality or processing failures.
Job Description

Company Description PT Mandiri Sekuritas (Mandiri Sekuritas/Company) has been awarded as Indonesia's Best Investment Bank and Best Broker by the FinanceAsia Country Awards 2022. These recognitions have established the Company's strong position as the Best Investment Bank in Indonesia for 12 consecutive years and Best Broker for 8 consecutive years. Established in 2000, Mandiri Sekuritas provides customers with comprehensive and value‑added capital market financial solutions. The Company obtained its business license as a securities broker and underwriter from Bapepam‑LK, demonstrating its commitment to excellence in financial services.

Role Description This is a contract, on‑site role for a Data Engineer located in Jakarta. The Data Engineer will be responsible for designing, developing, and managing data pipelines and infrastructure. Daily tasks include data acquisition, processing, and storage solutions; implementing data workflows; optimizing performance of data‑centric systems; and ensuring data quality and consistency. Additionally, the Data Engineer will collaborate with other teams to integrate and utilize data effectively.

Qualifications

  • 3–5 years of experience building large‑scale data pipelines in cloud or hybrid environments
  • Strong in SQL, Python, and Java; skilled with Bash scripting for automation
  • Hands‑on expertise with GCP, Azure, relational & non‑relational databases, and Hadoop/on‑prem systems
  • Production experience with Airflow DAGs, Spark, and Flink
  • Experienced with CI/CD & containerization (Git, Terraform, Helm, Docker, Kubernetes)
  • Solid grasp of distributed systems (partitioning, replication, fault tolerance)
  • Familiar with financial services data, regulations, and security frameworks
  • Excellent communicator—able to explain complex pipelines to non‑technical stakeholders
Job Description

As a Data Engineer at Cube Asia, you will use various methods to transform raw data into useful data systems. You'll strive for efficiency by aligning data systems with business goals.

To succeed in this position, you should have prior experience in large‑scale public data collection from the web using open APIs and other tools, as well as a good understanding of the terms and guidelines, as well as the technical considerations, governing such data collection.

Data engineer skills also include familiarity with several programming languages and a basic knowledge of machine learning methods. If you are detail‑oriented, with excellent organizational skills and experience in this field, we'd like to hear from you.

Responsibilities
  • Build and maintain scalable data pipelines to process and integrate e‑commerce data from multiple sources
  • Design and implement a modern cloud‑based data lakehouse architecture using AWS services such as S3, Athena, Glue, Fargate, and Iceberg
  • Explore tools and solutions for high‑performance data transformation and analysis, such as Polars, DuckDB, and PySpark
  • Work with data analysts to deliver accessible and well‑structured datasets for reporting and advanced analytics
  • Collaborate architects on designing data platforms
  • Explore ways to enhance data quality and reliability
What You'll Love About This Role
  • Build from Scratch: Be part of a team creating foundational data systems and processes, shaping the future of our platform
  • Learn by Doing: Gain hands‑on experience working with modern tools, cloud technologies, and real‑world data challenges
  • Work on Complex Projects: Tackle exciting and challenging problems, from integrating large‑scale e‑commerce data to optimizing data pipelines for performance and scalability
Requirements
  • Knowledge of programming languages (e.g. Java and Python)
  • Hands‑on experience with SQL database design
  • Previous experience as a data engineer or in a similar role
  • Technical expertise with data models, data mining, and segmentation techniques
  • Great numerical and analytical skills
  • Willingness to learn and work with new tools and technologies
Job Description

Hi #TalentReady, our client is looking for Data Engineer (ETL) for their project.

Full WFO at Jakarta Area | 12 Month Contract (PKWT) | Banking Industry

Requirements
  • Design, develop, deploy, and maintain robust ETL workflows using SSIS to support data integration from multiple sources into the data warehouse.
  • Build and maintain operational and analytical reports using SSRS, delivering insights to business users.
  • Collaborate with business analysts, data architects, and stakeholders to understand data requirements and translate them into technical specifications.
  • Optimize ETL packages for performance, scalability, and error handling.
  • Perform data profiling, validation, and reconciliation to ensure high data quality and integrity.
  • Maintain and improve existing SSIS/SSRS solutions.
  • Document ETL designs, data mappings, and workflow processes.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.