Enable job alerts via email!

Data Engineer

Briohr

Indonesia

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant
Job description
What Jobs are available for Data Engineer in Indonesia?

Showing 224 Data Engineer jobs in Indonesia

Kredit Pintar

Jakarta, Jakarta IDR9000000 - IDR12000000 Y Kredit Pintar

Posted today

Job Description

Job Responsibilities

  • Participate in the construction and development of data warehouses for financial projects, design models according to business requirements, and implement ETL.
  • Provide data report support for business departments.
  • Participate in optimizing ETL to improve code efficiency and reduce costs.

Job Qualifications

  • Proficient in Spark, able to develop in Python and Scala, with good experience in performance tuning;
  • Familiar with various data warehouse modeling theories, proficient in data model design and data layer design;
  • Priority will be given to those with Tableau development experience;
  • Priority will be given to those with a background in finance and e-commerce projects;
  • Diligent, responsible, meticulous, with good team spirit, analytical skills, and communication skills.
  • Spark;
  • Phyton;
  • Scala;
Accenture Southeast Asia - 1

IDR8000000 - IDR12000000 Y Accenture Southeast Asia

Posted today

Job Description

Develop innovative data solutions that meet the evolving needs of the organization. Collaborate with cross-functional teams to identify and implement best practices in data management.

Monitor and optimize data pipelines for performance and reliability. Conduct regular data quality assessments and implement improvements as necessary.

Stay updated with the latest trends and technologies in data engineering to enhance team capabilities

Bachelor's Degree in relevant field of studies.

Accenture

IDR4000000 - IDR8000000 Y Accenture

Posted today

Job Description

Develop innovative data solutions that meet the evolving needs of the organization. Collaborate with cross-functional teams to identify and implement best practices in data management.

Monitor and optimize data pipelines for performance and reliability. Conduct regular data quality assessments and implement improvements as necessary.

Stay updated with the latest trends and technologies in data engineering to enhance team capabilities

Bachelor's Degree in relevant field of studies.

Devoteam

Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.

By combining creativity, tech, and data insights, we empower our customers to transform their business and unlock the future.

With 25 years' of experience and 8,000 employees across Europe and the Middle East, Devoteam promotes responsible tech for people and works to create better change.

#Creative Tech for Better Change

Devoteam has launched in January 2021 its new strategic plan, Infinite 2024, with the ambition to become the #1 EMEA partner of the leading Cloud-based platform companies (AWS, Google Cloud, Microsoft, Salesforce, ServiceNow), further reinforced by deep expertise in digital strategy, cybersecurity, and data.

The Data Engineer will be responsible for the following activities:

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization's requirements
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices
  • Optimise data processing and storage systems for performance and scalability

Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions

Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.

Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.

Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large-scale data processing.

ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.

Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.

Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud-based data services is valuable.

Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.

Problem-Solving and Analytical Skills: Data engineers should have strong problem-solving abilities and analytical thinking to identify data-related issues, troubleshoot problems, and optimize data processing workflows.

Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross-functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms.

Education and Experience

Bachelor's degree in Engineering required.

  • Minimum Two years of related experience is highly preferred.
  • Two certifications in GCP (within 3 months after joining).

Status: Full-Time

Duration: -

The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities.

Entrepreneur Trust Digital

IDR120000 - IDR160000 Y Entrepreneur Trust Digital | Entrust Digital

Posted today

Job Description

PT. Entrepreneur Trust Digital provide clients with our professional team which are adapt with all the latest technologies. Our services in IT Outsourcing are an efficient way to deliver IT solutions to any business to boost business performance. We are looking for talented individuals for
Data Engineer
position

Requirements:

  • Minimum 1-3 years of work experiences as Data Engineer/ETL Developer
  • Minimum Associate/Bachelor degree level of education (D3/S1)
  • Experience in SSIS and SQL Server
  • Experience with Joins (Inner, Left, Right, Cross), Delete, Turncate, Drop Table, Execution Plan
  • Familiar with programming language: C#, .Net, Java
  • Able to work for bank company in Bintaro, Tangerang
PT Intikom Berlian Mustika

Jakarta, Jakarta IDR8000000 - IDR12000000 Y PT Intikom Berlian Mustika

Posted today

Job Description

Scope of Work – Data Engineer

Scope of Work – Data Engineer

  • Mengeksplorasi database DI/DX (data pipelines, ETL, integrasi data).
  • Mengeksplorasi dan, jika diperlukan, mengembangkan Datamart untuk mendukung kebutuhan bisnis terkait DI/DX.
  • Menangani permintaan ad-hoc untuk query, ekstraksi data, dan persiapan data terkait DI/DX.
  • Melakukan validasi pada data yang diekstrak untuk memastikan kualitas dan keandalan terkait DI/DX.
  • Memastikan konsistensi dan aksesibilitas data bagi pemangku kepentingan DXO terkait DI/DX.

Jika Anda merasa cocok atau memiliki referensi kandidat yang sesuai, silakan hubungi kami via WhatsApp:

SoftwareDeveloper #Hiring #Career #ITJobs #Java #SQL

Jenis Pekerjaan: Kontrak
Panjang kontrak: 12 bulan

Pertanyaan Lamaran:

  • Berapa usia kamu saat ini ?
  • apakah kamu Memiliki pengalaman kerja dalam membangun dan mengelola data pipelines, ETL, dan integrasi data?
  • apakah Kamu Menguasai SQL serta tools/teknologi terkait Data Warehouse dan Datamart?

Pendidikan:

  • S1 (Diutamakan)

Pengalaman:

  • Data Engineer: 4 tahun (Diutamakan)
Developer Follow-Up Job

Requirements:

Hold bachelor degree (S-1 degree) in Information Technology or Computer Engineering or Telecommunication or Statistic

Excellent written & verbal communication skill

Min. 3-5 years experience at ETL tools

Deep understanding in SQL, SQL optimization, data pipeline, and job optimization

Having knowledge shell script, VB script is a plus

Good and effective communication and leadership skills

Proactive, self-learn motivated person, and eager to learn new technologies

Good verbal and written English communication skills

Good interpersonal skills to be able to relate with people or personnel from different units of the company

Excellent numerical, analytical and problem-solving skills

Ability to perform under pressure and tight time constraints; be able to work under limited guidance in line with a broad plan, or strategy;

PT Astra Graphia Information Technology (AGIT)

Jakarta, Jakarta IDR8000000 - IDR12000000 Y PT Astra Graphia Information Technology (AGIT)

Posted today

Job Description
  • Build and optimize robust data pipelines for extracting, transforming, and loading (ETL) data from multiple sources into a central data warehouse or data lake.
  • Integrate data from multiple heterogeneous sources, ensuring data quality, consistency, and availability.
  • Monitor the performance of data systems, identify bottlenecks, and resolve issues related to data quality or processing failures.
PT Mandiri Sekuritas (Mandiri Sekuritas/Company)

Company Description

PT Mandiri Sekuritas (Mandiri Sekuritas/Company) has been awarded as Indonesia's Best Investment Bank and Best Broker by the FinanceAsia Country Awards 2022. These recognitions have established the Company's strong position as the Best Investment Bank in Indonesia for 12 consecutive years and Best Broker for 8 consecutive years. Established in 2000, Mandiri Sekuritas provides customers with comprehensive and value-added capital market financial solutions. The Company obtained its business license as a securities broker and underwriter from Bapepam-LK, demonstrating its commitment to excellence in financial services.

Role Description

This is a contract, on-site role for a Data Engineer located in Jakarta. The Data Engineer will be responsible for designing, developing, and managing data pipelines and infrastructure. Daily tasks include data acquisition, processing, and storage solutions; implementing data workflows; optimizing performance of data-centric systems; and ensuring data quality and consistency. Additionally, the Data Engineer will collaborate with other teams to integrate and utilize data effectively.

Qualifications

  • 3–5 years of experience building large-scale data pipelines in cloud or hybrid environments
  • Strong in SQL, Python, and Java; skilled with Bash scripting for automation
  • Hands‑on expertise with GCP, Azure, relational & non‑relational databases, and Hadoop/on‑prem systems
  • Production experience with Airflow DAGs, Spark, and Flink
  • Experienced with CI/CD & containerization (Git, Terraform, Helm, Docker, Kubernetes)
  • Solid grasp of distributed systems (partitioning, replication, fault tolerance)
  • Familiar with financial services data, regulations, and security frameworks
  • Excellent communicator—able to explain complex pipelines to non‑technical stakeholders
Other Information

Explore these high-demand roles to expand your search:

Didn't find the right job? Get Career Advice to find your ideal role.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.