Enable job alerts via email!

Data Engineer

Radya Labs

Daerah Khusus Ibukota Jakarta

On-site

IDR 200.000.000 - 300.000.000

Full time

2 days ago
Be an early applicant

Job summary

A leading technology firm in Indonesia is seeking a Data Engineer for various projects across financial, e-commerce, and consulting domains. The role involves designing and maintaining data warehouses and pipelines, optimizing ETL processes, and collaborating with cross-functional teams. Candidates should have at least 2 years of experience, proficiency in Python, Java, and Spark, and a Bachelor's degree in a related field.

Qualifications

  • Minimum 2 years of experience as a data engineer or related role.
  • Proficient in Spark, Python, Scala, Java, and SQL.
  • Hands-on experience with big data platforms such as Hadoop and cloud services.
  • Experience with ETL tools and data warehousing solutions.

Responsibilities

  • Design and maintain robust data warehouses and pipelines.
  • Build and optimize ETL processes for data projects.
  • Collaborate with teams to understand data requirements.
  • Implement data governance and quality standards.

Skills

Python
Java
Spark
SQL
Hadoop
Kafka
ETL tools
Data Visualization

Education

Bachelor’s degree in Engineering, Computer Science, IT, or Statistics

Tools

Airflow
Talend
Tableau
Power BI
AWS
Azure
GCP
Job description
Data Engineer – Multiple Opportunities in Indonesia

We have a portfolio of Data Engineer roles across Indonesia, covering financial, e‑commerce, and consulting domains. Positions offer competitive salaries, full‑time or contract employment, and opportunities to work on cutting‑edge data pipelines, ETL, and data warehousing solutions.

Key Responsibilities
  • Design, develop, and maintain robust data warehouses and data pipelines for financial and e‑commerce projects.
  • Build and optimize extract, transform, load (ETL) processes, ensuring performance, reliability, and cost efficiency.
  • Collaborate with data architects, data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable solutions.
  • Integrate data from multiple heterogeneous sources, including databases, APIs, and external systems, ensuring data quality and consistency.
  • Implement data governance, quality standards, and security controls across data lakes, warehouses, and lakes.
  • Investigate and resolve data processing issues, monitor system performance, and propose improvements.
  • Prepare and provide data reports and dashboards to support business decision making.
  • Stay current with emerging data technologies, tools, and best practices.
Required Qualifications
  • Minimum 2 years of professional experience as a data engineer, data architect, or related role.
  • Proficient in Spark, Python, Scala, Java, and SQL, with a strong focus on performance tuning.
  • Experience designing data models, dimensional modeling, and data layer design.
  • Hands‑on experience with big data platforms such as Hadoop, Spark, Kafka, and cloud services (AWS, Azure, GCP).
  • Knowledge of ETL tools (Airflow, Luigi, NiFi, Talend, Informatica) and data integration techniques.
  • Familiarity with data warehousing solutions (Redshift, BigQuery, Snowflake, Snowflake, etc.) and experience in building and optimizing warehouses.
  • Experience with data visualization tools such as Tableau, Power BI, or similar.
  • Strong problem‑solving, analytical, and communication skills; ability to collaborate across teams.
  • Hold a Bachelor’s degree in Engineering, Computer Science, Information Technology, Statistics, or a related field (Associate degree acceptable for some positions).
  • Preferred: certifications in GCP, Azure, or AWS; experience in finance or e‑commerce domains.
Skills & Technologies
  • Programming: Python, Java, Scala, SQL.
  • Big Data: Hadoop, Spark, Kafka.
  • Workflow & Scheduling: Airflow, Luigi, Oozie.
  • Cloud Platforms: AWS, Azure, GCP.
  • Data Modeling & Warehousing: dimensional modeling, Snowflake, BigQuery, Redshift.
  • ETL / Data Integration Tools: NiFi, Talend, Informatica.
  • Data Visualization: Tableau, Power BI.
  • Database Systems: SQL Server, MySQL, PostgreSQL.
  • Additional Tools: SSIS, SSRS, PowerShell, shell scripting.
Education & Experience
  • Bachelor’s degree in Engineering or related field (Associate degree acceptable for some roles).
  • Minimum 2 years of experience in data engineering, ETL, or data architecture.
  • For certain positions, certifications in cloud platforms are required (e.g., GCP in the first 3 months of employment).
Employment Details

Locations: North Jakarta (on‑site), Bintaro, Tangerang, and other Indonesian cities as specified per role.

Employment Type: Full‑time (contract positions also available).

Salary Ranges: IDR 8,000,000 – 12,000,000 per month; IDR 9,000,000 – 12,000,000 per month; IDR 12,0000 – 16,0000 per month (variable by company).

Posting Status: Actively hiring.

Equal Opportunity Statement

The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.