Senior Data Engineer (Jakarta, IDR9000000 - IDR12000000) – Kredit Pintar
Location: Jakarta, Jakarta
Job Type: Full‑time
Responsibilities
- Construct and develop data warehouses for financial projects, designing models per business requirements and implementing ETL processes.
- Provide data report support for business departments.
- Optimize ETL to improve code efficiency and reduce costs.
Qualifications
- Proficient in Spark; experience developing in Python and Scala with performance tuning.
- Familiar with data warehouse modeling theories and data layer design.
- Experience with Tableau development is a plus.
- Background in finance and e‑commerce projects is preferred.
- Strong teamwork, analytical, and communication skills.
Data Engineer – Devoteam
Role within Devoteam, a leading consulting firm focusing on digital strategy, tech platforms, and cybersecurity.
Responsibilities
- Design scalable and robust data architectures with data architects and stakeholders.
- Develop and maintain data pipelines including extraction, transformation, and loading to warehouses or lakes.
- Manage data warehouses and lakes for performance, scalability, and security.
- Integrate data from databases, APIs, and external systems.
- Implement ETL processes, ensuring data quality and governance.
- Collaborate with data scientists, analysts, and stakeholders to set quality standards.
- Optimize processing and storage for performance and scalability.
Qualifications
- Proficient in Python, Java, Scala, or SQL for data engineering.
- Strong relational database knowledge and experience with schema design and query optimization.
- Experience with Apache Hadoop, Spark, Kafka, and distributed computing.
- Skilled in ETL processes and tools such as Apache NiFi, Talend, or Informatica.
- Familiarity with data warehousing platforms like Redshift, BigQuery, or Snowflake.
- Knowledge of AWS, Azure, or GCP and deploying data engineering solutions in the cloud.
- Experience with workflow tools such as Airflow, Luigi, or Oozie.
- Problem‑solving, analytical skills, and effective communication.
Data Engineer – PT Intikom Berlian Mustika
Scope of Work – Data Engineer.
Responsibilities
- Explore and develop Datamart to support DI/DX needs.
- Handle ad‑hoc requests for querying, data extraction, and preparation for DI/DX.
- Validate extracted data for quality and reliability.
- Ensure data consistency and accessibility for DXO stakeholders.
Qualifications
- Bachelor’s degree (S‑1).
- Experience with data pipelines, ETL, and data integration.
- Strong SQL skills and experience with data warehouse and Datamart tools.
- Additional knowledge of shell or VB script is a plus.
Data Engineer – Various Experience (No Company Mention)
Required experience: 3+ years as Data Engineer.
- Mandatory SQL query, function, and procedure writing.
- Experience with Java API development is an advantage.
- SSRS reporting and PowerBI, Google BigQuery, ETL & data pipeline mandatory.
- Advantageous to have Python data science experience.
Data Engineer – PT Astra Graphia Information Technology (AGIT)
- Build and optimize robust ETL pipelines for multiple data sources into a centralized warehouse or lake.
- Integrate heterogeneous sources ensuring data quality, consistency, and availability.
- Monitor performance, identify bottlenecks, and resolve data quality or processing issues.
Data Engineer – PT Mandiri Sekuritas
Role Description
Contract, on‑site role in Jakarta. Responsible for designing, developing, and managing data pipelines and infrastructure, ensuring data quality and consistency.
Qualifications
- 3–5 years building large‑scale pipelines in cloud or hybrid environments.
- Strong SQL, Python, Java; scripting with Bash.
- Hands‑on GCP, Azure, relational & non‑relational databases, Hadoop/on‑prem.
- Experience with Airflow DAGs, Spark, Flink.
- Knowledge of CI/CD, containerization (Git, Terraform, Helm, Docker, Kubernetes).
- Understanding of distributed systems and financial data regulations.
- Excellent communicator for non‑technical stakeholders.
Data Engineer – Cube Asia
As a Data Engineer at Cube Asia, you will transform raw data into useful data systems aligned with business goals.
Responsibilities
- Build and maintain scalable pipelines for e‑commerce data integration.
- Design cloud‑based lakehouse architecture using AWS services.
- Explore high‑performance transformation tools like Polars, DuckDB, PySpark.
- Collaborate with analysts for accessible datasets.
- Work with architects on platform design.
- Enhance data quality and reliability.
What You'll Love
- Build from scratch and shape the future of our platform.
- Hands‑on experience with modern tools, cloud technologies, and real‑world challenges.
- Work on complex projects from integration to optimization.
Requirements
- Programming languages: Java and Python.
- SQL database design expertise.
- Prior data engineer or similar role experience.
- Data modeling, mining, and segmentation skills.
- Strong analytical skills.
- Willingness to learn new tools and technologies.
Data Engineer (ETL) – Accord Innovations
Full WFO, 12‑month contract in Jakarta, banking industry.
Responsibilities
- Design, deploy, and maintain robust SSIS ETL workflows.
- Build and maintain operational and analytical SSRS reports.
- Collaborate with analysts and stakeholders on requirements.
- Optimize SSIS packages for performance and error handling.
- Perform data profiling, validation, and reconciliation.
- Maintain and improve existing SSIS/SSRS solutions.
- Document ETL designs and workflow processes.
Data Engineer – Fast‑growing Digital Finance Platform
Supporting scalable data infrastructure and analytical capabilities across Southeast Asia.
Key Responsibilities
- Develop and maintain scalable data infrastructure, databases, and pipelines.
- Build and manage ingestion and transformation workflows.
- Apply best practices for stability, availability, and performance.
- Collaborate with engineering, data science, and product teams.
- Design large‑scale efficient pipelines.
- Translate user needs into practical tools and capabilities.
Qualifications
- 1–2 years of data engineering or backend development experience.
- Solid coding skills in Python, Java, or Scala.
- Familiar with Git, Maven, Docker, and Kubernetes.
- Experience with Spark, Kafka, Flink, Flume, Airflow.
- Comfortable with both relational and NoSQL databases.
- Prior cloud‑based data platform experience.
- Strong analytical mindset and communication skills.
- Motivation to stay current with emerging technologies.
- Experience with automation and DevOps is a plus.
Equal Opportunity Statement
The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities.