Enable job alerts via email!

Senior Data Engineer

Devoteam I Google Cloud Partner

Jawa Barat

Hybrid

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A technology consulting company is seeking a Data Engineer in Indonesia to work directly with clients on data pipeline design and optimization. Ideal candidates will have over a year of experience with ETL processes, strong SQL skills, and a solid understanding of data warehousing practices. The role requires excellent communication for effective stakeholder collaboration.

Qualifications

  • 1+ years of experience as a Data Engineer or in a similar role.
  • Strong experience with ETL processes and data pipelines.
  • Solid understanding of data warehouse design, particularly Star Schema modeling.

Responsibilities

  • Engage directly with customers to understand business requirements.
  • Design and build data ingestion and ETL pipelines.
  • Implement data warehouse solutions using best practices.

Skills

Data pipeline design
ETL processes
SQL
Data warehousing
Azure Data Factory
SSIS
Snowflake
Communication

Education

Bachelor’s degree in Computer Science

Tools

Azure Data Factory
SSIS
Snowflake
Job description
Senior Data Engineer – Consulting Team (Jakarta Selatan)

We’re looking for a Data Engineer to join our consulting team and work directly with clients on the design, development, and optimization of data pipelines, ETL workflows, and data warehouses. This role is ideal for someone who enjoys solving complex data challenges while collaborating closely with stakeholders.

Responsibilities
  • Engage directly with customers to understand business requirements and translate them into technical data solutions.
  • Design, build, and maintain data ingestion and ETL pipelines to support analytics and reporting.
  • Implement data warehouse solutions following Star Schema best practices.
  • Develop and orchestrate workflows using Azure Data Factory and/or Microsoft Fabric.
  • Leverage SSIS (SQL Server Integration Services) for ETL and SSAS (SQL Server Analysis Services) for analytical modeling.
  • Build and manage data solutions in Snowflake.
  • Monitor, troubleshoot, and optimize pipelines for performance and reliability.
  • Provide technical guidance and best practices to clients and internal teams.
Qualifications
  • 1+ years of experience as a Data Engineer or in a similar data-focused role.
  • Strong experience with ETL processes, data pipelines, and data ingestion.
  • Solid understanding of data warehouse design, particularly Star Schema modeling.
  • Hands‑on experience with Azure Data Factory and/or Microsoft Fabric.
  • Proficiency with SSIS and SSAS.
  • Experience working with Snowflake.
  • Strong SQL skills and understanding of relational database concepts.
  • Excellent communication skills for engaging with customers and translating requirements into solutions.
Nice to Have
  • Consulting or client‑facing project experience.
  • Exposure to BI tools (Power BI, Tableau) and data governance practices.
  • Knowledge of cloud platforms beyond Azure (AWS, GCP).
Data Science Engineer

Apply data science techniques and machine learning algorithms to solve business problems, improve decision‑making, and ensure the efficient deployment of models in production.

What You’ll Do
  • Understand business objectives and develop models that help achieve them, along with metrics to track progress.
  • Analyze ML algorithms suitable for a given problem.
  • Explore and visualize data to gain an understanding of it.
  • Identify data distribution differences that could affect performance when deploying the model in the real world.
  • Verify data quality and ensure it via data cleaning.
  • Supervise the data acquisition process if more data is needed.
  • Define preprocessing or feature engineering for a given dataset.
  • Train models and tune hyperparameters.
  • Analyze model errors and design strategies to overcome them.
  • Deploy models to production.
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Mathematics, or a related field.
  • 4+ years of experience in data science, machine learning, or related fields.
  • Data Science or Machine Learning certifications (e.g., Google Professional Data Engineer, Microsoft Certified: Azure Data Scientist).
  • Experience with specific data science platforms (e.g., AWS Sagemaker, Google AI Platform) is a plus.
Soft Skill Requirements
  • Strong problem‑solving and analytical skills.
  • Effective communication skills for presenting findings to stakeholders.
  • Ability to work collaboratively in a team environment.
  • Adaptability and a proactive approach to problem‑solving.
Technical Skill Requirements
  • Proficiency in data science tools and languages (Python, R, SQL).
  • Expertise in machine learning algorithms and frameworks (e.g., TensorFlow, PyTorch, Scikit‑learn).
  • Strong knowledge of data processing, feature engineering, and model validation techniques.
  • Experience with cloud platforms (e.g., AWS, GCP) and deployment of models to production.
Data Engineer – MST

PT Mitra Solusi Telematika (MST) is seeking a passionate Data Engineer to join our dynamic technology team in Jakarta office. You will play a crucial role in the design, development, and maintenance of our data infrastructure.

Responsibilities
  • Collaborate with cross‑functional teams to gather data requirements and develop ETL processes for seamless data integration.
  • Continuously monitor and enhance the performance of data warehousing solutions, ensuring scalability and efficiency.
  • Design and implement data models that meet business objectives and adhere to best practices.
  • Implement data governance and security measures to ensure data quality and compliance with regulatory standards.
  • Maintain clear and concise documentation of data pipelines, processes, and configurations.
  • Utilize Snowflake expertise to architect, build, and optimize data pipelines and warehouse solutions for our clients.
Qualifications
  • Bachelor’s degree in computer science, data science, software engineering, information systems, or related field.
  • Minimum 3+ years of experience as Data Engineer.
  • Proficient in programming languages such as R, SQL, ETL, Python, and C++.
  • Knowledge of visualization tools such as Power BI, Tableau, etc.
  • Relevant certifications or willingness to pursue them.
  • AWS Data Engineer/AWS Cloud Engineer certification highly preferred.
  • Strong communication skills, proactive attitude, and great teamwork traits.
  • Willing to be placed in Jakarta for 6 months contract with hybrid terms.
Mid‑Level Data Engineer – Insignia

At Insignia, we’re looking for a Mid‑Level Data Engineer who has hands‑on experience with Databricks and solid experience across AWS, GCP, or Azure.

What You’ll Do
  • Design, build, and maintain scalable data pipelines using Databricks (Lakehouse, Delta Lake, Spark).
  • Work across cloud platforms (AWS preferred, also GCP/Azure) – S3, BigQuery, Blob Storage, etc.
  • Transform raw data into structured, reliable datasets for analytics and ML teams.
  • Optimize performance, cost, and governance across data workflows.
  • Collaborate with analysts, MLEs, and software engineers to ensure data readiness.
  • Implement CI/CD, monitoring, and documentation practices for data systems.
Who You Are
  • 2–4 years of experience in data engineering, ideally within tech‑driven or digital service environments.
  • Hands‑on experience with Databricks – including PySpark, SQL, and workflow automation.
  • Proven track record working with at least one major cloud provider: AWS (S3, Glue, Redshift), GCP (BigQuery, Pub/Sub), or Azure (Data Lake, Synapse).
  • Proficient in Python, SQL, and data modeling (medallion architecture, star schema, etc.).
  • Experience with orchestration tools like Airflow, Prefect, or Step Functions.
  • Bonus: Familiarity with Unity Catalog, MLflow, or real‑time streaming (Kafka, Kinesis).
  • Fluent in English – written and spoken.
  • Collaborative, proactive, and passionate about building clean, maintainable data infrastructure.
Why Join Us?

Because great data systems aren’t just fast – they’re trusted, reusable, and built to evolve. If you’re ready to work on high‑impact projects where your pipelines power AI and insight, let’s talk.

Data Engineer – Accord Innovations

Accord Innovations is hiring a Data Engineer for a 3‑month contract. The role focuses on building reliable, scalable data pipelines in a multi‑cloud environment.

Requirements
  • 1+ year of experience as a Data Engineer or similar role.
  • Proficient in Python (primary), Java, or Scala.
  • Hands‑on experience with Apache Kafka & stream processing.
  • Familiar with RisingWave, ClickHouse, or similar analytical databases.
  • Strong in SQL/NoSQL, data modeling, and ETL/ELT pipelines.
  • Nice to Have: Familiarity with AWS/GCP/Azure (S3, BigQuery, Redshift, etc.).
  • Experience using Grafana, Prometheus, or Kibana for monitoring.
  • Strong performance tuning & data governance awareness.
Location & Contract

Hybrid – Depok. Contract duration: 3 months (extendable).

Additional Technical Skill Focus (from Insignia and Accord)
  • Proven experience as a Data Engineer with strong data solution design.
  • Programming in Python, Java, or Scala.
  • Stream processing and CDC pipelines, particularly using Apache Kafka.
  • Analytical databases and real‑time data warehouses, especially RisingWave and ClickHouse.
  • Database concepts (SQL & NoSQL) and query optimization for high‑performance analytics.
  • Data ingestion pipelines, ETL/ELT processes, and system integrations.
  • Data modeling, partitioning, and schema evolution in streaming/analytical environments.
  • Cloud platforms (AWS, GCP, or Azure) and services such as S3, BigQuery, Redshift, EMR, or EC2.
  • Observability and monitoring tools (Grafana, Prometheus, Kibana) for data pipeline monitoring.
  • Performance tuning for highly concurrent workloads.
  • Basic understanding of data governance, security, and compliance practices.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.