Enable job alerts via email!

Data Engineer Specialist

Bank Mega

Daerah Khusus Ibukota Jakarta

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A financial institution in Jakarta is seeking a Data Engineer Specialist to design and build data pipelines, ensuring data quality and compliance. The ideal candidate has a Bachelor's or Master's degree in Computer Science and at least 3 years of experience in data engineering, with proficiency in SQL and programming languages like Python or Java. This role offers opportunities for professional growth in a dynamic environment.

Qualifications

  • Minimum 3 years of experience in data engineering.
  • Strong experience with ETL/ELT tools.
  • Deep understanding of data warehouse and lake architectures preferred.

Responsibilities

  • Design and build data pipelines and ETL processes for data warehouse/data lake.
  • Monitor, troubleshoot, and optimize data pipelines.
  • Implement data governance, security, and compliance best practices.

Skills

SQL
Python
Java
Data governance
Data modeling
ETL processes
Data quality assurance
Communication skills

Education

Bachelor's or Master's degree in Computer Science, Information Systems, or related field

Tools

Pentaho
Apache Airflow
Talend
Informatica
NiFi
Apache Kafka
Spark Streaming
Flink
SQL Server
PostgreSQL
Oracle
MySQL
DB2
Job description

We're looking for a talented and passionate individual to fill the role of Data Engineer Specialist with following details

Responsibilities
  • Design and build data pipelines and ETL processes for data warehouse/data lake.
  • Design data models and schemas based on business and analytics requirements.
  • Ensure data quality, consistency, and timely delivery across systems.
  • Monitor, troubleshoot, and optimize data pipelines.
  • Implement data governance, security, and compliance best practices in data pipelines.
  • Stay current with emerging data technologies and recommend improvements.
Qualifications
  • Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
  • Minimum 3 years of experience in data engineering.
  • Proficiency in SQL and one or more programming languages such as Python or Java.
  • Strong experience with ETL/ELT tools (e.g., Pentaho, Apache Airflow, Talend, Informatica, or NiFi).
  • Deep understanding of data warehouse and lake architectures (preferred if experienced in Snowflake or BigQuery).
  • Experience with streaming and real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink).
  • Experience with RDBMS (SQL Server, PostgreSQL, Oracle, MySQL, DB2).
  • Strong communication skills with technical and non-technical teams.
  • Preferred experience in SysOps of Data Tools.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.