Enable job alerts via email!

Senior Data Engineer

Devoteam

Jakarta Selatan

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A leading consulting firm is seeking a Technical Lead for Data Engineering in Jakarta Selatan. This role involves designing scalable data platforms using Google Cloud, mentoring junior engineers, and engaging with clients to translate their needs into technical specifications. Ideal candidates have over 5 years in data engineering, proficiency with GCP services, and strong consulting skills.

Qualifications

  • 5+ years of hands-on experience in Data Engineering or Software Development.
  • 2+ years in a technical Lead role.
  • Expert proficiency in Google Cloud Platform Data Services.

Responsibilities

  • Lead design and implementation of scalable data platforms using GCP.
  • Act as a trusted advisor to clients on data strategy.
  • Mentor junior data engineers.

Skills

Data Engineering
Google Cloud Platform
Python
SQL
Infrastructure as Code
CI/CD pipelines
Consulting
Problem-solving

Tools

BigQuery
Dataflow
Terraform
Cloud Composer
Pub/Sub
Job description

Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.

By combining creativity, tech, and data insights, we empower our customers to transform their business and unlock the future.

With 25 years’ of experience and 8,000 employees across Europe and the Middle East, Devoteam promotes responsible tech for people and works to create better change.

#Creative Tech for Better Change

Devoteam has launched in January 2021 its new strategic plan, Infinite 2024, with the ambition to become the #1 EMEA partner of the leading Cloud-based platform companies (AWS, Google Cloud, Microsoft, Salesforce, ServiceNow), further reinforced by deep expertise in digital strategy, cybersecurity, and data.

Job Description

Technical Leadership & Architecture

Lead the end-to-end design and implementation of highly scalable, reliable, and secure data platforms and pipelines using the Google Cloud Data & Analytics suite.

Define technical standards, coding best practices, and architectural patterns (e.g., Kimball, Data Vault) for all data engineering solutions delivered to clients.

Serve as the primary technical expert on key GCP services including BigQuery, Dataflow (Apache Beam), Dataproc, Cloud Composer (Apache Airflow), Pub/Sub, and Cloud Storage.

Drive the adoption of Infrastructure as Code (IaC) principles, primarily using Terraform, for provisioning and managing data infrastructure.

Oversee the implementation of CI/CD pipelines (e.g., using Cloud Build, Jenkins, or GitHub Actions) to automate deployment and testing processes.

Ensure rigorous data quality, governance, and security standards (IAM, encryption) are embedded into every data solution.

Consulting & Client Engagement

Translate ambiguous or complex client business requirements into precise technical specifications and architectural blueprints.

Act as a trusted advisor to senior client stakeholders (e.g., Lead, Data Directors) on data strategy, cloud migration pathways, and modern data stack solutions.

Own project delivery success from a technical perspective, including estimating effort, managing technical risks, and ensuring timely, high-quality solution deployment.

Mentor and coach junior and mid-level data engineers, fostering their growth in GCP expertise, consulting skills, and engineering excellence.

Conduct code reviews, provide constructive feedback, and maintain accountability for the technical output of the project team.

Participate in the recruitment process, helping to build and scale Devoteam’s data engineering capability.

Qualifications

Technical Expertise (Must-Haves)

5+ years of hands-on experience in Data Engineering, Software Development, or a related field.

2+ years in a technical Lead role, guiding teams and owning solution architecture.

Expert Proficiency in Google Cloud Platform (GCP) Data Services: Deep, practical experience with BigQuery, Dataflow, Cloud Composer (Airflow), and Pub/Sub.

Programming Languages: Expert-level proficiency in Python (for data processing, API integration, and automation) and SQL (advanced query optimization and data modeling).

Data Modeling & Warehousing: Extensive experience designing and implementing dimensional models (Star/Snowflake schema) and managing Data Lakes/Data Warehouses.

DevOps & Automation: Proven experience with Infrastructure as Code (Terraform) and building robust CI/CD pipelines.

Consulting & Soft Skills

Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences.

Demonstrated ability to thrive in a consulting or client-facing project delivery environment.

Strong organizational, analytical, and problem-solving skills.

Fluency in English (additional local language proficiency is a plus).

Preferred (Nice-to-Have) Skills

Certification: Active GCP Professional Data Engineer certification is highly desirable. Additional certifications (Professional Cloud Architect or Machine Learning Engineer) are a bonus.

Advanced Data Tools: Experience with modern transformation tools like Dataform or dbt.

Streaming & Real-time: Experience with real-time data processing and stream analytics (e.g., Kafka, Flink, or highly optimized Pub/Sub and Dataflow).

Big Data Ecosystem: Familiarity with the broader Apache ecosystem (Spark, Hadoop).

AI/ML Integration: Exposure to Vertex AI and enabling data readiness for ML models.

Domain Expertise: Prior project experience in Financial Services, Telecommunications, or Retail.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.