Enable job alerts via email!

Data Engineer (GCP)

U3 INFOTECH PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A tech consultancy in Singapore is seeking a skilled Data Engineer for a 12-month contract. You will design and maintain data pipelines using Google Cloud Platform (GCP) to support data science initiatives. The ideal candidate has over 4 years of experience, strong GCP knowledge, and proficiency in Python and SQL. This role involves collaboration with cross-functional teams to enhance data accessibility and quality.

Qualifications

  • 4+ years of experience as a Data Engineer or similar role, preferably with GCP expertise.
  • Strong proficiency in SQL and experience with NoSQL databases.
  • Expertise in data modeling, ETL processes, and data warehousing concepts.
  • Significant experience with GCP services such as BigQuery, Dataflow, and Cloud Storage.
  • Proficiency in at least one programming language for data pipeline development.
  • GCP certifications are highly advantageous.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL processes using GCP services.
  • Implement and optimize data storage solutions using GCP technologies.
  • Collaborate with data scientists to understand data requirements.

Skills

Big Query
ETL
Data Management
Python
SQL
Kubernetes
Cloud computing (GCP)

Education

BSc/MSc in Computer Science, Information Systems, or related field

Tools

Cloud Storage
Hadoop
Kafka
Job description
Overview

Role: Data Engineer

Location: Beach Road

Duration: 12 months extendable contract

This role will be part of the client's Data Science team, which has achieved great recognition. The Data Engineer will engage with external Clients and internal customers, understand their needs, and design, build, and maintain data pipelines and infrastructure using Google Cloud Platform (GCP).

This will involve the design and implementation of scalable data architectures, ETL processes, and data warehousing solutions on GCP. The role requires expertise in big data technologies, cloud computing, and data integration, as well as the ability to optimize data systems for performance and reliability.

This requires a blend of skills including programming, database management, cloud infrastructure, and data pipeline development. Additionally, problem-solving skills, attention to detail, and the ability to work in a fast-paced environment are valuable traits.

You will frequently work as part of a scrum team, together with data scientists, ML engineers, and analyst developers, to design and implement robust data infrastructure that supports analytics and machine learning initiatives.

Responsibilities
  • Design, build, and maintain scalable data pipelines and ETL processes using GCP services such as Cloud Dataflow, Cloud Dataproc, and BigQuery.
  • Implement and optimize data storage solutions using GCP technologies like Cloud Storage, Cloud SQL, and Cloud Spanner.
  • Develop and maintain data warehouses and data lakes on GCP, ensuring data quality, accessibility, and security.
  • Collaborate with data scientists and analysts to understand data requirements and provide efficient data access solutions.
  • Implement data governance and security measures to ensure compliance with regulations and best practices.
  • Automate data workflows and implement monitoring and alerting systems for data pipelines.
  • Share data engineering knowledge with the wider functions and develop reusable data integration patterns and best practices.
Technical Skills

Mandatory Skills:

  • Big Query, ETL, Data Management, Python, SQL, Kubernetes, Cloud computing (GCP)

Optional Skills:

  • Cloud Storage, Hadoop, Kafka
Qualifications
  • BSc/MSc in Computer Science, Information Systems, or related field, or equivalent work experience.
  • Proven experience (4+ years) as a Data Engineer or similar role, preferably with GCP expertise.
  • Strong proficiency in SQL and experience with NoSQL databases.
  • Expertise in data modeling, ETL processes, and data warehousing concepts.
  • Significant experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub.
  • Proficiency in at least one programming language (e.g., Python, Java, or Scala) for data pipeline development.
  • Experience with big data technologies such as Hadoop, Spark, and Kafka.
  • Knowledge of data governance, security, and compliance best practices.
  • GCP certifications (e.g., Professional Data Engineer) are highly advantageous.
  • Effective communication skills to collaborate with cross-functional teams and explain technical concepts to non-technical stakeholders.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.