Ativa os alertas de emprego por e-mail!

Data Engineer - GCP

TN Portugal

Lisboa

Presencial

EUR 40 000 - 60 000

Tempo integral

Há 2 dias
Torna-te num dos primeiros candidatos

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

A leading company in technological advancement is seeking a Data Engineer specializing in GCP. You will design and optimize data infrastructure, collaborate with teams, and ensure data quality. This role offers an exciting opportunity to leverage your expertise in a dynamic environment focused on societal improvement.

Qualificações

  • Expertise in GCP, particularly BigQuery, Dataflow, and Dataproc.
  • Proficiency in SQL and knowledge of Python.
  • Experience with DevOps, Terraform, CI/CD.

Responsabilidades

  • Design, develop, and maintain scalable data pipelines on GCP.
  • Collaborate with teams for data integration and transformation.
  • Implement data quality checks and validation routines.

Conhecimentos

GCP
SQL
Python
Database Design
DevOps
Agile Methodologies

Descrição da oferta de emprego

Social network you want to login/join with:

Job Title: Data Engineer - GCP

As a Data Engineer specializing in GCP, you will play a crucial role in designing, implementing, and optimizing our data infrastructure. You will work closely with cross-functional teams to drive impactful projects. This is an exciting opportunity to leverage your expertise in GCP while contributing to a company that values technological advancement and societal improvement.

Responsibilities
  1. Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP) to support business analytics and reporting needs.
  2. Collaborate with cross-functional teams to gather and analyze requirements for data integration and transformation processes.
  3. Implement data quality checks and validation routines to ensure the accuracy and reliability of data across all systems.
  4. Optimize and monitor data workflows for performance and efficiency, ensuring minimal downtime and maximum throughput.
  5. Stay updated with the latest GCP tools and technologies to continuously improve data infrastructure and processes.
Minimum Requirements
  • Expertise in GCP, particularly data-related services such as BigQuery, Dataflow, and Dataproc
  • Proficiency in SQL
  • In-depth knowledge of one or more programming languages, preferably Python
  • Expertise in database design and data modeling
  • Experience with DevOps, Terraform, CI/CD
  • Experience with Agile methodologies
  • Familiarity with Kafka, Azure Servicebus, Airflow, and DBT is a big plus
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.