Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Transform Your Career As A Data Leader

Bebeedataengineer

Porto Alegre

Presencial

BRL 160.000 - 200.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions provider in Brazil is seeking a seasoned Senior Data Engineer to lead the design and implementation of cloud-based data architectures on Google Cloud Platform. The candidate will be responsible for developing and optimizing high-performance ETL/ELT pipelines, ensuring data quality, and collaborating with data scientists and analysts. This role offers a competitive compensation package within a collaborative environment focused on continuous learning and cutting-edge data projects.

Serviços

Opportunity to work on cutting-edge data projects
Collaborative environment with experienced professionals
Continuous learning and development opportunities
Competitive compensation and benefits package

Qualificações

  • Strong experience designing, developing, and deploying ETL / ELT pipelines.
  • Familiarity with CI/CD for data pipelines and related tools.
  • Understanding of data governance best practices.

Responsabilidades

  • Design and implement data architectures using GCP services.
  • Develop and optimize scalable ETL / ELT pipelines.
  • Ensure data quality and integrity.
  • Collaborate with data scientists and analysts.

Conhecimentos

BigQuery
Dataflow (Apache Beam)
Cloud Storage
Pub/Sub
SQL
Oracle Database
PostgreSQL
Cloud Composer (Airflow)
Git
Terraform
Kubernetes (GKE)
Machine Learning (Vertex AI, AI Platform)

Formação académica

GCP certifications
Descrição da oferta de emprego
Senior Data Engineer

We are seeking a seasoned Data Engineer to lead the design and implementation of our cloud-based data architectures on Google Cloud Platform (GCP).

This is an exciting opportunity for a highly skilled professional to join our team and contribute to the development of cutting‑edge data solutions. The ideal candidate will have expertise in designing, developing, and deploying scalable, high-performance ETL / ELT pipelines using GCP services such as BigQuery, Dataflow, Dataproc, and Pub / Sub.

Responsibilities
  • Design and implement data architectures on GCP using services like BigQuery, Dataflow, Dataproc, Pub / Sub, and Cloud Storage.
  • Develop and optimize scalable, high-performance ETL / ELT pipelines.
  • Ensure data quality, integrity, and security end-to-end.
  • Create and maintain data models aligned with business needs.
  • Collaborate with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases.
  • Automate ingestion, transformation, and data delivery processes.
  • Monitor and optimize cost and performance of GCP resources.
  • Implement best practices for DataOps and Data Governance.
Requirements
  • Proficiency in BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub / Sub.
  • Experience with SQL, Oracle Database, and PostgreSQL.
  • Knowledge of orchestration using Cloud Composer (Airflow).
  • Hands‑on experience with CI/CD applied to data pipelines (Git, Terraform).
  • Experience with cloud cost and performance optimization.
  • GCP certifications.
  • Knowledge of Kubernetes (GKE) and APIs on GCP.
  • Experience with Machine Learning pipelines (Vertex AI, AI Platform).
  • Previous involvement with Data Mesh and distributed architectures.
  • Understanding of Data Lake layers.
  • Knowledge of batch and streaming processing.
  • Experience with data modeling (relational, dimensional, and NoSQL).
Benefits

Opportunity to work on cutting‑edge data projects. Collaborative environment with experienced professionals. Continuous learning and development opportunities. Competitive compensation and benefits package.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.