Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

High-Performance Data Architect

Bebeedata

Manaus

Presencial

BRL 120.000 - 160.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology company in Manaus is seeking a Senior Data Engineer adept in Google Cloud Platform services. This role involves designing data architectures and developing ETL/ELT pipelines to support analytics and machine learning initiatives. The ideal candidate will have experience with SQL, CI/CD, and orchestration tools. Join a passionate team that values innovation and diversity while enjoying a competitive salary and benefits package.

Serviços

Competitive salary
Comprehensive benefits package
Opportunities for professional growth

Qualificações

  • Proficiency in GCP services is a must for the role.
  • Experience with SQL, Oracle, and PostgreSQL is required.
  • Knowledge of orchestration using Airflow is necessary.

Responsabilidades

  • Design and implement data architectures on GCP.
  • Develop scalable ETL/ELT pipelines ensuring data quality.
  • Collaborate with cross-functional teams for data analytics.

Conhecimentos

Strong proficiency in GCP services
Experience with SQL
Knowledge of orchestration using Cloud Composer
Hands-on experience with CI/CD
Experience with cloud cost optimization
GCP certifications
Knowledge of Kubernetes
Experience with Machine Learning pipelines
Involvement with Data Mesh architectures

Ferramentas

BigQuery
Dataflow
Dataproc
Pub/Sub
Cloud Storage
Composer
Git
Terraform
Descrição da oferta de emprego
Job Description

We are seeking an experienced Senior Data Engineer to join our team.

The ideal candidate will design and implement data architectures on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, and more. You will collaborate closely with data scientists, analysts, and software engineers to support advanced analytics and machine learning use cases. Develop scalable, high-performance ETL/ELT pipelines; ensure data quality, integrity, and security end-to-end; design and implement data architectures that meet business needs in collaboration with cross‑functional teams.

To succeed in this role, you should have proficiency in GCP services, experience with SQL, Oracle Database, and PostgreSQL, knowledge of orchestration using Cloud Composer (Airflow), and hands‑on experience with CI/CD applied to data pipelines (Git, Terraform). You should also have experience with cloud cost and performance optimization, GCP certifications, knowledge of Kubernetes (GKE) and APIs on GCP, experience with Machine Learning pipelines (Vertex AI, AI Platform), and involvement with Data Mesh and distributed architectures.

Requirements
  • Strong proficiency in GCP services, including BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and Composer
  • Experience with SQL, Oracle Database, and PostgreSQL
  • Knowledge of orchestration using Cloud Composer (Airflow)
  • Hands‑on experience with CI/CD applied to data pipelines (Git, Terraform)
  • Experience with cloud cost and performance optimization
  • GCP certifications
  • Knowledge of Kubernetes (GKE) and APIs on GCP
  • Experience with Machine Learning pipelines (Vertex AI, AI Platform)
  • Involvement with Data Mesh and distributed architectures
Benefits

The company offers a competitive salary, comprehensive benefits package, and opportunities for professional growth and development.

Why Join Our Team

Our team is passionate about innovation and collaboration.

We believe in creating a workplace culture that values diversity, equity, and inclusion.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.