Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Warehouse Architect

Avm Consulting Inc

Teletrabalho

BRL 160.000 - 200.000

Tempo integral

Há 9 dias

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading consulting firm is looking for a hands-on Data Warehouse Architect to lead the design and implementation of a Databricks Lakehouse. This role requires a solid background in data architecture with expertise in Databricks and AWS services. You will create data ingestion pipelines, optimize data workflows, and collaborate cross-functionally to ensure high-performance outcomes. A strong familiarity with data warehousing principles and modern data integration tools is essential for success in this role.

Qualificações

  • Hands-on experience with Databricks and AWS services.
  • Advanced PySpark and SQL skills.
  • Strong understanding of data warehousing principles.

Responsabilidades

  • Design and implement a new Databricks Lakehouse instance.
  • Build robust data ingestion pipelines using Spark.
  • Integrate AWS-native services with Databricks.

Conhecimentos

Databricks
AWS
PySpark
SQL
Delta Lake
Data Warehousing
CI / CD
CloudWatch

Ferramentas

AWS Glue
Terraform
GitHub Actions
Azure DevOps
Descrição da oferta de emprego
Databricks Data Warehouse Architect

About the role : One of the largest companies in the world in the Gaming industry is seeking a hands‑on Data Architect with DataWarehouse Engineer expertise in Databricks (DBX) and AWS‑native data services to spearhead the design and implementation of a new data warehouse instance for a major product line. This role will involve building from the ground up—architecting scalable pipelines, optimizing lakehouse performance, and integrating seamlessly with diverse real‑time and batch data sources across AWS.

The ideal candidate is passionate about data architecture , thrives in fast‑moving environments, and has a proven track record of setting up high‑performance lakehouse platforms on Databricks with a strong foundation in data warehousing principles .

Key Responsibilities :

Design and deploy a new Databricks Lakehouse instance tailored to the client’s product‑level data needs.

Architect and implement robust data ingestion pipelines using Spark (PySpark / Scala) and Delta Lake.

Integrate AWS‑native services (S3, Glue, Athena, Redshift, Lambda) with Databricks for optimized performance and scalability.

Define data models, optimize query performance, and establish warehouse governance best practices.

Collaborate cross‑functionally with product teams, data scientists, and DevOps to streamline data workflows.

Maintain CI / CD, preferably DBX for data pipelines using GitOps and Infrastructure‑as‑Code.

Monitor data jobs and resolve performance bottlenecks or failures across environments.

Required Skills & Experience : Databricks / Lakehouse Architecture End‑to‑end setup of Databricks workspaces and Unity Catalog

Expertise in Delta Lake internals , file compaction, and schema enforcement

Advanced PySpark / SQL skills for ETL and transformations

AWS Native Integration Deep experience with AWS Glue , S3 , Redshift Spectrum , Lambda , and Athena

IAM and VPC configuration knowledge for secure cloud integrations

Data Warehousing & Modeling Strong grasp of modern dimensional modeling (star / snowflake schemas)

Experience setting up lakehouse design patterns for mixed workloads

Automation & DevOps Familiarity with CI / CD for data engineering using tools like DBX, Terraform, GitHub Actions, or Azure DevOps

Proficient in monitoring tools like CloudWatch, Datadog, or New Relic for data pipelines

Bonus / Nice to Have : Experience supporting gaming or real‑time analytics workloads

Familiarity with Airflow , Kafka , or EventBridge

Exposure to data privacy and compliance practices (GDPR, CCPA)

Other Details : Location : Latin America (LATAM) region - Remote, USA - Remote

Length : 1+ Year

Client : Gaming giant

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.