Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Warehouse Architect

Avm Consulting Inc

Teletrabalho

BRL 160.000 - 200.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading consulting firm is looking for a Data Warehouse Architect skilled in Databricks and AWS-native data services. This role requires architecting and deploying a new data warehouse instance and integrating various AWS services. Candidates should have advanced skills in PySpark, SQL, and data warehousing principles with experience in data privacy compliance. This highly technical role offers the opportunity to work in dynamic environments and collaborate with cross-functional teams to optimize data workflows.

Qualificações

  • End-to-end setup of Databricks workspaces and Unity Catalog.
  • Strong grasp of dimensional modeling and lakehouse design patterns.
  • Familiarity with data privacy practices (GDPR, CCPA).

Responsabilidades

  • Design and deploy a new Databricks Lakehouse instance.
  • Integrate AWS-native services for optimized performance.
  • Collaborate with teams to streamline data workflows.

Conhecimentos

Databricks / Lakehouse Architecture
Advanced PySpark / SQL skills
AWS Native Integration
Data Warehousing & Modeling
Automation & DevOps
Familiarity with CI / CD tools

Ferramentas

GitHub Actions
Terraform
DBX
Descrição da oferta de emprego
Databricks Data Warehouse Architect

About the role: One of the largest companies in the world in the Gaming industry is seeking a hands‑on Data Architect with DataWarehouse Engineer expertise in Databricks (DBX) and AWS‑native data services to spearhead the design and implementation of a new data warehouse instance for a major product line.

This role will involve building from the ground up—architecting scalable pipelines, optimizing lakehouse performance, and integrating seamlessly with diverse real‑time and batch data sources across AWS. The ideal candidate is passionate about data architecture, thrives in fast‑moving environments, and has a proven track record of setting up high-performance lakehouse platforms on Databricks with a strong foundation in data warehousing principles.

Key Responsibilities
  • Design and deploy a new Databricks Lakehouse instance tailored to the client's product‑level data needs.
  • Architect and implement robust data ingestion pipelines using Spark (PySpark / Scala) and Delta Lake.
  • Integrate AWS‑native services (S3, Glue, Athena, Redshift, Lambda) with Databricks for optimized performance and scalability.
  • Define data models, optimize query performance, and establish warehouse governance best practices.
  • Collaborate cross‑functionally with product teams, data scientists, and DevOps to streamline data workflows.
  • Maintain CI / CD, preferably DBX for data pipelines using GitOps and Infrastructure‑as‑Code.
  • Monitor data jobs and resolve performance bottlenecks or failures across environments.
Required Skills & Experience
  • Databricks / Lakehouse Architecture: end‑to‑end setup of Databricks workspaces and Unity Catalog.
  • Expertise in Delta Lake internals, file compaction, and schema enforcement.
  • Advanced PySpark / SQL skills for ETL and transformations.
  • AWS Native Integration: deep experience with AWS Glue, S3, Redshift Spectrum, Lambda, and Athena; IAM and VPC configuration knowledge for secure cloud integrations.
  • Data Warehousing & Modeling: strong grasp of modern dimensional modeling (star / snowflake schemas) and experience setting up lakehouse design patterns for mixed workloads.
  • Automation & DevOps: familiarity with CI / CD for data engineering using tools like DBX, Terraform, GitHub Actions, or Azure DevOps; proficient in monitoring tools like CloudWatch, Datadog, or New Relic for data pipelines.
  • Bonus / Nice to Have: experience supporting gaming or real‑time analytics workloads.
  • Familiarity with Airflow, Kafka, or EventBridge.
  • Exposure to data privacy and compliance practices (GDPR, CCPA).
Other Details
  • Location: Latin America (LATAM) region - Remote, USA - Remote.
  • Length: 1+ Year.
  • Client: Gaming giant.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.