Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect

CriticalRiver Inc.

Teletrabalho

BRL 376.000 - 539.000

Tempo integral

Há 4 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A technology consulting firm is seeking an experienced Data Architect to develop a comprehensive data architecture remotely. The ideal candidate will have expertise in data modeling, Snowflake architecture, and skills in SQL and Python. Responsibilities include collaborating with stakeholders to create scalable data solutions and maintaining data governance standards. This is an exciting opportunity for those looking to lead data strategy in a dynamic environment.

Qualificações

  • Strong knowledge of dbt modelling and Snowflake-optimized schemas.
  • Expertise in designing complex Airflow DAGs.
  • Experience with AWS S3, IAM, and Private Link.

Responsabilidades

  • Develop and maintain the overall data architecture aligned with business strategies.
  • Create conceptual and physical data models.
  • Partner with stakeholders for scalable data solutions.
  • Define and enforce data governance standards.

Conhecimentos

Modern Data Modelling
Snowflake Architecture
Airflow Orchestration
Advanced SQL
Python
AWS Infrastructure
ELT Strategy
Descrição da oferta de emprego

Job Title : Data Architect Duration : 6months

Location : LATAM [Remote]

Need to support PST timings

Description : What You’ll Do
  • Design & Strategy: Develop and maintain the overall data architecture aligned with organizational strategy.
  • Blueprint Creation: Create conceptual, logical, and physical data models to support data storage, processing, and consumption needs.
  • Cross‑Functional Collaboration: Partner with business and technical stakeholders to translate requirements into scalable data solutions.
  • Governance & Security: Define and enforce data governance, data quality, and security standards across all data systems.
  • Technology Selection: Evaluate and select cloud platforms, storage solutions, and data processing tools appropriate for enterprise needs.
  • Implementation Leadership: Oversee the delivery of data pipelines, data warehouses, data lakes, and APIs.
  • Performance Optimization: Ensure high performance, scalability, and reliability across all data systems.
Key Skills
  • Modern Data Modelling: Expertise with dbt modelling approaches and Snowflake‑optimized schemas (e.g., Star Schema, Data Vault).
  • Snowflake Architecture: Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Airflow Orchestration: Ability to design and manage complex Airflow DAGs.
  • Advanced SQL & Python: Expertise in transforming, processing, and orchestrating data.
  • AWS Infrastructure: Hands‑on experience with S3, IAM, and Private Link for secure and efficient data access.
  • ELT Strategy: Skilled in building ELT pipelines using tools like Fivetran and dbt.
Preferred Skills
  • Active Metadata & Governance (Atlan)
  • BI Semantic Layering
  • Data Mesh & Data Contracts
  • FinOps & Cost Governance
  • CI / CD for Data Pipelines
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.