Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect

CriticalRiver Inc.

Teletrabalho

BRL 430.000 - 646.000

Tempo parcial

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A global tech company is seeking a Data Architect who will design and maintain data architecture, collaborate with stakeholders, and oversee data pipeline deliveries. The role requires expertise in modern data modeling, Snowflake architecture, and AWS infrastructure. Strong skills in SQL and Python are necessary. This position supports remote work from LATAM, providing a chance to drive strategic data solutions across the organization.

Qualificações

  • Expertise with dbt modelling approaches and Snowflake-optimized schemas.
  • Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Ability to design and manage complex Airflow DAGs.
  • Expertise in transforming, processing, and orchestrating data.
  • Hands-on experience with AWS S3, IAM, and Private Link.

Responsabilidades

  • Develop and maintain the overall data architecture aligned with organizational strategy.
  • Create conceptual, logical, and physical data models to support data needs.
  • Partner with business and technical stakeholders to translate requirements.
  • Define and enforce data governance, data quality, and security standards.
  • Evaluate and select cloud platforms and data processing tools.

Conhecimentos

Modern Data Modelling
Snowflake Architecture
Airflow Orchestration
Advanced SQL & Python
AWS Infrastructure
ELT Strategy
Descrição da oferta de emprego
Job Title: Data Architect
Duration: 6months
Location: LATAM (Remote)

Need to support PST timings

Description
What You’ll Do

Design & Strategy: Develop and maintain the overall data architecture aligned with organizational strategy.

Blueprint Creation: Create conceptual, logical, and physical data models to support data storage, processing, and consumption needs.

Cross‑Functional Collaboration: Partner with business and technical stakeholders to translate requirements into scalable data solutions.

Governance & Security: Define and enforce data governance, data quality, and security standards across all data systems.

Technology Selection: Evaluate and select cloud platforms, storage solutions, and data processing tools appropriate for enterprise needs.

Implementation Leadership: Oversee the delivery of data pipelines, data warehouses, data lakes, and APIs.

Performance Optimization: Ensure high performance, scalability, and reliability across all data systems.

Key Skills
  • Modern Data Modelling: Expertise with dbt modelling approaches and Snowflake‑optimized schemas (e.g., Star Schema, Data Vault).
  • Snowflake Architecture: Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Airflow Orchestration: Ability to design and manage complex Airflow DAGs.
  • Advanced SQL & Python: Expertise in transforming, processing, and orchestrating data.
  • AWS Infrastructure: Hands‑on experience with S3, IAM, and Private Link for secure and efficient data access.
  • ELT Strategy: Skilled in building ELT pipelines using tools like Fivetran and dbt.
Preferred Skills
  • Active Metadata & Governance (Atlan)
  • BI Semantic Layering
  • Data Mesh & Data Contracts
  • FinOps & Cost Governance
  • CI/CD for Data Pipelines
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.