Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect

CriticalRiver Inc.

Teletrabalho

BRL 484.000 - 646.000

Tempo integral

Há 2 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A data solutions firm is searching for a Data Architect to develop and maintain data architecture aligned with organizational goals. Responsibilities include creating data models, partnering with stakeholders for scalable solutions, and overseeing data governance and security standards. Ideal candidates should have expertise in dbt, Snowflake architecture, and advanced SQL & Python. This remote role supports PST timings with a focus on high performance and reliability across data systems.

Qualificações

  • Expertise with dbt modelling approaches and Snowflake‑optimized schemas.
  • Strong knowledge of virtual warehouses, scaling, and storage optimization.
  • Ability to design and manage complex Airflow DAGs.

Responsabilidades

  • Develop and maintain the overall data architecture aligned with organizational strategy.
  • Create conceptual, logical, and physical data models.
  • Partner with stakeholders to translate requirements into data solutions.

Conhecimentos

Modern Data Modelling
Snowflake Architecture
Airflow Orchestration
Advanced SQL & Python
AWS Infrastructure
ELT Strategy
Descrição da oferta de emprego

Job Title: Data Architect

Duration: 6months

Location: LATAM [Remote]

Need to support PST timings

Description:
What You’ll Do
  • Design & Strategy: Develop and maintain the overall data architecture aligned with organizational strategy.
  • Blueprint Creation: Create conceptual, logical, and physical data models to support data storage, processing, and consumption needs.
  • Cross‑Functional Collaboration: Partner with business and technical stakeholders to translate requirements into scalable data solutions.
  • Governance & Security: Define and enforce data governance, data quality, and security standards across all data systems.
  • Technology Selection: Evaluate and select cloud platforms, storage solutions, and data processing tools appropriate for enterprise needs.
  • Implementation Leadership: Oversee the delivery of data pipelines, data warehouses, data lakes, and APIs.
  • Performance Optimization: Ensure high performance, scalability, and reliability across all data systems.
Key Skills
  • Modern Data Modelling: Expertise with dbt modelling approaches and Snowflake‑optimized schemas (e.g., Star Schema, Data Vault).
  • Snowflake Architecture: Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Airflow Orchestration: Ability to design and manage complex Airflow DAGs.
  • Advanced SQL & Python: Expertise in transforming, processing, and orchestrating data.
  • AWS Infrastructure: Hands‑on experience with S3, IAM, and Private Link for secure and efficient data access.
  • ELT Strategy: Skilled in building ELT pipelines using tools like Fivetran and dbt.
Preferred Skills
  • Active Metadata & Governance (Atlan)
  • BI Semantic Layering
  • Data Mesh & Data ContractsFinOps & Cost Governance
  • CI/CD for Data Pipelines
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.