Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect

CriticalRiver Inc.

Teletrabalho

BRL 370.000 - 529.000

Tempo integral

Há 5 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions provider is seeking a Data Architect for a remote role supporting PST timings. The successful candidate will develop and maintain comprehensive data architecture, create data models, ensure governance standards, and lead implementation of scalable data solutions. Strong expertise in modern data modeling and cloud technologies, such as Snowflake and AWS, is essential, coupled with solid skills in SQL and Python. This role is ideal for a proactive leader in data architecture.

Qualificações

  • Expertise with dbt modelling approaches and Snowflake-optimized schemas.
  • Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Ability to design and manage complex Airflow DAGs.
  • Hands-on experience with S3, IAM for secure and efficient data access.

Responsabilidades

  • Develop and maintain the overall data architecture aligned with organizational strategy.
  • Create conceptual, logical, and physical data models.
  • Partner with stakeholders to translate requirements into scalable data solutions.
  • Define and enforce data governance and security standards.
  • Oversee the delivery of data pipelines and extensive data systems.

Conhecimentos

Modern Data Modelling
Snowflake Architecture
Airflow Orchestration
Advanced SQL & Python
AWS Infrastructure
ELT Strategy
Descrição da oferta de emprego

Job Title : Data Architect Duration : 6months

Location : LATAM [Remote]

Need to support PST timings

Description : What You'll Do

Design & Strategy : Develop and maintain the overall data architecture aligned with organizational strategy.

Blueprint Creation : Create conceptual, logical, and physical data models to support data storage, processing, and consumption needs.

Cross-Functional Collaboration : Partner with business and technical stakeholders to translate requirements into scalable data solutions.

Governance & Security : Define and enforce data governance, data quality, and security standards across all data systems.

Technology Selection : Evaluate and select cloud platforms, storage solutions, and data processing tools appropriate for enterprise needs.

Implementation Leadership : Oversee the delivery of data pipelines, data warehouses, data lakes, and APIs.

Performance Optimization : Ensure high performance, scalability, and reliability across all data systems.

Key Skills
  • Modern Data Modelling : Expertise with dbt modelling approaches and Snowflake-optimized schemas (e.g., Star Schema, Data Vault).
  • Snowflake Architecture : Strong knowledge of virtual warehouses, scaling, clustering, and storage optimization.
  • Airflow Orchestration : Ability to design and manage complex Airflow DAGs.
  • Advanced SQL & Python : Expertise in transforming, processing, and orchestrating data.
  • AWS Infrastructure : Hands-on experience with S3, IAM, and Private Link for secure and efficient data access.
  • ELT Strategy : Skilled in building ELT pipelines using tools like Fivetran and dbt.
Preferred Skills
  • Active Metadata & Governance (Atlan)
  • BI Semantic Layering
  • Data Mesh & Data Contracts
  • FinOps & Cost Governance
  • CI / CD for Data Pipelines
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.