Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect With Aws & Snowflake Experience

Ampstek

Teletrabalho

BRL 160.000 - 200.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A technology solutions provider is seeking an experienced Data Architect to lead technical direction and mentorship for data engineers. You will architect and optimize data pipelines, design scalable workflows, and implement solutions with technologies like Snowflake and Apache Airflow. The ideal candidate has over 8 years of experience in data engineering, robust programming skills in Python, and strong familiarity with cloud platforms. This 100% remote position requires leadership ability and expertise in data governance.

Qualificações

  • 8+ years of experience in data engineering.
  • Strong programming skills in Python.
  • Extensive experience with Snowflake or equivalent.
  • Expertise in building workflows with Apache Airflow.
  • Deep understanding of messaging queues.
  • Ability to architect scalable data solutions handling terabytes of data.
  • Hands-on experience with Apache Iceberg.
  • Proficiency in Docker and Kubernetes.
  • Strong understanding of CI/CD pipelines.
  • Experience in Agile development.
  • Background in data modeling and governance.

Responsabilidades

  • Provide technical direction and mentorship to data engineers.
  • Architect and optimize end-to-end data pipelines.
  • Design scalable workflows using orchestration tools.
  • Lead implementation and optimization of Snowflake.
  • Build systems leveraging Kafka for real-time processing.
  • Work closely with cross-functional teams for data requirements.
  • Establish data governance frameworks.
  • Develop strategies to optimize performance for large data sets.
  • Foster collaborative team environment.
  • Stay ahead of industry trends for tool incorporation.

Conhecimentos

Data engineering
Python
Snowflake
Apache Airflow
Kafka
Data governance
AWS
Docker
Kubernetes
CI/CD pipelines

Ferramentas

Apache Iceberg
ETL pipelines
Agile
Descrição da oferta de emprego
Data Architect with AWS & Snowflake experience

Location: 100% Remote

Job Type: Contract

Responsibilities
  • Technical Leadership : Provide technical direction and mentorship to a team of data engineers, ensuring best practices in coding, architecture, and data operations.
  • End-to-End Ownership : Architect, implement, and optimize end-to-end data pipelines that process and transform large-scale datasets efficiently and reliably.
  • Orchestration and Automation : Design scalable workflows using orchestration tools such as Apache Airflow, ensuring high availability and fault tolerance.
  • Data Warehouse and Lake Optimization : Lead the implementation and optimization of Snowflake and data lake technologies like Apache Iceberg for storage, query performance, and scalability.
  • Real-Time and Batch Processing : Build robust systems leveraging Kafka, SQS, or similar messaging technologies for real-time and batch data processing.
  • Cross-Functional Collaboration : Work closely with Data Science, Product, and Engineering teams to define data requirements and deliver actionable insights.
  • Data Governance and Security : Establish and enforce data governance frameworks, ensuring compliance with regulatory standards and maintaining data integrity.
  • Scalability and Performance : Develop strategies to optimize performance for systems processing terabytes of data daily while ensuring scalability.
  • Team Building : Foster a collaborative team environment, driving skill development, career growth, and continuous learning within the team.
  • Innovation and Continuous Improvement : Stay ahead of industry trends to evaluate and incorporate new tools, technologies, and methodologies into the organization.
Qualifications

Required Skills :

  • 8+ years of experience in data engineering with a proven track record of leading data projects or teams.
  • Strong programming skills in Python, with expertise in building and optimizing ETL pipelines.
  • Extensive experience with Snowflake or equivalent data warehouses for designing schemas, optimizing queries, and managing large datasets.
  • Expertise in orchestration tools like Apache Airflow, with experience in building and managing complex workflows.
  • Deep understanding of messaging queues such as Kafka, AWS SQS, or similar technologies for real-time data ingestion and processing.
  • Demonstrated ability to architect and implement scalable data solutions handling terabytes of data.
  • Hands-on experience with Apache Iceberg for managing and optimizing data lakes.
  • Proficiency in containerization and orchestration tools like Docker and Kubernetes for deploying and managing distributed systems.
  • Strong understanding of CI / CD pipelines, including version control, deployment strategies, and automated testing.
  • Proven experience working in an Agile development environment and managing cross-functional team interactions.
  • Strong background in data modeling, data governance, and ensuring compliance with data security standards.
  • Experience working with cloud platforms like AWS, Azure, or GCP.

Preferred Skills :

  • Proficiency in stream processing frameworks such as Apache Flink for real-time analytics.
  • Familiarity with programming languages like Scala or Java for additional engineering tasks.
  • Exposure to integrating data pipelines with machine learning workflows.
  • Strong analytical skills to evaluate new technologies and tools for scalability and performance.

Leadership Skills :

  • Proven ability to lead and mentor data engineering teams, promoting collaboration and a culture of excellence.
  • Exceptional communication and interpersonal skills to articulate complex technical concepts to stakeholders.

Strategic thinking to align data engineering efforts with business goals and objectives.

Thanks

Aatmesh

aatmesh.singh@ampstek.com

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.