Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Architect with AWS & Snowflake experience

Ampstek

Teletrabalho

BRL 538.000 - 700.000

Tempo integral

Há 2 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions company is seeking an experienced Data Architect with AWS and Snowflake expertise. This remote role involves providing technical leadership and mentoring data engineers, designing end-to-end data pipelines, and optimizing performance for large-scale systems. Ideal candidates should have 8+ years in data engineering, strong Python skills, and experience with orchestration tools like Apache Airflow. Additional experience with Kafka and data lakes using Apache Iceberg is preferred. Join a collaborative team focused on innovative data solutions.

Qualificações

  • 8+ years of experience in data engineering with proven leadership.
  • Strong programming skills in Python for ETL pipeline optimization.
  • Experience with Snowflake for schema design and query optimization.
  • Expertise in orchestration with Apache Airflow for complex workflows.
  • Deep understanding of messaging queues for real-time ingestion.
  • Hands-on experience with Apache Iceberg for data lake optimization.
  • Experience with Docker and Kubernetes for distributed systems.
  • Strong understanding of CI/CD pipelines and version control.
  • Proven experience in Agile development environments.

Responsabilidades

  • Provide technical leadership and mentorship to data engineers.
  • Architect and optimize end-to-end data pipelines.
  • Design scalable workflows using orchestration tools like Apache Airflow.
  • Lead the optimization of Snowflake and data lake technologies.
  • Build robust systems for real-time and batch processing.
  • Collaborate with Data Science, Product, and Engineering teams.
  • Establish data governance frameworks for compliance.
  • Develop strategies to optimize performance for large data processing.
  • Foster a collaborative team environment for continuous learning.
  • Stay ahead of industry trends in tools and technologies.

Conhecimentos

Data Engineering
Python
Snowflake
Apache Airflow
Kafka
Apache Iceberg
Docker
Kubernetes
CI/CD
Cloud Platforms (AWS, Azure, GCP)
Descrição da oferta de emprego

Title: Data Architect with AWS & Snowflake experience

Location: 100% Remote

Job Type: Contract

Responsibilities
  • Technical Leadership : Provide technical direction and mentorship to a team of data engineers, ensuring best practices in coding, architecture, and data operations.
  • End-to-End Ownership : Architect, implement, and optimize end-to-end data pipelines that process and transform large-scale datasets efficiently and reliably.
  • Orchestration and Automation : Design scalable workflows using orchestration tools such as Apache Airflow, ensuring high availability and fault tolerance.
  • Data Warehouse and Lake Optimization : Lead the implementation and optimization of Snowflake and data lake technologies like Apache Iceberg for storage, query performance, and scalability.
  • Real-Time and Batch Processing : Build robust systems leveraging Kafka, SQS, or similar messaging technologies for real-time and batch data processing.
  • Cross-Functional Collaboration : Work closely with Data Science, Product, and Engineering teams to define data requirements and deliver actionable insights.
  • Data Governance and Security : Establish and enforce data governance frameworks, ensuring compliance with regulatory standards and maintaining data integrity.
  • Scalability and Performance : Develop strategies to optimize performance for systems processing terabytes of data daily while ensuring scalability.
  • Team Building : Foster a collaborative team environment, driving skill development, career growth, and continuous learning within the team.
  • Innovation and Continuous Improvement : Stay ahead of industry trends to evaluate and incorporate new tools, technologies, and methodologies into the organization.
Qualifications
Required Skills :
  • 8+ years of experience in data engineering with a proven track record of leading data projects or teams.
  • Strong programming skills in Python, with expertise in building and optimizing ETL pipelines.
  • Extensive experience with Snowflake or equivalent data warehouses for designing schemas, optimizing queries, and managing large datasets.
  • Expertise in orchestration tools like Apache Airflow, with experience in building and managing complex workflows.
  • Deep understanding of messaging queues such as Kafka, AWS SQS, or similar technologies for real-time data ingestion and processing.
  • Demonstrated ability to architect and implement scalable data solutions handling terabytes of data.
  • Hands‑on experience with Apache Iceberg for managing and optimizing data lakes.
  • Proficiency in containerization and orchestration tools like Docker and Kubernetes for deploying and managing distributed systems.
  • Strong understanding of CI / CD pipelines, including version control, deployment strategies, and automated testing.
  • Proven experience working in an Agile development environment and managing cross‑functional team interactions.
  • Strong background in data modeling, data governance, and ensuring compliance with data security standards.
  • Experience working with cloud platforms like AWS, Azure, or GCP.
Preferred Skills :
  • Proficiency in stream processing frameworks such as Apache Flink for real-time analytics.
  • Familiarity with programming languages like Scala or Java for additional engineering tasks.
  • Exposure to integrating data pipelines with machine learning workflows.
  • Strong analytical skills to evaluate new technologies and tools for scalability and performance.
Leadership Skills :
  • Proven ability to lead and mentor data engineering teams, promoting collaboration and a culture of excellence.
  • Exceptional communication and interpersonal skills to articulate complex technical concepts to stakeholders.

Strategic thinking to align data engineering efforts with business goals and objectives.

Thanks

Aatmesh

aatmesh.singh@ampstek.com

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.