Ativa os alertas de emprego por e-mail!

Senior Data Engineer

Workana

São Paulo

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Há 16 dias

Resumo da oferta

A leading remote work platform is seeking a Senior Data Engineer to design and maintain data pipelines. The ideal candidate has over 5 years of experience, strong skills in Python, Airflow, and AWS, and excels in collaboration. This fully remote position offers competitive compensation and growth opportunities with international clients in a dynamic environment.

Serviços

Competitive compensation
Fully remote work
Career growth opportunities

Qualificações

  • 5+ years of experience in Data Engineering.
  • Strong hands-on experience with Airflow.
  • Solid expertise with AWS services.

Responsabilidades

  • Design, build, and maintain ETL pipelines using Airflow and AWS-native services.
  • Collaborate with data scientists, analysts, and business teams.
  • Implement best practices for data governance and quality.

Conhecimentos

Python
Airflow
AWS
SQL
Data Engineering
Collaboration
Descrição da oferta de emprego

Workana is the largest remote work platform for talents in Latin America. Our new segment, Workana Premium, focuses on matching the most exceptional professionals with leading and innovative companies around the globe. Enjoy competitive compensation, dedicated support, and the flexibility of remote work within a dynamic environment that fosters collaboration and professional advancement.

We are proud to present you with the following opportunity.

About Our Client

Our client is a fast-growing technology-driven company working on advanced data solutions to support analytics and digital transformation initiatives. They are building scalable, secure data platforms and pipelines that power decision-making across the organization.

Role Overview

As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining robust data pipelines and infrastructure. You will work closely with cross-functional teams to ensure high-quality, reliable, and scalable data solutions. This role requires strong expertise in Python, Airflow and AWS, with an opportunity to contribute to DevOps practices.

Responsibilities
  • Design, build, and maintain ETL pipelines using Airflow and AWS-native services.
  • Develop scalable data solutions and integrate data from diverse sources into the enterprise data platform.
  • Ensure data availability, reliability, and performance across production systems.
  • Collaborate with data scientists, analysts, and business teams to enable advanced analytics and insights.
  • Implement best practices for data governance, quality, and security.
  • Contribute to process automation and infrastructure improvements.
  • 5+ years of experience in Data Engineering.
  • Strong hands-on experience with Airflow for orchestration and workflow management.
  • Solid expertise with AWS services (S3, Redshift, Glue, Lambda, EMR, etc.).
  • Proficiency in Python for data processing and automation.
  • Strong SQL skills and familiarity with relational and NoSQL databases.
  • Experience building and optimizing large-scale data pipelines.
  • Ability to analyze complex datasets and troubleshoot data issues.
  • Excellent communication and collaboration skills.
Nice to Have
  • Experience with DevOps practices (CI/CD, Terraform, Docker, Kubernetes).
  • Exposure to data streaming technologies (Kafka, Kinesis).
  • Familiarity with data science workflows and machine learning integration.
  • Compensation in USD
  • Fully remote work in Latin America
  • Career growth opportunities with international clients and dynamic projects
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.