Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

AWS Developer (Data Lake)

Meta

Teresina

Teletrabalho

BRL 323.000 - 486.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading tech company is seeking a Senior AWS Developer to design and implement data solutions for a financial institution. Candidates should possess strong AWS and data expertise, along with proficiency in Python or Spark scripting. The role requires collaboration in a high-performing team, with responsibilities including building data pipelines and implementing Infrastructure as Code. This position is remote from Brazil, ideally suited for those with banking data experience and fluent English.

Qualificações

  • Strong expertise in AWS ecosystem, specifically data services.
  • Experience with data engineering tasks using Python or Spark.
  • Knowledge of Infrastructure as Code using Terraform.

Responsabilidades

  • Design and maintain data pipelines within the AWS ecosystem.
  • Build data lake components using Airflow and other tools.
  • Implement Infrastructure as Code for AWS service deployments.
  • Collaborate in a high-performing engineering team.

Conhecimentos

AWS expertise in data and analytics services
Python scripting
Spark scripting
Terraform
SQL
Performance focus
Fluent English
Descrição da oferta de emprego
About the Role

We are seeking a Senior AWS Developer to support the design and implementation of a large-scale data lake and analytics platform within a leading financial institution. The ideal candidate will demonstrate strong technical expertise across the AWS ecosystem and the ability to deliver complex solutions independently while collaborating effectively within a high-performing engineering team.

Key Responsibilities
  • Design, develop, and maintain data pipelines and workflows within the AWS ecosystem.
  • Build and optimize components for the data lake using Airflow, Glue, Redshift, S3, Athena, and Iceberg.
  • Write efficient and maintainable Python or Spark scripts for data ingestion, transformation, and automation.
  • Implement Infrastructure as Code using Terraform to deploy and manage AWS services.
  • Collaborate with cross-functional team members to ensure high-quality, performant, and reliable data delivery.
  • Contribute to code reviews, testing improvements, and performance optimization.
  • Participate in solution design discussions, bringing forward ideas and accepting feedback in a collaborative, team-first manner.
Required Qualifications
  • Strong AWS expertise, especially in data and analytics services (Airflow, Glue, Redshift, S3, Athena, Iceberg).
  • Proficient in Python or Spark scripting for automation and data engineering tasks.
  • Hands-on experience with Terraform and Infrastructure as Code principles.
  • Solid understanding of data modeling, analytics platforms, and SQL.
  • Experience working in financial services and familiarity with banking data terminology,
  • Demonstrated ability to work both independently and collaboratively in a team of senior developers.
  • Strong focus on performance, code quality, and continuous improvement.
  • Takes initiative to identify opportunities for improvement in testing, performance, and architecture.
  • Communicates constructively and is open to giving and receiving feedback.
  • Balances ownership and teamwork, understanding that success is a collective effort.
  • Fluent English.
Additional Information
  • Work hours aligned to U.S. Eastern Time (EST).
  • Remote from Brazil.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.