Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

AWS Developer (Data Lake)

Meta

Londrina

Teletrabalho

BRL 431.000 - 594.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology company is seeking a Senior AWS Developer to support the design and implementation of a large-scale data lake for a financial institution. The ideal candidate will have strong AWS expertise and be proficient in Python or Spark scripting, with hands-on experience in Terraform. This remote position allows for collaboration with a high-performing engineering team while working aligned to U.S. Eastern Time (EST).

Qualificações

  • Strong AWS expertise, especially in data and analytics services.
  • Proficient in Python or Spark scripting for automation.
  • Hands-on experience with Terraform and Infrastructure as Code.
  • Solid understanding of data modeling and SQL.
  • Experience in financial services is a plus.

Responsabilidades

  • Design, develop, and maintain data pipelines within the AWS ecosystem.
  • Build and optimize components for the data lake using various AWS services.
  • Write efficient Python or Spark scripts for data ingestion.
  • Implement Infrastructure as Code using Terraform.
  • Collaborate with cross-functional team members.

Conhecimentos

AWS expertise
Python scripting
Terraform
Data modeling
SQL
Spark scripting
Collaboration
Fluent English
Descrição da oferta de emprego
About the Role

We are seeking a Senior AWS Developer to support the design and implementation of a large-scale data lake and analytics platform within a leading financial institution. The ideal candidate will demonstrate strong technical expertise across the AWS ecosystem and the ability to deliver complex solutions independently while collaborating effectively within a high-performing engineering team.

Key Responsibilities
  • Design, develop, and maintain data pipelines and workflows within the AWS ecosystem.
  • Build and optimize components for the data lake using Airflow, Glue, Redshift, S3, Athena, and Iceberg.
  • Write efficient and maintainable Python or Spark scripts for data ingestion, transformation, and automation.
  • Implement Infrastructure as Code using Terraform to deploy and manage AWS services.
  • Collaborate with cross-functional team members to ensure high-quality, performant, and reliable data delivery.
  • Contribute to code reviews, testing improvements, and performance optimization.
  • Participate in solution design discussions, bringing forward ideas and accepting feedback in a collaborative, team-first manner.
Required Qualifications
  • Strong AWS expertise, especially in data and analytics services (Airflow, Glue, Redshift, S3, Athena, Iceberg).
  • Proficient in Python or Spark scripting for automation and data engineering tasks.
  • Hands-on experience with Terraform and Infrastructure as Code principles.
  • Solid understanding of data modeling, analytics platforms, and SQL.
  • Experience working in financial services and familiarity with banking data terminology,
  • Demonstrated ability to work both independently and collaboratively in a team of senior developers.
  • Strong focus on performance, code quality, and continuous improvement.
  • Takes initiative to identify opportunities for improvement in testing, performance, and architecture.
  • Communicates constructively and is open to giving and receiving feedback.
  • Balances ownership and teamwork, understanding that success is a collective effort.
  • Fluent English.
Additional Information
  • Work hours aligned to U.S. Eastern Time (EST).
  • Remote from Brazil.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.