Ativa os alertas de emprego por e-mail!

Aws Developer

Meta

Manaus

Teletrabalho

BRL 431.000 - 594.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology company in Brazil seeks a Senior AWS Developer to design and implement a data lake and analytics platform. The ideal candidate has strong expertise in AWS, Python or Spark scripting, and a solid understanding of data modeling. Responsibilities include developing data pipelines, collaborating with teams, and optimizing data delivery. This role offers remote work aligned with U.S. Eastern Time and aims for high-quality data solutions.

Qualificações

  • Strong AWS expertise, particularly in data and analytics services.
  • Proficient in Python or Spark scripting for automation.
  • Hands-on experience with Terraform and Infrastructure as Code principles.
  • Solid understanding of data modeling and SQL.
  • Experience in financial services and banking data terminology.
  • Ability to work independently and collaboratively.

Responsabilidades

  • Design, develop, and maintain data pipelines within AWS.
  • Build and optimize components for data lake using various AWS services.
  • Write efficient Python or Spark scripts for data ingestion.
  • Implement Infrastructure as Code using Terraform.
  • Collaborate with team members to ensure reliable data delivery.
  • Contribute to code reviews and performance optimization.

Conhecimentos

AWS expertise
Python or Spark scripting
Terraform
Data modeling
Analytics platforms
SQL
Fluency in English
Descrição da oferta de emprego
About the Role

We are seeking a Senior AWS Developer to support the design and implementation of a large-scale data lake and analytics platform within a leading financial institution.

Key Responsibilities
  • Design, develop, and maintain data pipelines and workflows within the AWS ecosystem.
  • Build and optimize components for the data lake using Airflow, Glue, Redshift, S3, Athena, and Iceberg.
  • Write efficient and maintainable Python or Spark scripts for data ingestion, transformation, and automation.
  • Implement Infrastructure as Code using Terraform to deploy and manage AWS services.
  • Collaborate with cross-functional team members to ensure high-quality, performant, and reliable data delivery.
  • Contribute to code reviews, testing improvements, and performance optimization.
  • Participate in solution design discussions, bringing forward ideas and accepting feedback in a collaborative, team-first manner.
Required Qualifications
  • Strong AWS expertise, especially in data and analytics services (Airflow, Glue, Redshift, S3, Athena, Iceberg).
  • Proficient in Python or Spark scripting for automation and data engineering tasks.
  • Hands-on experience with Terraform and Infrastructure as Code principles.
  • Solid understanding of data modeling, analytics platforms, and SQL.
  • Experience working in financial services and familiarity with banking data terminology.
  • Demonstrated ability to work both independently and collaboratively in a team of senior developers.
  • Strong focus on performance, code quality, and continuous improvement.
  • Takes initiative to identify opportunities for improvement in testing, performance, and architecture.
  • Communicates constructively and is open to giving and receiving feedback.
  • Balances ownership and teamwork, understanding that success is a collective effort.
  • Fluent English.
Additional Information
  • Work hours aligned to U.S. Eastern Time (EST).
  • Remote from Brazil.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.