Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Senior Data Engineer

Stefanini Group

Teletrabalho

BRL 120.000 - 160.000

Tempo integral

Há 4 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A prominent data solutions company in São Paulo is seeking a skilled professional responsible for building and optimizing data pipelines. Key responsibilities include designing data structures, ensuring compliance with data governance, and collaborating with stakeholders on data topics. The ideal candidate should have strong expertise in AWS Data Management architectures, data modeling, and excellent communication skills. This role offers remote work flexibility and the opportunity to work on innovative data solutions.

Qualificações

  • Strong experience with various Data Management architectures, mainly AWS environment.
  • Ability to design, build and manage data pipelines.
  • Excellent communication skills and the ability to collaborate effectively with cross‑functional teams.

Responsabilidades

  • Create, maintain, and optimize data pipelines for data structures.
  • Drive Automation through effective metadata management.
  • Ensure compliance and governance during data use.
  • Collaborate and support data owners and stakeholders around data topics.

Conhecimentos

AWS environment
Data Modeling
Data pipeline management
Message queuing technologies
Collaboration
Communication skills

Ferramentas

Databricks
Spark
Python
Terraform
SQL
Scala
AWS Glue
AWS Athena
AWS Lambda
AWS Step Functions
Descrição da oferta de emprego
Mission
  • Responsible for building, managing, and optimizing data pipelines;
  • Support for defining data flows and access to data sources;

Ensure compliance with data governance, data architecture, and data security requirements.

Responsibilities
  • Create, maintain, and optimize data pipelines for data structures that encompass data modeling, data transformation, schemas, metadata, and workload management;
  • Drive Automation through effective metadata management;
  • Participate in ensuring compliance and governance during data use, including the maintenance of the data catalog;
  • Collaborate and support data owners and other stakeholders around data topics;
  • Ensure the proper use of the data architecture defined by Data Factory team;
  • Respect and follow security principles ensuring compliance with data delivery.
Competencies & Skills
  • Strong experience with various Data Management architectures, mainly AWS environment;
  • Strong experience with Data Modeling;
  • Ability to design, build and manage data pipelines;
  • Experience working with commercial and open source message queuing technologies;
  • Highly creative and collaborative;
  • Excellent communication skills and the ability to collaborate effectively with cross‑functional teams.
Specific Technical Knowledge
  • Solid knowledge in: Databricks, Spark, python, terraform, SQL, Scala, S3, AWS Glue, AWS Athena, AWS Lambda, AWS Step Functions; Big data, Relational Databases;
  • Experience in: EMR, Gitflow, Agile Metodology; DevOps.
Languages

English: Advanced or Fluent

Additional information

Remote work

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.