Ativa os alertas de emprego por e-mail!

Sr. Data Architect

buscojobs Brasil

Santa Catarina

Teletrabalho

BRL 120.000 - 150.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Resumo da oferta

A recruitment platform in Brazil seeks a Senior Data Engineer to design and maintain scalable data pipelines on AWS. The role requires collaboration with data scientists, strong programming skills in Python, and a deep understanding of data engineering principles. Candidates should have over 5 years of experience and be proficient in advanced SQL. This position offers a remote-friendly environment and opportunities for career growth.

Serviços

Career growth opportunities
Remote-friendly environment
Ongoing training

Qualificações

  • 5+ years of experience in building scalable data pipelines in a cloud environment (preferably AWS).
  • Strong programming skills in Python or a similar scripting language.
  • Intermediate to advanced experience in relational database design.

Responsabilidades

  • Design and maintain scalable data pipelines on AWS.
  • Ensure data governance, documentation, and best practices.
  • Collaborate with data scientists and analysts to integrate data from multiple sources.

Conhecimentos

Data engineering with AWS
ETL and data modeling
Python programming
Advanced SQL
Problem-solving
Advanced English

Formação académica

Bachelor’s degree in Computer Science

Ferramentas

AWS Glue
AWS S3
AWS SageMaker
Descrição da oferta de emprego

We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.

Overview

Location: Guaramirim, Santa Catarina

Key Responsibilities
  • Build and optimize ETL pipelines with AWS Glue.
  • Work with AWS S3, Glue, and SageMaker for data and AI workflows.
  • Develop solutions in Python and SQL.
  • Integrate data from Salesforce and APIs.
  • Ensure data governance, documentation, and best practices.
Requirements
  • Proven experience in data engineering with AWS.
  • Experience with ETL, data modeling, and pipeline optimization.
  • Advanced English (international collaboration).

Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by the most accurate global data protection laws, such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a Consultancy company, Avenue Code may share your information with its clients and other Companies from the CompassUol Group to which Avenue Code’s consultants are allocated to perform its services.

What We Offer (Company Context)

What we offer and company information below are retained from the original description where relevant to the role context.

The company and our mission: Zartis is a digital solutions provider working across technology strategy, software engineering and product development. We partner with firms across financial services, MedTech, media, logistics technology, renewable energy, EdTech, e-commerce, and more. Our engineering hubs in EMEA and LATAM are full of talented professionals delivering business success and digital improvement across application development, software architecture, CI/CD, business intelligence, QA automation, and new technology integrations.

What you will do:

  • Designing performant data pipelines for the ingestion and transformation of complex datasets into usable data products.
  • Building scalable infrastructure to support hourly, daily, and weekly update cycles.
  • Implementing automated QA checks and monitoring systems to catch data anomalies before they reach clients.
  • Re-architecting system components to improve performance or reduce costs.
  • Supporting team members through code reviews and collaborative development.
  • Building enterprise-grade batch and real-time data processing pipelines on AWS, with a focus on serverless architectures.
  • Designing and implementing automated ELT processes to integrate disparate datasets.
  • Collaborating across multiple teams to ingest, extract, and process data using Python, R, Zsh, SQL, REST, and GraphQL APIs.
  • Transforming clickstream and CRM data into meaningful metrics and segments for visualization.
  • Creating automated acceptance, QA, and reliability checks to ensure business logic and data integrity.
  • Designing appropriately normalized schemas and making informed decisions between SQL and NoSQL solutions.
  • Optimizing infrastructure and schema design for performance, scalability, and cost efficiency.
  • Defining and maintaining CI/CD and deployment pipelines for data infrastructure.

What you will bring:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • 5+ years of experience building scalable and reliable data pipelines and data products in a cloud environment (AWS preferred).
  • Deep understanding of ELT processes and data modeling best practices.
  • Strong programming skills in Python or a similar scripting language.
  • Advanced SQL skills, with intermediate to advanced experience in relational database design.
  • Familiarity with joining and analyzing large behavioral datasets, such as Adobe and GA4 clickstream data.
  • Excellent problem-solving abilities and strong attention to data accuracy and detail.
  • Proven ability to manage and prioritize multiple initiatives with minimal supervision.

Why Join

We offer a remote-friendly environment and opportunities for career growth, mentorship, and ongoing training. This section summarizes general benefits and company culture as reflected in the original content.

Privacy and Compliance

The company reinforces its commitment to privacy and to all the principles guaranteed by the most accurate global data protection laws, such as GDPR, LGPD, CCPA and CPRA. The candidate data shared will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a consultancy, information may be shared with clients and other companies within the group to which consultants are allocated.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.

Ofertas semelhantes