Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer

Calyptus

Brasília

Presencial

BRL 120.000 - 160.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions provider in Brasília is seeking a Data Engineer to design and build data pipelines, integrate enterprise data sources, and implement data quality frameworks. The ideal candidate will have over 3 years of experience in data engineering, strong proficiency in Python and SQL, and a Bachelor's degree in a related field. This role emphasizes collaboration with global teams to optimize data processing workflows and ensure compliance with data governance.

Qualificações

  • Minimum 3 years of experience with data engineering in enterprise environments.
  • Strong proficiency in Python and SQL for data processing.
  • Experience with data integration patterns and cloud platforms.

Responsabilidades

  • Design data pipelines that integrate diverse enterprise data sources.
  • Implement data quality frameworks for AI model inputs.
  • Optimize data processing workflows for scale and performance.

Conhecimentos

Data engineering experience
Python proficiency
SQL proficiency
ETL / ELT tools knowledge
APIs (REST, GraphQL, SOAP)
Cloud platforms (AWS, Azure, GCP)
Data privacy knowledge
Containerization familiarity

Formação académica

Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Descrição da oferta de emprego
You’re a perfect match if you have :
  • 3+ years experience with data engineering in enterprise environments
  • Strong proficiency in Python and SQL for data processing and transformation
  • Experience with ETL / ELT tools and data integration patterns
  • Knowledge of enterprise data standards and APIs (REST, GraphQL, SOAP)
  • Understanding of data privacy, security, and access control best practices
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure
  • Familiarity with containerization and orchestration tools
  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Your day-to-day activities :
  • Design and build data pipelines that integrate diverse enterprise data sources for AI processing
  • Develop connectors and adapters for enterprise systems (CRM, ERP, document management, databases)
  • Implement data quality frameworks and validation systems for AI model inputs
  • Create secure data handling workflows that support global development teams
  • Build data transformation pipelines that prepare unstructured and structured data for AI models
  • Establish data governance and lineage tracking for compliance and audit requirements
  • Optimize data processing workflows for scale and performance
  • Collaborate with data scientists and AI engineers to ensure data meets model requirements
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.