Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer

Calyptus

Rio de Janeiro

Presencial

BRL 80.000 - 120.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions firm in Rio de Janeiro seeks an experienced Data Engineer to design and build data pipelines integrating diverse sources for AI processing. The ideal candidate has over 3 years of experience in data engineering, proficient in Python and SQL, and familiar with ETL tools and cloud platforms like AWS and GCP. This role offers the opportunity to implement data quality frameworks and collaborate with AI engineers, ensuring data meets model requirements.

Qualificações

  • 3+ years experience with data engineering in enterprise environments.
  • Strong proficiency in Python and SQL for data processing and transformation.
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure.

Responsabilidades

  • Design and build data pipelines integrating diverse enterprise data sources.
  • Develop connectors and adapters for enterprise systems (CRM, ERP, databases).
  • Implement data quality frameworks for AI model inputs.
  • Create secure data handling workflows for global development teams.

Conhecimentos

Python
SQL
ETL tools
Data integration patterns
Data privacy best practices
AWS
Azure
GCP
Containerization
Orchestration tools

Formação académica

Bachelor's degree in Computer Science
Bachelor's degree in Engineering
Bachelor's degree in Data Science
Descrição da oferta de emprego
You’re a perfect match if you have :
  • 3+ years experience with data engineering in enterprise environments
  • Strong proficiency in Python and SQL for data processing and transformation
  • Experience with ETL / ELT tools and data integration patterns
  • Knowledge of enterprise data standards and APIs (REST, GraphQL, SOAP)
  • Understanding of data privacy, security, and access control best practices
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure
  • Familiarity with containerization and orchestration tools
  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Your day-to-day activities :
  • Design and build data pipelines that integrate diverse enterprise data sources for AI processing
  • Develop connectors and adapters for enterprise systems (CRM, ERP, document management, databases)
  • Implement data quality frameworks and validation systems for AI model inputs
  • Create secure data handling workflows that support global development teams
  • Build data transformation pipelines that prepare unstructured and structured data for AI models
  • Establish data governance and lineage tracking for compliance and audit requirements
  • Optimize data processing workflows for scale and performance
  • Collaborate with data scientists and AI engineers to ensure data meets model requirements
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.