Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer

Calyptus

São Paulo

Presencial

BRL 80.000 - 120.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A tech company in São Paulo is looking for a Data Engineer with over 3 years of experience to design and build data pipelines for AI processing. The ideal candidate should be proficient in Python and SQL, have experience with data integration tools, and a strong understanding of data privacy practices. This role offers the opportunity to collaborate with data scientists and engineers to optimize data workflows and ensure compliance with governance requirements.

Qualificações

  • Minimum 3 years of experience with data engineering in enterprise environments.
  • Strong proficiency in Python and SQL for data processing.
  • Experience with ETL/ELT tools and data integration patterns.

Responsabilidades

  • Design and build data pipelines for AI processing.
  • Develop connectors for enterprise systems.
  • Implement data quality frameworks for AI models.

Conhecimentos

Data engineering experience
Proficiency in Python
Proficiency in SQL
ETL / ELT tools experience
Knowledge of APIs
Understanding of data privacy best practices
Cloud platforms experience
Containerization tools familiarity

Formação académica

Bachelor's degree in Computer Science
Descrição da oferta de emprego
You’re a perfect match if you have :
  • 3+ years experience with data engineering in enterprise environments
  • Strong proficiency in Python and SQL for data processing and transformation
  • Experience with ETL / ELT tools and data integration patterns
  • Knowledge of enterprise data standards and APIs (REST, GraphQL, SOAP)
  • Understanding of data privacy, security, and access control best practices
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure
  • Familiarity with containerization and orchestration tools
  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Your day-to-day activities :
  • Design and build data pipelines that integrate diverse enterprise data sources for AI processing
  • Develop connectors and adapters for enterprise systems (CRM, ERP, document management, databases)
  • Implement data quality frameworks and validation systems for AI model inputs
  • Create secure data handling workflows that support global development teams
  • Build data transformation pipelines that prepare unstructured and structured data for AI models
  • Establish data governance and lineage tracking for compliance and audit requirements
  • Optimize data processing workflows for scale and performance
  • Collaborate with data scientists and AI engineers to ensure data meets model requirements
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.