Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer

Calyptus

Osasco

Presencial

BRL 100.000 - 130.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology firm in Brazil is seeking a skilled Data Engineer to design and build data pipelines, integrating diverse enterprise data sources for AI processing. The ideal candidate will have over 3 years' experience in data engineering, strong proficiency in Python and SQL, and familiarity with cloud platforms like AWS or Azure. This role involves developing connectors for enterprise systems and ensuring data quality and compliance. A Bachelor's degree in a relevant field is required.

Qualificações

  • 3+ years experience with data engineering in enterprise environments.
  • Strong proficiency in Python and SQL for data processing and transformation.
  • Experience with cloud platforms and data architecture.

Responsabilidades

  • Design and build data pipelines that integrate diverse enterprise data sources.
  • Develop connectors and adapters for enterprise systems.
  • Implement data quality frameworks and validation systems.

Conhecimentos

Data engineering
Python
SQL
ETL tools
Data integration patterns
Cloud platforms (AWS, Azure, GCP)
Containerization

Formação académica

Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Descrição da oferta de emprego
Overview

Want to put your job search on autopilot? Join our platform, complete a 6‑minute AI screening interview, and get auto‑applied to 100s of high‑paying roles. Sign up now at and let the opportunities come to you.

Qualifications
  • 3+ years experience with data engineering in enterprise environments
  • Strong proficiency in Python and SQL for data processing and transformation
  • Experience with ETL / ELT tools and data integration patterns
  • Knowledge of enterprise data standards and APIs (REST, GraphQL, SOAP)
  • Understanding of data privacy, security, and access control best practices
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure
  • Familiarity with containerization and orchestration tools
  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Responsibilities
  • Design and build data pipelines that integrate diverse enterprise data sources for AI processing
  • Develop connectors and adapters for enterprise systems (CRM, ERP, document management, databases)
  • Implement data quality frameworks and validation systems for AI model inputs
  • Create secure data handling workflows that support global development teams
  • Build data transformation pipelines that prepare unstructured and structured data for AI models
  • Establish data governance and lineage tracking for compliance and audit requirements
  • Optimize data processing workflows for scale and performance
  • Collaborate with data scientists and AI engineers to ensure data meets model requirements
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.