Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer

Calyptus

Porto Alegre

Presencial

BRL 80.000 - 120.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading tech company in Porto Alegre is looking for an experienced Data Engineer with over 3 years in data engineering environments. You'll design and build data pipelines, developing connectors for enterprise systems and implementing data quality frameworks. Proficiency in Python and SQL is essential, along with a Bachelor's degree in a related field. The role involves optimizing data processing workflows and collaborating with data scientists to meet AI model requirements.

Qualificações

  • 3+ years experience with data engineering in enterprise environments.
  • Strong proficiency in Python and SQL for data processing and transformation.
  • Experience with ETL/ELT tools and data integration patterns.

Responsabilidades

  • Design and build data pipelines that integrate diverse enterprise data sources.
  • Develop connectors and adapters for enterprise systems (CRM, ERP, etc.).
  • Implement data quality frameworks and validation systems.

Conhecimentos

Data engineering in enterprise environments
Python
SQL
ETL / ELT tools
Data integration patterns
Data privacy
Security best practices
Cloud platforms (AWS, Azure, GCP)
Containerization and orchestration

Formação académica

Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Descrição da oferta de emprego
You’re a perfect match if you have :
  • 3+ years experience with data engineering in enterprise environments
  • Strong proficiency in Python and SQL for data processing and transformation
  • Experience with ETL / ELT tools and data integration patterns
  • Knowledge of enterprise data standards and APIs (REST, GraphQL, SOAP)
  • Understanding of data privacy, security, and access control best practices
  • Experience with cloud platforms (AWS, Azure, GCP) and data infrastructure
  • Familiarity with containerization and orchestration tools
  • Bachelor's degree in Computer Science, Engineering, Data Science, or related field
Your day-to-day activities :
  • Design and build data pipelines that integrate diverse enterprise data sources for AI processing
  • Develop connectors and adapters for enterprise systems (CRM, ERP, document management, databases)
  • Implement data quality frameworks and validation systems for AI model inputs
  • Create secure data handling workflows that support global development teams
  • Build data transformation pipelines that prepare unstructured and structured data for AI models
  • Establish data governance and lineage tracking for compliance and audit requirements
  • Optimize data processing workflows for scale and performance
  • Collaborate with data scientists and AI engineers to ensure data meets model requirements
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.