Ativa os alertas de emprego por e-mail!

Data Engineer (Collibra Data Quality)

buscojobs Brasil

Vitória da Conquista

Teletrabalho

BRL 318.000 - 478.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Resumo da oferta

A leading recruitment firm in Brazil seeks experienced Data Engineers to develop and maintain cloud-based data pipelines. The role involves building data warehouses, implementing ETL/ELT processes, and ensuring data quality for analytics. Ideal candidates have strong skills in Python, SQL, and cloud environments like Azure and AWS. This position offers professional development and flexible, fully remote work opportunities.

Serviços

Competitive compensation
Health and life insurance
Professional development opportunities

Qualificações

  • Minimum 5–7 years of data engineering experience with cloud platforms.
  • Strong Python and SQL proficiency; experience with orchestration tools.
  • Solid understanding of data modeling and warehousing.

Responsabilidades

  • Develop and maintain scalable data pipelines in cloud environments.
  • Build and manage data warehouses to support analytics and reporting.
  • Collaborate with teams to ensure data governance and security best practices.

Conhecimentos

Data engineering
Cloud environments
SQL
Python
Data governance
Agile methodologies

Ferramentas

Azure
AWS
Snowflake
BigQuery
ETL/ELT tools
Descrição da oferta de emprego
Overview

We are seeking experienced Data Engineers to design, develop, and maintain data pipelines and data products in cloud environments to support analytics, reporting, and decision-making across multiple brands and clients. The roles involve building and modernizing data warehouses and data marts, implementing ETL/ELT processes, ensuring data quality and governance, and enabling downstream analytics and dashboards. Depending on the project, responsibilities may center on Azure, Snowflake, AWS, BigQuery, Looker/Power BI, and related data tooling. This description consolidates several postings to reflect responsibilities, qualifications, and benefits across engagements.

Responsibilities (selected highlights across roles)

  • Design, develop, and maintain data pipelines and ETL/ELT processes using cloud-native tools (e.g., Azure Data Factory, Snowflake, AWS Glue, BigQuery, dbt).
  • Build and optimize data models, schemas (star/models, Data Mesh, Data Vault), and data warehousing solutions to support reporting and analytics.
  • Ingest and integrate data from multiple sources (APIs, files, databases, vendor systems) and ensure cleansed, reliable, and query-ready data.
  • Collaborate with analysts, data scientists, and stakeholders to document sources, transformations, governance, and dependencies.
  • Implement data governance, quality checks, metadata, and lineage; monitor pipelines and troubleshoot issues with minimal downtime.
  • Support BI/reporting: provide clean datasets for dashboards; collaborate with Power BI, Looker, or similar BI tools; develop LookML or equivalent data models as needed.
  • Optimize performance, cost, and scalability; implement automation and CI/CD for data infrastructure; containerize and deploy where applicable (Docker, Kubernetes, AWS ECS).
  • Provide knowledge transfer and mentorship; participate in incident response and continuous improvement initiatives.

Required Skills

  • Strong experience in data engineering with cloud environments (Azure, AWS, Snowflake, BigQuery) and modern data tooling (ETL/ELT, data modeling, governance).
  • Advanced SQL skills; proficient with stored procedures, views, indexing, and performance tuning.
  • Hands-on experience with data pipelines, orchestration tools (Airflow, NiFi, or equivalents).
  • Solid understanding of data governance, lineage, quality frameworks, and metadata management.
  • Experience with Python (and related libraries) for data ingestion, transformation, and automation.
  • Experience with data visualization/BI data models; ability to deliver datasets for dashboards (not necessarily build visualizations).
  • Excellent communication, collaboration, and problem-solving skills; able to work in Agile environments.

Nice-to-Have

  • Experience with healthcare IT data; Alteryx, Data Build Tool (dbt), Elasticsearch, Docker/Kubernetes; Terraform or IaC; cloud certifications (Azure, AWS, GCP).
  • Experience with machine learning data flows (e.g., usage of Snowflake, Looker, SageMaker, or equivalent).
  • Experience with real-time or batch data processing, data QA frameworks, and observability tools.

What You’ll Do

  • Develop and maintain scalable data pipelines in AWS, Azure, or GCP environments; ingest data from diverse sources and ensure data quality for downstream applications.
  • Build, optimize, and maintain data warehouses and marts; manage data assets to enable analytics and reporting.
  • Collaborate across teams to support data requirements, governance, and security best practices.
  • Contribute to documentation, testing, and automation; participate in code reviews and mentoring.

What We Offer

  • Fully remote opportunities with flexibility for global teams; competitive compensation.
  • Professional development, training budgets, mentoring, and career growth programs.
  • Benefits such as health and life insurance, wellbeing resources, and learning portals; paid certifications and international experience opportunities.

Project/Company Contexts

  • Projects range from modernizing on-prem SQL warehouses to cloud-native architectures; eCommerce, healthcare, and enterprise data environments.
  • Engagement models include short-term contracts with extensions and multi-year pipelines; teams collaborate with product and data science groups.

Required Qualifications (sample from postings)

  • Minimum 5–7+ years of data engineering experience with cloud data platforms; strong Python and SQL proficiency; experience with Airflow or similar orchestration; data modeling and data warehousing fundamentals.
  • Experience with at least one major cloud provider (AWS, Azure, GCP) and relevant data tools (Snowflake, BigQuery, Redshift, Synapse).
  • Strong communication, collaboration, and problem-solving skills; willingness to learn undocumented systems and design future-ready solutions.

Note

This consolidated description includes content from multiple postings and emphasizes responsibilities, qualifications, and benefits across engagements. It is not a guarantee of official role boundaries and should be adapted to a single posting if used for candidate outreach.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.