Ativa os alertas de emprego por e-mail!

Data Engineer (Relocation To Portugal)

buscojobs Brasil

Novo Hamburgo

Teletrabalho

BRL 20.000 - 30.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Resumo da oferta

A technology consulting company in Brazil is searching for a Data Engineer to design ETL processes and maintain data pipelines. The role is fully remote and requires strong skills in Azure, SQL, and Python. Candidates should also have experience with cloud environments, particularly AWS. Join this innovative team and contribute to impactful projects within healthcare IT.

Serviços

100% Remote Work
WFH allowance
Career growth and training programs
Mentoring and wellbeing programs

Qualificações

  • 5+ years of experience building scalable data pipelines in cloud environments.
  • Strong Python, SQL, and data modeling skills required.
  • Experience in healthcare IT data exposure is a plus.

Responsabilidades

  • Build and maintain ETL pipelines in Azure Data Factory.
  • Develop and optimize SQL stored procedures and jobs.
  • Apply data governance and quality checks.

Conhecimentos

Data Engineering
Python
SQL
ETL processes
Azure

Ferramentas

Azure Data Factory
Docker
Airflow
Descrição da oferta de emprego
Overview

Our mission is to be a meaningful part of our people’s careers. We are a Portuguese technology consulting company with offices in Lisbon, Porto and Óbidos, and representations in Brazil and Tunisia. We have over 12 years of market expertise, and today we are a universe of around 400 people working on-site, remotely or in hybrid mode in projects across 20+ countries. We believe that great people make successful companies, and we stand for the appreciation, recognition and growth of our professionals. We don’t want to be just another line on your CV — we want you to live a Lifetime Experience with us. We invest in training and certifications, promote a healthy work-life balance, and offer benefits that impact your personal life and career.

#welcometoyourfuture

What we’re looking for:

  • 3+ years in Data Engineering or similar roles.
  • Strong Python skills for ETL (CI/CD, API deployment, data workflows).
  • Familiarity with Big Data tools (e.g. Spark, Hadoop, BigQuery).
  • Hands-on experience in cloud or hybrid environments.
  • Deep understanding of data lifecycle, governance, and operational challenges.
  • Clear communication and ownership mindset. Comfortable with cross-functional projects.
  • English: Professional working proficiency required (B2/C1).

Nice to have:

  • Coding: Bash / YAML / SQL
  • Experience with security teams or IAM and data protection policies
  • Experience with message brokers (e.g., Kafka, RabbitMQ, ActiveMQ) and real-time data pipelines
  • French B2

#Affinity – Creating close and empathetic relationships with colleagues, clients and candidates (and not just with technologies, languages and platforms). Team spirit and good vibes;

#Ambition – Drive to exceed expectations and evolve personally and professionally;

#Action – Energy to make things happen with proactiveness and initiative;

#Learning – Willingness to grow individually and collectively and become an expert in the tech market;

#Assertiveness – Transparent and honest communication with constructive feedback.

Your Lifetime Experience

  • Taking part in national and international projects in a company based on personal relationships, simplicity and efficiency, with a disruptive approach to the tech market;
  • Accessing a career and training plan tailored to your performance and interests;
  • Being part of a welcoming, trustworthy, respectful and informal environment;
  • Joining our Affinity Communities (sports, tech, hobbies, etc.) and participating in social and environmental responsibility projects;
  • Access to a range of benefits, partnerships, discounts, events and internal dynamics.

Beyond professional satisfaction, we aim to offer memorable moments of leisure and connection, worthy of an Affinity experience.

Join the Experience!

Send us your application and follow along at:

Required Skills & Experience

  • 4+ years of data engineering experience — building data warehouses and ETL pipelines
  • 3+ years of experience with Snowflake Datawarehouse
  • Well versed with data marts
  • Strong understanding of SQL including: SSIS, SSAS, & SSRS
  • Ability to work MST hours

Nice to Have Skills & Experience

  • Experience in healthcare industry, provider side highly preferred
  • Experience working in an AWS environment
  • Experience with big data tools: Hadoop, Spark, Kafka

Job Description

A healthcare client is looking for multiple Data Engineers to join their growing team. The team is modernizing on-prem SQL warehouses to Snowflake. The data warehouse is built; you will support out data marts and data pipelines in an AWS environment with DevOps support. This is a 6-month contract with high likelihood to extend.

Overview

We are seeking a Data Engineer (short-term contractor) to design and implement data acquisition and ETL processes in Microsoft Azure to support enterprise reporting needs, integrating multiple datasets into a central Azure SQL Database.

Core Responsibilities

  • Build and maintain ETL pipelines in Azure Data Factory (or equivalent).
  • Develop and optimize SQL stored procedures, jobs, and views in Azure SQL Database.
  • Apply data governance and quality checks.
  • Document sources, transformations, and dependencies; support Power BI data models (not visualizations).
  • Optional: Alteryx or similar tools for advanced data transformations.
  • Provide knowledge transfer for sustainability post-engagement.

Required Skills

  • Strong experience with Microsoft Azure (Data Factory, SQL Database, Blob/Data Lake).
  • Advanced SQL (stored procedures, jobs, indexing, views).
  • Proven ETL development experience (batch + scheduled pipelines).
  • Familiarity with data governance, lineage and quality frameworks.
  • Familiarity with Power BI data models (not visualization).
  • Excellent communication skills for collaboration and mentoring.

Nice-to-Have

  • Alteryx for data preparation
  • Healthcare IT data exposure

Why Join Us?

  • Opportunity to work on enterprise-scale reporting systems
  • Fully remote role based out of Brazil (supporting EST/CST hours)
  • Short-term but impactful 4–5 month contract
  • Work with skilled professionals on meaningful projects

Interested? Apply today and help deliver trusted reporting data for enterprise decision-making.

We are part of large technology services groups and operate with inclusive cultures and opportunities for growth across multiple brands. We welcome applicants who want to learn, grow and contribute to innovative data solutions.

The project and role specifics

  • The primary goal is modernization, maintenance and development of data pipelines and analytics for a large client with several product domains.
  • Responsibilities include designing data solutions for big data volumes, building scalable pipelines, and collaborating with multiple teams to ingest and process data.

What you will bring

  • 5+ years of experience building scalable data pipelines in cloud environments (AWS preferred)
  • Strong Python, SQL, and data modeling skills
  • Experience with Airflow, dbt, and ETL/ELT processes
  • Containerization (Docker, Kubernetes) and CI/CD experience
  • Excellent communication and ability to work with cross-functional teams

What we offer

  • 100% Remote Work
  • WFH allowance
  • Career growth and training programs
  • Mentoring and wellbeing programs
  • Multicultural working environment with tech events and activities

About the Companies

We partner with firms across industries to deliver technology strategy, software engineering and product development. We seek inclusive culture and diverse teams across EMEA and LATAM.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.