Ativa os alertas de emprego por e-mail!

Data Engineer

reeeliance IM GmbH

Porto

Híbrido

EUR 45 000 - 60 000

Tempo integral

Há 23 dias

Resumo da oferta

A data integration company in Porto seeks a Data Engineer to design and maintain data pipelines that optimize data infrastructure and ensure data integrity. The role involves working with cloud platforms like AWS and Azure, requiring strong skills in Python and SQL. Join a multinational team and enjoy flexible working hours, continuous training, and modern technology while contributing to meaningful international projects.

Serviços

Flexible working hours
Modern equipment
Classes of German language
Childcare subsidy
Company pension scheme
Job bike

Qualificações

  • Proficiency in classic and recent data modeling concepts.
  • Experience with data warehousing architectures and best practices.
  • Hands-on experience in orchestrating data pipelines within cloud platforms like AWS or Azure.

Responsabilidades

  • Design, build, and maintain data pipelines for clients.
  • Optimize data infrastructure for extraction, transformation, and loading.
  • Ensure data accuracy, integrity, privacy, and compliance.

Conhecimentos

Data modeling
ETL tools
Python
SQL
Data pipelines on cloud platforms
Problem-solving
Agile teamwork
Communication skills

Formação académica

University degree in computer science or related discipline

Ferramentas

AWS
Azure
Snowflake
Azure Synapse
SQL
Matillion
VaultSpeed
Descrição da oferta de emprego
Your responsibilities
  • In this role you have the chance to design, build and maintain data pipelines according to the client's business needs and embed them end-to-end into the given data stack
  • As a Data Engineer you will maintain and optimize the data infrastructure required for data extraction, transformation, and loading from a wide variety of data sources
  • You build, maintain, and deploy data artifacts on cloud platforms
  • You will ensure of data accuracy, integrity, privacy, security, and compliance by applying data quality assurances methods
  • Your responsibility will include monitoring of data pipeline performance as well as the implementation of optimization strategies
  • You will work together with both reeeliance and client staff as part of an agile team
What we offer
  • Responsible roles in international projects
  • Space to grow and to take over responsibility
  • Dedicated onboarding program
  • Work with the cutting-edge technology
  • A permanent employment in a family-friendly company
  • Working in a multinational team (more than 15 nationalities)
  • Flexible working hours
  • Participation in regular team events in Berlin and Hamburg
  • Further training opportunities
  • Modern equipment
  • Classes of German language
  • And many other Job perks (childcare subsidy, company pension scheme, job bike, etc…)
Our requirements
  • Proficiency in both classic and recent data modeling concepts
  • Applicable knowledge in data warehousing architectures and best practice design
  • Experience and confidence in using at least one ETL tool and proficiency in Python
  • Ability to read and understand complex data flows paired with strong SQL skills
  • Hands on experience in orchestrating data pipelines within cloud platforms like AWS or Azure
  • Problem-solving attitude
  • Independent working style
  • Experience with working as part of a team in agile project setups
  • A university degree in computer science, information technology, or related discipline
  • Articulate with impeccable verbal and written communication skills in English. Strong & discerning listener. Another language skill is a plus
About the role

Data is undeniably one of the world's most valuable resources. However, it's futile if it's not in the right place or the right setup. We strive to place data where it's needed most and can be used to its fullest potential.

Become a valued member of our data integration team and build with us highly scalable data architectures and fast data pipelines for our multinational clients. Our use cases range from data warehouse and lakehouse architectures implementations over IoT and data streaming setups to sophisticated AI scenarios, in industries ranging from manufacturing, consumer products, healthcare to banking and insurance, just to name a few.

Our Tech Stack at a glance: Coding: SQL, Python; Data Pipeline: Matillion, VaultSpeed; Data Storage: Snowflake, Azure Synapse; DevOps: Azure DevOps, GitLab.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.