¡Activa las notificaciones laborales por email!

Data Engineer

MIGx AG

Torrejón de Ardoz

Presencial

EUR 40.000 - 80.000

Jornada completa

Hace 18 días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

Una empresa de consultoría global busca un ingeniero de datos apasionado por trabajar con datos estructurados y no estructurados. En esta emocionante posición, serás parte de un equipo que desarrolla productos de datos para grandes empresas, implementando arquitecturas de Data Mesh y participando en proyectos innovadores. Ofrecemos un entorno de trabajo híbrido, flexibilidad y oportunidades de desarrollo profesional, todo en una cultura centrada en el empleado. Si te entusiasma el mundo de los datos y deseas marcar la diferencia, esta es tu oportunidad.

Servicios

Modelo de trabajo híbrido
25 días de vacaciones al año
Clases de inglés gratuitas
Desarrollo profesional
Cultura centrada en el empleado
Programas de formación

Formación

  • 3+ años de experiencia en roles similares con habilidades avanzadas en Python y SQL.
  • Experiencia en la creación de pipelines ETL y en el uso de Azure Data Factory.

Responsabilidades

  • Desarrollar pipelines ETL en Python y Azure Data Factory.
  • Trabajar en la integración de sistemas y automatización de procesos.

Conocimientos

Python
ETL products
Azure Data Factory
Databricks
Snowflake
SQL
DevOps
Data Modeling
Agile development

Educación

BSc in Computer Science
MSc in Computer Science

Herramientas

Azure DevOps
ARM templates
Terraform

Descripción del empleo

MIGx is a global consulting company with an exclusive focus on the healthcare and life science industries, with their particularly demanding requirements on quality and regulatory aspects. We have been managing challenges and solving problems for our clients in the areas of compliance, business processes and many others.

MIGx interdisciplinary teams from Switzerland, Spain and Georgia have been taking care of projects in the fields of M&A, Integration, Application, Data Platforms, Processes, IT management, Digital transformation, Managed services and compliance.

About the profile

We are looking for a data enthusiast who likes to play with structured and unstructured data, transform, organize to work in state of the art data fabric and data mesh projects.

Project Description

In this role you will be working as a Data Engineer, working in complex projects with multiple data sources and formats. You will be part of a bigger team in MIGx responsible for the Data Services and building Data Products for our customers (mid to large size enterprises). You will have an opportunity to continue growing in the area of all things Data related. You will participate in building overall Data Mesh architecture for the customer while focusing on one specific visualization project and more upcoming.

Responsibilities:

  • Develop ETL pipelines in Python and Azure Data Factory as well as their DevOps CI / CD pipelines.
  • Software engineering and systems integration via REST APIs and other standard interfaces.
  • Work together with a team of professional engineers with the objective of developing the data pipelines, automate processes, deploying and building infrastructure as code, managing the solutions designed in multicloud systems.
  • Participate in agile ceremonies, weekly demos and such.
  • Communicate your daily commitments.
  • Configure and connect different data sources, especially SQL databases.

Requirements:

  • Must have:
  • Studies in Computer Science (BSc and / or MSc desired).
  • 3+ years of practical experience working in similar roles.
  • Proficient with ETL products (Spark, Databricks, Snowflake, Azure Data Factory, etc.)
  • Proficient with Azure Data Factory.
  • Proficient with Databricks / Snowflake and PySpark.
  • Proficient developing DevOps / CI/CD pipelines.
  • Proficient with Azure DevOps Classic / YAML Pipelines.
  • Proficient with Azure cloud services: ARM templates, API management, App Service, VMs, AKS, Gateways.
  • Advanced SQL knowledge and background in relational databases such as MS SQL Server, Oracle, MySQL, and PostgreSQL.
  • Understanding of landing, staging area, data cleansing, data profiling, data security and data architecture concepts (DWH, Data Lake, Delta Lake / Lakehouse, Datamart).
  • Data Modeling skills and knowledge of modeling tools.
  • Advanced programming skills in Python.
  • Ability to work in an agile development environment (SCRUM, Kanban).
  • Understanding of CI / CD principles and best practices.
  • Nice to have:
  • Proficient with .NET C#.
  • Terraform.
  • Bash / Powershell.
  • Data Vault Modeling.
  • Familiar with GxP.
  • Programming skills in other languages.

What we offer:

  • Hybrid work model and flexible working schedule that would suit night owls and early birds.
  • 25 holiday days per year.
  • Free English classes.
  • Possibilities of career development and the opportunity to shape the company future.
  • An employee-centric culture directly inspired by employee feedback - your voice is heard, and your perspective encouraged.
  • Different training programs to support your personal and professional development.
  • Work in a fast-growing, international company.
  • Friendly atmosphere and supportive Management team.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.