¡Activa las notificaciones laborales por email!

Data Engineer Associate

Metyis AG

Barcelona

Presencial

EUR 45.000 - 65.000

Jornada completa

Hace 19 días

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

A leading international firm is expanding its data team in Barcelona and is seeking experienced data engineers. The role involves building data pipelines, collaborating with stakeholders, and leading projects that drive impactful business change. Candidates should have a strong background in data solutions and relevant technologies.

Formación

  • 4+ years of hands-on experience building data pipelines.
  • Advanced experience with Azure services and related tools.
  • Strong knowledge of SQL and Python for complex data tasks.

Responsabilidades

  • Implement the technical roadmap for clients and maintain data platforms.
  • Collaborate on digital solutions and ensure effective integration.
  • Assist in building data insights and participate in project activities.

Conocimientos

Data Governance
Data Quality Implementation
Stakeholder Management
Python
SQL
Data Pipelines
CI/CD Workflows
Azure Services
Data Visualization

Educación

Bachelor's degree in Engineering
Master's degree (a plus)

Herramientas

Azure Data Factory
Databricks
Airflow
Kubernetes

Descripción del empleo

Metyis is growing! We are looking for data engineers with +4 years of experience to join our international data team based in Barcelona.


What we offer:
  • Interact with senior stakeholders at our clients on a regular basis to drive their business towards impactful change

  • Become the go-to person for end-to-end data handling, management, and analytics processes.

  • Lead your team in creating pipelines for data management, data visualization, and analytics products, including automated services and APIs.

  • Work with Data Scientists throughout the data lifecycle — acquisition, exploration, cleaning, integration, analysis, interpretation, and visualization.

  • Become part of a fast-growing, international, and diverse team.

Who we are:

Metyis is a global, forward-thinking firm operating across various industries, delivering Big Data, Digital Commerce, Marketing & Design solutions, and Advisory services. Our partnership model aims for long-lasting impact and growth for our business partners and clients through extensive execution capabilities.

Our collaborative environment includes highly skilled multidisciplinary experts, encouraging innovation and creativity. At Metyis, you can speak your mind and develop your ideas. Join us to achieve great things with a team that supports your growth.

We are Metyis. Partners for Impact.

What you will do
  • Contribute to implementing the technical roadmap for our clients, aligning with overall technology and architecture standards.

  • Support the development and maintenance of data platforms and pipelines, ensuring effective integration of technical and non-technical components to meet business needs.

  • Collaborate on digital solutions using Python, Spark, RESTful APIs, and Microsoft Azure Cloud services.

  • Follow best practices and established frameworks and coding standards to ensure consistency.

  • Learn from senior engineers through code reviews and technical discussions to enhance your skills.

  • Assist in building data insights and analytics components, such as dashboards and data transformations, often collaborating with data science and business teams.

  • Participate in project activities, helping evaluate business requirements and translating them into technical tasks with experienced team members.

What you’ll bring
  • Bachelor’s degree in Engineering, Computer Science, Mathematics, Economics, or related fields.

  • Master’s degree is a plus.

  • 4+ years of hands-on experience building data pipelines and solutions at production scale.

  • Advanced experience with Azure services including Data Factory, Databricks, Synapse, Azure Functions, and App Logic.

  • Strong knowledge of relational databases, data lakes, data vault modeling, and warehousing best practices.

  • Proficient in SQL for complex data transformation and profiling.

  • Strong experience with Python (including PySpark) and scripting languages like Bash.

  • Experience implementing CI/CD workflows and automated testing frameworks (unit, integration).

  • Hands-on experience with orchestration and infrastructure tools like Airflow, Kubernetes, and Great Expectations.

  • Familiarity with modern data architectures such as Lambda and Kappa.

  • Solid experience in data governance and data quality implementation.

  • Customer-centric mindset and passion for delivering high-quality digital products.

  • Ability to work independently in ambiguous environments.

  • Strong stakeholder management and reporting skills.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.