¡Activa las notificaciones laborales por email!

Data Integration Specialist

Next-Link

Barcelona

Presencial

EUR 50.000 - 70.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading data analytics firm in Barcelona is seeking a Global Data Factory Specialist to design and manage data integrations from multiple sources. You will ensure high-quality data by implementing and managing APIs and transformations. The ideal candidate has a bachelor's degree in Computer Science, strong experience in data integration, and proficiency in tools like AWS and PySpark. This is a full-time position offering a collaborative work environment.

Formación

  • 5 years of experience in data management, data modeling, or data integration.
  • Fluent in English.
  • Proven experience working with pharma datasets and applications.

Responsabilidades

  • Design and maintain robust data pipelines integrating data from diverse sources.
  • Implement API and FT integrations ensuring seamless data flow.
  • Communicate proactively with data consumers about pipeline incidents.

Conocimientos

APIs
Data Integration Tools
Data Management
PySpark
SQL
AWS
Cloud Platforms
Data Transformation Techniques
Communication Skills

Educación

Bachelor's degree in Computer Science or related field

Herramientas

AWS Glue
AWS Lambda
AWS Airflow
AWS Cloud9
AWS Step Functions
Descripción del empleo

We are looking for a Global Data Factory Specialist (Data Integration Specialist) to join the clients Data & Analytics organization.

This role will play a key part in enabling data excellence across the company contributing to the standardization and centralization of data processes worldwide.

As a Global Data Factory Specialist you will be responsible for designing and managing end-to-end data integrations from multiple sources into clean and structured data layers. You will collaborate closely with global teams across the Data & Analytics organization to support the companys data-driven strategy ensuring data quality performance and reliability.

MAIN TASKS
  • Design develop and maintain robust data pipelines integrating data from diverse sources into curated layers.
  • Implement and manage API and FT integrations ensuring seamless data flow between systems.
  • Perform data cleansing transformation and enrichment to deliver high-quality business-ready data.
  • Collaborate with data analysts data engineers and stakeholders to translate business requirements into technical solutions.
  • Monitor and optimize data pipelines for performance cost and scalability.
  • Communicate proactively with data consumers about pipeline execution incidents and remediation plans.
  • Maintain comprehensive documentation for all data integration processes and workflows.
Requisitos
  • Bachelors degree in Computer Science Information Technology or a related field.
  • 5 years of experience in data management data modeling or data integration.
  • Strong experience with APIs FT integrations and data integration tools.
  • Proficiency in PySpark SQL and cloud platforms such as AWS or Snowflake.
  • Hands‑on experience with AWS services (Glue Lambda Airflow Cloud9 Step Functions).
  • Solid knowledge of data transformation techniques (e.g. Parquet).
  • Proven experience working with pharma datasets and applications (Veeva CRM Adobe Campaign Google Suite Veeva Vault MDG etc.).
  • Excellent problem‑solving skills attention to detail and technical judgment.
  • Fluent in English (spoken and written).
  • Strong collaboration communication and stakeholder management skills.
  • Results‑oriented organized and capable of managing multiple priorities.
Key Skills
  • GIS
  • Computer Data Entry
  • Facilities Management
  • ADMA
  • Fleet
  • Key Account

Employment Type: Full Time

Experience: years

Vacancy: 1

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.