Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Data Engineer, Parameta Solutions (Madrid)

TP ICAP

Madrid

Presencial

EUR 45.000 - 65.000

Jornada completa

Hace 13 días

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading data solutions provider is seeking a skilled Data Engineer in Madrid. The ideal candidate will design and maintain scalable systems for processing and analyzing large datasets, collaborating closely with data scientists and product managers. Required skills include proficiency in Python and SQL, hands-on experience with AWS, and knowledge of data pipelines. Candidates should have a Bachelor's degree in a relevant field and strong communication skills. This role offers the opportunity to contribute to innovative data products in a dynamic environment.

Formación

  • Professional experience in data engineering or closely related role.
  • Excellent communication skills for collaboration across teams.
  • Comfortable with Linux and command-line operations.

Responsabilidades

  • Design, build, and maintain data pipelines for real-time market data.
  • Develop data warehousing solutions using Snowflake.
  • Manage cloud-based infrastructure (AWS or GCP).
  • Enhance CI/CD pipelines for streamlined development.
  • Monitor health and performance of data applications and pipelines.
  • Collaborate with stakeholders to deliver reliable data services.

Conocimientos

Python (Pandas, Dask, PySpark)
SQL
API-driven data platforms (FastAPI)
AWS
Snowflake
Kubernetes
Airflow
ETL processes
Event streaming (Kafka, Flink)
Monitoring systems (Prometheus, Grafana, CloudWatch)

Educación

Bachelor’s degree in Computer Science, Engineering, Mathematics or related field
Descripción del empleo
Role Overview

Parameta Solutions is seeking a skilled Data Engineer to join our growing global team. Based in Madrid, you will be a foundational member of our new hub, helping to shape a high-performing engineering function. In this role, you will design, build, and maintain scalable systems and infrastructure to process and analyse complex, high-volume datasets. Working closely with data scientists, analysts, product managers, and other stakeholders, you will translate business needs into robust technical solutions that power innovative data products and services.

Key Responsibilities
  • Design, build, and maintain performant batch and streaming data pipelines to support both real‑time market data and large‑scale batch processing (Apache Airflow, Apache Kafka, Apache Flink).
  • Develop scalable data warehousing solutions using Snowflake and other modern platforms.
  • Architect and manage cloud‑based infrastructure (AWS or GCP) to ensure resilience, scalability, and efficiency.
  • Enhance and optimise CI / CD pipelines (Jenkins, GitLab) to streamline development, testing, and deployment.
  • Monitor and maintain the health and performance of data applications, pipelines, and databases using observability tools (Prometheus, Grafana, CloudWatch).
  • Partner with stakeholders across functions to deliver reliable data services and enable analytical product development.
  • Contribute actively in agile ceremonies (stand‑ups, sprint planning, retrospectives) to align on priorities and delivery.
Experience & Competencies
Essential
  • Demonstrated, professional experience in data engineering or a closely related role.
  • Proficiency in Python (with frameworks such as Pandas, Dask, PySpark) and strong SQL skills.
  • Experience building and scaling API‑driven data platforms (e.g., FastAPI).
  • Hands‑on experience with AWS, Snowflake, Kubernetes, and Airflow.
  • Knowledge of monitoring and alerting systems (Prometheus, Grafana, CloudWatch).
  • Strong understanding of ETL processes and event streaming (Kafka, Flink, etc.).
  • Comfortable with Linux and command‑line operations.
  • Excellent communication skills, with the ability to collaborate across technical and business teams.
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or related technical field.
Desired
  • Exposure to financial market data or capital markets environments.
  • Knowledge of additional programming languages such as Java, C#, or C++.
Band & Level

Professional / 5

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.