¡Activa las notificaciones laborales por email!

Data Engineer, Parameta Solutions (Madrid)

TP ICAP

Madrid

Presencial

COP 206.944.000 - 321.914.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading global data solutions provider is seeking a Data Engineer in Madrid to build and maintain scalable data systems. This role involves designing data pipelines, managing cloud infrastructure, and optimizing CI/CD processes. Ideal candidates should have strong Python and SQL proficiency, hands-on experience with AWS, and excellent collaboration skills. Join a dynamic team to contribute to innovative data products and services.

Formación

  • Demonstrated experience in data engineering or a closely related role.
  • Proficiency in Python and strong SQL skills.
  • Experience building API-driven data platforms.
  • Hands-on experience with cloud platforms like AWS.
  • Strong understanding of ETL processes and event streaming.

Responsabilidades

  • Design and maintain batch and streaming data pipelines.
  • Develop scalable data warehousing solutions using Snowflake.
  • Architect and manage cloud-based infrastructure for efficiency.
  • Enhance CI/CD pipelines to streamline development.
  • Monitor health and performance of data applications.

Conocimientos

Data engineering
Python
SQL
API development
AWS
Snowflake
Kubernetes
Airflow
ETL processes
Linux

Educación

Bachelor’s degree in Computer Science, Engineering, Mathematics, or related field

Herramientas

Apache Airflow
Apache Kafka
Apache Flink
Prometheus
Grafana
CloudWatch
Descripción del empleo

Role Overview

Parameta Solutions is seeking a skilled Data Engineer to join our growing global team. Based in Madrid, you will be a foundational member of our new hub, helping to shape a high-performing engineering function. In this role, you will design, build, and maintain scalable systems and infrastructure to process and analyse complex, high-volume datasets. Working closely with data scientists, analysts, product managers, and other stakeholders, you will translate business needs into robust technical solutions that power innovative data products and services.

Key Responsibilities

Design, build, and maintain performant batch and streaming data pipelines to support both real-time market data and large-scale batch processing (Apache Airflow, Apache Kafka, Apache Flink).

Develop scalable data warehousing solutions using Snowflake and other modern platforms.

Architect and manage cloud-based infrastructure (AWS or GCP) to ensure resilience, scalability, and efficiency.

Enhance and optimise CI / CD pipelines (Jenkins, GitLab) to streamline development, testing, and deployment.

Monitor and maintain the health and performance of data applications, pipelines, and databases using observability tools (Prometheus, Grafana, CloudWatch).

Partner with stakeholders across functions to deliver reliable data services and enable analytical product development.

Contribute actively in agile ceremonies (stand-ups, sprint planning, retrospectives) to align on priorities and delivery.

Experience & Competencies

Essential

Demonstrated, professional experience in data engineering or a closely related role.

Proficiency in Python (with frameworks such as Pandas, Dask, PySpark) and strong SQL skills.

Experience building and scaling API-driven data platforms (e.g., FastAPI).

Hands-on experience with AWS, Snowflake, Kubernetes, and Airflow.

Knowledge of monitoring and alerting systems (Prometheus, Grafana, CloudWatch).

Strong understanding of ETL processes and event streaming (Kafka, Flink, etc.).

Comfortable with Linux and command-line operations.

Excellent communication skills, with the ability to collaborate across technical and business teams.

Bachelor’s degree in Computer Science, Engineering, Mathematics, or related technical field.

Desired

Exposure to financial market data or capital markets environments.

Knowledge of additional programming languages such as Java, C#, or C++.

Band & Level

Professional / 5

#PARAMETA #LI-ASO #LI-Hybrid

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.