¡Activa las notificaciones laborales por email!

Data Developer (REMOTE)

Infinity Quest

Madrid

A distancia

EUR 40.000 - 55.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading consulting firm in Madrid is seeking a Data Developer to enhance data pipelines and support marketing systems. The ideal candidate should have substantial experience in ETL development, SQL proficiency, and strong analytical skills. This role requires collaboration on data architecture decisions and ensuring performance integrity. Competitive salary and opportunities for growth are offered.

Formación

  • 3+ years of experience in ETL development.
  • Strong SQL and database design skills.
  • Proven experience in optimizing ETL performance.

Responsabilidades

  • Maintain and enhance data pipelines for marketing systems.
  • Collaborate on data architecture decisions.
  • Ensure data integrity and performance across multiple environments.

Conocimientos

ETL development
SQL
Data integration
Analytical skills

Herramientas

Informatica PowerCenter
Teradata
Oracle
Descripción del empleo
About the Role

We are seeking a Data Developer to join our consulting team supporting enterprise‑level marketing and data‑driven solutions. The ideal candidate will have a strong background in ETL development, data integration, and performance optimization, with the ability to collaborate on data architecture decisions. This role focuses on maintaining and enhancing data pipelines for marketing systems (e.g., Pega Marketing) and ensuring data integrity and performance across multiple environments.

Qualifications
  • 3+ years of experience in ETL development (Informatica PowerCenter, ODI or similar tools).
  • Strong SQL and database design skills (preferably in Teradata, Oracle, or similar enterprise databases).
  • Proven experience in optimizing ETL performance and troubleshooting data issues.
  • Understanding of data modeling, data warehousing, and data integration concepts.
  • Strong analytical and problem‑solving skills.
Preferred (optional)
  • Experience with Kafka or other streaming platforms (nice to have).
  • Familiarity with cloud environments (AWS, Azure).
  • Experience with Databricks.
  • Knowledge of version control tools (GitHub) and CI/CD processes.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.