¡Activa las notificaciones laborales por email!

Senior Data Engineer

Glow Services Corp

Madrid

Presencial

EUR 50.000 - 70.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A financial services company in Madrid is seeking a highly skilled Senior Data Engineer with at least 5 years of experience in Databricks and Lakehouse architecture. The ideal candidate will develop and optimize data architectures, implement CI / CD pipelines, and collaborate with cross-functional teams. Strong knowledge in ETL/ELT processes and cloud platforms is essential. This is a full-time position offering competitive compensation.

Formación

  • Minimum of 5 years of experience as a Data Engineer or similar role.
  • Experience with consumer finance data models is essential.
  • Strong knowledge of data warehousing technologies.

Responsabilidades

  • Design and optimize data architectures using Databricks.
  • Implement and maintain CI / CD pipelines for data solutions.
  • Collaborate with teams to understand and map data requirements.

Conocimientos

Databricks
Lakehouse architecture
Python
CI / CD pipelines
Cloud platforms (AWS, Azure, GCP)
ETL / ELT processes
PowerBI
Big data technologies

Educación

Bachelor's degree in Computer Science

Herramientas

AWS
Azure
GCP
Snowflake
Redshift
Spark
Descripción del empleo
Overview

We are seeking a highly skilled and experienced Senior Data Engineer with a minimum of 5 years working with Databricks and Lakehouse architecture to join our team in the consumer finance industry. The successful candidate will play a critical role in mapping requirements, designing, implementing, and maintaining scalable data solutions, and will ensure seamless integration and operation of CI / CD pipelines.

Key Responsibilities
  • Design, develop, and optimize robust data architectures using Databricks and Lakehouse principles to support large-scale and complex data analytics needs.
  • Implement and maintain CI / CD pipelines to ensure continuous integration and delivery of data solutions, ensuring data quality and operational efficiency.
  • Collaborate with cross-functional teams (ideally Finance) to understand data requirements, map data through distributed systems, and translate these into technical solutions that align with business objectives.
  • Manage and optimize data storage and retrieval systems to ensure performance and cost-effectiveness.
  • Develop, maintain, and document ETL / ELT processes for data ingestion, transformation, and loading using industry best practices.
  • Ensure data security and compliance, particularly within the context of financial data, adhering to relevant regulations and standards.
  • Troubleshoot and resolve any data-related issues, ensuring high availability and reliability of data systems.
  • Evaluate and incorporate new technologies and tools to improve data engineering practices and productivity.
  • Mentor junior data engineers and provide technical guidance to the wider team.
  • Contribute to the strategic planning of data architecture and infrastructure.
Required Qualifications and Experience
  • Bachelor’s degree in Computer Science, Information Technology, or a related field. A Master’s degree is a plus.
  • Minimum of 5 years of professional experience as a Data Engineer or in a similar role within the finance industry, demonstrating experience of working with consumer finance data models.
  • Proficient in using Databricks for data engineering and analytics.
  • Strong experience with Lakehouse architecture and its optimization.
  • Highly proficient in programming languages such as Python
  • Demonstrable expertise in implementing and managing CI / CD pipelines for data solutions
  • Solid experience with cloud platforms (e.g., AWS, Azure, or GCP), and their data services.
  • Deep understanding of data warehousing concepts and technologies (e.g., Snowflake, Redshift).
  • Strong knowledge of ETL / ELT processes and tools.
  • Solid experience of utilising PowerBI or similar visualisation tools
  • Experience working with big data technologies and frameworks (e.g., Spark)
  • Excellent problem-solving skills and a proactive approach to data engineering challenges.
  • Strong communication skills with the ability to articulate complex technical concepts to non-technical stakeholders.
Desirable Skills
  • Certifications in Databricks or cloud technologies.
  • Experience with machine learning pipelines and model deployment.
  • Knowledge of regulatory requirements in the finance industry, such as GDPR or PCI-DSS.
  • Experience with agile development methodologies, such as Scrum or Kanban.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.