¡Activa las notificaciones laborales por email!

Senior Data Engineer (Python)

Parser

Vitoria

Presencial

EUR 40.000 - 70.000

Jornada completa

Hace 5 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

Join Parser as a Senior Data Engineer and play a pivotal role in shaping data flows and ensuring data governance within a dynamic cloud environment. Leverage your expertise in building ETL pipelines, collaborating with teams, and implementing best practices to optimize data solutions. A competitive compensation package awaits you in a multicultural tech community.

Servicios

Flexible and remote working environment
Medical insurance
Competitive compensation package

Formación

  • Minimum 4 years of experience in data engineering or a related role.
  • Proficiency in SQL and Python.
  • Experience with data pipeline orchestration tools.

Responsabilidades

  • Build, maintain, and optimize scalable ETL/ELT pipelines.
  • Monitor data availability and perform regular consistency checks.
  • Collaborate with cross-functional teams for data alignment.

Conocimientos

SQL
Python
Data Engineering
Data Integration
Problem Solving
Communication

Educación

Bachelor's degree in Computer Science

Herramientas

AWS
Dagster
PostgreSQL

Descripción del empleo

Join to apply for the Senior Data Engineer (Python) role at Parser

Join to apply for the Senior Data Engineer (Python) role at Parser

Direct message the job poster from Parser

Global Talent Acquisition Specialist @Parser | Global Talent Hunter | HR Agile Executive | Diversity, Equity, and Inclusion

We are seeking a highly skilled Data Engineer to focus on maintaining data streams and ETL pipelines within a cloud-based environment. The ideal candidate will have experience in building, monitoring, and optimizing data pipelines, ensuring data

consistency, and proactively collaborating with upstream and downstream teams to enable seamless data flow across the organization.

In this role, you will not only troubleshoot and resolve pipeline issues but also contribute to enhancing data architecture, implementing best practices in data governance and security, and ensuring the scalability and performance of data solutions. You will play a critical role in understanding the business context of data, supporting analytics and decision-making by collaborating with data scientists, analysts, and other key stakeholders

This position requires client presence between 25%-50% of the time per month at the client's office, which is located in London.

Key Responsibilities :

  • Build, maintain, and optimize scalable ETL / ELT pipelines using tools such as Dagster, or similar.
  • Ensure high data availability, reliability, and consistency through rigorous data validation and monitoring practices.
  • Collaborate with cross-functional teams to align data pipeline requirements with business objectives and technical feasibility.
  • Automate data workflows to improve operational efficiency and reduce manual intervention.

Data Integrity & Monitoring

  • Perform regular data consistency checks, identifying and resolving anomalies or discrepancies.
  • Implement robust monitoring frameworks to proactively detect and address pipeline failures or performance issues.
  • Work closely with upstream teams to align data ingestion strategiesand optimize data handoffs.

Collaboration & Stakeholder Management

  • Partner with data scientists, analysts, and business teams to provide trusted, accurate, and well-structured data for analytics and reporting.
  • Communicate complex data concepts in a clear and actionable manner to non-technical stakeholders.
  • Develop and maintain documentation to ensure knowledge sharing and continuity

Infrastructure & Security Management

  • Maintain and support cloud-based data platforms such as AWS, ensuring cost-efficient and scalable solutions.
  • Implement best practices in data governance, compliance, and security, adhering to industry standards.
  • Continuously improve data processing frameworks for enhanced performance and resilience.

Continuous Improvement & Business Context Mastery

  • Gain a deep understanding of the business meaning behind data to drive insights and strategic decisions.
  • Identify opportunities to enhance data models and workflows, ensuring they align with evolving business needs.
  • Stay updated with emerging data technologies and advocate for their Adoption when relevant

Qualifications :

Education & Experience :

  • Bachelor's degree in Computer Science, Data Science, or a related field.
  • Minimum 4 years of experience years of experience in data engineering, data integration, or a related role.

Technical Skills :

  • Proficiency with SQL (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g. MongoDB), with hands-on experience in query
  • optimization and data modelling.
  • Strong programming skills in Python (preferred), with a focus on building scalable data solutions.
  • Experience with data pipeline orchestration tools such as Dagster or similar.
  • Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake).
  • Understanding of data warehousing concepts and experience with modern warehousing solutions.
  • Experience with GitHub Actions (or similar) and implementing CI / CD pipelines for data workflows and version-controlled deployments.

Soft Skills :

  • Strong problem-solving skills with keen attention to detail and a proactive mindset.
  • Ability to work in a collaborative, fast-paced environment, handling multiple stakeholders effectively.
  • Excellent communication skills with the ability to translate technical findings into business insights

Nice-to-Have Qualifications :

  • Experience with streaming technologies such as Kafka or similar.
  • Familiarity with containerization and orchestration (Docker and ECS) for data workflows.
  • Exposure to BI tools such as Tableau or Power BI for data visualization.
  • Understanding of machine learning pipelines and how they integrate with data engineering processes.

What We'll Offer You In Return :

  • The chance to join an organisation with triple-digit growth that is changing the paradigm on how digital solutions are built.
  • The opportunity to form part of an amazing, multicultural community of tech experts.
  • A highly competitive compensation package.
  • A flexible and remote working environment.
  • Medical insurance.

Come and join our #ParserCommunity.

Seniority level

  • Seniority level Mid-Senior level

Employment type

  • Employment type Full-time

Job function

  • Job function Information Technology
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Parser by 2x

Get notified about new Data Engineer jobs in Spain .

Madrid, Community of Madrid, Spain 2 months ago

Madrid, Community of Madrid, Spain 2 days ago

Madrid, Community of Madrid, Spain 2 weeks ago

Madrid, Community of Madrid, Spain 1 month ago

Madrid, Community of Madrid, Spain 2 months ago

Madrid, Community of Madrid, Spain 2 weeks ago

Madrid, Community of Madrid, Spain 4 weeks ago

Madrid, Community of Madrid, Spain 1 month ago

Madrid, Community of Madrid, Spain 4 days ago

Madrid, Community of Madrid, Spain 4 months ago

Sevilla La Nueva, Community of Madrid, Spain 4 months ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

J-18808-Ljbffr

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.