¡Activa las notificaciones laborales por email!

Senior Data Engineer (Python)

Parser Limited

España

Híbrido

EUR 40.000 - 70.000

Jornada completa

Hace 4 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

Join a rapidly growing company as a Senior Data Engineer where you will be responsible for developing and maintaining scalable data pipelines. You will work in a cloud environment, collaborating with data scientists and analysts to enhance data accessibility and governance. This position demands strong technical skills in SQL and Python, with an opportunity for a flexible, remote working environment while engaging with a multicultural tech community.

Servicios

Flexible working environment
Medical insurance
Competitive compensation package

Formación

  • Minimum 4 years experience in data engineering or related.
  • Proficiency in SQL databases and NoSQL.
  • Experience with data orchestration tools and cloud platforms.

Responsabilidades

  • Build, maintain, and optimize ETL/ELT pipelines.
  • Automate workflows, ensuring data integrity and security.
  • Collaborate with teams for effective data utilization.

Conocimientos

SQL
Python
Data modelling
Problem-solving

Educación

Bachelor's degree in Computer Science, Data Science, or a related field

Herramientas

AWS
Dagster
GitHub Actions

Descripción del empleo

Senior Data Engineer

We are seeking a highly skilled Data Engineer to focus on maintaining data streams and ETL pipelines within a cloud-based environment. The ideal candidate will have experience in building, monitoring, and optimizing data pipelines, ensuring data
consistency, and proactively collaborating with upstream and downstream teams to enable seamless data flow across the organization.
In this role, you will not only troubleshoot and resolve pipeline issues but also contribute to enhancing data architecture, implementing best practices in data governance and security, and ensuring the scalability and performance of data solutions. You will play a critical role in understanding the business context of data, supporting analytics and decision-making by collaborating with data scientists, analysts, and other key stakeholders

This position requires client presence between 25%-50% of the time per month at the client's office, which is located in London.

Key Responsibilities:

Data Pipeline Development & Maintenance
Build, maintain, and optimize scalable ETL/ELT pipelines using tools such as Dagster, or similar.
Ensure high data availability, reliability, and consistency through rigorous data validation and monitoring practices.
Collaborate with cross-functional teams to align data pipeline requirements with business objectives and technical feasibility.
Automate data workflows to improve operational efficiency and reduce manual intervention.

Data Integrity & Monitoring
Perform regular data consistency checks, identifying and resolvinganomalies or discrepancies.
Implement robust monitoring frameworks to proactively detect andaddress pipeline failures or performance issues.
Work closely with upstream teams to align data ingestion strategiesand optimize data handoffs.

Collaboration & Stakeholder Management
Partner with data scientists, analysts, and business teams to providetrusted, accurate, and well-structured data for analytics and reporting.
Communicate complex data concepts in a clear and actionable mannerto non-technical stakeholders.
Develop and maintain documentation to ensure knowledge sharing andcontinuity

Infrastructure & Security Management
Maintain and support cloud-based data platforms such as AWS, ensuring cost-efficient and scalable solutions.
Implement best practices in data governance, compliance, and security, adhering to industry standards.
Continuously improve data processing frameworks for enhanced performance and resilience.

Continuous Improvement & Business Context Mastery
Gain a deep understanding of the business meaning behind data to drive insights and strategic decisions.
Identify opportunities to enhance data models and workflows, ensuring they align with evolving business needs.
Stay updated with emerging data technologies and advocate for their Adoption when relevant

Qualifications:

Education & Experience:
Bachelor's degree in Computer Science, Data Science, or a relatedfield.
Minimum 4 years of experience years of experience in data engineering, data integration, or a related role.

Technical Skills:
Proficiency with SQL (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g. MongoDB), with hands-on experience in query
optimization and data modelling.
Strong programming skills in Python (preferred), with a focus on building scalable data solutions.
Experience with data pipeline orchestration tools such as Dagster or similar.
Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake).
Understanding of data warehousing concepts and experience with modern warehousing solutions.
Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and version-controlled deployments.

Soft Skills:
Strong problem-solving skills with keen attention to detail and aproactive mindset.
Ability to work in a collaborative, fast-paced environment, handling multiple stakeholders effectively.
Excellent communication skills with the ability to translate technical findings into business insights

Nice-to-Have Qualifications:
Experience with streaming technologies such as Kafka or similar.
Familiarity with containerization and orchestration (Docker and ECS) for data workflows.
Exposure to BI tools such as Tableau or Power BI for data visualization.
Understanding of machine learning pipelines and how they integrate with data engineering processes.
Certification in cloud data engineering (e.g., AWS Certified Data Analytics)

What We'll Offer You In Return:

  • The chance to join an organisation with triple-digit growth that is changing the paradigm on how digital solutions are built.
  • The opportunity to form part of an amazing, multicultural community of tech experts.
  • A highly competitive compensation package.
  • A flexible and remote working environment.
  • Medical insurance.

Come and join our #ParserCommunity.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.