¡Activa las notificaciones laborales por email!

Senior Data Engineer (Python)

Parser Limited

Lugo

A distancia

EUR 40.000 - 60.000

Jornada completa

Hace 3 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

Join a fast-growing company as a Data Engineer, focusing on maintaining and optimizing ETL pipelines in a cloud environment. You'll work collaboratively with teams to ensure data integrity, contribute to data architecture, and drive strategic data insights while enjoying a flexible remote working arrangement.

Servicios

Medical insurance
Highly competitive compensation package
Flexible, remote working environment
Multicultural community of tech experts
Growth opportunities

Formación

  • 4+ years of experience in data engineering or related roles.
  • Hands-on experience with SQL and NoSQL databases.
  • Strong programming skills in Python.

Responsabilidades

  • Build and optimize ETL pipelines for data management.
  • Ensure high data availability and perform regular data validation.
  • Collaborate with cross-functional teams to meet data needs.

Conocimientos

SQL
Python
Data modeling
Problem-solving
Collaboration
Communication

Educación

Bachelor's degree in Computer Science, Data Science, or a related field

Herramientas

AWS
GitHub Actions
Dagster

Descripción del empleo

We are seeking a highly skilled Data Engineer to focus on maintaining data streams and ETL pipelines within a cloud-based environment. The ideal candidate will have experience in building, monitoring, and optimizing data pipelines, ensuring data consistency, and proactively collaborating with upstream and downstream teams to enable seamless data flow across the organization.

In this role, you will troubleshoot and resolve pipeline issues, contribute to enhancing data architecture, implement best practices in data governance and security, and ensure the scalability and performance of data solutions. You will play a critical role in understanding the business context of data, supporting analytics and decision-making by collaborating with data scientists, analysts, and other key stakeholders.

This position requires client presence between 25%-50% of the time per month at the client's office in London.

Key Responsibilities :
  1. Build, maintain, and optimize scalable ETL / ELT pipelines using tools such as Dagster or similar.
  2. Ensure high data availability, reliability, and consistency through rigorous data validation and monitoring practices.
  3. Collaborate with cross-functional teams to align data pipeline requirements with business objectives and technical feasibility.
  4. Automate data workflows to improve operational efficiency and reduce manual intervention.
  5. Data Integrity & Monitoring
  6. Perform regular data consistency checks, identifying and resolving anomalies or discrepancies.
  7. Implement robust monitoring frameworks to proactively detect and address pipeline failures or performance issues.
  8. Work closely with upstream teams to align data ingestion strategies and optimize data handoffs.
  9. Collaboration & Stakeholder Management
  10. Partner with data scientists, analysts, and business teams to provide trusted, accurate, and well-structured data for analytics and reporting.
  11. Communicate complex data concepts clearly to non-technical stakeholders.
  12. Develop and maintain documentation for knowledge sharing and continuity.
  13. Infrastructure & Security Management
  14. Maintain and support cloud-based data platforms such as AWS, ensuring cost-efficient and scalable solutions.
  15. Implement best practices in data governance, compliance, and security, adhering to industry standards.
  16. Continuously improve data processing frameworks for better performance and resilience.
  17. Continuous Improvement & Business Context Mastery
  18. Gain a deep understanding of the business meaning behind data to drive insights and strategic decisions.
  19. Identify opportunities to enhance data models and workflows, ensuring alignment with evolving business needs.
  20. Stay updated with emerging data technologies and advocate for their adoption when relevant.
Qualifications :

Education & Experience :

  • Bachelor's degree in Computer Science, Data Science, or a related field.
  • Minimum 4 years of experience in data engineering, data integration, or related roles.

Technical Skills :

  • Proficiency with SQL (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB), with hands-on experience in query optimization and data modeling.
  • Strong programming skills in Python, focusing on scalable data solutions.
  • Experience with data pipeline orchestration tools such as Dagster or similar.
  • Familiarity with cloud platforms (AWS) and data services (S3, Redshift, Snowflake).
  • Understanding of data warehousing concepts and modern warehousing solutions.
  • Experience with GitHub Actions or similar CI/CD pipelines for data workflows and version control.

Soft Skills :

  • Strong problem-solving skills with attention to detail and a proactive mindset.
  • Ability to work collaboratively in fast-paced environments, managing multiple stakeholders.
  • Excellent communication skills, translating technical findings into business insights.

Nice-to-Have Qualifications :

  • Experience with streaming technologies like Kafka.
  • Familiarity with containerization and orchestration (Docker, ECS).
  • Exposure to BI tools such as Tableau or Power BI.
  • Understanding of machine learning pipelines and their integration with data engineering.

What We'll Offer You In Return :

  • The chance to join an organization experiencing triple-digit growth, transforming digital solutions.
  • The opportunity to be part of a multicultural community of tech experts.
  • A highly competitive compensation package.
  • A flexible, remote working environment.
  • Medical insurance.

Come and join our #ParserCommunity.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.