Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Senior Data Engineer

Jobgether

España

A distancia

EUR 30.000 - 50.000

Jornada completa

Ayer
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A recruitment agency is seeking a Senior Data Engineer to design and implement distributed data pipelines within a remote setting across the European Union. The role requires proficiency in technologies such as Spark and Hadoop, and expertise in Scala and Java. Candidates should possess a Bachelor's degree in Computer Science and have at least 5 years of relevant experience. This position offers competitive salary benefits and focuses on collaborative engineering culture and innovative solutions.

Servicios

Competitive salary package with equity options
100% remote position
Long-term career growth opportunities
Home office stipend and equipment
Generous vacation and family care packages

Formación

  • Minimum 5 years of professional experience in data engineering roles.
  • Strong experience with distributed data processing technologies like Spark and Hadoop.
  • Proficiency in Scala and Java; Python experience is a plus.

Responsabilidades

  • Design, implement, and maintain distributed data pipelines and Spark jobs for large-scale data processing.
  • Collaborate with Data Science teams to deploy machine learning models into production environments.
  • Optimize data workflows and pipelines using cloud platforms and orchestration tools.

Conocimientos

Distributed data processing technologies
Spark
Hadoop
Scala
Java
Python
SQL
NoSQL databases
Kafka
RabbitMQ
Cloud computing

Educación

Bachelor’s degree in Computer Science or equivalent

Herramientas

AWS
GCP
Airflow
Luigi
ELK stack
Descripción del empleo

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Data Engineer in the European Union.

We are seeking an experienced data engineer to help shape and optimize complex data systems at scale. This role offers the opportunity to work across the full data stack, from designing distributed ETL pipelines to deploying machine learning models in production. You will collaborate closely with data science, web, and engineering teams to deliver high-quality, user-centric solutions while leveraging modern tools, including AI-assisted development. The ideal candidate thrives in a flexible, remote-first environment, enjoys solving challenging technical problems, and is motivated to drive engineering excellence across large-scale data workflows.

Accountabilities
  • Design, implement, and maintain distributed data pipelines and Spark jobs for large-scale data processing.
  • Build robust asynchronous, parallel, and low-latency APIs and services.
  • Collaborate with Data Science teams to deploy machine learning models into production environments.
  • Apply best practices in TDD, SOLID principles, and scalable software architecture.
  • Design and manage relational and non-relational databases, data models, and warehouses.
  • Optimize data workflows and pipelines using cloud platforms (AWS, GCP) and orchestration tools (Airflow, Luigi).
  • Leverage AI tools to improve coding productivity, debugging efficiency, and system design.
  • Monitor, maintain, and enhance data collection and processing systems to ensure reliability and scalability.
Requirements
  • Bachelor’s degree in Computer Science or equivalent practical experience.
  • Minimum 5 years of professional experience in data engineering roles.
  • Strong experience with distributed data processing technologies, including Spark and Hadoop.
  • Proficiency in Scala and Java; Python experience is a plus.
  • Solid understanding of data structures, algorithms, and scalable system design.
  • Experience with SQL and NoSQL databases (PostgreSQL, Cassandra) and message queues (Kafka, RabbitMQ).
  • Knowledge of cloud computing, orchestration tools, and monitoring solutions (AWS, GCP, ELK stack).
  • Proven ability to build complex ETL workflows using modern orchestration frameworks.
  • Strong problem-solving skills, attention to detail, and ability to adapt solutions to complex challenges.
  • Excellent communication and collaboration skills, with a proactive, team-oriented mindset.
Benefits
  • Competitive salary package with equity options.
  • 100% remote position with flexible work arrangements across EU-approved locations.
  • Long-term career growth and development opportunities with access to mentorship and training.
  • Supportive, collaborative engineering culture emphasizing knowledge sharing and innovation.
  • Generous vacation and family care packages, including maternity/paternity leave.
  • Home office stipend and equipment to ensure optimal productivity.
  • Participation in hackdays, training days, and company-wide social events.

By submitting an application to this posting, the applicant acknowledges that Jobgether will process their personal data as necessary to evaluate their candidacy, provide feedback, and, when appropriate, share relevant information with potential employers. Such processing is carried out on the basis of legitimate interest and pre-contractual measures in accordance with applicable data protection laws. The applicant may exercise their rights of access, rectification, erasure, and objection at any time as provided under the GDPR.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.