¡Activa las notificaciones laborales por email!

Senior Data Engineer

beBeeSoftware

Barcelona

Presencial

EUR 45.000 - 65.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A technology company in Barcelona is seeking an experienced Backend Developer to build scalable backend infrastructure for a data-intensive web app. The role involves optimizing storage workloads, developing data pipelines, and managing cloud infrastructure. Candidates should have a Bachelor's or Master's in Computer Science and experience with TypeScript, NodeJS, and Python. Strong problem-solving skills and knowledge of distributed systems are essential.

Formación

  • Solid CS fundamentals and distributed systems knowledge is a must-have.
  • Experience building scalable data pipelines and optimizing workflows.
  • Very comfortable working across data and backend.

Responsabilidades

  • Build and own scalable backend services for our data-intensive web app.
  • Make database and storage trade-offs for optimal performance.
  • Build robust batch and real-time data pipelines.

Conocimientos

TypeScript
NodeJS
Python
Distributed systems
Data warehouse technologies
Workflow orchestration

Educación

Bachelor's or Master's Degree in Computer Science
Descripción del empleo
Backend Developer Opportunity

We are seeking an experienced software engineer to build scalable backend infrastructure for our data-intensive web app, optimize storage / database workloads, ensure high availability for AI / ML services, and develop robust data pipelines.

As a founding engineer on the team, you will partner closely with full-stack engineers and AI / ML scientists to ship production systems powered by LLMs and ML models. Your focus will be on developing applications that integrate complex data sources, optimize database performance, and ensure seamless scalability.

Key Responsibilities
  • Build and own scalable backend services for our data-intensive web app and AI / ML model serving
  • Make database and storage trade-offs to optimize for best performance when managing large amounts and varieties of data
  • Build robust batch and real-time data pipelines that can scale with increased load and data
  • Design and manage the evolution of flexible data models and database infrastructure to handle different types of data sources and data points
  • Implement and manage a robust workflow orchestration for multiple data jobs
  • Setup robust testing practices, monitoring, and observability to ensure reliability of all services and data pipelines
Requirements
  • Bachelor's or Master's Degree in Computer Science or a related field
  • Solid CS fundamentals and distributed systems knowledge is a must-have
  • Experience building scalable data pipelines, orchestrating workflows, and optimizing pipelines
  • Experience working with data warehouse technologies
  • Experience working with TypeScript / NodeJS / Python and a strongly typed language
  • Very comfortable working across the data and backend
  • Comfortable managing cloud infrastructure for backend, ML, and data pipelines at scale (DevOps)
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.