¡Activa las notificaciones laborales por email!

Data Engineer - AWS, Spark & Snowflake - Pharmaceutical Project

ERNI Spain

Madrid

Presencial

EUR 30.000 - 50.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading software engineering company in Madrid seeks a Data Engineer to design and optimize large-scale data infrastructures. You'll utilize AWS Glue, Snowflake, and Spark to connect raw data with meaningful insights. The ideal candidate has strong knowledge in the Apache ecosystem and experience in big-data architectures. Benefits include private health insurance, flexible benefits, and relocation bonuses.

Servicios

Private health and travel insurance
Full coverage for sick leave + extra off days
Free snacks and beverages
Monthly team lunches
Gym discounts
Relocation bonus (up to €2,000)
Referral bonuses up to €6,000
Remote work compensation for hardware
Referral bonuses: up to €6,000 per candidate
Remote work compensation: hardware + home office expenses
23 working days of vacation
Free language courses: English, Spanish, and German

Formación

  • Strong knowledge of data processing frameworks.
  • Ability to manage and transform large datasets.
  • Experience implementing data pipelines.
  • Solid SQL and NoSQL experience (Snowflake and or Databricks preferred).
  • Proficiency in Python or Scala.
  • Excellent communication skills in English and collaboration skills.

Responsabilidades

  • Design and maintain scalable data pipelines.
  • Optimize data processing using AWS Glue and Spark.
  • Integrate emerging data technologies into systems.
  • Integrating emerging data and software-engineering technologies into existing ecosystems.
  • Using Python and SQL to develop, automate, and optimize pipelines.

Conocimientos

Apache ecosystem knowledge (Hadoop, Spark, Kafka, Airflow)
Hands-on expertise with AWS data services (Glue, Redshift, Kinesis, S3)
Big-data architecture understanding
SQL and NoSQL experience (Snowflake or Databricks)
Proficiency in Python or Scala
Excellent communication skills in English

Herramientas

AWS Glue
Snowflake
Spark
Airflow
Descripción del empleo

ERNI is a Swiss Software Engineering company leader in building complex, customized software solutions. It’s more than 25 years of using technology to have a positive impact on people's lives.

We build digital solutions that connect the physical world (devices & connectivity) with the digital one (software solutions, connected, or consuming their data). Our teams are focused on prioritize the software lifecycle & to ensure that our code is both clean and secure. Our maturity in building high-impact software solutions approaches us whatever industry looking for quality and sophistication (e.g. Gaming, FMCG or Validation & Inspection..).

Our leadership in the Health-Tech (Diagnostic Medical Devices, Pharma, Health Care and more) and in all industries around Smart Devices (Robots, Cars, 3D Printers, Machinery), in a powerful combination with the way we growth people, let us create a growing, learning and challenging set up for ERNIans.

Can you imagine shaping the future of scalable data platforms that power decision-making in the Pharma industry? 💡📊

This could be the opportunity you’ve been waiting for.

At ERNI, we’re looking for a Data Engineer to design and optimize large-scale data infrastructures that ensure reliability, scalability, and efficiency across global initiatives. You’ll work with cutting-edge technologies such as AWS Glue, Snowflake, Spark, Airflow, and Big Data frameworks, helping connect raw data with business insights through well‑orchestrated, high‑performing data pipelines. 🚀

At ERNI, you're part of a team, not just a project.

Even when working on‑site with our clients, we make sure you stay connected and supported.

How do we ensure that? 🤔

From day one, you will have a mentor with you, who will guide you through your entire onboarding and career at ERNI. You’ll have regular 1 : 1 meetings with them, and recurrently, you’ll work on your development plan to define your short‑, medium‑, and long‑term goals.

What will you do?

  • Designing, building, testing, and maintaining scalable data pipelines and architectures
  • Implementing and optimizing data processing using AWS Glue, Spark, and Airflow
  • Managing and transforming large datasets using Snowflake and other relational or NoSQL systems.
  • Integrating emerging data and software‑engineering technologies into existing ecosystems
  • Using Python and SQL to develop, automate, and optimize pipelines.

What are we looking for?

✅ Strong knowledge of the Apache ecosystem (Hadoop, Spark, Kafka, Airflow).

✅ Hands‑on expertise with AWS data services (Glue, Redshift, Kinesis, S3).

✅ Deep understanding of big‑data architectures and data pipeline optimization.

✅ Solid SQL and NoSQL experience (Snowflake and / or Databricks preferred)

✅ Proficiency in Python or Scala.

✅ Excellent communication skills in English and collaboration skills.

What do we offer?

🏥 Private health and travel insurance.

💯 Full coverage for sick leave + 1 extra day off per month without medical leave.

🧘♀️ Free emotional, legal, and family support.

☕ Snacks, fruit, coffee, and tea at the office.

🍽️ Monthly team lunches paid by the company.

🎉 Special days like "churro therapy" or "calamari sandwich day."

🏋️♀️ Gym discounts + sports compensation.

🧳 Relocation bonus (up to €2,000).

💳 Flexible benefits : meals, transport, childcare, etc.

🤝 Referral bonuses : up to €6,000 per candidate and €5,000 per client.

🖥️ Remote work compensation : hardware + home office expenses.

🌴 23 working days of vacation.

🗣️ Free language courses : English, Spanish, and German.

And the salary...

We will discuss it during the first call. If it’s important to you, feel free to ask! 💬

Also, if you want to know what makes ERNI unique through the eyes of those who know it best 😎 Take a look at this video :

ERNI in One Word - YouTube

WOULD YOU LIKE TO BECOME AN ERNIan? APPLY NOW!

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.