Attiva gli avvisi di lavoro via e-mail!

Senior Data Engineer (Remote, Italy, Spain, Romania, m,d,f)

Factor Eleven

Milano

Remoto

EUR 60.000 - 90.000

Tempo pieno

Oggi
Candidati tra i primi

Descrizione del lavoro

A digital advertising firm based in Italy is seeking a Senior Data Engineer to design and maintain scalable data pipelines. The role involves working with Snowflake and Apache Iceberg to enhance the firm's SaaS offerings. Ideal candidates will have over 5 years of data engineering experience and excellent communication skills. The company offers transparent salaries, remote work, and opportunities for personal development.

Servizi

Transparent, above-market salaries
100% remote within Europe
Flexible work-hours
Professional development courses
Home office and co-working allowances

Competenze

  • 5+ years in data engineering with hands-on experience building open lakehouses.
  • Strong proficiency with dbt for complex models and CI/CD.
  • Advanced SQL and Python for building reliable data pipelines.

Mansioni

  • Design and maintain scalable data pipelines focusing on reliability.
  • Implement ETL processes ensuring data accuracy and consistency.
  • Mentor data engineers and foster a culture of ownership.

Conoscenze

Deep expertise in open lakehouse architectures
Production experience with dbt
Strong SQL and Python skills
Excellent communication
Collaboration and problem-solving

Strumenti

Snowflake
Apache Iceberg
AWS Cloud Platform
Apache Kafka
Terraform
Descrizione del lavoro
YOUR MISSION

We are looking for a Senior Data Engineer to shape and scale our open lakehouse data platform built on Snowflake (compute) and Apache Iceberg tables on AWS S3. You’ll own end-to-end design and evolution of robust, scalable pipelines that power analytics, ML, and customer-facing features across our SaaS digital advertising products.

You’ll collaborate with Product and Engineering to keep systems performant, reliable, and business aligned. You’ll ship hands‑on, lead technical design, run code reviews, and mentor others in modern data engineering practices.

To thrive here, you need
  • Deep expertise in open lakehouse architectures – Snowflake as compute + Apache Iceberg on cloud object storage
  • Hands‑on production experience with:
    • Iceberg catalog management (Apache Polaris, Glue, or Hive)
    • Time‑travel / snapshot queries
    • Partition evolution & schema evolution safety
    • Snowflake Iceberg external tables, query tuning, clustering, cost control, RBAC / masking
  • Strong production proficiency in dbt – authoring complex models, incremental logic, snapshots, exposures, custom tests, and CI / CD integration using dbt Core + Snowflake / Iceberg adapters
Highly valuable
  • Experience working with AWS Cloud Platform
  • DataOps / IaC (Terraform, dbt Cloud)
  • Real-time streaming (Apache Kafka / Flink, AWS Kinesis)

You’re passionate about clean, efficient architecture and have a proven track record building and operating production‑grade open lakehouses. You raise the bar through high‑quality delivery, team collaboration, and lasting improvements in reliability and engineering culture.

YOUR RESPONSIBILITIES
  • Design, build, and maintain scalable data pipelines, architectures, and platforms with a focus on reliability and efficiency
  • Implement ETL / ELT processes with rigorous quality checks and governance to ensure data accuracy and consistency
  • Mentor data engineers, share best practices, and foster a culture of learning and ownership
  • Partner with Engineering, Product, and Business to translate requirements into high‑impact data solutions
  • Own project execution end‑to‑end—scoping, estimation, delivery, and communication
  • Champion testing, documentation, and observability through design reviews and technical leadership
  • Stay ahead of industry trends in cloud data, big data processing, and real‑time analytics.
Requirements
YOUR PROFILE
  • 5+ years in data engineering, with hands‑on production experience building open lakehouses using Snowflake + Apache Iceberg
  • Strong production track record with dbt – complex models, dependencies, incremental logic, custom tests, CI / CD
  • Advanced SQL + Python; you build idempotent, observable, schema‑safe pipelines
  • Deep knowledge of data modelling trade‑offs, distributed systems, and big data frameworks
  • Excellent communicator – you distil complex topics for technical and non‑technical audiences with empathy
  • Proven collaborator with strong problem‑solving, mentoring, and project management skills
  • (Bonus) Built and maintained a production‑grade open lakehouse from scratch (Iceberg + catalog + compute)
  • (Bonus) Familiar with DataOps, IaC, or real‑time streaming pipelines.
Benefits

Factor Eleven's tech department is the inhouse tech scale‑up responsible for our SaaS product suite offering localized digital advertising to enterprises of all sizes and shapes. We're powering the engine that Factor Eleven is successfully built upon and elevate the possibilities of our product on a daily basis. We're working together to fulfill our ambitions as a top ad‑tech provider by continuously leveling up the quality and expanding capabilities of the entire platform, as well as our engineering and product organization.

Join our amazing team in our mission to move digital localized advertisement forward and enjoy the freedom, camaraderie and perks of our fully remote operations.

OUR PERKS & BENEFITS
  • Transparent, above-market salaries
  • 100% remote within Europe
  • Flexible work-hours and part‑time models
  • Be part of a fast‑growing, highly-skilled team
  • In person department and company events
  • Home office, co‑working space and work‑together allowance
  • Personal and professional development courses from Udemy

Please note that you need to be residing in and hold a work permit of a country in the European Union to be considered for this role.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.