Attiva gli avvisi di lavoro via e-mail!
A data-driven forecasting company in Rome seeks a Data Engineer to own and optimize data pipelines. You will lead a small team, collaborate across functions, and ensure reliable data flows. Candidates should have 7–10+ years of experience in data engineering, with a strong emphasis on performance and leadership. The role offers competitive compensation, flexible hybrid work, and opportunities for growth.
Own the data foundation that powers our forecasts. Design, orchestrate, and optimize reliable pipelines with a strong focus on performance, cost efficiency, observability, and reproducibility. You'll mentor/manage a small team and collaborate closely with Data Science, Meteorology, MLOps, Client Data Ingestion, and DevOps.
Own the data foundation: from client weather/production data to curated, production-ready features.
Orchestrate workflows: Airflow (v3 preferred), Dagster, Prefect.
Batch ELT across multiple data sources and schedules, real-time ingestion when needed.
Alongside classic pipelines you will set up and maintain tensor-first pipelines: xarray, Zarr, Dask.
Observability & performance: metrics, logs, p50/p95 latency; cost/latency trade-offs.
Resilience: restartable and idempotent workflows, safe reruns/backfills, schema evolution tracking, lineage.
Mentor & manage: delegate well, coach where needed, stay hands-on when it matters.
Collaborate across functions to align features, architecture, SLAs.
7–10+ yrs in Data Engineering with leadership/mentorship experience.
Strong orchestration experience with a preference for asset-first, event-driven workflows as supported in modern orchestrators.
Strong track record in building and optimising scalable batch ELT/ETL workflows with state-of-the-art frameworks and distributed processing engines.
Experience with observability & performance optimisation (metrics, alerts, cost awareness).
Building resilient pipelines (failures, reruns, safe and tracked schema evolution, lineage tools).
Tensorial stack (xarray, Zarr, Dask) or scientific formats (NetCDF, GRIB).
Familiarity with modern product oriented delivery methods (e.g., Shape Up).
We prize autonomy, ownership, and crisp communication. Priorities and decisions are transparent across DS, Engineering, and Commercial so teams move together, not in silos.
Clear, reliable data flows that are easier to reason about and improve.
Faster, more cost-efficient pipelines with robust monitoring/alerting.
A resilient orchestration layer and reproducible environments.
Juniors growing under your mentorship and delivering confidently.
Strategic ownership with direct leadership visibility to leadership.
Competitive compensation + ESOP.
Flexible hybrid work; offices in Tallinn and Rome.
Growth into discipline lead for Data Engineering.