Attiva gli avvisi di lavoro via e-mail!

Senior Data Engineer

Cast AI

Italia

Remoto

EUR 30.000 - 50.000

Tempo pieno

5 giorni fa
Candidati tra i primi

Descrizione del lavoro

A leading technology firm is seeking a Data Engineer to design and optimize data pipelines, work with streaming technologies, and partner with ML teams. The ideal candidate will have strong software engineering skills and experience in data pipeline development. The position offers a competitive salary of €6,500 - €9,000, a flexible remote-first environment, and opportunities for personal growth.

Servizi

Competitive salary
Flexible, remote-first global environment
Equity options
Private health insurance
Learning budget for professional development
Extra days off for work-life balance

Competenze

  • Strong software engineering and problem-solving skills.
  • Experience with data warehouse technologies such as ClickHouse, Snowflake, or BigQuery.
  • Proficiency with big data technologies and streaming platforms.
  • Knowledge of DBT and feature store concepts in data engineering.
  • Proven experience in data pipeline development for machine learning workflows.
  • Knowledge of DevOps practices, monitoring, and logging.
  • Experience in data security, compliance, and governance.
  • Excellent communication skills and a proactive, collaborative mindset.
  • Strong English skills (written and spoken).

Mansioni

  • Design, build, and scale cloud-native data pipelines leveraging orchestration frameworks.
  • Operate across batch and streaming ecosystems, integrating real-time data platforms.
  • Engineer high-quality, discoverable datasets through modern data modeling.
  • Partner with ML / AI teams to productionize data for training and inference pipelines.
  • Optimize cost, scalability, and performance in multi-cloud environments.
  • Champion observability and reliability through data SLAs and automated quality checks.
  • Drive innovation by adopting new technologies for advanced analytics and AI.

Conoscenze

Software engineering
Problem-solving
Data warehouse technologies
Big data technologies
Streaming platforms
DBT knowledge
Data pipeline development
DevOps practices
Data security
Communication skills
Descrizione del lavoro
Overview

Cast AI is the leading Application Performance Automation (APA) platform, enabling customers to cut cloud costs, improve performance, and boost productivity – automatically. Built originally for Kubernetes, Cast AI delivers real-time, autonomous optimization across cloud environments, continuously analyzes workloads, rightsizes resources, and rebalances clusters to improve speed, reliability, and efficiency.

Headquartered in Miami, Florida, Cast AI has employees in more than 32 countries and supports customers across major cloud, hybrid, and on‑premises environments.

What’s next? Backed by a $108M Series C, Cast AI is expanding to make APA the standard for DevOps and MLOps.

About the role

As a Data Engineer, you’ll design and optimize large-scale data pipelines, orchestrate big data workflows, and work with streaming technologies that power intelligent automation. This position is suited for someone with strong data engineering skills and an interest in machine learning and AI who wants to advance their expertise.

Responsibilities
  • Design, build, and scale cloud-native data pipelines leveraging orchestration frameworks and infrastructure-as-code for reproducibility and automation.
  • Operate across batch and streaming ecosystems, integrating real-time data platforms with large-scale data storage and processing architectures.
  • Engineer high-quality, discoverable datasets through modern data modeling, ensuring reliability, lineage tracking, and governance.
  • Partner with ML / AI teams to productionize data for training and inference pipelines, enabling feature stores, online / offline parity, and model monitoring.
  • Optimize cost, scalability, and performance in multi-cloud / hybrid environments, using modern query engines.
  • Champion observability and reliability by implementing data SLAs, automated quality checks, anomaly detection, and self-healing pipelines.
  • Drive innovation by adopting new technologies to support advanced analytics and AI.
Qualifications
  • Strong software engineering and problem-solving skills.
  • Experience with data warehouse technologies such as ClickHouse, Snowflake, or BigQuery.
  • Proficiency with big data technologies and streaming platforms.
  • Knowledge of DBT and feature store concepts in data engineering.
  • Proven experience in data pipeline development for machine learning workflows.
  • Knowledge of DevOps practices, monitoring, and logging.
  • Experience in data security, compliance, and governance.
  • Excellent communication skills and a proactive, collaborative mindset.
  • Based in the European Union within GMT 0 to GMT +3.
  • Strong English skills (written and spoken).
What’s in it for you?
  • Competitive salary (€6,500 - €9,000 gross, depending on experience)
  • Flexible, remote-first global environment.
  • Collaborate with a global team of cloud experts and innovators.
  • Equity options.
  • Private health insurance.
  • Fast-paced workflow with most feature projects completed in 1 to 4 weeks.
  • Spend 10% of your time on personal projects or self-improvement.
  • Learning budget for professional development and conferences.
  • Annual hackathon and team-building events.
  • Equipment and budget for remote work setup.
  • Extra days off for work-life balance.
Job details
  • Seniority level: Mid-Senior level
  • Employment type: Full-time
  • Job function: Information Technology
  • Industries: Software Development
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.