Job Search and Career Advice Platform

Attiva gli avvisi di lavoro via e-mail!

Data Devops

Azienda Riservata Italia

Lazio

Ibrido

EUR 50.000 - 70.000

Tempo pieno

Oggi
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

A leading international services group in Italy is seeking a Data Ops Engineer to design, deploy, and manage data infrastructure. The ideal candidate has strong experience with AWS, CI/CD pipelines, and automation tools. This position offers flexibility with on-site work in Milan twice a month and focuses on building scalable, reliable data solutions.

Competenze

  • Strong experience with AWS services such as S3, Glue, ECS, EKS.
  • Hands-on experience with Infrastructure as Code tools like Terraform.
  • Proficient in Python or other scripting languages.

Mansioni

  • Collaborate with teams to design and operate data infrastructures.
  • Build and manage data platform environments leveraging AWS.
  • Implement and maintain CI / CD pipelines for data workflows.

Conoscenze

AWS services
CI / CD pipelines
Infrastructure as Code
Data orchestration tools
Python
Containerization
Monitoring tools
Descrizione del lavoro

Data Ops Engineer

The group, operating internationally and active in the services sector, is looking for a Data Ops Engineer.

Key Responsibilities
  • Collaborate with Data Engineers, Dev Ops and Architects teams to design, deploy and operate scalable and reliable data infrastructures supporting data ingestion, analytics and AI projects;
  • Build, automate and manage data platform environments (data lakes, data warehouses, streaming systems) leveraging AWS services and Infrastructure as Code practices;
  • Implement and maintain CI / CD pipelines for data workflows, ensuring high availability, observability, and security across all environments;
  • Develop monitoring, logging, and alerting systems to ensure performance, reliability, and cost optimization of data workloads;
  • Contribute to the evolution of a data‑centric culture by enabling fast, safe, and repeatable deployment of data solutions;

Work within an Agile team with a collaborative mindset, contributing to continuous improvement of processes, automation, and platform reliability.

Required Skills
  • Strong experience with AWS services (e.g. S3, Glue, ECS, EKS, Lambda, CloudFormation, IAM, CloudWatch);
  • Solid understanding of CI / CD pipelines and tools (e.g. GitHub Actions, Jenkins, CodePipeline, dbt Cloud);
  • Hands‑on experience with Infrastructure as Code (Terraform, AWS CDK, or CloudFormation);
  • Familiarity with data orchestration tools (Airflow, Prefect, Dagster) and ETL / ELT frameworks;
  • Proficient in Python or other scripting languages for automation and operational tasks;
  • Experience with containerization and orchestration (Docker, Kubernetes);
  • Good knowledge of monitoring and observability tools (Prometheus, Grafana, ELK, Datadog);

Strong focus on reliability, automation, and scalability of data systems.

Smart working: 2 days per month on‑site in Milan, with great flexibility.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.