Job Search and Career Advice Platform

Attiva gli avvisi di lavoro via e-mail!

Data Devops

Azienda Riservata Italia

Alba Adriatica

Ibrido

EUR 45.000 - 70.000

Tempo pieno

Oggi
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

A leading international services firm is seeking a DataOps Engineer in Alba Adriatica, Italy. You will design and operate scalable data infrastructures, create automated data platform environments, and implement CI/CD pipelines. The ideal candidate has strong experience with AWS services, solid understanding of CI/CD practices, and a focus on automation and reliability. Smart working options included, with flexibility to work on-site in Milan only twice per month.

Competenze

  • Strong experience with AWS services used for data infrastructures.
  • Solid understanding of CI / CD pipelines and tools.
  • Hands-on experience with Infrastructure as Code.

Mansioni

  • Collaborate with cross-functional teams to design and operate data infrastructures.
  • Automate and manage data platform environments.
  • Implement and maintain CI / CD pipelines for data workflows.
  • Develop monitoring and alerting systems for data workloads.
  • Contribute to a data-centric culture through fast deployment.

Conoscenze

AWS services (e.g. S3, Glue, ECS, EKS, Lambda)
CI / CD pipelines and tools (e.g. GitHub Actions, Jenkins)
Infrastructure as Code (Terraform, AWS CDK)
Data orchestration tools (Airflow, Prefect)
Python or other scripting languages
Containerization and orchestration (Docker, Kubernetes)
Monitoring and observability tools (Prometheus, Grafana)
Reliability, automation, and scalability focus
Descrizione del lavoro
DataOps Engineer

The group, operating internationally and active in the services sector, is looking for a DataOps Engineer.

Key Responsibilities
  • Collaborate with Data Engineers, DevOps and Architects teams to design, deploy and operate scalable and reliable data infrastructures supporting data ingestion, analytics and AI projects;
  • Build, automate and manage data platform environments (data lakes, data warehouses, streaming systems) leveraging AWS services and Infrastructure as Code practices;
  • Implement and maintain CI / CD pipelines for data workflows, ensuring high availability, observability, and security across all environments;
  • Develop monitoring, logging, and alerting systems to ensure performance, reliability, and cost optimization of data workloads;
  • Contribute to the evolution of a data-centric culture by enabling fast, safe, and repeatable deployment of data solutions;
  • Work within an Agile team with a collaborative mindset, contributing to continuous improvement of processes, automation, and platform reliability.
Required Skills
  • Strong experience with AWS services (e.g. S3, Glue, ECS, EKS, Lambda, CloudFormation, IAM, CloudWatch);
  • Solid understanding of CI / CD pipelines and tools (e.g. GitHub Actions, Jenkins, CodePipeline, dbt Cloud);
  • Hands‑on experience with Infrastructure as Code (Terraform, AWS CDK, or CloudFormation);
  • Familiarity with data orchestration tools (Airflow, Prefect, Dagster) and ETL / ELT frameworks;
  • Proficient in Python or other scripting languages for automation and operational tasks;
  • Experience with containerization and orchestration (Docker, Kubernetes);
  • Good knowledge of monitoring and observability tools (Prometheus, Grafana, ELK, Datadog);
  • Strong focus on reliability, automation, and scalability of data systems.

Smart working: 2 days per month on-site in Milan, with great flexibility.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.