Job Search and Career Advice Platform

Attiva gli avvisi di lavoro via e-mail!

Data Engineer

MSC CRUISES

Campania

In loco

EUR 45.000 - 70.000

Tempo pieno

13 giorni fa

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

A leading cruise operator based in Naples is seeking a Data Engineer to design, develop, and support high-impact data pipelines. Candidates should have over 5 years of experience in Data Engineering, strong Data Warehousing concepts, and a grasp of advanced analysis tools. The role emphasizes collaboration with cross-functional teams and problem-solving skills in an agile working environment. Knowledge of various programming languages including Python and SQL is essential. This full-time position requires the right to work in Italy.

Competenze

  • 5+ years of hands-on experience in architecture and development of Data Engineering solutions.
  • Strong understanding of Data warehousing and Data Science concepts.
  • Experience with Agile Software Development methodologies is preferred.

Mansioni

  • Design and maintain scalable data pipelines for data ingestion and transformation.
  • Implement systems to monitor and ensure data quality.
  • Collaborate with data scientists and architects.

Conoscenze

Data Analysis expertise
Problem solving and troubleshooting skills
Excellent oral and written English communication skill

Formazione

Degree in Computer Science or equivalent

Strumenti

Python
SQL
Azure Synapse Analytics
ETL Tools
Big Data Technologies
BI Tools
Descrizione del lavoro
JOB PURPOSE

MSC Cruises, the 3rd largest cruise operator globally, is seeking a highly skilled and experienced Data Engineer to join our dynamic and international team in Naples . The candidate should be able to work independently in an agile way, prioritize workload, and adhere to tight deadlines, within a collaborative team environment. He/she will be responsible for the design, development, implementation, and support of business-critical data pipelines enabling the enterprise Business Intelligence, Data Visualization and Advanced Analytics solutions. The candidate will have a deep understanding of Data Architecture, Data Warehousing concepts as well as Data Analytics and Visualization tools that would harness a large amount of data from various sources both on premise and in cloud.

KEY ACCOUNTABILITIES
  • Design, development and maintenance of scalable data pipelines for ingesting and transforming data coming from different and complex data sources both On-Prem and in Cloud
  • Design data integrations and data quality (reconciliation) framework
  • Implementation processes and systems to monitor data quality, ensuring production data is always accurate and available for business processes
  • Performing Data analysis required to troubleshoot data related issues
  • Assisting in selection of the appropriate technology for the company's needs.
  • Creation custom software applications
  • Creation, running and documenting unit/integration tests
  • Collaboration with Data scientists and the Data architects
  • Close work with teams of frontend and backend developers
QUALIFICATIONS (skills, competencies, experience)
  • Degree in Computer Science, IT / IT management, systems analysis, or a comparable qualification in a technology or software engineering discipline
  • 5 + years of hands‑on experience in architecture, design and development of enterprise Data Engineering solutions and integrations
  • Strong understanding Datawarehousing and Data Science concepts
  • Data Analysis expertise
  • Problem solving and troubleshooting skills
  • Experience or knowledge of Agile Software Development methodologies
  • Excellent oral and written English communication skill
Tech Skills
  • Python, Scala, Java, R, SQL
  • SQL or SQL DBMS (MS SQL Server, Oracle, MySql, Azure SQL, MongoDB, CosmosDB, Cloud Warehouse Technologies)
  • Azure Synapse Analytics, BigQuery, Snowflake
  • ETL\\ELT tools (MS SSIS, Informatica PowerCenter, Azure Data Factory, PowerQuery)
  • Big Data Technologies (Hadoop, Kafka, Apache Spark, Apache Spark Streaming, Azure Databricks, Delta Tables, ADLS Storage Gen2)
  • ML frameworks and libraries (Azure ML, Dataiku, TensorFlow, PyTorch)
  • BI Tools (PowerBI, QlikSense, SAP BO)
VISA REQUIREMENTS (if any)
  • Right to work in Italy.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.