Attiva gli avvisi di lavoro via e-mail!

GCP Data Engineer/Data Architect

Experteer Italy

Roma

Ibrido

EUR 45.000 - 80.000

Tempo pieno

2 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

An established industry player is seeking a skilled Data Architect to join their innovative team. This role involves designing and operationalizing data processing systems that drive data-driven decision-making. You will build large-scale data pipelines using advanced technologies on the Google Cloud Platform, ensuring compliance and efficiency. If you are passionate about transforming raw data into valuable insights and thrive in a collaborative environment, this opportunity is perfect for you. Join a company that values community and personal development, and be part of the digital transformation journey.

Competenze

  • 5-10 years of experience in data engineering roles.
  • Expertise in building cloud-native data pipelines.

Mansioni

  • Design and build data processing systems with a focus on security and compliance.
  • Create large-scale batch and real-time data pipelines.

Conoscenze

Scala
Java
Python
Advanced SQL
Google Cloud Platform
Data Modeling
Data Migration
CICD

Formazione

Bachelor's degree in Computer Science

Strumenti

BigQuery
Google Cloud Storage
Dataflow
Dataproc
Data Fusion
Hadoop
Hive

Descrizione del lavoro

Offer description

NTT DATA, Trusted Global Innovator, è tra i principali player a livello mondiale in ambito IT services. Con più di 140.000 professionisti in oltre 50 Paesi in tutto il mondo, siamo protagonisti e acceleratori della trasformazione digitale offrendo ai nostri clienti soluzioni tecnologiche e innovative progettate su misura. Il motore di NTTDATA sono le persone, ognuna con la propria unicità, talento ed attitudine. Abbiamo costruito una Smile Working Company in cui la cura, l’ascolto delle persone, il loro benessere e sviluppo delle competenze sono la nostra priorità. Abbiamo creato spazi di lavoro che favoriscono il senso di comunità e lo scambio costruttivo di esperienze.

Guardiamo al nostro domani con la stessa passione di ieri e abbiamo bisogno anche del tuo talento!

www.nttdata.com/it

Responsibilities

A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value.

  • Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
Requirements

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering or relevant field
  • At least 5 - 10years’ experience in a data engineering role
  • Expertise as a software engineering using Scala/Java/Python
  • Experience in Advanced SQL skillset - preference on using BigQuery
  • Good knowledge on Google Managed Services as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion
  • Experience using workflow management
  • Good understand of GCP Architecture batch and streaming
  • Strong knowledge of data technologies and data modeling
  • Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
  • Experience with Data Migration / Data Warehouse
  • Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
  • Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
  • Good understanding of developer tools, CICD etc
  • Excellent communication, empathetic with end users and internal customers.

Nice-to-have:

  • Google Cloud Data Engineer Certification
  • Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
  • Experience with Agile methodologies and DevOps principles

Location: Milano, Bari, Bologna, Cosenza, Napoli, Roma, Salerno, Torino, Treviso, Pisa

Type
Híbrido/Flexible
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.