Attiva gli avvisi di lavoro via e-mail!

Google Cloud Data Engineer Certification

NTT DATA Corporation

Cagliari

In loco

EUR 45.000 - 75.000

Tempo pieno

2 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

Un'importante azienda IT cerca un Data Architect esperto per trasformare i dati in informazioni strategiche. La figura sarà responsabile della progettazione e costruzione di sistemi di elaborazione dati su Google Cloud, favorendo un approccio analitico e collaborando con team e partner aziendali. I candidati ideali possiedono una laurea in informatica e una significativa esperienza in ingegneria dei dati, con conoscenze avanzate in SQL e delle architetture GCP.

Competenze

  • Esperienza da 5 a 10 anni in un ruolo di ingegneria dei dati.
  • Capacità di costruire pipeline di dati moderne e basate sul cloud.
  • Eccellenti capacità comunicative e empatia verso gli utenti finali.

Mansioni

  • Costruire pipeline di dati di grande scala utilizzando framework di elaborazione su GCP.
  • Collaborare con il team per valutare le esigenze aziendali e gestire i requisiti dei sistemi di dati.
  • Partecipare alla pianificazione del progetto identificando traguardi e attività.

Conoscenze

SQL
Scala
Java
Python
GCP architecture
Data modeling
Data migration

Formazione

Laurea in Informatica o Ingegneria Informatica

Strumenti

BigQuery
Google Cloud Storage
Dataflow
Hadoop
Hive
HDFS
HBase

Descrizione del lavoro

Responsibilities

A Data Architect is an IT expert who enables data-driven decision making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission of a Data Architect is to turn raw data into information, creating insights and business value.

  • Build large-scale batch and real-time data pipelines using data processing frameworks on the GCP cloud platform.
  • Apply an analytical, data-driven approach to understand rapidly changing business needs.
  • Collaborate with the team to evaluate business needs and priorities, liaise with key business partners, and address team requirements related to data systems and management.
  • Participate in project planning by identifying milestones, deliverables, resource requirements, and tracking activities and task execution.

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering, or a relevant field.
  • At least 5-10 years of experience in a data engineering role.
  • Expertise as a software engineer using Scala, Java, or Python.
  • Advanced SQL skills, preferably with BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience with workflow management tools.
  • Strong understanding of GCP architecture for batch and streaming data processing.
  • Extensive knowledge of data technologies and data modeling.
  • Experience in building modern, cloud-native data pipelines and operations following an ELT philosophy.
  • Experience with data migration and data warehouse solutions.
  • Ability to organize, normalize, and store complex data to enable both ETL processes and end-user access.
  • Passion for designing ingestion and transformation processes from multiple data sources to create cohesive data assets.
  • Good understanding of developer tools, CI/CD pipelines, etc.
  • Excellent communication skills, empathetic towards end users and internal customers.

Nice-to-have:

  • Experience with Big Data ecosystem tools such as Hadoop, Hive, HDFS, HBase.
  • Experience with Agile methodologies and DevOps principles.

J-18808-Ljbffr

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.