Attiva gli avvisi di lavoro via e-mail!

Google Cloud Data Engineer Certification

NTT DATA Corporation

Cremona

In loco

EUR 45.000 - 70.000

Tempo pieno

2 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading company seeks a Data Architect to drive data-driven decision-making. This role focuses on designing and building secure, scalable data pipelines while ensuring compliance and efficiency. Ideal candidates will bring at least 5 years of experience in data engineering and a strong technical background in programming languages and cloud services.

Competenze

  • 5-10 years’ experience in data engineering role.
  • Expertise in Scala, Java, Python, and Advanced SQL.
  • Experience with Google Managed Services and Data Migration.

Mansioni

  • Build large-scale batch and real-time data pipelines on GCP.
  • Work with business partners to evaluate needs related to data systems.
  • Participate in project planning and track task execution.

Conoscenze

Data Technologies
Data Modeling
ETL
Scala
Java
Python
SQL
GCP Architecture
Agile Methodologies
DevOps Principles

Formazione

Bachelor’s degree in Computer Science or relevant field

Strumenti

BigQuery
Cloud Storage
Dataflow
Dataproc
Data Fusion
Hadoop
Hive
HDFS
Hbase

Descrizione del lavoro

Responsibilities

A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value.

  • Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform.
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering or relevant field.
  • At least 5 - 10 years’ experience in a data engineering role.
  • Expertise as a software engineer using Scala / Java / Python.
  • Experience in Advanced SQL skillset - preference on using BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience using workflow management.
  • Good understanding of GCP Architecture for batch and streaming.
  • Strong knowledge of data technologies and data modeling.
  • Expertise in building modern, cloud-native data pipelines and operations, with an ELT philosophy.
  • Experience with Data Migration / Data Warehouse.
  • Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users.
  • Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset.
  • Good understanding of developer tools, CICD, etc.
  • Excellent communication skills, empathetic with end users and internal customers.
  • Nice-to-have :

  • Experience using Big Data ecosystem : Hadoop, Hive, HDFS, Hbase.
  • Experience with Agile methodologies and DevOps principles.
  • J-18808-Ljbffr

    Ottieni la revisione del curriculum gratis e riservata.
    oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.