Attiva gli avvisi di lavoro via e-mail!

Google Cloud Data Engineer Certification

NTT DATA Corporation

Venezia

In loco

EUR 40.000 - 70.000

Tempo pieno

16 giorni fa

Descrizione del lavoro

A leading technology company is seeking a Data Architect to facilitate data-driven decision-making. The role involves designing and monitoring data processing systems, requiring extensive experience in data engineering and a strong proficiency in SQL and cloud technologies. The ideal candidate will effectively turn raw data into actionable insights, ensuring compliance and security while creating significant business value.

Competenze

  • Minimum 5-10 years of experience in a data engineering role.
  • Expertise with data processing frameworks on GCP.
  • Strong understanding of ETL processes and data normalization.

Mansioni

  • Build large-scale batch and real-time data pipelines in GCP.
  • Collaborate with teams to address data system requirements.
  • Participate in project planning, including milestones and deliverables.

Conoscenze

SQL
Scala
Java
Python
Data modeling
Data technologies
GCP architecture

Formazione

Bachelor’s degree in Computer Science

Strumenti

BigQuery
Cloud Storage
Dataflow
Dataproc
Data Fusion

Descrizione del lavoro

Responsibilities

A Data Architect is an IT expert who enables data-driven decision making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission of a Data Architect is to turn raw data into information, creating insights and business value.

  • Build large-scale batch and real-time data pipelines using data processing frameworks in GCP cloud platform.
  • Apply an analytical, data-driven approach to understand rapidly changing business needs.
  • Collaborate with the team to evaluate business needs and priorities, liaise with key business partners, and address team requirements related to data systems and management.
  • Participate in project planning by identifying milestones, deliverables, and resource requirements; track activities and task execution.

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering, or a relevant field.
  • At least 5-10 years of experience in a data engineering role.
  • Expertise as a software engineer using Scala, Java, or Python.
  • Advanced SQL skills, preferably with BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience with workflow management tools.
  • Strong understanding of GCP architecture for batch and streaming data processing.
  • Deep knowledge of data technologies and data modeling.
  • Expertise in building modern, cloud-native data pipelines and operations following an ELT philosophy.
  • Experience with data migration and data warehouse solutions.
  • Ability to organize, normalize, and store complex data effectively for ETL processes and end-user access.
  • Passion for designing data ingestion and transformation processes from multiple sources to create cohesive data assets.
  • Good understanding of developer tools, CI/CD practices, etc.
  • Excellent communication skills, with empathy for end users and internal customers.

Nice-to-have :

  • Experience with Big Data ecosystem tools such as Hadoop, Hive, HDFS, HBase.
  • Experience with Agile methodologies and DevOps principles.

J-18808-Ljbffr

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.