Google Cloud Data Engineer Certification

Solo per membri registrati
Udine
EUR 50.000 - 80.000
Descrizione del lavoro

Responsibilities

A Data Architect is an IT expert who enables data-driven decision making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission of a Data Architect is to turn raw data into information, creating insights and business value.

  • Build large-scale batch and real-time data pipelines using data processing frameworks on the GCP cloud platform.
  • Apply an analytical, data-driven approach to gain a deep understanding of rapidly changing business needs.
  • Collaborate with the team to evaluate business needs and priorities, liaise with key business partners, and address team requirements related to data systems and management.
  • Participate in project planning, identifying milestones, deliverables, resource requirements, and tracking activities and task execution.

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering, or a relevant field.
  • 5 to 10 years of experience in a data engineering role.
  • Expertise as a software engineer using Scala, Java, or Python.
  • Advanced SQL skills, preferably with BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience with workflow management tools.
  • Strong understanding of GCP architecture for batch and streaming data processing.
  • Extensive knowledge of data technologies and data modeling.
  • Proficiency in building modern, cloud-native data pipelines and operations following an ELT philosophy.
  • Experience with data migration and data warehousing.
  • Ability to organize, normalize, and store complex data to support ETL processes and end-user needs.
  • Passion for designing ingestion and transformation processes for data from multiple sources to create cohesive data assets.
  • Good understanding of developer tools, CI/CD pipelines, etc.
  • Excellent communication skills, with empathy for end users and internal customers.

Nice-to-have :

  • Experience with Big Data ecosystems such as Hadoop, Hive, HDFS, HBase.
  • Experience with Agile methodologies and DevOps principles.

J-18808-Ljbffr