Attiva gli avvisi di lavoro via e-mail!

Google Cloud Data Engineer Certification

NTT DATA Corporation

L'Aquila

In loco

EUR 50.000 - 80.000

Tempo pieno

2 giorni fa
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Inizia da zero o importa un CV esistente

Descrizione del lavoro

A leading company in data solutions is seeking a Data Architect to enhance data-driven decisions through effective data management. This role involves designing and maintaining scalable data processing systems in a cloud environment, with responsibilities ranging from implementing advanced data pipelines to collaborating with multi-disciplinary teams. Ideal candidates will possess a strong background in data engineering and solid practical experience with cloud technologies.

Competenze

  • 5-10 years of experience in a data engineering role.
  • Proficient in Scala, Java, or Python.
  • Strong knowledge of data technologies and data modeling.

Mansioni

  • Build large-scale batch and real-time data pipelines on GCP.
  • Collaborate with teams to prioritize business needs and address data system requirements.
  • Participate in project planning and track deliverables.

Conoscenze

Data processing frameworks
Analytical approaches
Collaboration
BigQuery
Python
Cloud-native data pipelines
ETL processes
Data modeling
Communication skills

Formazione

Bachelor’s degree in Computer Science

Strumenti

Google Managed Services
Big Data technologies
CI/CD pipelines

Descrizione del lavoro

Responsibilities

A Data Architect is an IT expert responsible for enabling data-driven decision making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be capable of designing, building, operationalizing, securing, and monitoring data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The primary goal of a Data Architect is to convert raw data into information, generating insights and business value.

  • Build large-scale batch and real-time data pipelines using data processing frameworks on the GCP cloud platform.
  • Apply analytical, data-driven approaches to gain a deep understanding of rapidly changing business needs.
  • Collaborate with teams to evaluate business needs and priorities, liaise with key business partners, and address team requirements related to data systems and management.
  • Participate in project planning by identifying milestones, deliverables, and resource needs; track activities and task execution.

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering, or a related field.
  • 5-10 years of experience in a data engineering role.
  • Proficiency as a software engineer using Scala, Java, or Python.
  • Advanced SQL skills, preferably with BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience with workflow management tools.
  • Solid understanding of GCP architecture for batch and streaming data processing.
  • Strong knowledge of data technologies and data modeling.
  • Expertise in building modern, cloud-native data pipelines and operations following an ELT approach.
  • Experience with data migration and data warehousing.
  • Ability to organize, normalize, and store complex data to support ETL processes and end-user needs.
  • Passion for designing ingestion and transformation processes for data from multiple sources to create cohesive data assets.
  • Good understanding of developer tools, CI/CD pipelines, etc.
  • Excellent communication skills, with empathy towards end users and internal customers.

Nice-to-have :

  • Experience with Big Data ecosystems such as Hadoop, Hive, HDFS, HBase.
  • Experience with Agile methodologies and DevOps principles.

J-18808-Ljbffr

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.