Attiva gli avvisi di lavoro via e-mail!

GCP Data Engineer / Data Architect

NTT

Bologna

In loco

EUR 50.000 - 80.000

Tempo pieno

29 giorni fa

Descrizione del lavoro

A leading company in IT services is seeking a GCP Data Engineer/Data Architect to join their Bologna team. The ideal candidate will be responsible for building data pipelines, ensuring data-driven decision-making, and collaborating with teams to meet business needs. This exciting opportunity requires substantial experience in data engineering and software development in a cloud environment, particularly Google Cloud.

Competenze

  • 5-10 years of experience in a data engineering role.
  • Experience with Google Managed Services.
  • Ability to organize and store complex data for ETL processes.

Mansioni

  • Build large-scale batch and real-time data pipelines.
  • Collaborate with the team on evaluating business needs.
  • Participate in project planning and tracking of activities.

Conoscenze

Software engineering using Scala
Software engineering using Java
Software engineering using Python
Advanced SQL skills
GCP architecture knowledge
Data modeling
Communication skills

Formazione

Bachelor’s degree in Computer Science
Bachelor's degree in Computer Engineering

Strumenti

Google Cloud Platform
BigQuery
Cloud Storage
Dataflow
Data Warehousing

Descrizione del lavoro

Social network you want to login/join with:

GCP Data Engineer/Data Architect, Bologna

Client: NTT

Location: Bologna, Italy

Job Category: Other

EU work permit required: Yes

Job Reference: 0b4577785896

Job Views: 5

Posted: 09.07.2025

Expiry Date: 23.08.2025

Job Description:

NTT DATA, Trusted Global Innovator, is among the leading players worldwide in IT services. With over 140,000 professionals in more than 50 countries, we are at the forefront of digital transformation, offering tailored technological and innovative solutions. Our people are our engine, each with their own uniqueness, talent, and attitude. We foster a Smile Working Company culture prioritizing care, listening, well-being, and skill development. Our workspaces promote community and constructive exchange of experiences.

We look to the future with the same passion as yesterday and need your talent!

Responsibilities

A Data Architect is an IT expert who enables data-driven decision-making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be capable of designing, building, operationalizing, securing, and monitoring data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission is to transform raw data into information, creating insights and business value.

  • Build large-scale batch and real-time data pipelines using data processing frameworks on GCP cloud platform.
  • Apply analytical, data-driven approaches to understand rapidly changing business needs.
  • Collaborate with the team to evaluate business needs, liaise with key partners, and address data system management needs.
  • Participate in project planning, identifying milestones, deliverables, and resources; track activities and tasks.
Required Skills
  • Bachelor’s degree in Computer Science, Computer Engineering, or relevant field.
  • At least 5-10 years’ experience in a data engineering role.
  • Expertise in software engineering using Scala/Java/Python.
  • Advanced SQL skills, preferably with BigQuery.
  • Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
  • Experience with workflow management tools.
  • Strong understanding of GCP architecture for batch and streaming data processing.
  • Deep knowledge of data technologies and data modeling.
  • Experience building modern, cloud-native data pipelines with an ELT approach.
  • Experience with Data Migration and Data Warehousing.
  • Ability to organize, normalize, and store complex data for ETL processes and end-users.
  • Passion for designing ingestion and transformation of data from multiple sources.
  • Familiarity with developer tools and CICD processes.
  • Excellent communication skills, empathetic towards end users and internal customers.
Nice-to-have:
  • Google Cloud Data Engineer Certification.
  • Experience with Big Data ecosystems like Hadoop, Hive, HDFS, HBase.
  • Experience with Agile methodologies and DevOps principles.

Location: Milano, Bari, Bologna, Cosenza, Napoli, Roma, Salerno, Torino, Treviso, Pisa

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.