Attiva gli avvisi di lavoro via e-mail!

GCP Data Engineer/Data Architect

NTT

Torino

In loco

EUR 40.000 - 70.000

Tempo pieno

30+ giorni fa

Descrizione del lavoro

A leading company in the tech sector is seeking a skilled Data Architect to enable data-driven decision making. You'll design and operationalize data systems that emphasize security, compliance, and scalability. With a focus on building efficient cloud-native data pipelines, this position will leverage advanced data technologies. Ideal candidates will have extensive experience in data engineering, cloud environments, and excellent communication skills, contributing significantly to business value.

Competenze

  • 5-10 years’ experience in a data engineering role.
  • Expertise in Scala/Java/Python and Advanced SQL.
  • Experience with data migration and building cloud-native data pipelines.

Mansioni

  • Build large-scale batch and real-time data pipelines.
  • Drive understanding of changing business needs.
  • Participate in project planning and track execution.

Conoscenze

Data Processing
Data Modeling
Communication
Analytical Thinking
Python
SQL
Scala/Java
Cloud Architecture
Agile Methodologies
DevOps Principles

Formazione

Bachelor’s degree in Computer Science or relevant field

Strumenti

BigQuery
GCP
Cloud Storage
Dataflow
Hadoop
Hive
Descrizione del lavoro

Social network you want to login/join with:

Responsibilities

A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value.

  • Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution

Required Skills

  • Bachelor’s degree in Computer Science, Computer Engineering or relevant field
  • At least 5 - 10years’ experience in a data engineering role
  • Expertise as a software engineering using Scala/Java/Python
  • Experience in Advanced SQL skillset - preference on using BigQuery
  • Good knowledge on Google Managed Services as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion
  • Experience using workflow management
  • Good understand of GCP Architecture batch and streaming
  • Strong knowledge of data technologies and data modeling
  • Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
  • Experience with Data Migration / Data Warehouse
  • Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
  • Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
  • Good understanding of developer tools, CICD etc
  • Excellent communication, empathetic with end users and internal customers.

Nice-to-have:

  • Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
  • Experience with Agile methodologies and DevOps principles
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.