AWS - Data Architect

Solo per membri registrati
Salerno
EUR 50.000 - 70.000
Descrizione del lavoro

Overview

JOB DESCRIPTION NTT DATA, Trusted Global Innovator, è tra i principali player a livello mondiale in ambito IT services. Con più di 140.000 professionisti in oltre 50 Paesi in tutto il mondo, siamo protagonisti e acceleratori della trasformazione digitale offrendo ai nostri clienti soluzioni tecnologiche e innovative progettate su misura. Il motore di NTTDATA sono le persone, ognuna con la propria unicità, talento ed attitudine. Abbiamo costruito una Smile Working Company in cui la cura, l’ascolto delle persone, il loro benessere e sviluppo delle competenze sono la nostra priorità. Guardiamo al nostro domani con la stessa passione di ieri e abbiamo bisogno anche del tuo talento!

Responsibilities

  • Create and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services (such as GLUE, Lambda) or using data management technologies (such as Talend or Informatica)
  • Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
  • Design operations architecture and conduct performance engineering for large scale data lakes in production environment
  • Participate in client design workshops and provide tradeoffs and recommendations towards building solutions
  • Mentor other engineers in coding best practices and problem solving

Required Skills / Qualifications

  • Working experience in a cloud native environment in one of the 3 major public clouds (GCP, AWS, Azure), at least 5 year experience on AWS is preferred
  • Experience and knowledge of Big Data Architectures, cloud and on premise
  • AWS infrastructure and networking working experience
  • AWS Collection Services: Kinesis, Kafka, Database Migration Service
  • AWS main Storage Service: S3, RDS, Redshift, DynamoDB
  • AWS main Compute Service: EC2, Lambda, ECS, EKS
  • Experience in building and deliver proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
  • Working experience in migrating workloads from on premise to cloud environment
  • Experience in monitoring distributed infrastructure, using AWS tools or open source ones such as CloudWatch, Prometheus, and the ELK stack
  • Proven experience in: Java, Scala, Python, and shell scripting
  • Working experience with: Apache Spark, Databricks, Azure Data Factory, Azure Synapse, and other Azure related ETL/ELT tools
  • AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
  • Working experience with Agile Methodology and Kanban
  • SQL language knowledge
  • Experience working with source code management tools such as AWS CodeCommit or GitHub
  • Location: Bologna, Roma, Milano, Torino, Bari, Cosenza, Napoli, Treviso, Pisa e Salerno