Attiva gli avvisi di lavoro via e-mail!

AWS - Data Architect

NTT

Torino

In loco

EUR 55.000 - 75.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Descrizione del lavoro

A leading data solutions company in Torino is seeking an AWS Data Architect to support AWS Data Platform projects. The role requires expertise in cloud-native architectures, AWS services, and mentoring skills. Ideal candidates should have at least 5 years of AWS experience and proficiency in data ingestion solutions. This position offers a dynamic environment with opportunities for growth.

Competenze

  • At least 5 years of experience on AWS preferred.
  • Experience in migrating workloads to cloud environments.
  • Knowledge of SQL language.

Mansioni

  • Create and maintain optimal data pipeline architecture.
  • Design and optimize data models on AWS Cloud.
  • Participate in client design workshops.

Conoscenze

Cloud-native environment experience
Big Data Architectures knowledge
AWS infrastructure and networking
Proficient in Java, Scala, Python
Experience with Apache Spark

Formazione

AWS Certification: Solutions Architect/Data Analytics

Strumenti

AWS CodeCommit
GitHub
CloudWatch
Prometheus
ELK stack

Descrizione del lavoro

Social network you want to login/join with:

Responsibilities

For Data Intelligence Line NTT DATA is looking for an AWS Data Architect.

Your role as an AWS Data Architect in NTT Data will be supporting customers in AWS Data Platform projects, working in partnership with business analysts and solution architects to understand use cases, data needs, and outcome objectives.

We are looking for an engineer that can be the bridge between Data Science and Data Engineering, having a clear understanding of both worlds.

Responsibilities:

  • Create and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services (suchas GLUE, Lambda) or using data management technologies (such as Talend or Informatica)
  • Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
  • Design operations architecture and conduct performance engineering for large scale data lakes in production environment
  • Participate in client design workshops and provide tradeoffs and recommendations towards building solutions.
  • Mentor other engineers in coding best practices and problem solving

Required Skills

  • Working experience in a cloud native environment in one of the 3 major public clouds (GCP, AWS, Azure), at least 5 year experience on AWS is preferred
  • Experience and knowledge of Big Data Architectures, cloud and on premise
  • AWS infrastructure and networking working experience
  • AWS main Storage Service: S3, RDS, Redshift, DynamoDB
  • AWS main Compute Service: EC2, Lambda, ECS, EKS
  • Experience in building and deliver proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
  • Working experience in migrating workloads from on premise to cloud environment
  • Experience in monitoring distributed infrastructure, using AWS tools or open source ones Experience in monitoring distributed infrastructure, using AWS tools or open source ones such as CloudWatch, Prometheus, and the ELK stack
  • Proven experience in: Java, Scala, Python, and shell scripting
  • Working experience with: Apache Spark, Databricks, Azure Data Factory, Azure Synapse, and other Azure related ETL/ELT tools
  • AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
  • Working experience with Agile Methodology and Kanban
  • SQL language knowledge
  • Experience working with source code management tools such as AWS CodeCommit or GitHub
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.