Attiva gli avvisi di lavoro via e-mail!

AWS - Data Architect

NTT

Bologna

In loco

EUR 50.000 - 80.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Inizia da zero o importa un CV esistente

Descrizione del lavoro

NTT is seeking an AWS Data Architect to enhance data platform projects. The role involves designing data pipelines and solutions on AWS while collaborating with business analysts for effective outcomes. Ideal candidates should have extensive experience in cloud environments, especially AWS, along with strong programming skills in languages like Java and Python.

Competenze

  • At least 5 years of experience on AWS preferred.
  • Experience in cloud migrations and workload management.
  • Strong knowledge in Java, Scala, and Python.

Mansioni

  • Design and implement data ingestion solutions on AWS.
  • Optimize data models and architecture for performance.
  • Mentor engineers in coding best practices.

Conoscenze

Cloud Native Environment
Big Data Architectures
AWS Networking
Java
Python
SQL
Agile Methodology

Formazione

AWS Certified Solutions Architect
AWS Certified Data Analytics

Strumenti

AWS Services
Apache Spark
GitHub

Descrizione del lavoro

Social network you want to login/join with:

Responsibilities

For Data Intelligence Line NTT DATA is looking for an AWS Data Architect.

Your role as an AWS Data Architect in NTT Data will be supporting customers in AWS Data Platform projects, working in partnership with business analysts and solution architects to understand use cases, data needs, and outcome objectives.

We are looking for an engineer that can be the bridge between Data Science and Data Engineering, having a clear understanding of both worlds.

Responsibilities:

  • Create and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services (suchas GLUE, Lambda) or using data management technologies (such as Talend or Informatica)
  • Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
  • Design operations architecture and conduct performance engineering for large scale data lakes in production environment
  • Participate in client design workshops and provide tradeoffs and recommendations towards building solutions.
  • Mentor other engineers in coding best practices and problem solving

Required Skills

  • Working experience in a cloud native environment in one of the 3 major public clouds (GCP, AWS, Azure), at least 5 year experience on AWS is preferred
  • Experience and knowledge of Big Data Architectures, cloud and on premise
  • AWS infrastructure and networking working experience
  • AWS main Storage Service: S3, RDS, Redshift, DynamoDB
  • AWS main Compute Service: EC2, Lambda, ECS, EKS
  • Experience in building and deliver proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
  • Working experience in migrating workloads from on premise to cloud environment
  • Experience in monitoring distributed infrastructure, using AWS tools or open source ones Experience in monitoring distributed infrastructure, using AWS tools or open source ones such as CloudWatch, Prometheus, and the ELK stack
  • Proven experience in: Java, Scala, Python, and shell scripting
  • Working experience with: Apache Spark, Databricks, Azure Data Factory, Azure Synapse, and other Azure related ETL/ELT tools
  • AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
  • Working experience with Agile Methodology and Kanban
  • SQL language knowledge
  • Experience working with source code management tools such as AWS CodeCommit or GitHub
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.