Job Search and Career Advice Platform

Enable job alerts via email!

AWS Data Architect

Fractal

Dadri

On-site

INR 20,00,000 - 30,00,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global analytics company located in Dadri, Uttar Pradesh, is seeking an experienced AWS Data Architect. The role requires designing and operationalizing large-scale enterprise data solutions on AWS while leveraging tools such as Spark and DynamoDB. Candidates should have between 9-15 years of experience and hold a Bachelor’s Degree in a relevant field. Strong proficiency in AWS services and Big Data Architectures is essential. Join a passionate team driving innovative data-driven strategies!

Qualifications

  • 9 to 15 years of experience in the industry.
  • Proficiency in AWS main Compute Services: EC2, Lambda, ECS, EKS.
  • Experience in monitoring distributed infrastructure.

Responsibilities

  • Consult and build large scale enterprise data solutions.
  • Analyze and re-architect on-premises data stores to AWS.
  • Design data pipelines within a big data architecture.

Skills

Big Data Architectures
AWS Collection Services
Java
Scala
Python
SQL
Agile Methodology
Data Analytics

Education

Bachelor’s Degree in Computer Science

Tools

AWS Redshift
AWS S3
AWS Glue
AWS Lambda
Apache Spark
Snowflake on AWS
Job description

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

AWS Data Architect

Fractal is a strategic AI and analytics partner to Fortune 500 companies, powering human decisions at scale by integrating AI, Engineering, and Design. With 5000+ consultants across 16 global locations, Fractal combines cutting‑edge technology with human‑centered design.

Responsibilities
  • Consult, Design, build and operationalize large scale enterprise data solutions using one or more of AWS data and analytics services in combination with 3rd parties - Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue, Snowflake and Databricks.
  • Analyze, re‑architect and re‑platform on‑premises data stores/ Databases to modern data platforms on AWS cloud using AWS or 3rd party services.
  • Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
  • Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
  • Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
  • Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud as part of customer consultation and business proposals.
  • Participate in client design workshops and provide trade‑offs and recommendations towards building solutions.
  • Mentor other engineers in coding best practices and problem solving
Requirements
  • 9 to 15 years’ experience in the industry.
  • Bachelor’s Degree in computer science, Information Technology or other relevant fields
  • Experience and knowledge of Big Data Architectures, on cloud and on premise
  • Proficiency in AWS Collection Services: Kinesis, Kafka, Database Migration Service
  • Proficiency in AWS main Storage Service: S3, EBS, EFS
  • Proficiency in AWS main Compute Service: EC2, Lambda, ECS, EKS
  • Proven experience in: Java, Scala, Python, and shell scripting.
  • Working experience with: AWS Athena and Glue Pyspark, EMR, DynamoDB, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflake on AWS
  • Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB
  • AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
  • Working experience with Agile Methodology and Kanban
  • Good knowledge of SQL.
  • Experience in building and delivering proofs‑of‑concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
  • Experience partnering with executive stakeholders as a trusted advisor as well as enabling technical implementers
  • Working experience in migrating workloads from on premise to cloud environment
  • Experience in monitoring distributed infrastructure, using AWS tools or open‑source onesExperience in monitoring distributed infrastructure, usingAWStools or open‑source ones such as CloudWatch, Prometheus, and the ELK stack would be big advantage.

If you like wild growth and working with happy, enthusiastic over‑achievers, you'll enjoy your career with us!

Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top‑right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.