Develop and optimize data pipelines using AWS Glue and Lambda functions.
Utilize Python to build reusable modules, APIs (Fast API / Flask / Django), and automation scripts.
Lead project teams in designing end-to-end data pipelines and ETL workflows.
Troubleshoot data processing issues and optimize performance for ETL workflows. Experiences :
Experienced in working with Python projects for Data Processing, Data Transformation, and building reusable and modular components. Experience in Data Migration is a plus.
Experienced in both classical Python development as well as in Python Notebooks.
Experienced in AWS components such as S3, Glue, Lambda, and Kinesis (KDS)
Experienced with DevOps and Infrastructure oversight.
Required Qualifications :
5+ years of experience in Data Engineering with a focus on cloud-based solutions.
Strong expertise in AWS services like S3, Glue, Lambda, and IAM configurations.
Bachelor’s degree in Computer Science, Information Technology, or related fields.
Soft Skills :
Excellent problem-solving skills with a collaborative mindset.
Strong leadership and communication skills for working with cross-functional teams.
Ability to manage multiple tasks, prioritize effectively, and deliver high-quality results."