Senior Data Engineer / AWS Migration Lead
Key Responsibilities
1. Migration & Technical Implementation
- Lead migration of on-premises and SharePoint/legacy systems to our cloud-based data platform.
- Collaborate with cloud engineering teams to develop migration strategies and implementation plans.
- Re-architect existing Python scripts and UiPath automation workflows for cloud-native services.
- Design and implement scalable ETL/ELT pipelines and data integration workflows.
- Migrate Tableau dashboards to modern BI platforms such as QuickSight, ensuring full functionality.
- Establish data connections from multiple enterprise systems.
- Ensure seamless transition with minimal disruption.
2. Data Infrastructure & Architecture
- Guide and plan data engineering processes across digital systems.
- Develop and maintain data lakes, warehouses, and database infrastructure.
- Define data architecture standards and DevOps best practices.
- Support the development of pipelines integrating data from multiple platforms.
3. Data Quality & Governance
- Implement data validation, cleansing, and harmonisation processes to ensure integrity.
- Monitor pipelines for issues and performance bottlenecks.
- Support data governance, access control, and compliance needs.
4. System Maintenance & Documentation
- Maintain cloud and on-prem data infrastructure.
- Troubleshoot and resolve data-related issues.
- Create and maintain documentation for engineering processes.
5. Stakeholder Collaboration
- Work with product teams, analysts, and stakeholders to translate business requirements.
- Develop and deploy data tables, marts, and visulisation layers for reporting.
- Support users in accessing analytics dashboards.
6. Training & Knowledge Transfer
- Mentor and train junior engineers.
- Conduct knowledge-sharing sessions.
- Prepare training materials for new systems.
7. Innovation & Continuous Improvement
- Stay updated with emerging technologies and contribute to process enhancements.
- Participate in continuous learning and internal capability development.
Skills & Knowledge
- Proficiency in Python, and familiarity with Java or Scala.
- Strong knowledge of SQL, data modelling, schema design, and ETL/ELT processes.
- Experience with AWS services such as S3, Lambda, Glue, SageMaker, Athena, RDS, QuickSight.
- Familiarity with big data and analytics frameworks (e.g., Spark, Databricks).
- Experience with version control (Git) and DevOps pipelines (GitLab, CI/CD, Nexus).
- Understanding serverless architectures and Infrastructure-as-Code (CloudFormation, YAML/JSON).
- Knowledge of data governance, security best practices, and enterprise access controls.
- Experience with BI tools such as Tableau, Power BI, or QuickSight.
- Strong communication, analytical thinking, and stakeholder management skills.
Requirements
- Bachelor’s degree in Computer Science, Data Science, Engineering, IT, or related field.
- 5–7+ years of experience in data engineering, cloud infrastructure, or platform engineering.
- Experience with cloud platforms (preferably AWS) and data pipeline development.
- Experience with data visualisation and business intelligence tools.
- Familiarity with agile development methodologies.
- Relevant certifications such as AWS Certified Data Engineer or Databricks Certified Data Engineer are advantageous.