Overview
Build and maintain AWS datalake infrastructure and develop robust data pipelines that ensure reliable, scalable data ingestion and processing.
Key Responsibilities
- Implement and optimize AWS datalake solutions to support business and technical requirements.
- Build, monitor, and maintain data integration pipelines for diverse data sources.
- Collaborate with cross-functional teams to ensure data quality, security, and compliance standards.
- Develop and maintain data pipeline documentation, monitoring, and alerting systems.
- Troubleshoot and resolve data pipeline issues to ensure high availability and performance.
Required Technical And Professional Expertise
- Strong hands-on experience with AWS services, particularly AWS Lake Formation, AWS Glue, AWS S3, and related data services.
- 5-7+ years in data engineering, ETL development, or similar technical roles.
- Proficiency in programming languages such as Python, SQL, and experience with data pipeline orchestration tools.
- Experience building and maintaining production data pipelines and data quality frameworks.
- Knowledge of data security best practices and compliance requirements, with public sector experience preferred.
- Experience with Infrastructure as Code (IaC) tools and DevOps practices for data platforms.