At Liberty, we employ more than 6,000 people across 7 businesses in 18 African countries.
Every day, our employees grow their knowledge by working with diverse groups of people who specialise in a wide range of skills across insurance, asset management, investment, and health products.
We continually seek to engage, develop, recognise, and reward the people who make our business great.
Job Description
We are looking for a Data Engineer to build and manage AWS data pipelines to ensure systems are scalable, reliable, and optimized for analytics, machine learning, and various business needs.
Key Responsibilities
- Design and implement large-scale enterprise data solutions using technologies such as AWS Glue, AWS Step Functions, AWS Redshift, AWS Lambda, AWS Athena, AWS Lake Formation, Spark, and Python.
- Analyze, re-architect, and re-platform on-premise data warehouses to AWS cloud data platforms, utilizing AWS and third-party services.
- Design and build production data pipelines from ingestion to integration within a big data architecture, using PySpark, Python, and SQL.
- Design, implement, and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.
- Collaborate with other technology teams to extract, transform, and load data from various sources using SQL and AWS big data technologies.
- Research latest big data and visualization technologies to enhance capabilities and efficiency.
- Implement advanced analytics algorithms for statistical analysis, prediction, clustering, and machine learning.
- Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.
- Work with modern data practices, supporting highly available, distributed systems for data extraction, ingestion, and processing.
- Perform root cause analysis on data and processes to answer business questions and identify improvements.
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
Minimum Requirements
- Proficiency in SQL, Python, and PySpark.
- Experience building and operating large-scale data pipelines and distributed systems.
- Experience with AWS services such as Glue, Athena, DynamoDB, Redshift, Lambda, and Step Functions.
- Bachelor's Degree in Computer Science, Information Technology, or related field.
- Experience in data engineering roles supporting cross-functional teams.
Preferred Skills
- Experience with AWS EC2, EMR, RDS, DynamoDB.
- Knowledge of software engineering best practices, agile methodologies, and DevOps.
Liberty Group Limited is an equal opportunity employer and encourages applications from diverse backgrounds, including people with disabilities.