Data Engineer (Contractor) – Insurance / Financial Services
Location: Johannesburg, Gauteng
The Role: Data Engineer (Contractor)
This contract role is key to designing, building, and maintaining high‑quality, secure, and scalable data solutions for a leading insurance client. You’ll work across modern data platforms and cloud environments, enabling accurate insights and supporting analytics and machine learning initiatives. The role is initially a 12‑month contract, with strong potential to extend based on performance and project needs.
- Apply extensive experience to deliver robust, efficient data solutions.
- Develop, maintain, and optimize data pipelines using Azure Databricks.
- Integrate and manage Oracle databases to ensure optimal security and performance.
- Write and optimize SQL, applying strong data modeling principles.
- Collaborate with cross‑functional teams to ensure data quality, accessibility, and scalability.
- Support machine learning and analytical initiatives in the insurance domain.
- Monitor, troubleshoot, and enhance existing data infrastructure.
Key Qualifications
- 3‑5 years of hands‑on Data Engineering experience.
- Proven experience working with Oracle database environments.
- Solid SQL and data modeling capabilities.
- Experience in insurance or broader financial services is advantageous.
- Strong problem‑solving mindset, with the ability to work independently in a remote‑first setup.
- Collaborative team player with excellent communication skills.
Client Support Data Engineer – Quintessence
Location: Remote (South Africa)
Salary: R18 000 – R25 000 per month (depending on experience)
Provide technical and analytical support to ensure Quintessence implementations run smoothly and meet SLA requirements.
- Minimum 2 years of experience in data analysis, modeling, and troubleshooting.
- Configure and implement Quintessence software for client environments.
- Provide 2nd‑tier client support, including data enhancements and issue resolution.
- Understand and manage client data requirements within the financial markets.
- Build and maintain end‑to‑end data service solutions and integrations.
- Develop queries combining multiple data sources while ensuring data integrity.
- Recommend improvements to data reliability, efficiency, and quality.
- Provide structured feedback to development teams on functionality and issues.
- Design user interfaces for data uploads and visualisation.
Key Qualifications
- 2+ years of experience working with data in a technical support or engineering role.
- Degree in Statistics, Mathematics, Engineering, Informatics, or related field.
- Strong SQL and Excel skills, plus a programming language such as Python (essential).
- Exposure to data visualisation tools (Power BI, Tableau, or QlikView) (advantageous).
- Knowledge of APIs, ETL processes, or data warehousing (advantageous).
- Background in financial services or asset management (distinct advantage).
- Client‑focused mindset with excellent communication skills.
- Ability to multitask, manage competing priorities, and meet deadlines.
AWS Data Engineer (6‑Month Contract)
Location: Remote (supporting an international client)
Design & optimize data pipelines and ETL processes.
Work with AWS services: S3, Glue, Redshift, DBT, Spark, Terraform.
Support cloud integration and modernization projects.
Ensure system performance, monitoring & reliability.
Enforce data security, governance, and compliance standards.
Collaborate with global, cross‑functional teams.
Qualifications
- 5+ years' experience as a Data Engineer (Intermediate‑Senior).
- Hands‑on expertise in AWS data services.
- Strong SQL, data modelling, and pipeline management skills.
- Familiarity with CI/CD, Git, and infrastructure‑as‑code.
- Excellent collaboration and problem‑solving skills.
Why Join?
Competitive contract compensation; work with cutting‑edge AWS technologies; collaborate with international teams.
Data Engineer – Education Focus (Sample)
- Education: Bachelor of Science in Computer Science, Big Data, Database Administration or related fields.
- 3+ years in Advanced Data Engineering.
- Experience with on‑premises and cloud data pipelines (M. Fabric).
- SQL, Python, R or Power BI; knowledge of Oracle, Teradata, Snowflake.
- Experience in data warehousing and ETL; Yellowfin knowledge is beneficial.
- Experience with telecommunications / financial services or Fintech is a plus.
- Client‑focused mindset with excellent communication.
NOTE
All postings include standard requirements; some items reference Access Bank and other clients.