
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading recruitment agency in South Africa is seeking a Senior Data Warehousing Engineer to manage and architect high-performance data ecosystems. The ideal candidate must possess 4–7 years of hands-on experience in data engineering, strong proficiency in Python and SQL, and knowledge in cloud data platforms, particularly AWS. This role emphasizes mentoring junior engineers and collaboration with diverse teams, aiming to drive AI-driven decision-making and analytics solutions.
We are seeking a Senior Data Warehousing Engineer with expertise in cloud data platforms, big data pipelines, and advanced analytics.
In this role, you'll architect and maintain scalable, high-performance data ecosystems that power machine learning models, BI dashboards, and AI-driven decision-making. Must have strong PYTHON, SQL AND AWS SKILLS. Hands‑on Data Warehousing experience. Managing and running of LARGE DATA PROJECTS. Leading and mentoring of Data Engineers. You'll combine hands‑on engineering with strategic leadership, mentoring junior engineers, guiding best practices, and collaborating with data scientists, software engineers, and business stakeholders.
This role is ideal for someone who thrives in cloud‑native environments (AWS preferred) and wants to make a real impact with data.
"cloud data platforms" "big data pipelines" "data warehouse" "data warehousing" python "SQL" "AWS" "real‑time and batch data" "AI‑driven decision‑making" "machine learning workflows" "ETL / data ingestion" "Power BI, Tableau" "AWS data engineer"
Experience & Skills Required 4–7 years of hands‑on data engineering. Experience leading or mentoring junior engineers. Advanced proficiency in Python and SQL. Strong experience with BI tools like Power BI for data storytelling. Strong database design knowledge and experience with data warehousing techniques and modelling approaches. Experience building and maintaining cloud‑based data architecture (AWS preferred).
Hands‑on experience with data ingestion from, amongst others: Microsoft SQL Server, Oracle, MongoDB, Amazon S3 and other AWS data services, HTTP APIs, SFTP, and various file systems. Proficiency with Git, CI / CD pipelines, and Agile methodologies. Hands‑on designing and development of complex data pipelines from multiple sources into a central data platform / lakehouse. Familiarity with machine learning workflows and supporting analytics teams. Strong problem‑solving, analytical, and communication skills. Ability to work independently and take initiative on projects. Business acumen to translate technical work into business impact. Qualifications: Degree or diploma in Computer Science, Information Systems, Engineering, or a related field. AWS certifications or equivalent practical expertise.
Basic + Benefits (Company Contributions)