Claremont, South Africa | Posted on 18 / 03 /
We are seeking a skilled and detail-oriented Analytics Engineer to join our data team.
This role is key in building, optimising, and maintaining data pipelines that drive our business intelligence and analytics efforts.
The Analytics Engineer will collaborate closely with data scientists, data analysts, and other stakeholders to ensure data is accessible, reliable, and actionable.
If you're passionate about creating scalable data solutions and enabling data-driven decisions, we'd love to meet you.
Key Performance Outputs
- Data Pipeline Development: Design, build, and maintain robust ETL (Extract, Transform, Load) pipelines to collect, transform, and integrate data from various sources.
- Data Modeling: Develop and maintain clean, reusable, and reliable data models to support analytics, reporting, and machine learning.
- Data Quality: Ensure data integrity, accuracy, and consistency through regular monitoring, testing, and validation processes.
- Collaboration with Stakeholders: Work closely with data analysts, data scientists, and business stakeholders to understand requirements and deliver solutions that meet business needs.
- Tool Optimisation: Optimise analytics workflows and tool usage (e.g., dbt, Airflow, SQL) for scalability and efficiency.
- Documentation and Best Practices: Document data transformations, pipelines, and business logic to ensure knowledge sharing and alignment with best practices.
- Performance Tuning: Optimise queries and storage solutions for efficiency, scalability, and cost-effectiveness.
Requirements
- Education / Experience: Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
- 3+ years of experience in data engineering, analytics engineering, or a similar role.
- Strong proficiency in SQL, with the ability to write complex queries and optimize for performance.
- Experience with ETL tools and data orchestration frameworks (e.g., dbt, Apache Airflow, Prefect).
- Solid understanding of data warehousing concepts and data modeling techniques (e.g., star schema, snowflake schema).
- Familiarity with cloud data platforms (e.g., Snowflake, BigQuery, Redshift).
- Experience with version control systems like Git and familiarity with CI / CD pipelines.
- Ability to work independently and manage multiple tasks while meeting deadlines.
Functional / Technical Skills
- Version Control (Git) – Managing code changes in analytics pipelines.
- Experience with BI and visualization tools (e.g., Looker, Tableau, Power BI).
- Data Governance & Quality – Implementing data validation, observability, and documentation standards.