Human Resources Officer @ Pepkor Lifestyle
Job purpose : The ideal candidate will use their passion for big data and analytics to provide insights to the business covering a range of topics.
They will be responsible for conducting both recurring and ad hoc analysis for business users.
As a Data Engineer at Pepkor Lifestyle, you will play a critical role in the development and maintenance of our data infrastructure.
You will work closely with cross‑functional teams to ensure data availability, quality, and accessibility for analysis.
Responsibilities
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
- Design, develop, and maintain data pipelines and ETL processes.
- Implement and maintain data warehousing and data storage solutions.
- Optimize data pipelines for performance, scalability, and reliability.
- Ensure data quality and integrity through data validation and cleansing processes.
- Monitor and troubleshoot data infrastructure issues.
- Stay current with emerging technologies and best practices in data engineering.
- Systematic solution design of the ETL and data pipeline in line with business user specifications.
- Develop and implement ETL pipelines aligned to the approved solution design.
- Ensure data governance and data quality assurance standards are upheld.
- Deal with customers in a customer‑centric manner.
- Effective self‑management and teamwork.
Minimum Qualification and Experience
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Engineer in a professional setting.
- Proficiency in data engineering technologies and programming languages (e.g., SQL, Python, Scala, Java).
- Strong knowledge of data storage, database design, and data modelling concepts.
- Experience with ETL tools, data integration, and data pipeline orchestration.
- Familiarity with data warehousing solutions (e.g., Snowflake, Redshift).
- Excellent problem‑solving and troubleshooting skills.
- Strong communication and collaboration skills.
- 5‑10 years' experience and understanding in designing and developing data warehouses according to the Kimball methodology.
- Adept at design and development of ETL processes.
- SQL development experience, preferably SAS Data Studio and AWS experience.
- The ability to ingest / output CSV, JSON and other flat file types and any related data sources.
- Proficient in Python or R or willingness to learn.
- Experience within Retail, Financial Services and Logistics environments.
- Redshift technologies.
- Understanding of data security and compliance best practices.
- Relevant certifications (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer).
Seniority level: Mid‑Senior level
Employment type: Full‑time
Job function: Other
Industries: IT System Data Services