Enable job alerts via email!

Data Engineer

Top Vitae Recruitment

Gqeberha

On-site

ZAR 500 000 - 700 000

Full time

Yesterday
Be an early applicant

Job summary

A recruitment agency is seeking a Data Engineer to optimize data systems and pipelines for market research purposes. The ideal candidate will develop, maintain, and automate data workflows, ensuring data reliability and efficiency. A Bachelor's degree in Computer Science or related field, along with 3-5 years of relevant experience, is required. Expertise in technologies such as Python, SQL, and cloud platforms is essential for success in this role.

Qualifications

  • 3-5 years’ experience in data operations, data engineering, or software development roles.
  • Proven experience in building and maintaining data pipelines and automated workflows.
  • Exposure to market research or data analytics environments is advantageous.
  • Experience in managing data within cloud-based or hybrid infrastructures.

Responsibilities

  • Design, build, and maintain data pipelines and ETL processes.
  • Develop automation solutions for data quality and efficiency.
  • Monitor data workflows and resolve issues in data processing.
  • Work closely with research and data analytics teams.

Skills

Scripting languages (Python, R, DAX)
ETL/ELT pipeline design
SQL proficiency
Cloud platforms (AWS, Azure, GCP)
BI tools (Power BI, Tableau)
Microsoft Fabric
APIs and systems integration

Education

Bachelor’s degree in Computer Science or related field
Job description
FUNCTIONAL DEFINITION AND RESPONSIBILITY

The Data Engineer is responsible for developing, maintaining, and optimising the data systems and operational pipelines that support the company’s market research and analytics processes. The role ensures that data flows seamlessly from collection through transformation to reporting, enabling timely, accurate, and actionable insights for clients. This position bridges technical development and operational efficiency, ensuring data reliability, process automation, and continuous improvement across the data lifecycle.

RESPONSIBILITIES
Data Systems Development and Maintenance
  • Design, build, and maintain data pipelines and ETL processes for efficient ingestion, transformation, and storage of large, multi-source datasets.
  • Develop automation solutions that improve data quality, integrity, and processing efficiency across the research workflow.
  • Maintain and enhance database systems, ensuring data consistency, accuracy, and optimal performance.
  • Develop APIs and integration scripts to connect internal platforms with third-party data sources and tools.
Operational Data Management
  • Monitor data workflows to ensure timely delivery and availability for analysis and reporting.
  • Diagnose and resolve issues in data processing and system performance.
  • Implement process controls and error handling to minimise downtime and data discrepancies.
  • Support data validation and reconciliation to maintain confidence in reporting outputs.
Collaboration and Stakeholder Support
  • Work closely with research, data analytics and data science to understand operational needs and deliver scalable data solutions.
  • Translate data and workflow requirements into robust, maintainable technical systems.
  • Provide technical input into data architecture, governance, and process improvement initiatives.
  • Document data flows, system configurations, and development procedures for operational continuity.
Innovation and Continuous Improvement
  • Identify opportunities to optimise data pipelines and automate manual processes.
  • Evaluate and implement new tools and technologies to improve data efficiency and quality.
  • Contribute to the continuous improvement of data engineering standards, best practices, and workflows.
QUALIFICATIONS & EXPERIENCE
  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
  • 3-5 years’ experience in data operations, data engineering, or software development roles.
  • Proven experience in building and maintaining data pipelines and automated workflows.
  • Exposure to market research or data analytics environments is highly advantageous.
  • Experience in managing data within cloud-based or hybrid infrastructures.
TECHNICAL SKILLS
  • Proficiency in scripting languages (Python, R, DAX, or equivalent).
  • Expertise in designing and maintaining ETL/ELT pipelines.
  • Proficiency in SQL and data modelling for relational and non-relational databases.
  • Experience with cloud platforms (AWS, Azure, or GCP) and associated data services.
  • Knowledge of BI tools (Power BI, Tableau) and data warehousing concepts.
  • Knowledge of Microsoft Fabric.
  • Knowledge of APIs and systems integration.
CORE COMPETENCIES
  • Strong statistical / mathematical skills.
  • Excellent understanding of data operations within a commercial or research environment.
  • Ability to translate business requirements into scalable technical workflows.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.