Enable job alerts via email!

Data Engineer

Optimal Growth Technologies

Gauteng

On-site

ZAR 400,000 - 700,000

Full time

15 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading technology company in Gauteng seeks a Data Engineer with 3-5 years of experience in data engineering. The role focuses on developing efficient data pipelines, collaborating with stakeholders, and ensuring data quality. Proficiency in SQL, Python, and experience with cloud platforms is essential.

Qualifications

  • Must have 3-5 years experience in data engineering or similar role.
  • Proficient in SQL and Python.
  • Experience with data pipeline tools and cloud platforms.

Responsibilities

  • Develop scalable data pipelines as per business requirements.
  • Collaborate with data owners and support data-driven initiatives.
  • Manage data integrity, security, and quality.

Skills

Data analysis
Data lifecycle management
Cross-functional collaboration
Communication
Presentation skills

Education

3-5 years experience in data engineering or similar role
Proficiency in SQL
Proficiency in Python
Experience with data pipeline tools (e.g., Apache Airflow, dbt)
Hands-on experience with cloud platforms (AWS, Azure)
Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery)
Familiarity with containerization tools (Docker, Kubernetes)
Experience with visualization tools (e.g. Power BI)
Desirable experience with real-time processing frameworks (e.g., Kafka, Spark Streaming)
Experience with Smartsheets

Job description

  • Data Pipeline Development : Developing scalable and efficient data pipelines as per business requirements.
  • Collaboration & Stakeholder Engagement : Collaborating with functional data owners in solving data related queries and provide the necessary support for data driven initiatives.
  • Infrastructure & Governance : Managing data integrity, security, quality and accessibility.
  • Reporting Solutions : Assist with development of reporting solutions.

Core competencies, knowledge and experience

  • Experience with analysing data.
  • In-depth knowledge and practical experience of data lifecycle management.
  • Experience working with cross-functional teams and engaging with business stakeholders.
  • Ability to manage multiple priorities independently.
  • Strong communication and presentation skills.

Must have technical / professional qualifications :

  • 3-5 years of experience in data engineering or a similar role.
  • Proficiency in SQL and Python.
  • Experience with data pipeline tools (e.g., Apache Airflow, dbt).
  • Hands-on experience with cloud platforms (AWS, Azure).
  • Strong knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
  • Familiarity with containerization tools (Docker, Kubernetes) is a plus.
  • Experience in developing reporting solutions using visualization tools (e.g. Power BI).
  • Desirable experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming).
  • Experience with Smartsheets is a plus.

Key performance indicators

  • Data Pipeline Efficiency & Uptime Measure the reliability and performance of ETL (Extract, Transform, Load) or ELT workflows, ensuring minimal downtime and efficient data processing.
  • Data Quality & Accuracy Track metrics like data completeness, consistency, and error rates to ensure high-quality, reliable datasets for analytics and decision-making.
  • Scalability & Optimization Evaluate improvements in data infrastructure, including query
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.