Job Search and Career Advice Platform

Enable job alerts via email!

Software Developer / Data Engineer

Boardroom Appointments

Cape Town

On-site

ZAR 500 000 - 700 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment agency is looking for a Software Developer/Data Engineer for a 12-month contract in Cape Town. The successful candidate will have 4 to 8 years of experience working on data-driven ecosystems, expertise in AWS services, and strong skills in PySpark, Python, and SQL. Responsibilities include building data pipelines and mentoring junior developers. If you have a solid understanding of data architecture and are adept at implementing automation practices, we encourage you to apply.

Qualifications

  • Minimum 4 years of experience in Data Engineering or Software Development.
  • Experience with cloud technologies, particularly AWS.
  • Proficiency in coding languages: PySpark, Python, SQL.

Responsibilities

  • Implement scalable data pipelines and architectures.
  • Build distributed data systems using AWS Lambda and Glue.
  • Collaborate with team to meet business requirements.

Skills

Data Engineering
Software Development
Data Modeling
AWS Lambda
PySpark
Python
SQL

Education

BSc Computer Science
BEng
AWS Professional Certification

Tools

AWS S3
AWS Athena
AWS Glue
AWS EC2
JIRA
Confluence
GIT
Job description
About the job Software Developer/Data Engineer

Software Developer/Data Engineer - 12 Month Contract

Educational Qualifications
  • BSc Computer Science / BEng
Professional Qualifications
  • AWS Professional Certification
Years of Experience
  • 4 to 8 years of Data Engineering or Software Development experience working on Data-Driven ecosystems
Skills / Requirements
  • 4 to 8 years of Data Engineering or Software Development experience working on Data-Driven ecosystems
  • Required to code complex transformations using loader specifications provided by the BA
  • Experience working with Big Data sets and solving data-related challenges
  • Ability to automate ingestion by building ingestion pipelines using AWS Lambda or Glue
  • Proficiency in coding languages: PySpark, Python, SQL
  • Hands-on experience with AWS services, including S3, Athena, Lambda Functions, GLUE, EC2
  • AWS experience and AWS certification required
  • Experience in data modeling and data architecture design
Key Responsibilities
  • Implement scalable data pipelines and architectures using PySpark, Python, SQL
  • Build distributed data pipelines and compute tiers operating on AWS Lambda and Glue
  • Serve as a technical resource for team members and mentor junior engineers
  • Collaborate with the team to deliver high-quality solutions that meet business requirements
  • Ensure code is well-designed, maintainable, and follows best practices and standards
  • Play a key role in shaping engineering practices by working in a scrum team to ensure sprint deliverables are met
  • Utilize project development tools such as JIRA, Confluence, and GIT
  • Assist DEVOPS Engineer in automation and CI/CD practices
  • Evaluate and recommend new technologies to improve performance, scalability, and reliability of software systems.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.