Enable job alerts via email!

AWS Data Engineer

CodeConnect Staffing (Pty) Ltd

Pretoria

On-site

ZAR 600,000 - 800,000

Full time

20 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in the technology sector is looking for an experienced AWS Data Engineer to build and manage scalable data systems. The role involves developing ETL pipelines, ensuring compliance with security regulations, and collaborating with various teams to enhance data processes.

Qualifications

  • 5+ years of experience in Data Engineering and AWS services.
  • Strong programming skills in Python, particularly with PySpark.
  • Experience with SQL and NoSQL databases such as PostgreSQL, MySQL.

Responsibilities

  • Design and manage scalable data systems using AWS services.
  • Develop and optimize ETL pipelines using AWS Glue and PySpark.
  • Ensure compliance with GDPR and best security practices.

Skills

AWS Services
Data Engineering
Programming
Data Modeling & Optimization
Machine Learning
Compliance & Security

Education

Bachelor’s degree in Computer Science, Engineering, or related field
AWS certifications (e.g., AWS Certified Data Engineer)

Job description

An exciting opportunity has become available for an experienced AWS Data Engineer. In this role, you'll be responsible for building and managing scalable data systems, from setting up data sources to integrating analytical tools, using AWS services.

Key Responsibilities :

  • Data Architecture & Management : Design and maintain data systems using AWS services (e.g., AWS S3, AWS Glue, Athena). Organize data effectively and ensure easy access through data partitioning and cataloging strategies.
  • ETL Pipeline Development : Develop and optimize ETL (Extract, Transform, Load) pipelines using AWS Glue and PySpark. Focus on improving performance, scalability, and cost-efficiency in batch and real-time data processing.
  • Automation & Monitoring : Automate workflows and ensure they run efficiently. Set up monitoring and alerts for data pipelines, and optimize AWS resources for scalability and performance.
  • Security & Compliance : Follow security best practices, including API authentication, data encryption, and compliance with GDPR, HIPAA, and SOC2.
  • Collaboration & Mentorship : Work closely with data scientists, analysts, and backend teams to integrate data pipelines. Provide mentorship to junior team members and encourage growth and collaboration.
  • Quality Management : Ensure high-quality software development standards by adhering to best practices and complying with industry regulations like ISO, CE, and FDA.

Key Skills and Experience (5+ Years Required) :

  • AWS Services : Experience with AWS Glue, S3, Lambda, Athena, and CloudWatch.
  • Data Engineering : Proven experience developing and optimizing ETL pipelines. Familiarity with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB).
  • Programming : Strong skills in Python, particularly using PySpark with AWS Glue.
  • Data Modeling & Optimization : Experience in data modeling, schema design, and database optimization.
  • Machine Learning : Experience integrating data pipelines with machine learning workflows and deploying models with AWS SageMaker.
  • Compliance & Security : Knowledge of data governance, API security, and compliance with industry standards and regulations.

Education & Certification Requirements :

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • AWS certifications, such as AWS Certified Data Engineer, Solutions Architect, or Data Analyst, are highly desirable.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.