Enable job alerts via email!

Data Engineer(Python + Airflow + Snowflake)

RAPSYS TECHNOLOGIES PTE. LTD.

Singapore

On-site

SGD 80,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A tech company in Singapore is seeking a Data Engineer experienced in Python and AWS technologies to design and maintain data pipelines. The ideal candidate will have a strong background in data engineering with a focus on optimizing performance. Responsibilities include developing complex workflows using Apache Airflow and working with AWS-based solutions. Candidates should hold a Bachelor's degree and possess effective communication skills.

Qualifications

  • Minimum 5 years of experience in data engineering.
  • Proficient with Python, PySpark, and SQL.
  • Strong knowledge of AWS services and data integration processes.

Responsibilities

  • Design and maintain complex data pipelines using Python.
  • Collaborate with teams to understand data requirements.
  • Optimize data pipelines for performance and reliability.

Skills

Python
PySpark
SQL
Data modeling
AWS (S3, Glue, EMR, Redshift)

Education

Bachelor’s degree in Computer Science or Engineering

Tools

Apache Airflow
Snowflake
Git
Job description
Role

Data Engineer (Python + Airflow + Snowflake)

Responsibilities
  • Design, develop, and maintain complex data pipelines using Python for efficient data processing and orchestration.
  • Collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment.
  • Implement data integration and transformation processes to ensure optimal performance and reliability of data pipelines.
  • Optimize and fine-tune existing data pipelines / Airflow to improve efficiency, scalability, and maintainability.
  • Troubleshoot and resolve issues related to data pipelines, ensuring smooth operation and minimal downtime.
  • Work closely with AWS services like S3, Glue, EMR, Redshift, and other related technologies to design and optimize data infrastructure.
  • Develop and maintain documentation for data pipelines, processes, and system architecture.
  • Stay updated with the latest industry trends and best practices related to data engineering and AWS services.
Requirements
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in Python, PySpark and SQL for data processing and manipulation.
  • Min 5 years of experience in data engineering, specifically working with Apache Airflow and AWS technologies.
  • Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.
  • Understanding of Snowflake Data Lake is preferred.
  • Experience with optimizing and scaling data pipelines for performance and efficiency.
  • Good understanding of data modeling, ETL processes, and data warehousing concepts.
  • Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.
  • Effective communication skills and the ability to articulate technical concepts to non-technical stakeholders.
Preferred Qualifications
  • AWS certification(s) related to data engineering or big data.
  • Experience working with big data technologies like Snowflake, Spark, Hadoop, or related frameworks.
  • Familiarity with other data orchestration tools in addition to Apache Airflow.
  • Knowledge of version control systems like Bitbucket, Git.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.