Enable job alerts via email!

Data Engineering Lead

TekSalt Solutions

Singapore

On-site

SGD 90,000 - 110,000

Full time

3 days ago
Be an early applicant

Job summary

A technology solutions provider in Singapore is seeking a Data Engineering Lead with expertise in AWS Glue and PySpark. The ideal candidate will have 5 to 8 years of experience and be responsible for designing scalable data processing pipelines and optimizing ETL processes in the cloud. Join us to lead innovative data solutions that drive business insights.

Qualifications

  • 5 to 8 years of experience in data engineering roles.
  • Expertise in AWS Glue and PySpark for ETL workflows.
  • Strong programming skills in Python for data transformation.

Responsibilities

  • Design and implement data processing pipelines using Spark and PySpark.
  • Build and optimize ETL processes for data extraction and transformation.
  • Utilize AWS Glue for serverless ETL jobs.

Skills

Apache Spark
PySpark
Python
AWS Glue
SQL
ETL processes
Data Pipeline Optimization
Apache Airflow

Tools

AWS services (S3, Lambda, Redshift)

Job description

    At Teksalt Solutions, we specialize in connecting top-tier talent with leading companies to create dynamic, productive workforces. We are committed to delivering technology solutions that not only meet but exceed the demands of the modern business landscape.We are currently seeking a Data Engineering Lead - AWS Glue & PySpark Specialist for a permanent full-time position in Bangalore. The ideal candidate should have 5 to 8 years of experience with skills in AWS Glue, PySpark, and Python.Key Responsibilities:- Spark & PySpark Development: Design and implement scalable data processing pipelines using Apache Spark and PySpark for large-scale data transformations.- ETL Pipeline Development: Build, maintain, and optimize ETL processes for seamless data extraction, transformation, and loading across various data sources and destinations.- AWS Glue Integration: Utilize AWS Glue to create, run, and monitor serverless ETL jobs for data transformations and integrations in the cloud.- Python Scripting: Develop efficient, reusable Python scripts to support data manipulation, analysis, and transformation within the Spark and Glue environments.- Data Pipeline Optimization: Ensure that all data workflows are optimized for performance, scalability, and cost-efficiency on the AWS Cloud platform.- Collaboration: Work closely with data analysts, data scientists, and other engineering teams to create reliable data solutions that support business analytics and decision-making.- Documentation & Best Practices: Maintain clear documentation of processes, workflows, and code while adhering to best practices in data engineering, cloud architecture, and ETL design.Required Skills:- Expertise in Apache Spark and PySpark for large-scale data processing and transformation.- Hands-on experience with AWS Glue for building and managing ETL workflows in the cloud.- Strong programming skills in Python, with experience in data manipulation, automation, and integration with Spark and Glue.- In-depth knowledge of ETL principles and data pipeline design, including optimization techniques.- Proficiency in working with AWS services, such as S3, Glue, Lambda, and Redshift.- Strong skills in writing optimized SQL queries, with a focus on performance tuning.- Ability to translate complex business requirements into practical technical solutions.- Familiarity with Apache Airflow for orchestrating data workflows.- Knowledge of data warehousing concepts and cloud-native analytics tools.If you are passionate about data engineering and have the required skills and experience, we welcome you to apply for this position. Join us at Teksalt Solutions, where a pinch of us makes all the difference in the world of technology.,

Sign-in & see how your skills match this job

Sign-in & Get noticed by top recruiters and get hired fast

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.