Enable job alerts via email!

Lead AWS Data Engineer

Technology Next

Singapore

Remote

SGD 60,000 - 80,000

Full time

8 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in data engineering seeks a Lead AWS Data Engineer to join their remote team. This role involves developing and managing data pipelines, using AWS services for efficient data processing, and requires expertise in Python, PySpark, SQL, and various data formats. If you have a strong background in consumer finance or similar industries and want to work in a flexible environment, this opportunity is for you.

Qualifications

  • 10+ years total work experience, with 7+ years as Data Engineer.
  • Proficiency in Python, PySpark, SQL, and various AWS services.
  • Familiarity with JSON, XML, CSV, TSV, Parquet file formats.

Responsibilities

  • Develop and maintain data pipelines using AWS Glue.
  • Implement serverless solutions with AWS Lambda.
  • Design data workflows using AWS Step Functions.

Skills

Python
PySpark
SQL
AWS services
Data processing
Data analysis
CI/CD tools

Tools

Looker
Power BI
Jenkins
GitLab
Github
Jira
Confluence
Terraform

Job description

    Title : Lead AWS Data EngineerMandatory skills : Python, PySpark , SQL, and AWS servicesYears of experience : 10+ yearsTarget Date : Looking for immediate joiners.Level of interview : 2 Technical LevelsSalary : INR(1,00,000- 1,20,000)/month in handMode of Work : RemoteAWS Data EngineerWe are seeking a skilled AWS Data Engineer with expertise in various AWS services. The ideal candidate will have hands-on experience with Lambda, Glue, SNS, SQS, Step Functions , PySpark , Python, Athena, CloudWatch, S3, and more. The successful candidate should also have working experience with various data file formats such as JSON, XML, CSV, Parquet.Proficiency in SQL, and visualization tool experience with Looker or Power BI.Responsibilities :1. Develop and maintain robust data pipelines using AWS Glue for efficient ETLprocesses.2. Implement serverless computing solutions with AWS Lambda to automate tasks andprocesses.3. Utilize SNS and SQS for efficient messaging and event-driven architecture.4. Design and orchestrate data workflows using AWS Step Functions.5. Leverage PySpark, Python, and SQL for data processing, analysis, andtransformation.6. Implement and optimize queries using AWS Athena for efficient querying of largedatasets.7. Monitor and manage resources and applications using AWS CloudWatch.8. Manage data storage and retrieval using AWS S3.9. Work with various data file formats, including JSON, XML, CSV, TSV, Parquet, andexecute SQL queries as needed.10. Utilize visualization tools such as Looker or Power BI for effective datarepresentation.11. Build end-to-end data pipelines, from conception to implementation, ensuringscalability and efficiency.12. Hands-on experience with CI/CD tools such as Jenkins, GitLab/Github, Jira,Confluence, and other related tools.13. Experience working with Delta Lake for efficient version control and datamanagement.Qualifications : 7+ years experience as a Data Engineer in consumer finance or equivalent industry(consumer loans, collections, servicing, optional product, and insurance sales). Proven experience as a Data Engineer with a strong focus on AWS services. Proficiency in Python, PySpark, SQL, and AWS services for data processing andanalysis. Hands-on experience with AWS Lambda, Glue, SNS, SQS, Step Functions, Athena,CloudWatch, and S3. Practical experience working with JSON, XML, CSV, TSV, Parquet file formats. Experience with visualization tools such as Looker or Power BI is a significant plus. Good understanding of serverless architecture and event-driven design. Hands-on experience with CI/CD tools, including Jenkins, GitLab/Github, Jira,Confluence, and other related tools. Comfortable learning about and deploying new technologies and tools. Organizational skills and the ability to handle multiple projects and prioritiessimultaneously and meet established deadlines. Good written and oral communication skills and the ability to present results to non-technical audiences. Knowledge of business intelligence and analytical tools, technologies, andtechniques. Experience with Terraform is a plus.Job Types: Full-time, Contractual / TemporaryContract length: 12 monthsExperience: total work: 10 years (Required) Data Engineer, Python: 7 years (Required) AWS services: 6 years (Required) SQL, Pyspark: 6 years (Required)Work Location: Remote,

Sign-in & see how your skills match this job

Sign-in & Get noticed by top recruiters and get hired fast

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.