Enable job alerts via email!

Data Engineer

OPTIMUM SOLUTIONS (SINGAPORE) PTE LTD

Singapore

On-site

SGD 60,000 - 85,000

Full time

Today
Be an early applicant

Job summary

A technology solutions provider in Singapore is seeking a Data Engineer to design and maintain data pipelines and architectures. The ideal candidate should have at least 4 years of experience, strong skills in Python, SQL, and AWS services, and proficiency in data virtualisation tools like Denodo. This role involves developing APIs and data models to support business intelligence needs. Competitive package and a collaborative work environment are provided.

Qualifications

  • At least 4 years of experience in data engineering or similar role.
  • Strong proficiency in Python, VQL, SQL.
  • Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker).
  • Knowledge of data virtualisation concepts and tools.
  • Experience with BI tools (preferably Tableau, Power BI).

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using AWS services.
  • Ingest and process data from various internal and external data sources.
  • Implement automated data collection processes including API integrations.
  • Develop Python scripts in AWS Glue for data processing and automation.
  • Work with AWS Sagemaker for machine learning model deployment.

Skills

Data Engineering
Python
SQL
VQL
AWS services
Data virtualisation
BI tools
Agile methodologies

Tools

AWS Glue
Denodo
Tableau
Power BI
Job description
Responsibilities:
Data Engineering and Platform Integration
  • Design, develop, and maintain data pipelines and ETL processes using AWS services (Glue, Athena, S3, RDS)
  • Work with data virtualisation tools like Denodo and develop VQL queries
  • Ingest and process data from various internal and external data sources
  • Perform data extraction, cleaning, transformation, and loading operations
  • Implement automated data collection processes including API integrations when necessary
Data Architecture
  • Design and implement data models (conceptual, logical, and physical) using tools like ER Studio
  • Develop and maintain data warehouses, data lakes, and operational data stores
  • Develop and maintain data blueprints
  • Create data marts and analytical views to support business intelligence needs using Denodo, RDS
  • Implement master data management practices and data governance standards
Development and Analytics
  • Develop Python scripts in AWS Glue for data processing and automation
  • Write efficient VQL/SQL queries and stored procedures
  • Design and develop RESTful APIs using modern frameworks and best practices for data services
  • Work with AWS Sagemaker for machine learning model deployment and integration
Requirements:
  • At least 4 years of experience in data engineering or similar role
  • Strong proficiency in Python, VQL, SQL
  • Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker)
  • Knowledge of data virtualisation concepts and tools (preferably Denodo)
  • Experience with BI tools (preferably Tableau, Power BI)
  • Understanding of data modelling and database design principles
  • Familiarity with data governance and master data management concepts
  • Experience with version control systems (Gitlab) and CI/CD pipelines
  • Experience working in Agile environments with iterative development practices
  • Strong problem-solving skills and attention to detail
  • Excellent communication skills and ability to work in a team environment
  • Knowledge of AI technologies (AWS Bedrock, Azure AI, LLMs) would be advantageous
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.