Enable job alerts via email!

Data Engineer

VINOVA PTE. LTD.

Singapore

On-site

SGD 60,000 - 80,000

Full time

2 days ago
Be an early applicant

Job summary

A data solutions firm in Singapore is seeking a Data Engineer to design and maintain data pipelines and ETL processes. You will work with AWS services and data virtualization tools to ingest, clean, transform, and load data. Ideal candidates should have at least 3 years of experience, strong skills in Python and SQL, and familiarity with BI tools. This role emphasizes communication and collaboration within a team-oriented environment.

Qualifications

  • At least 3 years of experience in data engineering or a similar role.
  • Strong proficiency in Python, VQL, and SQL.
  • Experience with AWS services such as Glue, Athena, S3, RDS, and Sagemaker.
  • Knowledge of data governance and master data management.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using AWS services.
  • Implement master data management practices and data governance standards.
  • Develop and maintain data warehouses, data lakes, and operational data stores.

Skills

Data engineering
Python
VQL
SQL
AWS services (Glue, Athena, S3, RDS)
Data governance
CI/CD pipelines
Agile environments
BI tools (Tableau, Power BI)
Problem-solving

Tools

Denodo
Gitlab
ER Studio
Job description

· Design, develop, and maintain data pipelines and ETL processes using AWS services (Glue, Athena, S3, RDS)

· Work with data virtualisation tools like Denodo and develop VQL queries

· Ingest and process data from various internal and external data sources

· Perform data extraction, cleaning, transformation, and loading operations

· Implement automated data collection processes including API integrations when necessary

· Design and implement data models (conceptual, logical, and physical) using tools like ER Studio

· Develop and maintain data warehouses, data lakes, and operational data stores

· Develop and maintain data blueprints

· Create data marts and analytical views to support business intelligence needs using Denodo, RDS

· Implement master data management practices and data governance standards

· At least 3 years of experience in data engineering or similar role

· Strong proficiency in Python, VQL, SQL

· Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker)

· Knowledge of data virtualisation concepts and tools (preferably Denodo)

· Experience with BI tools (preferably Tableau, Power BI)

· Understanding of data modelling and database design principles

· Familiarity with data governance and master data management concepts

· Experience with version control systems (Gitlab) and CI/CD pipelines

· Experience working in Agile environments with iterative development practices

· Strong problem-solving skills and attention to detail

· Excellent communication skills and ability to work in a team environment

· Knowledge of AI technologies (AWS Bedrock, Azure AI, LLMs) would be advantageous

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.