Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

NEUTRON PTE. LTD.

Singapore

On-site

SGD 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data engineering firm in Singapore is looking for experienced Data Engineers to develop and maintain data products in their data platform. Responsibilities include creating data pipelines, ensuring data integration, and collaborating with cross-functional teams. The ideal candidate has at least 3 years of data engineering experience, strong Python and AWS skills, and works well in Agile environments.

Qualifications

  • At least 3 years of experience in data engineering or similar role.
  • Strong proficiency in Python, VQL, SQL and AWS services.
  • Knowledge of data virtualisation concepts and tools, preferably Denodo.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes.
  • Create and maintain technical documentation for data models and systems.
  • Collaborate with cross-functional teams on data engineering initiatives.

Skills

Python
VQL
SQL
AWS services (Glue, Athena, S3, RDS, Sagemaker)
Data virtualisation (Denodo)
Agile methodologies

Tools

AWS Glue
Denodo
Tableau
Power BI
Gitlab
Job description

To support the development and maintenance of enterprise data products within our organisation's data platform, we are looking for experienced data engineers to join our team who will be responsible for:

Data Engineering and Platform Integration
  • Design, develop, and maintain data pipelines and ETL processes using AWS services (Glue, Athena, S3, RDS)
  • Work with data virtualisation tools like Denodo and develop VQL queries
  • Ingest and process data from various internal and external data sources
  • Perform data extraction, cleaning, transformation, and loading operations
  • Implement automated data collection processes including API integrations when necessary
Data Architecture
  • Design and implement data models (conceptual, logical, and physical) using tools like ER Studio
  • Develop and maintain data warehouses, data lakes, and operational data stores
  • Develop and maintain data blueprints
  • Create data marts and analytical views to support business intelligence needs using Denodo, RDS
  • Implement master data management practices and data governance standards
Technical Architecture and Integration
  • Ensure seamless integration between various data systems and applications
  • Implement data security and compliance requirements
  • Design scalable solutions for data integration and consolidation
Development and Analytics
  • Develop Python scripts in AWS Glue for data processing and automation
  • Write efficient VQL/SQL queries and stored procedures
  • Design and develop RESTful APIs using modern frameworks and best practices for data services
  • Work with AWS Sagemaker for machine learning model deployment and integration
  • Manage and optimise database performance, including indexing, query tuning, and maintenance
  • Work in an Agile environment and participate in sprint planning, daily stand-ups, and retrospectives
  • Implement and maintain CI/CD pipelines for automated testing and deployment
  • Participate in peer code reviews and pair programming sessions
Documentation and Best Practices
  • Create and maintain technical documentation for data models and systems
  • Follow industry-standard coding practices, version control, and change management procedures
Stakeholder Collaboration
  • Partner with cross-functional teams on data engineering initiatives
  • Gather requirements, conduct technical discussions, implement solutions, and perform testing
  • Collaborate with Product Managers, Business Analysts, Data Analysts, Solution Architects, UX Designers to build scalable, data-driven products
  • Provide technical guidance and support for data-related queries
Qualifications and Experience:
  • At least 3 years of experience in data engineering or similar role
  • Strong proficiency in Python, VQL, SQL · Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker)
  • Knowledge of data virtualisation concepts and tools (preferably Denodo)
  • Experience with BI tools (preferably Tableau, Power BI)
  • Understanding of data modelling and database design principles
  • Familiarity with data governance and master data management concepts
  • Experience with version control systems (Gitlab) and CI/CD pipelines
  • Experience working in Agile environments with iterative development practices
  • Strong problem-solving skills and attention to detail
  • Excellent communication skills and ability to work in a team environment
  • Knowledge of AI technologies (AWS Bedrock, Azure AI, LLMs) would be advantageous
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.