Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Maestro Human Resource Pte Ltd

Singapore

On-site

SGD 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech recruitment firm in Singapore is seeking a talented Data Engineer/Software Developer to join their product development team. The successful candidate will be responsible for designing and implementing data integration pipelines and event-driven data processes. Key responsibilities include collaboration with skilled engineers, developing scalable data models, and ensuring data quality and compliance. Candidates should have a degree in Computer Science or Engineering and proficiency in Java. This role provides a great opportunity for growth in a dynamic environment.

Qualifications

  • Bachelor's/Master's degree in Computer Science, Engineering, or related field.
  • Proficiency in at least one object-oriented language, Java preferred.
  • Experience with data lifecycle management.
  • Knowledge of event-driven architecture.
  • Familiarity with data analytics pipelines and data lakehouse concepts.

Responsibilities

  • Design and implement data integration pipelines.
  • Collaborate with Software and Data Engineers.
  • Develop and maintain scalable data models.
  • Design integration connectors for data integration.
  • Implement event-driven processing pipelines.

Skills

Data manipulation languages (SQL)
Object-oriented programming (Java)
Event-driven architecture
Data lifecycle management
Data transformation techniques
Communication skills
Problem-solving skills

Education

Bachelor's/Master's degree in Computer Science/Engineering

Tools

Solace Pubsub
Apache Kafka
MQTT
AMQP
Relational databases
Non-relational databases
Job description

We are looking for aspiring Data Engineer/Software Developer to join our product development team, you will be responsible for the design and implementation of data integration pipeline and asynchronous data event using event driven architecture approach to consolidate data from various sources and systems.

About the Role
  • Collaborating with a team of highly skilled Software Engineers and Data Engineers in deploying and delivering software products. Actively participating in product enhancement, particularly in the areas of integration and UI, to improve product quality and reconfigurability.
  • Taking ownership of the development of key software components.
  • Design, develop and maintain scalable data models and schemas using various data modelling best practices.
  • Design and develop integration connectors to facilitate data integration from multiple sources and systems, including databases, APIs, log files, streaming platforms, IoT end devices and external data providers.
  • Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis or reporting.
  • Strong knowledge of data manipulation languages such as SQL necessary to build and maintain complex queries and data pipelines.
  • Implement event-driven processing pipelines using frameworks like Solace Pubsub, Apache Kafka, MQTT, AMQP.
  • Ensure compliance to data governance policies and standards to ensure data quality, integrity, security and consistency.
  • Experience working in a dynamic environment with managing senior stakeholders from different organisations and agencies.
Requirements
  • Bachelor's/Master's degree specializing in Computer Science / Engineering / Industrial Engineering or related field. Diploma holders with relevant experience are encouraged to apply.
  • Proficiency in at least one object-oriented language; Java skills are highly desirable.
  • Deep understanding in designing, implementing data lifecycle management.
  • Deep understanding of Event driven architecture, data producers and consumers.
  • Knowledge on data analytics pipeline, data lakehouse & warehouse concept. Curate data for exporting to data analytics tools for data processing pipeline to provide business analytics outcomes.
  • Experience with relational and non-relational databases Familiar with Product Development and Design thinking approach
  • Excellent communication, collaboration, and problem-solving skills
  • Ability to work independently and as part of a team.
  • 5 day week @ AMK area
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.