Enable job alerts via email!

Data Engineer

DUOTECH PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

Yesterday
Be an early applicant

Job summary

A consulting firm in Singapore is seeking a Data Engineer to oversee data ingestion and integration. Responsibilities include building data pipelines, maintaining a data warehouse, and creating automated reporting systems. The ideal candidate will have strong SQL and Python skills, along with experience in data manipulation and cloud infrastructure. This role offers a great opportunity to work collaboratively within a dynamic team focused on digital transformation.

Qualifications

  • Strong command of relational databases and SQL.
  • Proficiency with Python or R for data manipulation.
  • Ability to learn new techniques and troubleshoot code.

Responsibilities

  • Maintain and build on our data warehouse and analytics environment.
  • Design, implement, and maintain data engineering solutions.
  • Build reports and data visualizations from data.

Skills

SQL
Python
R
Data Manipulation
APIs
Cloud Infrastructure (AWS)
Data Visualization (PowerBI, Tableau)

Tools

GitLab
Job description
About Hytech

Hytech is a leading management consulting firm headquartered in Australia and Singapore, specializing in digital transformation for fintech and financial services companies. We provide comprehensive consulting solutions, as well as middle- and back-office support, to empower our clients with streamlined operations and cutting-edge strategies.

With a global team of over 2,000 professionals, Hytech has established a strong presence worldwide, with offices in Australia, Singapore, Malaysia, Taiwan, Philippines, Thailand, Morocco, Cyprus, Dubai and more.

Overview of role

The Data Engineer will oversee the company’s data ingestion / integration work, including developing a data model, maintaining a data warehouse and analytics environment, and writing scripts for data ingestion / integration and analysis. This role will work closely and collaboratively with members of the Data & Analytics and Development teams to define requirements, mine and analyze data, integrate data from a variety of sources, and deploy high quality data pipelines in support of the analytics needs. This person will also be responsible for creating automated reporting / data visualization system based on the request of internal stakeholders.

Job Responsibilities
  • Maintain and build on our data warehouse and analytics environment, the home for various source data generated in the company.
  • Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects, including integrating new sources of data into our central data warehouse, and moving data out to applications.
  • Build reports and data visualizations, using data from the data warehouse and other sources.
  • Produce scalable, replicable code and engineering solutions that help automate repetitive data management tasks.
  • Perform one-off data manipulation and analysis on a wide variety of source data.
  • Implement and monitor best in class security measures in our data warehouse and analytics environment.
  • Help other data team staff troubleshoot their SQL, Python, or R code.
Requirements
  • Strong command of relational databases and SQL. Extract, Transform, and Load (ETL) data into a relational database.
  • Proficiency with Python or R, especially for data manipulation and analysis, and ability to build, maintain and deploy sequences of automated processes with these tools.
  • General data manipulation skills: read in data, process and clean it, transform and recode it, merge different data sets together, reformat data between wide and long, etc.
  • Demonstrated ability to learn new techniques and troubleshoot code without support, e.g. find answers to common programming challenges on Google. In other words, be able to learn on the job.
  • Demonstrated ability to write clear code that is well-documented and stored in a version control system (we use Gitlab).
  • Use APIs to push and pull data from various data systems and platforms.
  • Experience working with cloud infrastructure services like Amazon Web Services is preferred.
  • Experience with data visualization (PowerBI, Tableau) is helpful, but not required.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.