Enable job alerts via email!

Reporting and Analytics Developer/Data Engineer

INNOVATIQ TECHNOLOGIES PTE. LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

5 days ago
Be an early applicant

Job summary

A dynamic technology company in Singapore is seeking a Data Engineer to enhance data processing and reporting solutions. The role involves collaboration with technical teams, creating detailed documentation, and optimizing data workflows using advanced tools like Python and SQL. The ideal candidate will have expertise in data modeling, analytics, and experience with various data platforms, ensuring high availability and performance in cloud environments.

Qualifications

  • Good understanding and completion of projects using waterfall/Agile methodology.
  • Strong SQL, data modelling and data analysis skills are a must.
  • Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica.

Responsibilities

  • Analyse the Authority’s data needs and document the requirements.
  • Refine data collection/consumption by migrating data collection to more efficient channels.
  • Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs.

Skills

SQL
Data Modelling
Data Analysis
Python
Pyspark
Linux
ETL Tools
SAP BO
Tableau
DevOps
Data Virtualization

Job description

Key Responsibilities

• Analyse the Authority’s data needs and document the requirements.

• Refine data collection/consumption by migrating data collection to more efficient channels.

• Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs.

• Develop test plan and scripts for system testing, support user acceptance testing.

• Build reports and dashboards according to user requirements.

• Work with the Authority’s technical teams to ensure smooth deployment and adoption of new solution.

• Ensure the smooth operations and service level of IT solutions.

• Support production issues.

What we are looking for

• Good understanding and completion of projects using waterfall/Agile methodology.

• Strong SQL, data modelling and data analysis skills are a must.

• Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica.

• Hands-on experience in a reporting or visualization tool like SAP BO and Tableau is must.

• Hands-on experience in DevOps deployment and data virtualisation tools like Denodo will be an advantage.

• Track record in implementing systems using Hive, Impala and Cloudera Data Platform will be preferred.

• Good understanding of analytics and data warehouse implementations.

• Ability to troubleshoot complex issues ranging from system resource to application stack traces.

• Track record in implementing systems with high availability, high performance, high security hosted at various data centres or hybrid cloud environments will be an added advantage.

• Passion for automation, standardization, and best practices.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.