Job Search and Career Advice Platform

Enable job alerts via email!

Data Modeler - Cloud Architecture

Verinon Technology Solution

Kuala Lumpur

On-site

MYR 80,000 - 110,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions firm in Kuala Lumpur is currently seeking a knowledgeable and skilled Data Modeler. The ideal candidate should have over 3 years of experience in cloud architecture and data modeling, with a strong background in SQL and NoSQL databases. The role involves developing data models, implementing data architecture solutions, and collaborating with business intelligence teams for analytics. Candidates should possess problem-solving skills and familiarity with the banking financial services industry.

Qualifications

  • Minimum 3 years hands-on experience in Cloud Architecture and cloud-based databases.
  • Experience in data modelling tools and strong database development skills.
  • Solid knowledge of database programming languages.
  • Experience in Agile/Scrum development process.

Responsibilities

  • Develop conceptual, logical, and physical data models.
  • Implement data architecture and modelling solutions.
  • Normalise data to reduce redundancy and improve integrity.
  • Collaborate with BI Data Engineers for data model development.

Skills

Cloud Architecture (Azure, GCP, AWS)
Data modelling tools
Data warehousing concepts
SQL
NoSQL
Data visualization tools (e.g., Power BI, Tableau)
Python
DBT
Agile/Scrum

Education

Bachelor/Master's degree in Computer Science or related field
Job description

currently seeking a knowledgeable and skilled Data Modeler to join the team. The ideal candidate should have an excellent understanding of data modelling concepts and be able to understand the overall investment data structure. Successful candidates should be detail-oriented, analytical thinkers with a knack for understanding and working with complex investment data and operational trading systems.

Principal Accountabilities

Develop conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL).

Designs, implements, and documents data architecture and data modelling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests.

Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning).

Triage new requirements, provide impact assessments and appropriate estimates for amending data models in line with solution designs and requirements.

Normalize data to reduce redundancy and improve data integrity.

Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization.

Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.

Collaborate with BI Data Engineers to translate business requirements into effective data models for Data Lake/Data Mart environments, ensuring data accuracy, integrity and consistency.

Coordinate with project team for data model development and produce mapping documentation. Collaborates with the Business analysts in defect root cause analysis. Works to resolving and Implementing fixes for data / data model related defects.

Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively.

Perform administrative tasks to ensure project efficiency, including document preservation, reporting, and maintaining data warehouse system and data services.

Knowledge, Skills, and Abilities

Minimum 3 years hands-on experience in Cloud Architecture (Azure, GCP, AWS) & cloud-based databases (Synapse, Databricks, Snowflake, Redshift) and various data integration techniques (API, stream, file) using DBT, SQL/PySpark, Python.

Experience in data modelling tools, strong in Data warehousing concepts and have strong database development skills like complex SQL queries, complex store procedures.

Strong understanding of data modelling concepts, including normalization techniques (1NF, 2NF, 3NF, BCNF).

Proven work experience working with database and business intelligence system, and process automatio

Solid and strong knowledge of database programming language to write code mainly SQL, NoSQL. and data visualisation tools to assist with dashboard development and report generation (e.g., MicroStrategy, Qlik, Power BI, Tableau).

Proficient in leveraging data platforms and tools to streamline data engineering and business intelligence workflows.

Experience with facilitating data exchange between systems (e.g., API, SFTP).

Experience working in an Agile/Scrum development process, managing various projects, maintenance, technical support, etc.

Excellent analytic skills associated with working on structured and unstructured datasets. Excellent problem solving, complex troubleshooting methods and excellent communication skills in both written and verbal.

Education, Experience and Certifications

Bachelor/Master's degree in Computer Science, Information Technology, Management Information Systems, Business Administration, Engineering, Mathematics, or Statistics

At least 3 years of experience working in Data Modeler or Solution Designer roles.

Prior experience on data engineering/ ETL/ETL intensive projects.

Domain knowledge and prior experience in Banking Financial Services industry.

Experience in cloud database solution

Well understanding of investment and market data.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.