Job Search and Career Advice Platform

Enable job alerts via email!

Data Architect

Two95 International Inc.

Kuala Lumpur

On-site

MYR 120,000 - 150,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data solutions company in Kuala Lumpur is seeking a seasoned Data Architect to drive enterprise data initiatives. You will establish a robust data ecosystem, leveraging your extensive experience in Big Data and data management. A key responsibility will be to collaborate with various teams to ensure data security compliance and effective data governance across the organization. The ideal candidate has over 8 years of experience and a Bachelor's degree in a related field.

Qualifications

  • Min 8 years of experience architecting and developing large scale data solutions.
  • Expertise in architecting modern data ingestion frameworks.
  • Hands on experience in Data Management Lifecycle and Data Governance.

Responsibilities

  • Execute Enterprise data initiatives to establish a governed data ecosystem.
  • Deliver Data & AI solutions and create architecture using leading tech frameworks.
  • Collaborate with teams to ensure compliance with data security and privacy policies.

Skills

Big Data analysis
Data management
Data modeling
Data security
Data warehousing
Data governance
Machine learning
Communication skills

Education

Bachelor's degree in Computer Science or related field

Tools

Hadoop
Kafka
Talend
SQL
Job description

SCOPE & AUTHORITY

Data Architecture:

  • Execute Enterprise data initiatives / programs to establish a governed, curated, and agile data ecosystem that enables business to make data-driven decisions.
  • Translate strategic requirements into a usable Enterprise information architecture (e.g. enterprise data model, associated metamodel, common business vocabulary, and naming taxonomy, etc.).
  • Develop and maintain architecture artifacts, frameworks, and patterns as references for the development teams across the Group.
  • Deliver tangible Data & AI solutions, Choosing the right technology, evaluating architecture evolution as well as being capable of creating and maintaining architecture using leading Data & AI technology frameworks
  • Participate in key transformational project design reviews as part of the methodology process to ensure application designs adhere to enterprise information architecture guidelines.
  • Monitor regulatory guidelines such as consumer privacy laws, data retention policies, outsourced data, and specific industry guidelines to determine impact on the enterprise information architecture.
  • Provide capability assessment tool on Data Management & Governance in all Dimensions of Data Management adhering to DMBOK V2.
  • Establish and monitor the operations of the data governance organization across the Group.
  • Drive the implementation of corrective measures to ensure data governance policies and procedures are followed.
  • Publish data reference architecture, data architecture principles, best practices, and design patterns to enable data engineering teams to build scalable and resilient platforms.
  • Exploring new technology trends and leveraging on it in order to simplify our data eco-system (e.g. development of architectures, strategies, and policies around data governance, including master data management, metadata management, data quality and data profiling).

Data Management (Regulatory & Compliance):

Academic Qualification:

Bachelor's degree in computer science, computer engineering, electrical engineering, systems analysis or a related field of study

Min 8 years of experience architecting, designing and developing large scale data solutions utilizing a mixture of Big Data and Relational database platforms. Data/information modeling expertise at the enterprise level Space, NLTK

Skills Required:

Requires advance knowledge of Big Data analysis and data management tools to be able to recommend and provide industry best practices.

You will drive end to end data solutions and data management strategy across data and analytics platforms.

Enterprise scale expertise in data analysis, modelling, data security, data warehousing, metadata management and data quality.

Extensive knowledge and experience in architecting modern data ingestion frameworks, highly scalable distributed systems using open source and emerging data architecture patterns.

Data/information modelling expertise at the enterprise level

Experience with Master Data Management, Metadata Management, and Data Quality tools, Data Security and Privacy methods and frameworks

  • Hands on experience in Data Management Lifecycle, Data Modelling and Data Governance
  • Experience with Hadoop clusters, in memory processing, GPGPU processing and parallel distributed computing systems
  • Experience building data pipelines using Kafka, Flume, and accelerated Stream processing - Deep understanding of Apache Hadoop 1/2 and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
  • Good knowledge of Elastic search and Solr
  • Experience in designing NoSql, HDFS, Hive, HBASE datamarts and creating data lakes - Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto)
  • At least 10 years of experience with data warehouse design for RDBMS such as for Oracle, MS SQL, PostgreSQL and MySQL – Can incorporate with above
  • Experience with Service Oriented Architecture (SOA), web services, enterprise data management, information security, applications development, and cloud-based architectures
  • Experience with enterprise data management technologies, including database platforms, ETL tools such as Talend/Pentaho (developing Spark ETL jobs), and SQL.
  • Experience in languages like Java, PHP, Python and/or R on Linux OS. – JavaScript, Scala and Windows OS
  • Experience in implementing machine-learning solutions, development in multiple languages and statistical analysis. He/she will also need to be familiar with a whole host of other approaches used in practical applications of machine learning.
  • Experience in AI Integration, Natural Language Processing and AI Application Programming
  • Experience with Telecommunications, IoT, Data visualization and GIS projects - Current hands-on implementation experience required
  • Database Administrator with big data project and TOGAF certification will be an advantage
  • Customer facing skills to represent Big Data Architectures well within the opco environments and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation.

Collaborate with Opco’s Analytics, data science and full stack team to ensure structured and unstructured data capturing and ingestion as per design requirements

Work with Axiata’s IT and security architects to ensure compliance with data security and privacy policy

Manage and lead end-to end data lifecycle management activities and ensure consistency/quality/availability between data management – More of Data Engineers role

Excellent communication skills and ability to convey complex topics through effective documentation as well as presentation.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.