Job Search and Career Advice Platform

Enable job alerts via email!

Data Platform Architect

Master-Works

Riyadh

On-site

SAR 200,000 - 300,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology consultancy is seeking an Enterprise Data Platform Architect in Riyadh, Saudi Arabia. This strategic role involves guiding the design and implementation of enterprise-wide data platforms, optimizing data flow, and ensuring compliance with standards. The ideal candidate will have extensive experience with Denodo and the Hadoop ecosystem, strong SQL skills, and at least 7 years in data projects. This is a crucial position for promoting data integrity and scalability across the organization.

Qualifications

  • Extensive experience with Denodo Platform and Cloudera Hadoop ecosystem.
  • Strong expertise in SQL and data modeling.
  • Proficient in big data tools like Spark, Hive, and Kafka.

Responsibilities

  • Architect, implement, and maintain enterprise-scale data solutions.
  • Design and monitor large-scale data pipelines for performance.
  • Collaborate with stakeholders to translate requirements into robust solutions.

Skills

Denodo Platform
SQL
Data modeling
Problem-solving
Cloud data services
Big data tools

Education

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field

Tools

Cloudera
Hadoop
Spark
Hive
Kafka
Job description

Master Works is excited to invite applications for the position of Enterprise Data Platform Architect. In this strategic role, you will guide the design and implementation of enterprise-wide data platforms that facilitate effective data management, analytics, and governance. You will work collaboratively with stakeholders across the organization to develop data architecture strategies that empower the business while ensuring compliance with industry standards. Your expertise will play a crucial role in optimizing data flow, storage, and accessibility, making data a valuable asset for decision-making. As a champion for best practices in data architecture, you will lead initiatives to promote data integrity, security, and scalability throughout the enterprise, ultimately transforming the way Master Works leverages its data assets for business success.

Key Responsibilities
  • Architect, implement, and maintain enterprise-scale data solutions, combining data virtualization (Denodo) and big data ecosystem technologies (Cloudera, Hadoop, Spark, Hive, Kafka, etc.).
  • Integrate complex structured and unstructured data sources (SQL/NoSQL, cloud platforms, applications) into unified, high-performance data layers.
  • Design, optimize, and monitor large-scale data pipelines, virtual views, and workflows for high-performance, low-latency access.
  • Implement and enforce data governance, security, and access control policies across all data platforms.
  • Collaborate with data engineers, analysts, and business stakeholders to translate requirements into scalable and robust solutions.
  • Troubleshoot, monitor, and continuously improve system performance, reliability, and scalability.
  • Maintain best practices, documentation, and knowledge sharing for enterprise data platforms.
  • Extensive experience with Denodo Platform, Cloudera Hadoop ecosystem, and enterprise data virtualization.
  • Strong expertise in SQL, data modeling, query optimization, and distributed computing concepts.
  • Proficient in big data tools: Spark, Hive, Impala, HBase, Kafka, and Sqoop.
  • Solid understanding of ETL processes, data integration, and cloud data services.
  • Proven ability to manage complex, enterprise-scale data projects with high-quality results.
  • Excellent problem-solving, analytical, and communication skills.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • Minimum 7+ years of experience in related filed
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.