Enable job alerts via email!

Data Architect – Data Lakehouse Transformation

Confidential Company

Jeddah

On-site

SAR 80,000 - 120,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a highly skilled Data Architect to spearhead a transformative Data Lakehouse initiative for a prominent banking client. This role demands extensive experience in data architecture and platform design, particularly with Cloudera and Teradata, to develop scalable and secure data solutions. The ideal candidate will collaborate with cross-functional teams to ensure alignment with enterprise data strategies while driving modernization efforts. If you are passionate about data architecture and eager to make a significant impact in the banking sector, this opportunity is tailored for you.

Qualifications

  • 12+ years of experience in data architecture and platform design.
  • Strong hands-on experience in Cloudera, Teradata, and Informatica.

Responsibilities

  • Design end-to-end architecture for Data Lakehouse solutions.
  • Lead modernization initiatives from legacy DWH to Cloudera architecture.

Skills

Data Architecture
Cloudera
Teradata
Informatica
SQL
Data Security
Data Governance
Data Modeling
Performance Optimization
Data Ingestion

Education

Bachelor's Degree in Computer Science
Master's Degree in Data Science

Tools

Power BI
Kafka
Airflow
Control-M

Job description

Data Architect – Data Lakehouse Transformation

We are looking for a highly skilled Data Architect with a deep understanding of modern data architectures to support a large-scale Data Warehouse to Data Lakehouse Transformation initiative for a leading banking client. The ideal candidate will have a strong background in data platform architecture, solution design, and implementation, with expertise in Cloudera, Teradata, and Informatica, and a solid understanding of banking data domains.

This role will play a pivotal part in designing scalable, secure, and high-performance data solutions that align with the bank’s enterprise data strategy.

Key Responsibilities:

  • Design and define the end-to-end architecture for the Data Lakehouse solution covering Bronze, Silver, and Gold layers, metadata management, and data governance.
  • Lead data platform modernization initiatives involving migration from legacy DWH to modern Cloudera-based architecture.
  • Translate business and functional requirements into scalable data architecture solutions.
  • Collaborate with engineering, platform, analytics, and business teams to define data flows, ingestion strategies, transformation logic, and consumption patterns.
  • Ensure architectural alignment with enterprise data standards, security guidelines, and regulatory requirements.
  • Define data modeling standards and oversee data modeling efforts across layers (relational and big data).
  • Partner with the implementation oversight partner to review and validate logical and physical data models.
  • Drive architecture reviews, performance tuning, and capacity planning for the data ecosystem.
  • Guide and mentor data engineering teams on architectural best practices.

Required Skills and Experience:

  • 12+ years of experience in data architecture, data platform design, or enterprise architecture roles.
  • Strong hands-on experience in Cloudera (Hadoop ecosystem, Hive, HDFS, Spark), Teradata, Informatica PowerCenter/IDQ, and SQL-based platforms.
  • Deep understanding of data ingestion, curation, transformation, and consumption in both batch and near real-time.
  • Banking industry experience with familiarity across domains such as retail, corporate banking, credit risk, finance, and regulatory reporting.
  • Proficiency in designing for scalability, performance optimization, and data security/compliance.
  • Solid experience with data lakehouse concepts, open table formats (Iceberg/Delta), and layered architectures.
  • Experience integrating BI/reporting platforms (e.g., Power BI, Cognos) and downstream data products.

Preferred Attributes:

  • Experience with Kafka/NiFi for streaming ingestion and orchestration tools like Control-M or Airflow.
  • Knowledge of metadata, lineage, and data catalog tools.
  • Familiarity with hybrid deployment models (on-prem and cloud) and DevOps/DataOps pipelines.
  • TOGAF, CDMP, or DAMA certification is a plus.

Employment Type

    Full Time

Company Industry

  • IT - Software Services

Department / Functional Area

  • IT Software

Keywords

  • Architect
  • Data Lakehouse

Disclaimer: Naukrigulf.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@naukrigulf.com

People Looking for Data Architect Jobs also searched
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.