Job Search and Career Advice Platform

Enable job alerts via email!

Data Architect

New Era Solutions

Bengaluru

On-site

INR 25,00,000 - 31,00,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech firm located in Bangalore is seeking an experienced Data Architect to design and implement scalable data architectures. The role involves collaborating with cross-functional teams to develop data solutions using cloud platforms such as AWS and Azure. Applicants should have over 8 years of experience in data architecture and engineering, along with strong proficiency in SQL and various database technologies. The position offers a competitive annual salary between ₹25,00,000 and ₹31,00,000.

Qualifications

  • 8+ years of experience in data engineering, data architecture, or database management.
  • Proven expertise in designing data models, data warehouses, and large-scale distributed data systems.
  • Strong knowledge of database technologies and cloud data platforms.

Responsibilities

  • Design and architect scalable, secure, and high-performance data systems.
  • Develop conceptual, logical, and physical data models.
  • Implement data warehousing solutions using modern cloud platforms.

Skills

Proficiency in SQL
Data modeling tools (ERwin, ER/Studio)
ETL/ELT tools (Informatica, Talend, Azure Data Factory)
Big data tools (Hadoop, Spark, Kafka, Hive)
API integrations and microservices
Data governance and metadata management

Education

Bachelor’s or Master’s degree in Computer Science, IT or related

Tools

SQL Server
Oracle
MySQL
PostgreSQL
NoSQL databases
AWS Redshift
Azure Synapse
Google BigQuery
Snowflake
Job description
Job Description – Data Architect

Location: Bangalore

Salary: ₹25,00,000 – ₹31,00,000 per year

Role Overview

The Data Architect is responsible for designing, implementing, and maintaining scalable data architectures that support enterprise analytics, business intelligence, and data-driven decision‑making. This role involves building robust data models, optimizing data flows, ensuring data quality, and establishing governance frameworks. The Data Architect collaborates closely with engineering, product, and analytics teams to develop modern data solutions using cloud platforms, big data technologies, and advanced integration tools.

Key Responsibilities
  • Design and architect scalable, secure, and high‑performance data systems that support enterprise‑wide analytics and reporting.
  • Develop conceptual, logical, and physical data models aligned with business objectives.
  • Oversee the design of data pipelines, ETL/ELT workflows, and data integration frameworks.
  • Ensure data accuracy, reliability, and availability through effective governance and quality management practices.
  • Implement data warehousing solutions using modern cloud platforms such as AWS, Azure, or Google Cloud.
  • Define data architecture standards, best practices, and development guidelines.
  • Optimize database performance, storage architecture, and data lifecycle management.
  • Collaborate with cross‑functional teams to understand requirements and translate them into technical architecture solutions.
  • Evaluate and implement new technologies, tools, and platforms to enhance the data ecosystem.
  • Ensure compliance with security, privacy, and regulatory standards related to data management.
Essential Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or related fields.
  • 8+ years of experience in data engineering, data architecture, or database management.
  • Proven expertise in designing data models, data warehouses, and large‑scale distributed data systems.
  • Strong knowledge of database technologies such as SQL Server, Oracle, MySQL, PostgreSQL, or NoSQL databases.
  • Experience with cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery, Snowflake).
Skills Required
Technical Skills
  • Proficiency in SQL, data modeling tools (ERwin, ER/Studio), and database design.
  • Strong experience with ETL/ELT tools (Informatica, Talend, Azure Data Factory, dbt, SSIS).
  • Expertise in big data tools such as Hadoop, Spark, Kafka, Hive, or Databricks.
  • Familiarity with API integrations, microservices, and streaming data architectures.
  • Strong understanding of data governance, metadata management, MDM, and data quality frameworks.
  • Knowledge of security practices including encryption, masking, and compliance frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.