Job Description – Data Architect
Location: Bangalore
Salary: ₹25,00,000 – ₹31,00,000 per year
Role Overview
The Data Architect is responsible for designing, implementing, and maintaining scalable data architectures that support enterprise analytics, business intelligence, and data-driven decision‑making. This role involves building robust data models, optimizing data flows, ensuring data quality, and establishing governance frameworks. The Data Architect collaborates closely with engineering, product, and analytics teams to develop modern data solutions using cloud platforms, big data technologies, and advanced integration tools.
Key Responsibilities
- Design and architect scalable, secure, and high‑performance data systems that support enterprise‑wide analytics and reporting.
- Develop conceptual, logical, and physical data models aligned with business objectives.
- Oversee the design of data pipelines, ETL/ELT workflows, and data integration frameworks.
- Ensure data accuracy, reliability, and availability through effective governance and quality management practices.
- Implement data warehousing solutions using modern cloud platforms such as AWS, Azure, or Google Cloud.
- Define data architecture standards, best practices, and development guidelines.
- Optimize database performance, storage architecture, and data lifecycle management.
- Collaborate with cross‑functional teams to understand requirements and translate them into technical architecture solutions.
- Evaluate and implement new technologies, tools, and platforms to enhance the data ecosystem.
- Ensure compliance with security, privacy, and regulatory standards related to data management.
Essential Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or related fields.
- 8+ years of experience in data engineering, data architecture, or database management.
- Proven expertise in designing data models, data warehouses, and large‑scale distributed data systems.
- Strong knowledge of database technologies such as SQL Server, Oracle, MySQL, PostgreSQL, or NoSQL databases.
- Experience with cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery, Snowflake).
Skills Required
Technical Skills
- Proficiency in SQL, data modeling tools (ERwin, ER/Studio), and database design.
- Strong experience with ETL/ELT tools (Informatica, Talend, Azure Data Factory, dbt, SSIS).
- Expertise in big data tools such as Hadoop, Spark, Kafka, Hive, or Databricks.
- Familiarity with API integrations, microservices, and streaming data architectures.
- Strong understanding of data governance, metadata management, MDM, and data quality frameworks.
- Knowledge of security practices including encryption, masking, and compliance frameworks.