Enable job alerts via email!

Better Future Solutions - Data Modeler - Big Data

EbizON

Dehradun

On-site

INR 20,00,000 - 30,00,000

Full time

Today
Be an early applicant

Job summary

A leading data solutions company located in Dehradun, India is seeking an experienced Data Architect. The role requires over 10 years of expertise in data modeling, proficiency in data architecture practices, and experience with the Azure Data Platform. The ideal candidate will excel in a collaborative, agile environment and contribute to enterprise data initiatives, supporting business goals through effective data structures and governance.

Qualifications

  • 10+ years of hands-on experience in data modeling in Big Data environments.
  • Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices.
  • Proficient in modeling methodologies including Kimball, Inmon, and Data Vault.

Responsibilities

  • Design and develop conceptual, logical, and physical data models.
  • Build, maintain, and optimize data models within Databricks Unity Catalog.
  • Collaborate with data engineers and stakeholders to ensure alignment with business goals.

Skills

Data modeling in Big Data
OLTP
OLAP
Dimensional modeling
Enterprise data warehouse practices
SQL
Apache Spark
Azure Data Platform
Cross-functional teamwork

Education

Bachelor's or Master's degree in Computer Science or related field

Tools

Databricks
Delta Lake
Azure Data Factory
ER/Studio
SQLDBM
Job description

Key Responsibilities - Familiarity with modern storage formats like Parquet and ORC.

  • Design and develop conceptual, logical, and physical data models to support enterprise data initiatives.
  • Build, maintain, and optimize data models within Databricks Unity Catalog.
  • Develop efficient data structures using Delta Lake, optimizing for performance, scalability, and reusability.
  • Collaborate with data engineers, architects, analysts, and stakeholders to ensure data model alignment with ingestion pipelines and business goals.
  • Translate business and reporting requirements into robust data architecture using best practices in data warehousing and Lakehouse design.
  • Maintain comprehensive metadata artifacts including data dictionaries, data lineage, and modeling documentation.
  • Enforce and support data governance, data quality, and security protocols across data ecosystems.
  • Continuously evaluate and improve modeling processes.
Skills and Experience
  • 10+ years of hands-on experience in data modeling in Big Data environments.
  • Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices.
  • Proficient in modeling methodologies including Kimball, Inmon, and Data Vault.
  • Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart.
  • Proven experience in Databricks with Unity Catalog and Delta Lake.
  • Strong command of SQL and Apache Spark for querying and transformation.
  • Hands-on experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database.
  • Exposure to Azure Purview or similar data cataloging tools.
  • Strong communication and documentation skills, with the ability to work in cross-functional agile environments.
Qualifications
  • Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or related field.
  • Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure.
  • Experience working in agile/scrum environments.
  • Exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.