We are seeking an experienced Datamart / Semantic Layer Developer to develop and implement business‑oriented datamarts and semantic layers on Teradata EDW, CDP Hive, and Trino platforms. The candidate must possess strong SQL development skills, dimensional modeling knowledge, telecommunications domain expertise, and the ability to translate technical specifications into optimized analytics solutions.
Core Responsibilities
- Datamart Development – Develop and implement star schema and snowflake schema dimensional models on Teradata EDW. Build subject‑area datamarts (Customer, Revenue, Network, Product, Finance) based on design specifications. Create and optimize fact tables, dimension tables, bridge tables, and aggregate tables. Implement slowly changing dimensions (SCD Types 1, 2, 3) logic and dimensional hierarchies. Develop complex SQL queries, stored procedures, and views for datamart population. Implement data transformation and aggregation logic for business metrics and KPIs.
- Semantic Layer Development – Build semantic layers using TIBCO Data Virtualization on Teradata and CDP platforms. Create semantic models using Trino for distributed query processing and data access. Create virtual views, materialized views, and business‑friendly data abstractions. Implement business logic, calculated measures, KPIs, and derived metrics in the semantic layer. Develop data access policies, row‑level security, and governance rules. Optimize semantic layer performance through caching, indexing, and query optimization.
- Multi‑Platform Development – Work across Teradata, CDP Hive, and Trino platforms for datamart and semantic layer implementation. Develop HiveQL queries and tables in CDP (Cloudera Data Platform) environment. Integrate data from Teradata EDW and CDP Hive through Trino for unified semantic access. Create cross‑platform queries and federated views using Trino connectors. Implement partitioning, bucketing, and optimization strategies in Hive tables.
- Implementation & Optimization – Translate design documents (HLD, LLD) and mapping specifications into SQL code. Develop ETL/ELT processes to populate datamarts from EDW sources. Optimize query performance using indexing (PI, SI, NUSI), statistics, partitioning, and aggregations. Conduct unit testing, data validation, and reconciliation between source and target. Debug and troubleshoot performance issues in datamarts and semantic layers.
- Collaboration & Documentation – Work closely with datamart designers, EDW developers, BI teams, and business analysts. Implement business requirements and KPI calculations as per specifications. Create technical documentation: SQL scripts, deployment guides, data lineage. Support UAT activities and assist business users in validating data accuracy. Provide production support and resolve data or performance issues.
Required Skills
- SQL & Development – Advanced SQL development, stored procedures, performance tuning, utilities (BTEQ, TPT). Strong understanding of Teradata architecture, indexing (PI, SI, NUSI), partitioning, and statistics.
- CDP Hive – HiveQL development, table creation, partitioning, bucketing, and optimization in Cloudera environment.
- Trino (PrestoSQL) – SQL development using Trino, federated queries, connector configuration.
- Expert‑level SQL across multiple platforms for complex queries and transformations.
- Oracle SQL and PL/SQL development experience.
- Semantic Layer & Tools – Hands‑on development experience with TIBCO Data Virtualization. Experience creating virtual views, business views, and semantic models. Understanding of data virtualization concepts and query federation. Knowledge of BI tool integration with semantic layers.
- Dimensional Modeling – Star and snowflake schema dimensional models, fact table design, dimension design, SCD implementations. Ability to translate dimensional models into physical database objects.
- Telecommunications Domain – Understanding of telecom business processes, KPIs, and data flows.
- OSS & BSS – Network performance, inventory, fault management metrics; Billing, customer analytics, revenue, churn, product performance metrics.
- Professional Skills – Full SDLC experience (Agile/Scrum, Waterfall). Analytical and debugging skills for performance troubleshooting. Good communication for technical collaboration. Unix/Linux scripting for automation.
- Version Control – Git, SVN.
Preferred Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field.
- Experience with data profiling and data quality tools.
- Knowledge of ETL tools (Ab Initio, Informatica).
- Understanding of data governance and metadata management.
- Experience with BI tools: Tableau, Power BI, Qlik.
Key Deliverables
- Developed and deployed datamarts (star/snowflake schema) on Teradata.
- Semantic layer implementations using TIBCO and Trino with business views and virtual tables.
- Optimized SQL code, stored procedures, and views for datamarts.
- HiveQL scripts and tables in CDP environment.
- Technical documentation: SQL scripts, deployment guides, data lineage.
- Unit‑tested code with data validation and reconciliation reports.
- Performance‑tuning recommendations and optimization implementations.