Job Description:- Creating, enhancing, maintaining, and supporting structures for storage of data in formats that are suitable for consumption in analytics solutions.
- Automation of data pipelines used to ingest, prepare, transform, and model data for use in analytics products.
- Creating, enhancing, maintaining, and supporting dashboards and reports.
- Creating, enhancing, maintaining, and supporting analytics environments and implementing new technology to improve performance, simplify architecture patterns, and reduce cloud hosting costs.
- Knowledge transfer sessions and documentation for technical staff related to architecting, designing, and implementing continuous improvement enhancements to analytics solutions. These sessions will be held as needed and on a case by case basis that involve walkthroughs of documentation, code, and environment setups.
Experience and Skill Set Requirements:
Data Storage and Preparation - 35%:
- The candidate must demonstrate their experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse structures in real world implementations.
Data Pipelines - 35%:
- The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (SQL,Python, Databricks and Azure Data Factory).
Data Analytics - 15%:
- The candidate must demonstrate their experience with Power BI reports and dashboards.
Knowledge Transfer - 15%:
- The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting, designing, and implementing end to end analytics solutions.