Overview
Our client is looking to hire a direct placement or contract-to-hire Microsoft Fabric Data Analyst to their team. The position is full‑time, with remote flexibility but requires quarterly travel to Appleton, WI (local candidates preferred). The analyst will work closely with the product owner, the sole in‑house data warehouse team member, and will be assigned to business units such as HR or Finance to gather requirements, dig into data sources, and build data models in Microsoft Fabric.
Job Description
Accelerating a modern, AI‑centered data platform on Microsoft Fabric, the analyst translates business questions into reliable, decision‑ready insights. They partner daily with Finance, Operations, Customer Service, and HR, shape requirements, validate data, and deliver print‑friendly Power BI reports and self‑service datasets. They collaborate with Data Engineers to convert curated warehouse tables into trusted metrics, dashboards, and narratives across business units (Fulfillment, Traditional, Chemical, Transportation).
Must‑Haves
- SQL (strong): confidently join across multiple sources and very large tables; write efficient queries (filters before joins, window functions, CTEs); interpret query plans to tune performance.
- Data sourcing & mapping: define and maintain source‑to‑target mappings, conformed dimensions/keys, lineage, and KPI definitions; reconcile semantics across systems.
- Cross‑source storytelling: pull data from multiple systems to deliver a single, coherent narrative and self‑service dataset; shape a consistent, print‑friendly user experience end‑to‑end.
- Power BI: semantic modeling, DAX, RLS, usability, and visualization best practices.
- Business partnership: engage Finance, Operations, Customer Service, and HR to clarify definitions, prioritize needs, and iterate quickly.
- AI in the loop: use Copilot for Power BI/Fabric and other AI tools to speed analysis, documentation, and the end‑user experience.
What You’ll Do
- Run structured discovery with Finance, Operations, Customer Service, and HR; translate goals into well‑defined KPIs, dimensional definitions, and acceptance criteria.
- Shape semantic models and Power BI datasets (relationships, DAX measures, hierarchies, RLS) aligned to curated warehouse/lakehouse tables delivered by Engineering.
- Create readable, print‑friendly reports and dashboards that answer the "so what?".
- Profile data quality, reconcile totals, trace lineage, and author performant queries to validate numbers and accelerate insight generation.
- Define test cases, execute UAT, reconcile to source systems (WMS/OMS/ERP/HRIS), and document known caveats and SLAs.
- Use Copilot/NLQ to draft measures, generate narrative summaries, create how‑to guides, and surface "explain this insight" experiences.
- Maintain data dictionaries, KPI definition sheets, short "how to use this report" guides; run training for power users and frontline managers.
- Monitor usage, gather feedback, prioritize enhancements, and partner with Data Engineers to address data gaps or performance issues.
- Apply RLS/OLS, certify datasets, tag sensitivity, and contribute to catalog/lineage for trusted data.
Required Qualifications
- 3+ years in analytics, BI, or FP&A‑adjacent analyst roles producing executive‑ready dashboards and KPI packs.
- SQL (strong): confident writing joins, window functions, CTEs; performance‑minded validation queries.
- Power BI expertise: semantic modeling, DAX, RLS, bookmarks/drill‑through; design clean, business‑friendly visuals.
- Experience partnering with Finance, Operations, Customer Service, and/or HR; reconciling to ledgers, inventory/shipments, labor/time, and service metrics.
- Familiarity with Microsoft Fabric concepts (Warehouse/Lakehouse tables, Dataflows Gen2, OneLake) and how analysts consume curated data.
- Excellent communication; define terms precisely, write clear documentation, and facilitate decisive UAT.
- Evidence of using AI tools (Copilot for Power BI/Fabric, LLM‑assisted docs) to improve speed and quality.
Nice to Have
- Statistics for business analysis (variance, cohort, seasonality), A/B or test‑and‑learn literacy.
- Excel power‑user skills (Power Query, pivots) for ad‑hoc reconciliation.
- Light Python for data wrangling or reproducible analysis notebooks.
- Exposure to 3PL/logistics domains (WMS/OMS/ERP), SLA/reporting in operational environments.
Reporting & Structure
Reports to: IT Manager, Data & Analytics (aligned with the Data Engineer team).
Collaborates with: Data Engineers, Product Owners/BAs, and business leaders across Finance, Ops, CS, HR.
Key Details
- Experience Level: Intermediate
- Job Type & Location: Permanent, fully remote with occasional on‑site visits to Appleton, WI.
- Pay Range: $90,000 – $110,000 per year.
- Application Deadline: February 17, 2026.
About TEKsystems
We’re partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. TEKsystems is an Allegis Group company.
The company is an equal‑opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information, or any characteristic protected by law.