Enable job alerts via email!

Data Systems Analyst (D365 F&O, Azure Synapse, Fabric)

PrecisionERP / PrecisionIT

Ottawa

On-site

CAD 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology consulting firm in Canada is seeking a qualified Data Systems Analyst to work on D365 programs and enhance enterprise reporting and planning automation. The role demands over 5 years of expertise in data engineering, specifically with D365 F&O and Azure Synapse. Responsibilities include designing robust data pipelines, developing high-complexity financial reports, and collaborating with finance stakeholders. A contract duration of 6 months is offered with potential extensions.

Qualifications

  • 5+ years in data engineering / BI with D365 F&O exposure.
  • Strong SQL and data modeling skills.
  • Hands-on Power BI experience.

Responsibilities

  • Configure and optimize D365 F&O to Azure Synapse Link.
  • Build ELT pipelines in Fabric and Azure Synapse.
  • Partner with Finance for BPA reports and BPP cubes.

Skills

Data engineering
Business Intelligence (BI)
SQL (T-SQL / Spark SQL)
Power BI
Data modeling
Azure DevOps
Communication skills

Tools

Azure Synapse
D365 F&O
Power Query
Python / PySpark
Job description

PrecisionERP / PrecisionIT is seeking a qualified Data Systems Analyst to work with our client's D365 program. We’re seeking a hands‑on consultant who blends data engineering, analytics, and applied data science to accelerate enterprise reporting and planning automation.

You will be the technical backbone across D365 F&O, Synapse Link, BPA / BPP, Azure Synapse, and Fabric —designing robust pipelines, models, and reports that advance our maturity from diagnostic to predictive / prescriptive analytics.

D365 program is moving reporting from on‑premises to Fabric plus D365 F&O data platform. With D365 Microsoft Business Performance Planning (BPP) and Business Performance Analytics (BPA) on top and Synapse Link plus Microsoft Fabric as the core.

Contract duration: 6 months with potential for extension

Federal clearance eligibility will be required

Tasks : D365 & Synapse Link
  • Configure and optimize D365 F&O → Azure Synapse Link (tables, change feeds, performance, known limitations).
  • Ensure reliable replication to Data Lake / Fabric Lakehouse; validate schemas, keys, and deltas.
Fabric & Azure Synapse Engineering
  • Build ELT pipelines in Fabric (Dataflows Gen2, Pipelines), Lakehouse / Warehouse, Spark / SQL and / or Azure Synapse.
  • Model conformed / serving layers for Finance, Operations, and Commerce domains.
  • Implement Direct Lake and Import / Composite models as appropriate.
BPA / BPP Reporting & Planning
  • Partner with Finance to deliver BPA reports and BPP cubes / workflows (budgeting / forecasting, departmental roll-ups).
  • Develop high-complexity financial reports (P&L by dimensions, contribution margin, variance / driver analysis).
Analytics & Data Quality
  • Prepare / transform data for analysis; create Power BI datasets, measures (DAX), and visuals for decision-making.
  • Apply statistical / ML methods (forecasting, A / B testing) where they add clear value.
  • Document data flows, definitions, lineage, and metadata in the data catalog; contribute to governance and continuous improvement.
DevOps & Ways of Working
  • Use Azure DevOps for user stories, CI / CD (Power BI / Fabric), testing, and release management across DEV / TEST / PROD.
  • Support UAT, cutover readiness, and knowledge transfer to internal teams.
Requirements
  • 5+ years in data engineering / BI with D365 F&O exposure and at least one production Synapse Link for D365 implementation.
  • Strong SQL (T-SQL / Spark SQL), data modeling (star / schema for finance), and ELT on Fabric and / or Azure Synapse.
  • Hands‑on Power BI (Power Query, DAX, model security), including performance tuning and enterprise deployment patterns.
  • Practical experience with BPA / BPP (or equivalent EPM tools) supporting budgeting / forecasting workflows and financial reporting.
  • Version control and CI / CD (Azure DevOps), environment promotion, and testing practices.
  • Excellent communication with Finance / Operations / Commerce stakeholders and IT—able to explain trade‑offs and document clearly.
Nice to have (Assets)
  • Experience with Direct Lake semantic models and Fabric Warehouse vs. Lakehouse design choices.
  • Python / PySpark for transformations; performance tuning of large fact tables / slowly changing dimensions.
  • Data governance tools (e.g., Purview) and KPI / metrics standardization.
  • Forecasting methods (classical time series or ML) applied to finance, commence, and operations datasets.
  • Public‑sector or Crown corporation experience.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.