Business System Analyst - Reference Data
This is with a large IT services firm for a capital markets division of a major bank
Candidate must have legal work status for Canada.
Client Location is downtown Toronto, close to GO and TTC
Hybrid - In-Office 3 days per week.
Perm/FT role, salary and benefits or on contract - 6 months to start at Market rate
Description:
We’re seeking a detail oriented Business System Analyst with deep Capital Markets GoldenSource Reference Data (Capital Markets) expertise. In this role, candidate will partner with front office, data governance and technology teams to define, source, curate and deliver critical reference data (e.g. instrument identifiers, legal entity data, product hierarchies) for trading, risk, compliance and reporting systems. Candidate combination of domain knowledge and technical acumen will ensure our reference data pipeline is accurate, complete and scalable.
Capital Markets reference data, GoldenSource (or similar golden source platform: Alveo, SunGard MDM, etc.), market data feeds (Bloomberg, Refinitiv, etc.), downstream consumers and ETL/ESB layers, identifiers (ISIN, CUSIP, FIGI, SEDOL).
Multiple roles, all require Capital Markets Reference Data with at least one with GoldenSource.
Responsibilities:
• Platform Assessment & Roadmap
• Conduct periodic reviews of our GoldenSource (or equivalent) implementation for securities master data.
• Evaluate vendor provided enhancement packs, patches and major version upgrades; recommend which to adopt based on business value, ROI and technical feasibility.
• Maintain a multi year roadmap for platform evolution, balancing stability with new feature adoption.
• Requirements Gathering & Analysis
• Collaborate with front office, middle office and operations teams to identify gaps in current reference data processes.
• Elicit and document functional and non functional requirements for platform enhancements (e.g. data model extensions, workflow automation, exception management).
• Translate business needs into detailed technical specifications and user stories.
• Technical Design & Integration
• Work with architects and developers to design integrations between GoldenSource, market data feeds (Bloomberg, Refinitiv, etc.), downstream consumers and ETL/ESB layers.
• Define data mapping, transformation and business rule configurations within the platform’s workflow engine.
• Ensure high availability and performant B2B data delivery (APIs, file feeds, message queues).
• Testing, Deployment & Support
• Develop and execute test plans (unit, integration, regression and UAT) for platform upgrades and configuration changes.
• Coordinate cutover activities, rollback procedures and post go live support.
• Troubleshoot data‐quality issues and production incidents; lead root cause analysis and remediation.
• Data Governance & Quality
• Enforce reference data governance policies: stewardship assignments, data lineage documentation and KPI monitoring (completeness, accuracy, timeliness).
• Partner with data quality teams to define SLAs, exceptions handling workflows and reconciliation routines.
• Stakeholder Communication & Training
• Prepare executive level impact assessments, upgrade benefit analyses and status dashboards.
• Conduct training sessions, workshops and update user documentation on new features and best practices.
Requirements:
• 10+ years in Capital Markets reference data, specifically securities master and instrument data.
• In depth understanding of identifiers (ISIN, CUSIP, FIGI, SEDOL), corporate actions and product taxonomies.
• Hands on experience configuring, upgrading and supporting GoldenSource (or similar golden source platform: Alveo, SunGard MDM, etc.).
• Strong knowledge of platform modules: data model, workflow engine, certification, business rules and distribution services.
• Proficient in SQL and relational data modeling.
• Familiarity with middleware/ETL tools (Informatica, MuleSoft, IBM DataStage) and API frameworks (REST/SOAP).
• Solid grasp of version control (Git), CI/CD pipelines and virtualization/containerization (Docker).
• Excellent problem solving and process optimization mindset.
• Strong written and verbal communication: able to distill complex technical topics for business audiences.
Preferred, but not required:
• Prior experience with B2B data‐delivery protocols (SFTP/SCP, MQ, FIX).
• Scripting expertise (Python, Shell) for automation and data‐validation routines.
• Exposure to cloud data platforms (AWS, Azure) and container orchestration (Kubernetes).
• Certifications in GoldenSource, CBIP or CDMP.