Senior Data Engineer
Client is a dynamic Toronto-based IT consulting boutique that strives for excellence in customer service and delivery in financial services and fintech industry. They focus on technology advisory services, application development, cloud computing, integration solutions, and enterprise big data and analytics. With a team of highly knowledgeable business and technical experts, we provide strategic, mission critical solutions to our clients. Looking for Senior Data Engineer who have experience and expertise in designing and implementing complex systems and solutions. You will be responsible for providing leadership, designing, and implementing Enterprise level systems and solutions for our Financial Services clients in the Greater Toronto Area.
Company location: Downtown Toronto. Hybrid work environment - 3 days in office/week. Perm / FTE role: Salary: Market rate + benefits and performance bonus.
Responsibilities
- Understand business requirements for data and information in the financial services industry.
- Collaborate with cross-functional teams to deliver analytics solutions.
- Design and implement scalable data pipelines and ETL processes.
- Drive technology modernization and migration projects to modern platforms.
- Develop and maintain data models and data warehouse infrastructure.
- Develop high performing data pipelines in Microsoft SSIS, Azure Data Factory, or Databricks.
- Develop database stored procedures based on Microsoft T‑SQL and/or Oracle PL/SQL.
- Develop data processing modules in the following programming languages: C#, Java, Python.
- Perform data profiling and data quality audit.
- Optimize performance and reliability of data systems in cloud and on‑prem environments.
- Design automated unit‑testing framework as per project requirement.
- Integrate pipelines with Continuous Integration/Delivery framework as needed.
- Produce estimates for analysis, design, development, and testing for data pipelines.
- Ensure compliance with business, data, and technical requirements.
- Ensure that client’s enterprise architecture standards, policies, and procedures are followed.
Requirements
- Minimum Bachelors degree in Computer Science or Engineering.
- Minimum 5 years experience in Data engineering and ETL development.
- Strong problem solving, communication, and collaboration skills.
- Strong knowledge in data analysis, database development, data warehousing life cycle and data integration methodologies (ETL, ELT, EAI, EII).
- Strong SQL knowledge.
- Strong knowledge in Python for data processing and automation.
- Hands‑on experience with cloud‑based data platforms (AWS or Azure).
- Familiarity with AWS tools such as SageMaker, AWS Unified Studio, and other data services or familiarity with Azure Data Factory.
- Experience in tech modernization/migration projects.
- Familiar with capabilities of various platform services (e.g. Compute, Storage, Web, Developer, Integration, Data, Security, or Management) from the cloud vendors.
- Working knowledge of one of data analytics tools (such as Microsoft Power BI).
- Knowledgeable in both relational and dimensional data modeling (with both Kimball and Inmon approaches).
- Knowledge in data management, REST‑oriented API, Continuous Integration and Delivery (CI/CD) principles.
- Business knowledge in financial industries (i.e. one or more in retail banking, commercial banking, capital markets, wealth management, insurance, pension fund, and fintech).
- Ability to work independently and excel in a team environment.