3 days ago Be among the first 25 applicants
Get AI-powered advice on this job and more exclusive features.
A. THE DELIVERABLES
1.1 Background
As our client, a crown corporation, continues to expand its data-driven culture, the role of Data Engineers is critical in enabling scalable, secure, and high-performance data solutions. Data Engineers work closely with business units and analytics professionals to design, build, and maintain data pipelines and architectures that support advanced analytics, reporting, and operational systems. This includes integrating data from legacy systems, cloud platforms, and modern data lakehouse environments to ensure data is accessible, reliable, and optimized for use.
1.2 Description of Requirements
In order to be considered - the Data Engineer resource MUST have a minimum of 5 years of recent (in last 7 years) experience in modern data management principles such as, but not limited to ETL, practical data design, architecture, management, modelling, quality, and analytics. The successful candidate will demonstrate a broad and solid understanding of these principles.
Successful Data Engineer candidate will have:
- Degree or Diploma in Computer Science, Engineering, Data Sciences, or a quantitative discipline
- 5+ years’ recent experience in ETL, data design, data architecture, data management, and data modeling
- Relevant job experience in North America
Core Technical Skills:
- SQL Server & SSIS: Expert proficiency with SQL Server (on-premises), including stored procedures, and SSIS package-level deployment.
- Data Pipelines: Proven experience designing, creating, and maintaining robust data pipelines and ETL processes.
- Monitoring: Skilled in monitoring and troubleshooting database issues to ensure compliance with policies and regulations.
- Python for ETL: Advanced Python skills applied to developing ETL processes following software development best practices (including automated testing and code reviews).
- Big Data Tools: Proficient in leveraging big data technologies, including PySpark and SparkSQL for large-scale data processing.
- Cloud Expertise: Hands-on experience with cloud-based platforms such as Databricks, Azure Data Factory, and Azure Data Lake.
Lakehouse Architecture: Knowledgeable in implementing lakehouse architectures using Delta format and optimization strategies.
API Integration: Experience working with external third-party APIs as ETL sources, including Microsoft Graph APIs to automate tasks across Microsoft services.
- Automation & Deployment: Familiar with CI/CD processes and tools—including Databricks asset bundles (DABs) for managing workflows—and proficient with version control systems (e.g., Git) for ETL deployments.
Data Management & Communication:
- Understanding of data management principles as outlined by DAMA/DMBoK
- Ability to provide insights on evolving database integration, storage, and utilization needs
- Overseeing integration from legacy/on-premises systems to new solutions
- Clear communication of technical information and the ability to train/support staff
- Knowledge of data privacy and confidentiality regulations (e.g., PIPEDA)
Typical activities in this role include but not limited to:
- Work closely with the Enterprise Analytics team to create and maintain ELT processes;
- Assemble large, complex datasets that meet functional / non-functional business requirements;
- Consulting DT&S and Business leaders on data and information management practices and governance;
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.;
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other technologies;
- Work with stakeholders including the Executive, to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our business.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Championing efforts to improve business performance through enterprise information capabilities,such as master data management (MDM), metadata management, analytics, content management, data integration, and related data management or data infrastructure;
- Provide insight into the changing database integration, storage and utilization requirements for the company and offer suggestions for solutions;
- Monitor and understand Information Management trends and emerging technologies.
Seniority level
Employment type
Job function
Industries
- Business Consulting and Services
Referrals increase your chances of interviewing at OTA Business Solutions Inc. by 2x
Get notified about new Data Engineer jobs in Regina, Saskatchewan, Canada.
Regina, Saskatchewan, Canada CA$70 - CA$90 2 days ago
Greater Regina Metropolitan Area 2 days ago
Software Engineer, Ceph & Distributed Storage
System Software Engineer - Ubuntu Networking
Regina, Saskatchewan, Canada 2 months ago
System Software Engineer - GCC/LLVM compiler, tooling, and ecosystem
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.