Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Vantix Systems Inc.

Edmonton

On-site

CAD 80,000 - 110,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company is seeking a Data Engineer to design and maintain data pipelines, optimize dimensional models, and ensure data quality. The role involves collaborating with engineers and analysts to integrate data and develop insightful analytics solutions. Candidates should have a Bachelor's degree in Computer Science or related fields, along with substantial experience in data engineering, analytics, and developing dashboards using tools like Power BI. Experience with Azure Data Factory and DAX is essential.

Qualifications

  • Bachelor degree in Computer Science, IT or related field.
  • 3+ years designing efficient dimensional models (star and snowflake schemas).
  • 3+ years ensuring data quality, security, and governance.
  • 3+ years experience with Microsoft tabular models and DAX.
  • 5+ years as a Data Analyst, Data Engineer or similar role.
  • 2+ years using Git, CI/CD pipelines, and containerization.
  • 3+ years developing dashboards and reports.
  • 5+ years extracting and integrating data from diverse sources.
  • 3+ years experience with SSIS and Azure Data Factory.

Responsibilities

  • Design, build, and maintain data pipelines on-premises and in the cloud.
  • Create and optimize dimensional models for improved query performance.
  • Integrate data from SQL, NoSQL, APIs, and files.
  • Improve ETL/ELT processes for efficiency and scalability.
  • Manage data lakes and warehouses with proper governance.

Skills

Data analysis
Dimensional modeling
DAX
Python
Statistical methods
Data quality assurance
SSIS
Azure Data Factory

Education

Bachelor degree in Computer Science, IT or related field

Tools

Power BI
Git
Docker
Kubernetes
Terraform
CloudFormation
Job description
Data Engineering
  • Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
  • Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
  • Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high‑quality data.
  • Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
  • Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
  • Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
  • Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
  • Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self‑service analytics.
Data Analytics
  • Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
  • Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
  • Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
  • Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
  • Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self‑service capabilities.
  • Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
Qualifications
  • Bachelor degree in Computer Science, IT or related field of study.
  • Designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics. (3+ years)
  • Ensuring data quality, security, and governance. (3+ years)
  • Experience and technical knowledge of Microsoft tabular models and DAX. (3+ years)
  • Experience as a Data Analyst, Data Engineer or in a similar role. (5+ years)
  • Experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions. (2+ years)
  • Experience with development for dashboards and reports. (3+ years)
  • Experience with manipulating and extracting data from diverse on-premises and cloud-based sources. (5+ years)
  • Experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications. (3+ years)
  • Experience working with Power BI. (3+ years)
  • Performing migrations across on-premises, cloud, and cross-database environments. (2+ years)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.