Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer - Remote / Telecommute

Cynet systems Inc

Remote

CAD 80,000 - 110,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions company located in Canada seeks a Senior Data Engineer responsible for designing and maintaining scalable data solutions. This role involves building reliable data pipelines, ensuring data quality, and developing interactive dashboards with Power BI. Candidates should possess a Bachelor's degree in Computer Science or a related field and have experience in data engineering and ETL processes. Knowledge of DAX and strong analytical skills are essential. The position offers opportunities to work in both on-premises and cloud environments.

Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Experience designing efficient dimensional models for data warehousing and analytics.
  • Strong experience with Microsoft tabular models and DAX.
  • Experience developing dashboards and reports.
  • Extensive experience in data engineering, analytics, ETL development, and cloud-based data solutions.

Responsibilities

  • Design and build data pipelines in cloud platforms and on-premises.
  • Integrate data from SQL, NoSQL, and APIs while maintaining accuracy.
  • Create and optimize dimensional models for reporting.
  • Develop interactive dashboards using Power BI.

Skills

Data engineering
Data analytics
ETL processes
Experience with Power BI
DAX
Statistical analysis

Education

Bachelor's degree in Computer Science
Bachelor's degree in Information Technology

Tools

SSIS
Azure Data Factory
PostgreSQL
MongoDB
Azure Synapse
Talend
Job description
Job Description
  • The Senior Data Engineer is responsible for designing, building, and maintaining scalable data engineering and analytics solutions across on-premises and cloud environments.
  • This role supports data ingestion, transformation, governance, analytics, and reporting to enable data-driven decision-making across the organization.
Responsibilities
  • Design, build, and maintain data pipelines on-premises and in cloud platforms to ingest, transform, and store large datasets.
  • Ensure data pipelines are reliable and support multiple business use cases.
  • Create and optimize dimensional models using star and snowflake schemas to improve query performance and reporting.
  • Integrate data from SQL, NoSQL, APIs, and file-based sources while maintaining data accuracy and completeness.
  • Apply validation checks, monitoring, and logging to ensure high-quality data.
  • Improve ETL and ELT processes for efficiency and scalability by redesigning workflows and removing bottlenecks.
  • Build and maintain end-to-end ETL and ELT pipelines using SSIS and Azure Data Factory.
  • Implement error handling, scheduling, and monitoring for dependable operations.
  • Automate deployment, testing, and monitoring of ETL workflows using CI/CD pipelines.Evalrow
  • Manage data lakes and data warehouses with appropriate governance, security controls, and encryption practices.
  • Collaborate with engineers, analysts, and stakeholders to translate business requirements into technical solutions.
  • Prepare curated data marts and fact and dimension tables to support self-service analytics.
  • Analyze datasets to identify trends, patterns, and anomalies using statistical methods, DAX, Python, and R.
  • Develop interactive dashboards and reports using Power BI and DAX.
  • Build predictive and descriptive models using statistical and machine learning techniques.
  • Present analytical findings to non-technical stakeholders in clear, actionable terms.
  • Deliver analytics solutions iteratively in an Agile environment and mentor teams to improve analytics fluency.
Requirement/Must Have
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Experience designing efficient dimensional models for data warehousing and analytics.
  • Experience ensuring data quality, security, and governance.
  • Strong experience with Microsoft tabular models and DAX.
  • Experience working as a Data Engineer, Data Analyst, or in a similar role.
  • Experience using Git, collaborative workflows, CI/CD pipelines, containerization, and infrastructure-as-code tools.
  • Experience developing dashboards and reports.
  • Experience extracting and manipulating data from diverse on-premises and cloud-based sources.
  • Experience building ETL pipelines using SSIS, Azure Data Factory, and APIs.
  • Experience working with Power BI.
  • Experience performing migrations across on-premises, cloud, and cross-database environments.
Should Have
  • Experience with databases and data integration tools such as PostgreSQL, MongoDB, Azure Cosmos DB, Azure Synapse, and Talend.
  • Exposure to AI and machine learning tools and workflows within cloud platforms.
  • Experience with enterprise IT environments, shared services, or large-scale data platforms.
Experience
  • Extensive experience in data engineering, analytics, ETL development, and cloud-based data solutions.
Qualification And Education
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.