Enable job alerts via email!

Business Data Engineer

Telesat

Ottawa

On-site

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading satellite communications provider in Ottawa seeks a Business Data Engineer. This hybrid role requires 2–5 years of experience in Data Engineering, focusing on building data pipelines and delivering analytics. The ideal candidate has strong skills in Databricks, Power BI, and SQL, along with proficiency in Python. Candidates must be able to work in Canada and obtain necessary clearance under the CGP.

Qualifications

  • 2–5 years in a Data Engineering or Business Data Analysis role.
  • Strong experience with Databricks and Delta Lake.
  • Proficiency in Python or Scala for data processing.

Responsibilities

  • Design and maintain data pipelines for analysis and reporting.
  • Collaborate with teams to gather requirements and validate data.
  • Optimize SQL for data transformation and cost-efficient queries.

Skills

Databricks
Power BI
Advanced SQL
Python
Data Integration

Education

Bachelor’s degree in Computer Science or related

Tools

Fivetran
Scala
Job description

We are seeking a motivated Business Data Engineer with 2–5 years of experience in Lakehouse environments. This hybrid role combines the technical rigor of a Data Engineer with the business insight of a Data Analyst, building reliable data pipelines while partnering with stakeholders to deliver impactful analytics and reporting.

The ideal candidate has hands‑on experience with Databricks, PowerBI, advanced SQL, Fivetran, data integration from on‑prem and cloud systems (including Workday and Salesforce), and solid understanding of Python and Databricks Notebooks.

Key Responsibilities
  • Design, develop, and maintain data pipelines and workflows for ingestion, transformation, and delivery of clean, reliable business data for analysis and reporting.
  • Collaborate with business teams to gather requirements, perform data validation, and support UAT/demos.
  • Extract, integrate, and transform data from diverse systems including Workday, Salesforce, on‑prem and SaaS applications using APIs, JDBC/ODBC, and native/direct connections.
  • Write and optimize advanced SQL for data modeling, transformation, and cost‑efficient query execution.
  • Build and optimize Power BI datasets, models, and dashboards for business insights and performance tracking.
  • Use Databricks Notebooks with Python and/or Scala for data preparation, automation, and analysis.
  • Monitor and optimize compute resources and job performance for cost control and efficiency.
  • Document data pipelines, transformation logic, and architecture for transparency and maintainability.
Education and Experience
  • 2–5 years in a Data Engineering or Business Data Analysis role.
  • Strong hands‑on experience with Databricks (including Delta Lake, Spark SQL, and Notebooks).
  • Strong working knowledge of Power BI (data modeling, DAX, dashboard design, publishing).
  • Advanced SQL skills for large‑scale data transformation and optimization.
  • Proficiency in Python and/or Scala for data processing in Databricks.
  • Proven experience with Fivetran or similar ETL/ELT tools for automated data ingestion.
  • Experience integrating data from Business Applications like Workday and Salesforce (via APIs, reports, or connectors).
  • Ability to manage and transform data from on‑premises and cloud systems.
  • Strong communication skills with experience in business requirement gathering and data storytelling.
  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, Statistics, or a related field.
  • Relevant certifications (e.g., Databricks Certified Data Engineer, Microsoft Power BI Data Analyst, Workday Reporting Specialist) are a plus.
Preferred / Nice‑to‑Have
  • Fundamental knowledge of Apache Spark (architecture, RDDs, DataFrames, optimization).
  • Experience in query and compute cost optimization within Databricks or similar platforms.
  • Familiarity with data governance, security, and metadata management.
  • Exposure to CI/CD for data pipelines using Git or DevOps tools.
  • GenAI Agents and/or ML experience
Decision Making and Supervision
  • Work under minimal supervision.
  • Make decisions and recommendations requiring analysis and interpretation within established procedures.
Working Conditions
  • Generally comfortable working conditions with lifting and onsite installations.
  • Moderate visual concentration in use of video display terminal.

The successful candidate must be able to work in Canada and obtain clearance under the Canadian Controlled Goods program (CGP).

#LI-MF1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.