Job Search and Career Advice Platform

Enable job alerts via email!

DataOps Engineer

Citylogix

Remote

CAD 85,000 - 120,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data analytics firm in Canada is seeking a DataOps Engineer to develop sophisticated data processing pipelines and ensure robust performance. The successful candidate will design scalable ETL/ELT workflows using big data frameworks like Spark and Hadoop, along with workflow orchestration tools such as Airflow or Prefect. This role offers the opportunity to integrate diverse data sources and create monitoring solutions within a dynamic and collaborative environment.

Qualifications

  • 3+ years building production data pipelines.
  • Expertise in big data frameworks such as Spark and Hadoop.
  • Experience with workflow orchestration tools like Airflow or Prefect.

Responsibilities

  • Implement orchestration with Airflow/Prefect.
  • Build reusable components and shared libraries.
  • Optimize processing performance and costs.
  • Manage schema evolution and data versioning.
  • Integrate diverse data sources and formats.
  • Create monitoring dashboards and metrics.

Skills

Building production data pipelines
Expertise in big data frameworks
Experience with workflow orchestration tools

Tools

Apache Spark
Hadoop
Airflow
Prefect
Job description

Citylogix is a leading provider of data and analytics for smart city transportation infrastructure, leveraging LiDAR, 360° imaging, and AI-powered analytics to create detailed digital maps, and provide predictive analytics for proactive asset management.

About the role

The DataOps Engineer position is central to developing sophisticated data processing pipelines that handle diverse data formats and volumes. This role involves designing and implementing scalable ETL/ELT workflows using modern big data frameworks and orchestration tools. The engineer will create reusable components and optimize processing efficiency to meet demanding performance requirements. The position requires deep technical expertise in distributed systems and data processing patterns. The successful candidate will work at the intersection of data engineering and operations, ensuring pipelines are both functionally robust and operationally excellent.

What you'll do
  • Implement orchestration with Airflow/Prefect
  • Build reusable components and shared libraries
  • Optimize processing performance and costs
  • Manage schema evolution and data versioning
  • Integrate diverse data sources and formats
  • Create monitoring dashboards and metrics
Qualifications
  • 3+ years building production data pipelines
  • Expertise in big data frameworks (Spark, Hadoop)
  • Experience with workflow orchestration tools
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.