Enable job alerts via email!

Sr. Data Engineer II

Enable International

Toronto

On-site

CAD 80,000 - 120,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a senior data engineer to enhance their data platform team. This role involves designing scalable data solutions, leveraging advanced technologies such as Snowflake and Kubernetes. You will contribute to the development of high-quality, secure software that empowers domain teams within a dynamic data mesh ecosystem. With a focus on innovation and collaboration, you'll mentor peers, lead projects, and drive improvements in engineering practices. If you are passionate about data engineering and eager to make a significant impact in a fast-growing company, this opportunity is perfect for you.

Benefits

Paid Time Off
Wellness Benefit
Comprehensive Insurance
Retirement Plan
Lucrative Bonus Plan
Equity Program
Career Growth
Free Food
Training
Pets

Qualifications

  • 5+ years of experience as a data engineer in cloud-based SaaS products.
  • Proficiency in Python and experience with Snowflake application development.

Responsibilities

  • Design and evolve data platform solutions within a Data Mesh architecture.
  • Develop and maintain production-level applications primarily using Python.

Skills

Problem-solving
Attention to detail
Communication skills
Collaboration
Ownership-driven mindset

Education

Bachelor’s degree in Computer Science
Master’s degree in a relevant technical field

Tools

Snowflake
Kubernetes
Python
Terraform
Git/GitHub
Azure
Airflow
Databricks

Job description

At Enable, we are transforming the supply chain with our cutting-edge rebate management software. We see rebates as a strategic advantage, strengthening partnerships, driving smarter decisions, and unlocking significant value across the entire supply chain – from manufacturers to consumers.

After securing $276M in Series A-D funding, we are positioned for continued, significant growth. Since the launch of our flagship product in 2016, we have been rapidly scaling our client base, product offerings, and built a team of top-tier talent committed to reshaping the industry.

Want a glimpse into life at Enable? Visit our Life at Enable page to learn how you can be part of our journey.

Job Summary

You’ll work as a senior voice within the data platform team to build and evolve the core tools, infrastructure, and processes that empower other domain teams within our data mesh ecosystem to develop and maintain data products, ensuring our data solutions (including Kubernetes-based deployments, Snowflake application development, and event-driven architectures) are reusable, standardized, and enable self-service for domain teams. You will contribute to the technical design, implementation, testing, deployment, and ongoing support and maintenance of our data platform on Snowflake and Azure. By going above and beyond simply implementing new features, we focus on customer experience, building high-quality, secure, and highly scalable software. You’ll use your full range of skills and further develop them—and those of your colleagues—including:

  • Problem-solving, and the ability and confidence to tackle complex data and platform challenges.
  • Peer code reviews to maintain quality, reliability, and security.
  • Modern big data architecture design, encompassing data orchestration and choreography.
  • Ability to prioritize and meet deadlines in a dynamic environment.
  • Attention to detail and solid written and verbal English communication skills.
  • Willingness and an enthusiastic attitude to work within existing processes/methodologies, while driving improvements where needed.

We want all our people to be whoever they want to be and are committed to creating a truly inclusive culture at Enable. We believe that bringing your full, authentic self to work helps us build the best quality software, and by creating a truly diverse workforce we bring innovation into everything we do.

This is a senior technical role focused on the development of our SaaS products—suited to a highly focused, ownership-driven engineer. You will regularly leverage Python, Kubernetes, and Snowflake for both data and application development. Development is a part of the role, but you’ll also be expected to contribute to all areas of our engineering work, including product and feature design, leading and mentoring peers, and helping us to continually improve.

You’ll have focused professional experience as a data engineer, preferably in cloud-based SaaS products. Ideally, you’ll have at least five years of experience, but we focus on skill and ability, not tenure.

Duties and Responsibilities - Architecture Design
  • Plan, design, and evolve data platform solutions within a Data Mesh architecture, ensuring decentralized data ownership and scalable, domain-oriented data pipelines.
  • Apply Domain-Driven Design (DDD) principles to model data, services, and pipelines around business domains, promoting clear boundaries and alignment with domain-specific requirements.
  • Collaborate with stakeholders to translate business needs into robust, sustainable data architecture patterns.
Duties and Responsibilities - Software Development & DevOps
  • Develop and maintain production-level applications primarily using Python (Pandas, PySpark, SnowPark), with the option to leverage other languages (e.g., C#) as needed.
  • Implement and optimize DevOps workflows, including Git/GitHub, CI/CD pipelines , and infrastructure-as-code (Terraform), to streamline development and delivery processes.
  • Containerize and deploy data and application workloads on Kubernetes leveraging KEDA for event-driven autoscaling and ensuring reliability, efficiency, and high availability.
Duties and Responsibilities - Big Data Processing
  • Handle enterprise-scale data pipelines and transformations, with a strong focus on Snowflake, or comparable technologies such as Databricks or BigQuery.
  • Optimize data ingestion, storage, and processing performance to ensure high-throughput and fault-tolerant systems.
Duties and Responsibilities - Data Stores
  • Manage and optimize SQL/NoSQL databases, Blob storage, Delta Lake, and other large-scale data store solutions.
  • Evaluate, recommend, and implement the most appropriate storage technologies based on performance, cost, and scalability requirements.
Duties and Responsibilities - Data Orchestration & Event-Driven Architecture
  • Build and orchestrate data pipelines across multiple technologies (e.g., dbt, Spark), employing tools like Airflow, Prefect, or Azure Data Factory for macro-level scheduling and dependency management.
  • Design and integrate event-driven architectures (e.g., Kafka, RabbitMQ) to enable real-time and asynchronous data processing across the enterprise.
  • Leverage Kubernetes & KEDA to orchestrate containerized jobs in response to events, ensuring scalable, automated operations for data processing tasks.
Duties and Responsibilities - Scrum Methodologies
  • Participate fully in Scrum ceremonies, leveraging tools like JIRA and Confluence to track progress and collaborate with the team.
  • Provide input on sprint planning, refinement, and retrospectives to continuously improve team efficiency and product quality.
Duties and Responsibilities - Cloud
  • Deploy and monitor data solutions in Azure, leveraging its native services for data and analytics.
Duties and Responsibilities - Collaboration & Communication
  • Foster a team-oriented environment by mentoring peers, offering constructive code reviews, and sharing knowledge across the organization.
  • Communicate proactively with technical and non-technical stakeholders, ensuring transparency around progress, risks, and opportunities.
  • Take ownership of deliverables, driving tasks to completion and proactively suggesting improvements to existing processes.
Duties and Responsibilities - Problem Solving
  • Analyze complex data challenges, propose innovative solutions, and drive them through implementation.
  • Maintain high-quality standards in coding, documentation, and testing to minimize defects and maintain reliability.
  • Exhibit resilience under pressure by troubleshooting critical issues and delivering results within tight deadlines.
Required Education and Experience
  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent professional experience).
  • Proven experience with Snowflake (native Snowflake application development is essential).
  • Proficiency in Python for data engineering tasks and application development.
  • Experience deploying and managing containerized applications using Kubernetes (preferably on Azure Kubernetes Services).
  • Understanding of event-driven architectures and hands-on experience with event buses (e.g., Kafka, RabbitMQ).
  • Familiarity with data orchestration and choreography concepts, including the use of scheduling/orchestration tools (e.g., Airflow, Prefect) and using eventual consistency/distributed systems patterns to avoid centralised orchestration at the platform level.
  • Hands-on experience with cloud platforms (Azure preferred) for building and operating data pipelines.
  • Solid knowledge of SQL and database fundamentals.
  • Strong ability to work in a collaborative environment, including cross-functional teams in DevOps, software engineering, and analytics.
Preferred Education and Experience
  • Master’s degree in a relevant technical field.
  • Certifications in Azure, Snowflake, Databricks (e.g., Microsoft Certified: Azure Data Engineer, SnowPro, Databricks Certified: Data Engineer).
  • Experience implementing CI/CD pipelines for data-related projects.
  • Working knowledge of infrastructure-as-code tools (e.g., Terraform, ARM templates).
  • Exposure to real-time data processing frameworks (e.g., Spark Streaming, Flink).
  • Familiarity with data governance and security best practices (e.g., RBAC, data masking, encryption).
  • Demonstrated leadership in data engineering best practices or architecture-level design.
Supervisory Responsibilities
  • This position may lead project-based teams or mentor junior data engineers, but typically does not include direct, ongoing management of staff.
  • Collaboration with stakeholders (Data Architects, DevOps engineers, Data Product Managers) to set technical direction and ensure high-quality deliverables.
Total Rewards:

At Enable, we’re committed to helping all Enablees grow. During the interview process, we assess your level based on experience, expertise, and role scope, aligning it with our compensation bands. Starting pay is determined by factors like location, skills, experience, market conditions, and internal parity.

Salary/TCC is just one component of Enable’s total rewards package. Enable is committed to investing in the holistic health and wellbeing of all Enablees and their families. Our benefits and perks include, but are not limited to:

Paid Time Off:

Take the time you need to relax and recharge

Wellness Benefit:

Quarterly incentive dedicated to improving your health and well-being

Comprehensive Insurance:

Health and life coverage for you and your family

Retirement Plan:

Build your future with our retirement savings plan

Lucrative Bonus Plan:

Enjoy a rewarding bonus structure subject to company or individual performance

Equity Program:

Benefit from our equity program with additional options tied to tenure and performance

Career Growth:

Explore new opportunities with our internal mobility program

Additional Perks:
  • Free Food: Complimentary meals, snacks, and drinks on-site in our global offices
  • Training: Access a range of workshops and courses designed to boost your professional growth and take your career to new heights
  • Pets: Bring your pets to our welcoming, pet-friendly offices

According to LinkedIn's Gender Insights Report, women apply for 20% fewer jobs than men, despite similar job search behaviors. At Enable, we’re committed to closing this gap by encouraging women and underrepresented groups to apply, even if they don’t meet all qualifications.

Enable is an equal opportunity employer, fostering an inclusive, accessible workplace that values diversity. We provide fair, discrimination-free employment, ensuring a harassment-free environment with equitable treatment.

We welcome applications from all backgrounds. If you need reasonable adjustments during recruitment or in the role, please let us know.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Sr. Data Engineer II

Enable International

Toronto

On-site

CAD 80,000 - 120,000

Yesterday
Be an early applicant

Data Engineer - Databricks - Tech Lead

Lumenalta

Mississauga

Remote

CAD 99,000 - 108,000

8 days ago

Senior Data Engineer

Wave HQ

Toronto

Hybrid

CAD 100,000 - 160,000

9 days ago

Sr. Data Engineer II

Enable International

Toronto

On-site

CAD 80,000 - 130,000

16 days ago

Data Scientist II

Palmstreet

Vancouver

Remote

CAD 100,000 - 132,000

4 days ago
Be an early applicant

Data Scientist II

Coalition Inc

Remote

CAD 70,000 - 110,000

5 days ago
Be an early applicant

Senior Business Intelligence & Data Analyst (9-Month Contract)

University Pension Plan Ontario

Toronto

Hybrid

CAD 90,000 - 100,000

9 days ago

Senior Data Engineer

Ontario Health

Toronto

Hybrid

CAD 80,000 - 110,000

30+ days ago

Data Engineer

Loblaw Companies Limited

Toronto

On-site

CAD 70,000 - 110,000

12 days ago