Job Search and Career Advice Platform

Enable job alerts via email!

AWS Cloud Developer, Data Streaming

BMO Financial Group

Toronto

Hybrid

CAD 75,000 - 142,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading financial institution is seeking an experienced AWS Data Engineer to design and optimize robust data platforms on AWS. This hybrid role emphasizes the development of scalable real-time and batch ETL pipelines and offers competitive pay, flexibility, and continuous learning opportunities. Candidates should have extensive experience in cloud technologies and a strong foundation in data engineering principles.

Benefits

Competitive compensation
Continuous learning opportunities
Flexible work arrangements

Qualifications

  • 3+ years experience in real-time data streaming pipelines with Amazon Kinesis.
  • Batch ETL skills using AWS Glue, EMR, and Redshift.
  • 5+ years experience in Python, PySpark, and SQL.

Responsibilities

  • Define standards for real-time and batch ETL data processing.
  • Design and implement low-latency, event-driven pipelines.
  • Architect fault-tolerant streaming solutions.

Skills

Real-time data streaming with Amazon Kinesis
Batch ETL with AWS Glue and Amazon Redshift
Python programming
AWS CDK or CloudFormation
DevOps/CI tools familiarity
CloudWatch and CloudTrail experience
Performance Tuning and Optimization
Job description
Overview

The AWS Data engineer is responsible for the design, development, and optimization of large-scale real-time and batch data platforms on AWS. This role requires deep hands-on development expertise as well as ownership of architecture, best practices, and technical direction for data pipelines that support analytics, reporting, and advanced data products.

Your main goal is to ensure the availability and consistent performance of various applications. You will design, build, and maintain scalable real-time and batch ETL pipelines on the AWS cloud. You will work closely with business analysts, data scientists, and business teams to ensure high-quality, reliable, and performant data platforms.

Accelerate your career in cloud technology. As an AWS Cloud Developer in our Treasury business, you’ll gain hands-on experience with advanced AWS tools, SQL Server, and modern automation practices while building future-ready applications. We offer continuous learning, certifications, and mentorship to help you grow—plus competitive pay, flexibility, and a culture that values your ideas.

THIS is a HYBRID role This role is hybrid, with an expectation of two days per week in the office. Please note that this may change in the future based on business needs.

Why Join Us?

  • Work on mission-critical, future-ready applications for Treasury.
  • Gain exposure to cutting-edge AWS technologies and modern automation practices.
  • Enjoy competitive compensation, flexible work arrangements, and continuous learning opportunities.
  • Be part of a collaborative, inclusive culture that values innovation and impact.
Responsibilities
  • Define standards, patterns, and best practices for real-time streaming and batch ETL data processing, implement data ingestion from multiple sources including databases, APIs, logs, and event streams.
  • Design and implement low-latency, event-driven pipelines using Amazon Kinesis (Streams & Firehose), Amazon MSK (Kafka), AWS Lambda or Spark Structured Streaming.
  • Architect fault-tolerant and scalable streaming solutions for high-volume data, and integrate CDC pipelines where required.
  • Design and optimize batch ETL/ELT pipelines using Amazon S3, AWS Glue, Amazon EMR (Spark), Amazon Redshift.
  • Build robust data transformation frameworks using Python, PySpark and SQL. Transform raw data into curated datasets following data modeling best practices (star/snowflake).
  • Optimize ETL workloads for performance, scalability, and cost efficiency. Build and manage enterprise data lakes using S3 and Glue Catalog.
  • Implement monitoring, alerting, and observability using CloudWatch, Airflow/MWAA, or custom frameworks.
  • Implement data quality checks, validation rules, and reconciliation logic.
  • Monitor pipelines using CloudWatch, CloudTrail, Airflow/MWAA, or custom tools.
  • Troubleshoot production failures and perform root-cause analysis.
  • Use GitHub to deploy AWS resources and implement CI/CD pipelines for data workflows.
  • Ensure security best practices including IAM roles, encryption, and data access controls.
  • Analyze, solve, and correct issues in real time; provide suggestions for solutions and automate regular processes; track issues and document changes.
  • Provide support for critical production systems and perform scheduled maintenance and after-hours release deployment activities.
Skills and Qualifications
  • 3+ years real-time data streaming pipelines using Amazon Kinesis (Data Streams / Firehose), AWS Lambda and Amazon MSK (Kafka)
  • Batch ETL skills with AWS Glue, EMR, Amazon Redshift, and Amazon S3
  • 5+ years experience with Python, PySpark and SQL programming
  • Strong expertise with AWS CDK or CloudFormation
  • Familiarity with DevOps/CI tools (GIT, Bitbucket)
  • 2+ years of experience with CloudWatch, CloudTrail, Airflow/MWAA
  • 3 years of experience with Performance Tuning and Optimization (PTO)
  • 3 years of experience with backups, restores and recovery models
  • Exposure to DevOps/CI tools and practices (GIT, Bitbucket)
  • Sense of ownership and pride in your performance and its impact on the company’s success
  • Critical thinker and problem-solving skills
  • Team player
  • Strong time-management skills
  • Great interpersonal and communication skills
Additional Context

Translates user requirements into technical specifications, writes code, and supports the software development lifecycle. Focus includes security requirements, debugging, and presenting technical solutions; emphasizes data structures, algorithms, and platform-specific tooling aligned with business needs.

Salary and Benefits

Salary: 75,900.00 - 141,900.00

Pay Type: Salaried

The above represents the employer’s pay range and type. Salaries vary based on location, skills, experience, and education. Benefits may include health insurance, tuition reimbursement, and retirement plans. For more details on total rewards, please visit the employer’s total rewards page.

About Us

BMO Financial Group is committed to an inclusive, equitable and accessible workplace. Accommodations are available on request for candidates taking part in all aspects of the selection process. To request accommodation, please contact your recruiter.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.