Enable job alerts via email!

Data Engineer - Databricks

Citco GSGS

Toronto

On-site

CAD 85,000 - 105,000

Full time

Today
Be an early applicant

Job summary

A global financial services provider in Toronto is seeking a Data Engineer to build and maintain data pipelines using the Databricks Lakehouse Platform. The ideal candidate will have a Bachelor's degree in Computer Science and over 4 years of data engineering experience, particularly with Python and AWS services. This role offers an opportunity to collaborate with diverse teams and contribute to innovative data solutions while enjoying several benefits supporting work-life balance.

Benefits

Range of benefits
Training and education support
Flexible working arrangements

Qualifications

  • 4+ years of experience in data engineering.
  • 1+ years of hands-on experience with Databricks platform.
  • Understanding of ETL/ELT principles and patterns.

Responsibilities

  • Develop and maintain data pipelines using Databricks Lakehouse and Delta Lake.
  • Implement ETL/ELT workflows using Spark (Python) in Databricks environment.
  • Work with AWS services (S3, Glue) for data lake storage and catalog management.
  • Create and optimize Spark jobs for efficient data processing and cost management.

Skills

Strong programming skills in Python
Experience with Spark and distributed computing
Working knowledge of AWS services (S3, Glue, Lambda)
Experience with Delta Lake and Lakehouse architecture
Familiarity with data modelling and SQL
Experience with version control systems (Git)
Experience with CI/CD for data pipelines
Familiarity with Agile development methodologies
Experience with real-time data processing
Good communication and collaboration abilities

Education

Bachelor's degree in Computer Science, Engineering, or related field

Tools

Databricks
Spark
AWS
Job description
About Citco:

Since the 1940s Citco has provided specialist financial services to alternative investment funds, investors,multinationals and private clients worldwide. With over 6,000 employees in 45 countries, we pioneer innovative solutions that meet our clients’ evolving needs, and deliver exceptional service.

Our continuous investment in learning means our people are among the best in the industry. And our corporate social responsibility programs provide meaningful and fulfilling work in the community.

A career at Citco isn’t just a job – it’s an opportunity to excel in an environment that genuinely supports your personal and professional development.

About the Role:

You will be working in a cross-functional team, using agile methodologies to build and maintain data pipelines on the Databricks Lakehouse Platform for the financial services industry. As a Data Engineer, you'll collaborate with data scientists, analysts, and business stakeholders to transform raw financial data into actionable insights. Using modern data engineering practices, you'll develop scalable ETL/ELT processes, implement data quality controls, and ensure data governance standards are met. Working within our AWS cloud environment, you'll help build robust data solutions that power critical business operations while maintaining the highest standards of data security and compliance.

Qualifications
About You:
  • Bachelor's degree in Computer Science, Engineering, or related field
  • 4+ years of experience in data engineering
  • 1+ years of hands‑on experience with Databricks platform
  • Strong programming skills in Python
  • Experience with Spark and distributed computing
  • Working knowledge of AWS services (S3, Glue, Lambda)
  • Experience with Delta Lake and Lakehouse architecture
  • Familiarity with data modelling and SQL
  • Understanding of ETL/ELT principles and patterns
  • Experience with version control systems (Git)
  • Good communication and collaboration abilities
  • Experience with CI/CD for data pipelines
  • Familiarity with Agile development methodologies
  • Experience with real-time data processing is a plus
  • Self‑motivated with ability to work independently
Our Benefits

Your well-being is of paramount importance to us, and central to our success. We provide a range of benefits, training and education support, and flexible working arrangements to help you achieve success in your career while balancing personal needs. Ask us about specific benefits in your location.

We embrace diversity, prioritizing the hiring of people from diverse backgrounds. Our inclusive culture is a source of pride and strength, fostering innovation and mutual respect.

Citco welcomes and encourages applications from people with disabilities. Accommodation is available upon request for candidates taking part in all aspects of the selection.

Responsibilities
Your Role:
  • You will participate and contribute on all team activities such as Sprint Planning, Sprint Execution, Daily Scrum
    • Develop and maintain data pipelines using Databricks Lakehouse and Delta Lake
    • Implement ETL/ELT workflows using Spark (Python) in Databricks environment
    • Work with AWS services (S3, Glue) for data lake storage and catalog management
    • Create and optimize Spark jobs for efficient data processing and cost management
    • Build and maintain data quality checks and monitoring systems
    • Configure and manage Databricks notebooks and jobs
    • Implement proper security and access controls using Unity Catalog
    • Participate in code reviews and documentation efforts
    • Stay current with Databricks features and data engineering best practices
    • Support real-time data processing using structured streaming when required
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.