Enable job alerts via email!

Data Engineer

73 Strings

Toronto

On-site

CAD 80,000 - 120,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Une entreprise innovante recherche un Ingénieur de Données pour concevoir et maintenir des pipelines de données robustes utilisant Azure et Databricks. Ce rôle implique le développement de solutions analytiques, garantissant la qualité et la conformité des données, tout en collaborant au sein d'une équipe dynamique.

Qualifications

  • Expérience prouvée avec des services cloud Azure, notamment Databricks.
  • Compétences solides en programmation (Python, SQL, Scala).
  • Compréhension solide de l'architecture Spark et du traitement de données distribuées.

Responsibilities

  • Développer et maintenir des pipelines ETL/ELT utilisant Azure Data Factory et Databricks.
  • Construire des architectures de données évolutives, y compris des lacs de données.
  • Collaborer avec des équipes interfonctionnelles pour comprendre les exigences de données.

Skills

Azure cloud services
Python
SQL
Scala
Databricks
ETL/ELT pipelines
API integration
Spark architecture
Data security

Education

Master’s degree in Computer Science or Engineering

Tools

Azure Data Factory
Data Lake

Job description

OVERVIEW OF 73 STRINGS:

73 Strings is an innovative platform providing comprehensive data extraction, monitoring, and valuation solutions for the private capital industry. The company's AI-powered platform streamlines middle-office processes for alternative investments, enabling seamless data structuring and standardization, monitoring, and fair value estimation at the click of a button. 73 Strings serves clients globally across various strategies, including Private Equity, Growth Equity, Venture Capital, Infrastructure and Private Credit.

Our 2025 $55M Series B, the largest in the industry, was led by Goldman Sachs, with participation from Golub Capital and Hamilton Lane, with continued support from Blackstone, Fidelity International Strategic Ventures and Broadhaven Ventures.

About the role

We are seeking a Data Engineer with hands-on experience in Azure, Databricks, and API integration. You will design, build, and maintain robust data pipelines and solutions that power analytics, AI, and business intelligence across the organization.

Key Responsibilities

- Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory, Databricks, and related Azure services.

- Build scalable data architectures, including data lakes and data warehouses

- Integrate and process data from diverse sources via REST and SOAP APIs

- Design and implement Spark-based data transformations in Databricks using Python, Scala, or SQL

- Ensure data quality, security, and compliance across all pipelines and storage solutions.

- Collaborate with cross-functional teams to understand data requirements and deliver actionable datasets.

- Monitor, troubleshoot, and optimize Databricks clusters and data workflows for performance and reliability.

- Document data processes, pipelines, and best practices.

Required Skills & Qualifications

- Proven experience with Azure cloud services, especially Databricks, Data Lake, and Data Factory.

- Strong programming skills in Python, SQL, and/or Scala.

- Experience building and consuming APIs for data ingestion and integration.

- Solid understanding of Spark architecture and distributed data processing.

- Familiarity with data modeling, data warehousing, and big data best practices.

- Knowledge of data security, governance, and compliance within cloud environments.

- Excellent communication and teamwork skills.

Preferred

- Experience with DevOps tools, CI/CD pipelines, and automation in Azure/Databricks environments[7][6][10].

- Exposure to real-time data streaming (e.g., Kafka) and advanced analytics solutions

Education

- Master’s degree in Computer Science, Engineering, or a related field, or equivalent experience.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Cloud Data Engineer- Ministry Experience

Huntel Global

Toronto

Remote

CAD 100 000 - 150 000

Yesterday
Be an early applicant

Data Engineer - Databricks

Lumenalta

Toronto

Remote

CAD 75 000 - 100 000

Yesterday
Be an early applicant

Intermediate DataOps/Cloud Data Engineer

Akkodis group

Toronto

Remote

CAD 80 000 - 110 000

3 days ago
Be an early applicant

Data Engineer - Snowflake

Lumenalta

Toronto

Remote

CAD 100 000 - 140 000

4 days ago
Be an early applicant

Intermediate DataOps/Cloud Data Engineer - Remote / Telecommute

Cynet Systems Inc

Toronto

Remote

CAD 90 000 - 130 000

3 days ago
Be an early applicant

Data Engineer

AmeriLife

Vaughan

Remote

CAD 85 000 - 120 000

3 days ago
Be an early applicant

Data Engineer - Snowflake

Lumenalta

Toronto

Remote

CAD 90 000 - 130 000

8 days ago

QA Engineer - Data Platform

Veeva Systems

Toronto

Remote

CAD 65 000 - 115 000

8 days ago

Senior Data Engineer

The Score

Toronto

Remote

CAD 90 000 - 130 000

8 days ago