Enable job alerts via email!

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake [...]

S I Systems

Toronto

On-site

CAD 80,000 - 100,000

Full time

Today
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking an experienced Data Engineer to design and build data pipelines that support AI use cases. The role involves working in an agile team, utilizing technologies like Google Cloud Platform and Python. Candidates should have strong SQL skills and experience in data engineering practices.

Qualifications

  • 4+ years Python development experience.
  • Extensive SQL query and stored procedure knowledge.

Responsibilities

  • Design and maintain data-driven applications.
  • Build data pipelines from enterprise data lake.

Skills

Python
SQL
Agile

Tools

Google Cloud Platform
JIRA
Visual Studio Code

Job description

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake for enabling AI use cases - 19364

Duration of Contract: 7 months (until end of 2025)

As a Data Engineer, you will be responsible for designing, building, and maintaining data-driven applications that enable innovative, customer-centric digital experiences.

You will work as part of a collaborative, cross-disciplinary agile team, helping to solve problems across functions. As a custodian of customer trust, you will employ best practices in development, security, accessibility, and design to ensure high-quality service.

Our development environment utilizes technologies such as Google Cloud Platform (GCP), PySpark, Dataflow, BigQuery, Looker Studio, Google Cloud Scheduler, Shell scripting, Pubsub, Elasticsearch, LLMs, Gemini Pro, GitHub, Terraform, among others, to create modern, user-friendly data pipelines.

You will contribute to building a data pipeline that transfers data from our enterprise data lake to support AI use cases.

We seek a fast learner, highly technical, passionate data engineer eager to collaborate with multidisciplinary experts to enhance skills and contribute to data development practices.

Must-Have Skills:
  • Python development experience on Visual Studio Code, Cline/Copilot, or other AI-based coding tools – 4+ years
  • Extensive knowledge of constructing complex SQL queries and stored procedures
  • Experience with JIRA and Agile methodologies
Nice-to-Have Skills:
  • Telecom domain knowledge
  • Interest and ability to learn new languages & technologies as needed
  • Familiarity with executing ETL flows using Google Cloud Platform
  • Experience with Spark, Beam, Airflow, Cloud SQL, BigQuery, MSSQL
  • Basic understanding of data warehouse, data lake, OLAP, and OLTP systems
Great-to-Haves:
  • Candidates with 4-6 years of relevant experience
  • Experience with big data tools and technologies
  • Proficiency in SQL, Unix, Shell scripting
  • Experience with data visualization tools such as Tableau, Domo, Looker
  • Selection process involves an offline coding assignment on Google Cloud, BigQuery, and Python, followed by technical and behavioral interviews with leadership
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.