Enable job alerts via email!

Data Engineer 2 - 19364

emergiTEL Inc.

Toronto

On-site

CAD 80,000 - 110,000

Full time

2 days ago
Be an early applicant

Job summary

A technology solution provider in Toronto is seeking a Data Engineer to design and build data-driven applications. You will work with modern technologies like Google Cloud Platform and contribute to a multidisciplinary agile team to enhance customer-centric digital experiences. Ideal candidates should have strong experience in Python, SQL, and Google Cloud technologies. Competitive compensation and growth opportunities are offered.

Qualifications

  • Familiar with the execution sequence of ETL flows using Google Platforms.
  • Intermediate level candidates with 4-6 years of relevant experience.
  • Experience with Big Data related tools and technologies.

Responsibilities

  • Design, develop, test, deploy, maintain, and improve the analytics pipeline.
  • Assist in evaluating technology choices and rapidly test solutions.
  • Collaborate closely with multiple teams in an agile environment.

Skills

Python on Visual Studio Code
Google Cloud - Vertex AI
Extensive knowledge of building complex SQL queries
Tableau
JIRA and Agile Methodology

Tools

Google Cloud Platform
Databricks
Tableau

Job description

As a Data Engineer, you will be responsible for designing, building and running the data driven applications which enable innovative, customer centric digital experiences.
You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.
Our development team uses a range of technologies to get the job done: Google Cloud Platform(GCP), PySpark, Dataflow, BigQuery, Looker Studio, Google Cloud Scheduler, Shell scripting, Pubsub, Elasticsearch, LLMs, Gemini Pro, GitHub, Terraform etc to provide a modern, easy to use data pipeline.

You will be part of the team building a data pipeline to transfer the data from our enterprise data lake for enabling our AI use cases.
You are a fast learner, highly technical, passionate data engineer looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.

Here’s how
● Learn new skills & advance your data development practice
● Design, develop, test, deploy, maintain and improve the analytics pipeline
● Assist in evaluating technology choices and rapidly test solutions
● Assist the outcome teams in understanding how to best measure their web properties
● Collaborate closely with multiple team in an agile environment

You're the missing piece of the puzzle
● A passion for data
● Interest and ability to learn new languages & technologies as needed
● Familiar with the execution sequence of ETL Flows using Google Platforms
● Experience with Spark, Beam, Airflow, Cloud SQL, BigQuery, MSSQL
● Basic understanding of data warehouse, data lake, OLAP and OLTP applications

Great-to-haves
● Intermediate level candidates with 4-6 years of relevant experience
● Experience with Big data related tools and technologies
● Experience with SQL, Unix, Shell scripting
● Experience with data visualization tools such as Tableau, Domo, Looker

Must-Have Skills (min. 3 skills please):
1. Python on Visual Studio Code, Cline/Copilot or other AI based coding experience - 4+ years exp
2 Google Cloud - Vertex AI, M/L Pipelines, Gen AI Models, BigQuery, Dataflow, Terraforms, YAML, Pyspark
3. Extensive knowledge of building complex SQL queries and stored procedures
4. Tableau or Looker Studio
5. JIRA and Agile Methodology

Nice-to-Have Skills (min. 3 skills please):
1. Google Workspace, App Scripts
2. JavaScript, React JS, Node JS
3. Telecom Domain Knowledge

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs