Enable job alerts via email!

19409 Data Engineer 1

emergiTEL

Vancouver

On-site

CAD 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Job summary

A growing tech company in Vancouver is seeking a Junior Data Engineer to design and maintain data pipelines on Google Cloud Platform. Ideal candidates will have a passion for data, familiarity with Python and SQL, and a desire to learn cloud technologies. This full-time position offers the opportunity to work within an agile team. New graduates are welcome to apply.

Qualifications

  • Basic understanding of ETL / ELT processes and data pipeline concepts.
  • Familiarity with SQL and basic programming concepts.
  • Interest and ability to learn new cloud technologies.

Responsibilities

  • Assist in designing, developing, testing, deploying and maintaining analytics pipelines on GCP.
  • Collaborate closely with multiple teams in an agile environment.
  • Support in evaluating Google Cloud technology choices and testing solutions.

Skills

Python
SQL
Problem-solving skills
Data pipeline concepts

Tools

Google Cloud Platform (GCP)
Tableau
Git

Job description

Job Description :

As a Junior Data Engineer you will be responsible for learning to design build and run data-driven applications on Google Cloud Platform (GCP) that enable innovative customer-centric digital experiences. This is an excellent opportunity for early-career professionals to grow their skills in modern cloud data engineering .

You will be working as part of a friendly cross-discipline agile team that helps each other solve problems across all functions. As a custodian of customer trust you will learn and apply best practices in development security accessibility and design to achieve the highest quality of service for our customers.

Our development team uses modern Google Cloud technologies to get the job done :

BigQuery

Dataflow

Cloud Storage

Cloud Composer

Pub / Sub

These are coupled with Python SQL and Infrastructure as Code to provide modern scalable data pipelines.

You will be part of the team building cloud-native data pipelines to transfer and process data from our enterprise data lake on GCP enabling our AI and analytics use cases .

You are a fast learner eager to grow technically and passionate about data engineeringlooking to work within a team of multidisciplinary experts to develop your craft and contribute to our data engineering practice .

Responsibilities :

Learn new skills & advance your cloud data engineering practice

Assist in designing developing testing deploying and maintaining analytics pipelines on GCP

Support in evaluating Google Cloud technology choices and testing solutions

Help outcome teams understand how to best measure and analyze their data

Collaborate closely with multiple teams in an agile environment

Gain hands-on experience with modern cloud data engineering tools and practices

Youre the Missing Piece of the Puzzle :

A passion for data and eagerness to learn

Interest and ability to learn new cloud technologies and programming languages

Basic understanding of ETL / ELT processes and data pipeline concepts

Familiarity with SQL and basic programming concepts

Understanding of fundamental data concepts ( databases data warehouses data lakes )

Strong problem-solving skills and attention to detail

Great-to-Haves :

02 years of relevant experience (new graduates welcome)

Basic experience with Google Cloud Platform (GCP) services

Experience with Python SQL or other programming languages

Familiarity with version control (Git) and basic DevOps concepts

Experience with data visualization tools such as Looker Tableau or similar

Understanding of basic cloud computing concepts

Must-Have Skills : Python

GCP BigQuery SQL Stored Procedures

Tableau

Nice-to-Have Skills :

GCP Certification

GSuite AppScripts

VS Code & GitHub

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Employment Type : Full Time

Experience : years

Vacancy : 1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs