Enable job alerts via email!

Data Engineer (Analytics)

T-Net British Columbia

Vancouver

Hybrid

CAD 80,000 - 120,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A dynamic startup is seeking a Data Engineer to join their Analytics Team in Vancouver. The role involves designing scalable data pipelines, collaborating with cross-functional teams, and ensuring data quality. Ideal candidates have a Bachelor’s in Computer Science, strong ETL skills, and experience with data systems. This hybrid position offers competitive salary options and various perks including health benefits and a collaborative work environment.

Benefits

Health benefits with EAP starting on Day 1
Stock options
Fully equipped gym in office building
Bike locker in office building
Monthly in-office social and snacks

Qualifications

  • 2+ years of hands-on experience in data engineering or related fields.
  • Strong understanding of managing stream and batch processing pipelines.
  • Excellent problem-solving skills and strong attention to detail.

Responsibilities

  • Design and maintain scalable data pipelines and workflows.
  • Integrate disparate data sources for unified analytics.
  • Monitor and troubleshoot data quality and workflows.

Skills

Data Modeling
ETL Design
Problem Solving
Communication
Collaboration

Education

Bachelor's degree in Computer Science

Tools

Python
SQL
Hadoop
Spark
AWS

Job description

Netskrt is seeking a talented and motivated Data Engineer to join our Analytics Team. This is a hybrid role (3 days WFH) located in our beautiful downtown Vancouver office, next to Burrard SkyTrain station.

At Netskrt, we're a highly driven team focused on building innovative products and services that enhance the customer experience of streaming video at the edge of the network. We've developed a suite of interrelated technologies aimed at businesses offering customers Wi-Fi in bandwidth-constrained environments.

This role offers hands-on experience across data infrastructure, networking, security, and cloud technologies—all while solving complex problems in a dynamic startup setting, alongside accomplished engineers and a leadership team with a strong track record of success.

If you're passionate about designing scalable data pipelines, experimenting with complex datasets, and uncovering insights from diverse data sources, this is an exciting opportunity to have a meaningful impact on our technology landscape.

Key Responsibilities:

  • Collaborate with cross-functional teams to design and maintain robust, scalable data pipelines that automate data extraction, transformation, and loading from sources such as databases, APIs, and flat files.
  • Integrate and unify disparate data sources for analytical and reporting purposes.
  • Develop and maintain structured data models and warehousing using industry best practices.
  • Design and optimize ETL processes to handle both real-time streams and batch workloads.
  • Monitor and troubleshoot data workflows for performance and scalability.
  • Work closely with Data Scientists, Analysts, and Business Intelligence teams to deliver impactful solutions.
  • Champion data quality, integrity, and compliance across all workflows.

Required Skills & Qualifications:

  • Bachelor's degree in Computer Science or equivalent professional experience.
  • 2+ years of hands-on experience in data engineering or related fields.
  • Proficiency in managing and differentiating between stream and batch processing pipelines.
  • Strong understanding of data modeling, ETL design, and warehouse architecture.
  • Experience working with large-scale, distributed data systems.
  • Excellent problem-solving skills and strong attention to detail.
  • Strong communication skills and an aptitude for collaboration.
  • Ability to multitask and thrive in a fast-paced, high-growth environment.
  • Passion for continuous learning and staying on top of data engineering innovations.

Desired Qualifications:

  • Programming: Python, Scala, or Java
  • Big Data: Hadoop, Spark, Kafka, or similar frameworks
  • Databases: SQL and NoSQL systems such as PostgreSQL, ClickHouse, or Cassandra
  • Workflow Automation: Apache Airflow or equivalent orchestration tools
  • Version Control: Git (GitHub/GitLab/Bitbucket)
  • RESTful APIs and building scalable data pipelines
  • Cloud: Experience with AWS (e.g., Redshift, S3, Lambda), Google Cloud (BigQuery, Dataflow), or Azure (Data Lake, SQL Data Warehouse)
  • Competitive salary based on experience
  • Collaborative team environment
  • Health benefits with EAP starting on Day 1
  • Stock options
  • Snacks and monthly in-office social
  • Fully equipped gym in office building
  • Bike locker in office building

Response Information

To apply for this position, please click the appropriate "Apply" button (or follow the application instructions listed in the Job Description above). If more than one Apply button appears below, please select the option you prefer.

Post your resume on T-Net and let employers come to you.

  • Upload, Copy and Paste or Create your resume easily.
  • Save Time Applying For Future Positions.
  • Make "Searchable" or "Confidential" (optional) and BC Tech and IT employers can search our database and contact you.

Registration on or use of this website constitutes acceptance of our Terms of Use .

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.