Enable job alerts via email!

Sr. Data Engineer - Lululemon

Red Pico

Vancouver

Hybrid

CAD 100,000 - 130,000

Full time

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading tech company is seeking a Senior Data Engineer to support their Global Trade technology team in Vancouver. The ideal candidate will have extensive experience in data engineering, focusing on microservices and cloud data infrastructures. You will play a key role in ensuring production readiness and efficiency while delivering quality code using modern development tools and methodologies.

Qualifications

  • Senior level experience as a Data Engineer (7+ years).
  • Hands-on experience in SQL and Python with cloud-native data pipelines.
  • Strong understanding of CI/CD and DevOps practices.

Responsibilities

  • Lead development and operations with a focus on data engineering.
  • Define technical infrastructure and develop scalable microservices.
  • Perform ad hoc analysis for stakeholders and troubleshoot product issues.

Skills

SQL
Python
Data Engineering
Cloud Data Architectures
CI/CD Practices

Tools

Git
Databricks
Snowflake

Job description

This role will support the Global Trade technology team, focusing on data engineering and microservices development as needed. You will be counted on as a leader in your technology space, as you contribute to all areas of development and operations (pre-production to production). Senior Engineers take production readiness and performance personally and help drive continuous improvement. You will be part of a day-to-day production release team and may perform on-call support functions as needed. You will be considered a primary caretaker of the Global Trade technology stacks and focus on how the platform is delivered and how to optimize it for efficiency, security, resiliency, and reliability.

A day in the life :

  • Attend daily stand-up calls to review open dev and support tickets
  • Perform ad hoc analysis for stakeholders to identify new applications of data analytics and machine learning algorithms to support business outcomes
  • Define technical infrastructure and data model requirements
  • Develop and design scalable, secure modern microservices, test driven development, and cloud native design patterns
  • Use algorithms, data structures, programming language and programming paradigms to create, test and operate sustainable client-side or server-side software applications and services
  • Deliver software that meets architectural and operational requirements and perform to expectations
  • Work with teams to add and refine acceptance criteria and the definition-of-done to user stories
  • Deliver quality code using test driven development (TDD) approach with minimal bugs
  • Partner with Dev and QA team members on permanent fixes for recurring issues
  • Conduct research to aid in product troubleshooting and optimization efforts
  • Work with QA team to test tech debt fixes and coordinate with Dev teams and release management on any tech debt deployments
  • Attend Post-Mortems and work on action items from deployment defects
  • Use innovation and automation when possible, to provide sustainable solutions
  • Use out of the box thinking approach that alleviates all bottlenecks during development and production support

1) Senior level experience as a Data Engineer (7+ years). The ideal candidate will have a strong background with hands-on experience in SQL and Python as well as a deep understanding of data systems architecture, including how database operations and transactions function work.

2) Strong understanding of CI / CD processing and DevOps practices. The ideal candidate would have experience using Gitlab (Experience with GitHub or Jenkins) so if an error occurs during deployment, they would know the process of how to deal with it or who to reach out to.

3) Experience working in a Java environment. Most in house applications have been built with Java / Spring in AWS so the ability to read and understand at a high level would be helpful (not needing the ability to code.)

4)Hands-on experience with AWS cloud services—including Glue, EMR, Lambda, S3, and RDS. The ideal candidate will have the ability to design, implement, and optimize cloud-native data pipelines.

Nice to haves :

  • experience working with Kafka
  • experience with Snowflake

Workplace Type

Hybrid

Experience Level

lululemon is a yoga-inspired technical apparel company up to big things. The practice and philosophy of yoga informs our overall purpose to elevate the world through the power of practice. We are proud to be a growing global company with locations all around the world, from Vancouver to Shanghai, and places in between. We owe our success to our innovative product, our emphasis on our stores, our commitment to our people, and the incredible connections we get to make in every community we are in.

The trade team, rooted under the Global Supply Chain Technology domain, is all about ensuring customs compliance with global regulatory agencies with the movement of our goods and unlocking capabilities to take advantage of preferential trade agreements in a scalable manner. We work closely with other supply chain teams such as our upstream product, logistics and distribution systems.

The Global Logistics and Trade Technology department is committed to unlocking business potential with our Global Fulfillment and broader Supply Chain partners. Supporting both vendor and propriety systems the teams provide solutions to enable our partners in streamlining the lifecycle of product movement from our manufacturers to our DC’s, from our DC’s to our stores and guests as well as international trade across logistics and settlement activities including but not limited to components such as product classification, import / export process management and commercial invoice information.

Additional Skills Tags

Data warehouse

Additional Skills & Qualifications

  • 7+ years of experience of software engineering experience with focus on cloud data engineering and analysis
  • Highly proficient in SQL & Python programming and modern development tools (e.g. Git, Databricks)
  • Experience with Snowflake and PBI platforms
  • Familiarity with structure and unstructured data types; experience with cloud data architectures and ability to develop model data sets from multiple internal data sources
  • Experienced with infrastructure as a code and multi cloud strategy
  • Proficiency in CI / CD processing and DevOps practices
  • Experience in Kafka, API integration patters and APIGEE
  • Self-starter and has pride in taking initiative
  • Excellent project management and problem-solving skills
  • Strong business acumen and written and verbal skills
  • Challenges the status quo, champions change and influences others to change
  • Possesses an entrepreneurial spirit and continuously innovates to achieve great results
  • Strong time management skills and ability to prioritize tasks
  • Integrates fun and joy as a way of being and working, aka doesn’t take themselves too seriously
  • Strong knowledge of the ITIL Framework (Incident, Problem, Change, Release Management)

Start Date

Jul-13-2025

Interview Information

2 rounds

Business Challenge

support the trade systems team as they navigate the changes with tariffs.

Create a job alert for this search
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Scientist, Experimentation & Incremental Measurement

lululemon

Vancouver null

Hybrid

Hybrid

CAD 105.000 - 139.000

Full time

18 days ago

Data Scientist

Lululemon Athletica

Vancouver null

On-site

On-site

CAD 105.000 - 139.000

Full time

30+ days ago