Enable job alerts via email!

Data Engineer (AI Platforms), Canada

OpenVPN Inc.

Canada

Remote

CAD 80,000 - 110,000

Full time

4 days ago
Be an early applicant

Job summary

A leading tech company in Canada is seeking a skilled Data Engineer passionate about building data foundations for AI. You'll design and optimize data pipelines, collaborate with AI/ML teams, and help modernize our data structures. The role is fully remote, offering autonomy and a chance to significantly impact our data landscape. Ideal candidates will have 4+ years of data engineering experience, deep SQL expertise, and a self-managed mindset.

Benefits

Competitive pay rates
Fully remote work environment
Self-managed time off

Qualifications

  • 4+ years of experience in a data engineering role.
  • Experience working in a fully remote environment.
  • Ability to articulate complex technical concepts.

Responsibilities

  • Design and optimize data pipelines.
  • Collaborate with AI/ML teams to productionize models.
  • Lead the charge in migrating legacy systems.

Skills

Proven experience in data engineering
Deep expertise in SQL
Hands-on experience with Google BigQuery and CloudSQL
Programming experience with Python or JAVA
Excellent communication skills

Tools

Google BigQuery
CloudSQL
Docker
Apache Kafka
dbt
Job description

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

Our philosophy is that we are a small, close-knit team, and we care deeply about you:

  • Competitive pay rates

  • Fully remote work environments

  • Self-managed time off

Important:

  • This is a full-time remote job, and it will be a long-termB2B contract.

What You'll Do
  • Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.

  • Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.

  • Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.

  • Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.

  • Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.

Challenges You'll Help Us Tackle
  • Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.

  • Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.

  • Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.

  • Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.

  • Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.

  • Deep expertise in SQL and query optimization.

  • Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).

  • Programming experience with Python or JAVA

  • A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.

  • Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.

  • The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.

Bonus Points For
  • AI/ML Tooling: Experience with Google's VertexAI platform.

  • Programming Languages: Proficiency in Go.

  • DE tools : familiarity with dbt and airflow

  • Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.

  • Streaming Analytics: Real time streaming analytics

  • DevOps & Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run)

  • Legacy Systems: Experience with Perl or PHP is a plus.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.