Enable job alerts via email!

Data Engineer

Monark Group

White Rock

On-site

CAD 80,000 - 100,000

Full time

14 days ago

Job summary

A technology solutions company in White Rock is seeking a motivated Data Engineer with 2-3 years of experience. You will design, build, and optimize data pipelines and analytics workflows, collaborating with cross-functional teams to transform raw data into actionable insights. Knowledge in Apache Airflow, AWS, and Python is essential, with a focus on data warehousing and AI/ML model support.

Qualifications

  • 2-3 years of professional experience in data engineering or related field.
  • Hands-on experience with Apache Airflow for building and orchestrating ETL pipelines.
  • Practical experience with AWS services.
  • Experience working with Snowflake for data warehousing workloads.
  • Strong Python programming skills for data processing and automation.

Responsibilities

  • Design and implement robust ETL pipelines using Apache Airflow.
  • Build and maintain data warehouses in Snowflake.
  • Develop ETL workflows for structured and semi-structured data using AWS services.
  • Collaborate with data scientists to prepare data for AI/ML model training.
  • Integrate data from APIs and databases using Python.

Skills

Apache Airflow
AWS
Snowflake
Python
Streamlit
Data Modeling
AI/ML

Education

Bachelor's degree in Computer Science or related field
Job description

We are seeking a motivated Data Engineer with 2-3 years of experience to join our innovative team. In this role, you will design, build, and optimize data pipelines and analytics workflows using modern cloud data platforms and tools. You will collaborate with cross-functional teams to transform raw data into actionable insights that power intelligent product features, ML models, and strategic decision-making.

Responsibilities

  • Design and implement robust ETL pipelines using Apache Airflow
  • Build and maintain data warehouses in Snowflake to support scalable data ingestion and transformation
  • Develop ETL workflows for structured and semi-structured data from various sources using AWS services
  • Collaborate with data scientists and ML engineers to prepare and transform data for AI/ML model training and inference
  • Build interactive data applications using Streamlit
  • Design and implement data models to support machine learning workflows and analytics
  • Integrate data from APIs, databases, and cloud storage using Python
  • Implement data quality checks and monitoring systems to ensure data integrity
  • Document data models and pipeline architectures
  • Stay updated on advancements in Airflow, AWS, Snowflake, Streamlit, and AI/ML technologies

Requirements

  • 2-3 years of professional experience in data engineering or a related field
  • Hands-on experience with Apache Airflow for building and orchestrating ETL pipelines
  • Practical experience with AWS services
  • Experience working with Snowflake for data warehousing workloads
  • Strong Python programming skills for data processing and automation
  • Experience with Streamlit for building data applications
  • Solid understanding of data modeling concepts
  • Familiarity with AI/ML workflows, including data preparation for model training
  • Bachelor's degree in Computer Science, Data Engineering, or a related technical field (or equivalent experience)

Preferred Qualifications

  • Advanced experience with AWS data services
  • Deep expertise in Snowflake optimization and performance tuning
  • Experience building complex Streamlit applications
  • Advanced Python skills for data wrangling and automation
  • Experience with AI/ML model deployment and data modeling for machine learning
  • Knowledge of Airflow best practices for ETL pipeline design
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.