Job Search and Career Advice Platform

Enable job alerts via email!

Senior Analytics Engineer

Alpaca

Remote

CAD 90,000 - 120,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading financial technology company is seeking an experienced Analytics Engineer to define and execute the vision for their data transformation layer. You will work remotely with Data Engineers and Data Scientists to develop scalable data models that support diverse business needs. Ideal candidates should have 4+ years of experience in analytics engineering, be proficient in SQL and DBT, and possess strong technical versatility. This position offers competitive salary and benefits including home-office setup and monthly stipends.

Benefits

Competitive Salary & Stock Options
Home-Office Setup: One-time USD $500
Monthly Stipend: USD $150

Qualifications

  • 4+ years of analytics or data engineering experience with focus on transformation.
  • Proficiency in creating and maintaining scalable data models.
  • Expert-level SQL skills and experience with dbt.

Responsibilities

  • Design and maintain scalable data models using dbt and SQL.
  • Collaborate with various teams to deliver reliable data products.
  • Implement improvements to data warehouse performance.

Skills

Analytics engineering
SQL
DBT
Python

Tools

Airflow
Postgres
GCP
Job description
Our Team Members

We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond! We are searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply.

About the Role

We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer. You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms. You will join our 100% remote team and work closely with Data Engineers and Data Scientists and Business Users who consume your data models. Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models. These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems.

What You'll Do
  • Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics.
  • Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability.
  • Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products.
  • Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business.
  • Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve.
What You Need (Must-Haves)
  • 4+ years of experience in analytics engineering or data engineering with a strong focus on the "T" (transformation) in ELT.
  • Proven track record of owning data products end-to-end, applying analytics and data engineering best practices to ensure data quality, scalability, and robust data models.
  • Comfortable working with ambiguity and collaborating with stakeholders to define requirements; able to take ownership with minimal oversight in a fast-paced environment.
  • Experience proactively identifying and implementing improvements to data warehouse performance and ETL efficiency.
  • Technical Versatility: Expert-level SQL and DBT skills for complex queries and data transformations.
  • Proficiency in Python for transformations that extend beyond SQL.
  • Hands-on experience with query optimization across OLTP and OLAP systems (e.g., Postgres, Iceberg).
  • Proficiency with Semantic Layer modelling (e.g., Cube, dbt Semantic Layer).
  • Experience owning CI/CD workflows and establishing team-wide standards for version control and code review (e.g., Git).
  • Familiarity with cloud environments (GCP or AWS).
Nice to Haves
  • Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow).
  • Domain experience for brokerage operations or a passion for financial markets and modelling financial datasets.
How We Take Care of You
  • Competitive Salary & Stock Options
  • New Hire Home-Office Setup: One-time USD $500
  • Monthly Stipend: USD $150 per month via a Brex Card

Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.

Interested in building your career at Alpaca? Get future opportunities sent straight to your email.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.