Enable job alerts via email!

Data Analyst 2 - R&D Engine

Menlo Ventures

Canada

On-site

CAD 70,000 - 90,000

Full time

Yesterday
Be an early applicant

Job summary

A technology investment firm is seeking an experienced Data Analyst to build and maintain data pipelines and dashboards while optimizing infrastructure costs. The role requires strong expertise in SQL, dbt, and Python, enabling high-frequency reporting and ad hoc analyses. Ideal candidates will have 3+ years of experience in analytical environments and a Bachelor's degree in a quantitative field. This position is based in Canada and offers a collaborative work environment.

Qualifications

  • 3+ years of experience working with large-scale data.
  • Proficiency in writing complex SQL queries.
  • Hands-on experience using DBT for data transformation.
  • Experience with a programming language for data exploration.
  • Strong grasp of data analyst best practices.

Responsibilities

  • Build and maintain data pipelines and dashboards.
  • Support reporting across various R&D initiatives.
  • Partner with stakeholders to define and optimize metrics.

Skills

SQL
dbt
Python
Data visualization

Education

Bachelor's degree in Data Science, Mathematics, Statistics, Computer Science, or related field

Tools

Looker
Tableau
Power BI
Job description

Join our R&D Engine team as a Data Analyst, where you will build and maintain core data pipelines and dashboards supporting a broad range of R&D metrics, with a primary focus on infrastructure cost optimization, alongside areas such as AI enablement, engineering productivity, and other critical initiatives. In this highly collaborative role, you\'ll work closely with engineers and finance leadership to define, track, and optimize the metrics that matter most for our organization\'s efficiency and innovation.

Bringing your expertise in SQL, dbt, and Python, you will ensure our team can deliver high-frequency reporting, provide ad hoc analyses, and enable rapid feedback loops for cost and operational effectiveness. Your work will multiply our team\'s impact across all R&D initiatives.

Key Responsibilities
  • Build, enhance, and maintain data pipelines and dashboards that drive transparency and optimization within our infrastructure cost program (across AWS, Azure, and related platforms)
  • Support reporting and data development across AI enablement, engineering productivity, product usage, and other R&D-focused initiatives
  • Partner with engineers and technical stakeholders to define, track, and optimize actionable metrics; participate in metric design, not just execution
  • Apply strong SQL, dbt, and Python skills to automate measurement, ensure data quality, and maintain reliable operational metrics
Qualifications
  • 3+ years of experience working with large-scale data in analytical or product-oriented environments, with a strong focus on data exploration, interpretation, and communication.
  • Proficiency in writing complex SQL queries and building clean, reliable data models to support reporting and analysis.
  • Hands-on experience using DBT (Data Build Tool) to transform and organize data for downstream analytics workflows.
  • Strong grasp of data analyst best practices, including analysis reproducibility, validating results through testing and sanity checks, and communicating insights clearly to both technical and non-technical audiences.
  • Experience with at least one programming language (e.g., Python or R) for data exploration, statistical analysis, and automating reporting workflows; familiarity with DBT and/or Databricks is a plus.
  • Bachelor\'s degree in a quantitative field such as Data Science, Mathematics, Statistics, Computer Science, Information Systems, or a related discipline.
  • Demonstrated ability to translate ambiguous business questions into structured analyses and data models optimized for insight generation.
  • Strong track record of collaboration with cross-functional partners to deliver high-impact data solutions.
Preferred but not required:
  • MS degree in Data Science, Mathematics, Statistics, Computer Science, Information Systems or other related engineering field
  • Experience integrating and analyzing cloud cost and operational datasets (AWS, Azure, Databricks)
  • Experience with predictive modeling or statistical analysis techniques to support deeper insights and forecasting.
  • Proficiency with business intelligence tools (e.g., Looker, Tableau, Power BI) to build intuitive dashboards and communicate insights effectively.

#LI-RT1

Abnormal AI is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status or other characteristics protected by law. For our EEO policy statement please click here. If you would like more information on your EEO rights under the law, please click here.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.