- Explore and analyse millions of rows of tabular data to uncover meaningful insights and build advanced machine-learning models.
- Design and implement causal analysis models to assess the impact of system performance on user experience, providing clear, actionable insights into customer behaviour.
- Develop and deploy machine learning models and workflows, transforming terabytes of traffic data into actionable insights that drive key business decisions.
- Lead the development and deployment of models, ensuring robustness, scalability, and reliability in production environments.
- Build automated solutions for business needs,
such as bot detection using advanced machine learning and statistical methods.
- Collaborate closely with product owners, engineers, and other stakeholders to translate analytical findings into impactful features and product improvements.
- Take ownership of technical direction, contribute to architectural decisions, identify technical debt, and advocate for opportunities for improvement.
- Mentor and support junior data scientists, enhancing team productivity, improving code quality, and fostering a culture of collaboration and learning.
- Continuously monitor and enhance model performance in partnership with the engineering team, improving model impact on user experience and system effectiveness.
- Qualifications
- A degree in Engineering, Computer Science, Mathematics, or another quantitative field.
- 10+ years of demonstrable / tenured experience in data / data science, including at least 3 years in a lead or senior IC role.
- Expertise in causal analysis methods (e.g., propensity score matching, A / B testing, uplift modeling)
with a demonstrated ability to analyse tabular data.
- Strong experience in Python (including Pandas, NumPy, and Scikit-Learn) for data processing and machine learning model construction.
- Proficiency in SQL, with the ability to write complex queries and optimise data retrieval from relational databases.
- Strong communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
- Familiarity with big data technologies such as Spark or Snowpark for processing and analysing large datasets efficiently.
- Hands-on experience working with Snowflake, particularly using Snowpark for scalable data engineering and machine learning workflows.
- Experience with AWS services (e.g., S3, Lambda, EC2) for managing machine learning infrastructure and deploying models in a cloud-native environment.
- Hands-on experience with data visualisation tools like Plotly, Seaborn, or other Python-based libraries to convey data insights effectively.
- Familiarity with data pipeline orchestration tools (e.g., Airflow, Luigi) to manage ETL / ELT workflows.
- Ability to operate in a fast-paced, dynamic environment, effectively prioritising multiple projects with competing deadlines.
- Additional Information
- All Insights team members are expected to travel at least 1 to 2 times per year for annual team meetings & events.
J-18808-Ljbffr