Enable job alerts via email!

Data Engineer

TN United Kingdom

London

On-site

GBP 50,000 - 90,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join an innovative firm as a Data Engineer, where you'll tackle large-scale data challenges and build cutting-edge data pipelines. This role offers the chance to work with massive datasets and collaborate with diverse teams in a fast-paced environment. You'll contribute to a new approach in digital marketing by harnessing big data and machine learning, making a significant impact on the company's growth. If you're passionate about technology and eager to be part of a dynamic team that values creativity and fun, this opportunity is perfect for you.

Benefits

Healthcare insurance & cash plans
Pension
Parental Leave Policies
Learning & Development Program
Wellness Resources
Equity
Annual Leave Entitlement
Paid holidays
Team events

Qualifications

  • Advanced degree in computer science or related field required.
  • Solid programming skills in Python and SQL are essential.

Responsibilities

  • Design and implement data engineering systems for high-volume data.
  • Work with teams to translate market needs into solutions.

Skills

Python
SQL
Data Engineering
Performance Optimization
Commercial Mindset

Education

Bachelor's Degree in Computer Science
Master's Degree in Computer Science

Tools

Google Cloud Platform
Apache Airflow
Dataflow
BigQuery
Apache Druid
Elasticsearch
Terraform
DBT
Looker

Job description

Social network you want to login/join with:

Skimlinks, a Connexity and Taboola company, drives e-commerce success for 50% of the Internet’s largest online retailers. We deliver $2B in annual sales by connecting retailers to shoppers on the most desirable retail content channels. As a pioneer in online advertising and campaign technology, Connexity is constantly iterating on products, solving problems for retailers, and building interest in new solutions.

We have recently been acquired by Taboola to make the first Open-Web Source for Publishers connecting editorial content to product recommendations, where readers can easily buy products related to stories they are reading.

Due to our explosive growth, we are seeking an experienced Data Engineer to help with a variety of large scale data challenges and build pipelines that can analyze the torrent of our data. Our ideal candidate enjoys a fast-paced environment, working with multiple, complex projects and teams with a positive, collaborative attitude.

About the role

We are looking for a Data Engineer to join our team in London. We are creating a fundamentally new approach to digital marketing, combining big data with large-scale machine learning. Our data sets are on a truly massive scale - we collect data on over a billion users per month and analyze the content of hundreds of millions of documents a day.

As a member of our Data Platform team your responsibilities will include:

  • Designing, building and implementing data engineering systems across all parts of our data platform; from high-volume data collection, scheduling, enrichment through to automated analysis of large datasets.
  • Working with our team of Product Managers and our commercial teams to understand the needs of the market and our customers; and how they can be translated into solutions.
  • Crafting innovative solutions to complex technical problems and making design decisions in line with our technical strategy and high engineering standards.
  • Ensuring adherence to software development best practices.
  • Sharing your knowledge across the business and mentoring others in your areas of deep technical expertise.

Requirements

Here at Skimlinks we value dedication, enthusiasm, and a love of innovation. We are disrupting the online monetization industry and welcome candidates who want to be a part of this ambitious journey. But it is not just hard work; we definitely appreciate a bit of quirkiness and fun along the way.

We’re looking for a Data Engineer with the following:

  • An advanced degree (Bachelor/Masters) in computer science or a related field.
  • Solid programming skills in both Python and SQL.
  • Proven work experience in Google Cloud Platform or other clouds, developing batch (Apache Airflow) and streaming (Dataflow) scalable data pipelines.
  • Experience processing large datasets at scale (BigQuery, Apache Druid, Elasticsearch).
  • Familiarity with Terraform, DBT & Looker is a plus.
  • Passion around performance optimization and cost reduction.
  • A commercial mindset; you are passionate about creating outstanding products.

Voted “Best Places to Work,” our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates sites and business services in the US, UK, and EU. We offer top benefits including Annual Leave Entitlement, paid holidays, competitive compensation, team events, and more!

  • Healthcare insurance & cash plans.
  • Pension.
  • Parental Leave Policies.
  • Learning & Development Program (educational tool).
  • Wellness Resources.
  • Equity.

We are committed to providing a culture at Connexity that supports the diversity, equity, and inclusion of our most valuable asset, our people. We encourage individuality and are driven to represent a workplace that celebrates our differences, and provides opportunities equally across gender, race, religion, sexual orientation, and all other demographics. Our actions across Education, Recruitment, Retention, and Volunteering reflect our core company values and remind us that we’re all in this together to drive positive change in our industry.

This position is hybrid (1-2 days/week) and based in our London office.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.