Enable job alerts via email!

Senior Data Engineer (GCP)

Ipsos

London Borough of Harrow

Hybrid

GBP 45,000 - 65,000

Full time

8 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

Ipsos is seeking a Data Engineer for the Route project, focusing on developing and maintaining data infrastructure. This role will be critical in enhancing audience measurement through innovative data-driven solutions in a hybrid work environment, providing excellent benefits and opportunities for professional growth.

Benefits

25 days annual leave
Pension contribution
Income protection
Life assurance
Health & wellbeing benefits
Professional development opportunities

Qualifications

  • Minimum of 3 years relevant commercial experience.
  • Experience with scalable data pipelines, SQL on BigQuery, and Python libraries like Pandas.
  • Familiarity with containerisation and orchestration.

Responsibilities

  • Develop robust data pipelines and optimise data storage solutions on GCP.
  • Implement ETL processes and CI/CD pipelines.
  • Collaborate with data scientists to integrate models into production.

Skills

Data pipelines
ETL processes
Cloud storage systems
APIs
Data quality
AI/ML model deployment
Excellent communication

Tools

GCP
Docker
Kubernetes
Terraform

Job description

The Audience Measurement team at Ipsos use deep understanding of peopleto make sense of audiences and how they consume media.We use these to influence media strategy, helping clients to answer crucial questions, such as how to target audiences, maximise attention across platforms, enhance audience experience, and demonstrate or increase audience value.

We are recruiting a Data Engineer for one of our flagship accounts, Out Of Home – Route.

Your work will be essential for the Route project's success. Building robust and scalable infrastructure and being the linchpin between the data platform and data science teams in an ‘advisory-like’ role will enable data scientists to focus on research and innovation in synthetic data, enhancing data-driven solutions for clients and Ipsos. You will often support the productionisation and scaling of local ML models, guiding the data science teams and providing the guardrails as they develop and iterate.

What will I be doing?

As a Data Engineer on the ground-breaking Route project, you'll develop and maintain the data infrastructure for a first-of-its-kind synthetic travel survey, generating audience figures for Out-of-Home media in Great Britain. Reporting to a Principal Data and Platform Engineer, you will collaborate with data scientists to design, build, and optimise data pipelines for high-quality audience measurement data.

Your key responsibilities will include:

  • Develop robust, scalable data pipelines and optimise data storage solutions on GCP.
  • Implement ETL processes and CI/CD pipelines to ensure clean, structured data ready for use.
  • Work with data scientists to integrate synthetic models into production environments and provide the guardrails and advice as they develop.
  • Provide technical support, troubleshoot issues, and research new technologies to enhance capabilities.
  • Document pipelines and the platform including architectures and user guides,helping to enforce data management standards.
  • Participate in agile ceremonies and provide occasional client interaction.
  • Engage in DataOps practices and improve data delivery performance.
  • GCP: GCS, BigQuery, GKE, Artifact Registry, Vertex AI, App Engine, Data Store, Secret Manager, Pub/Sub

What do I need to bring with me?

It is essential that your personal attributes compliment your technical skills.To be successful in this role you will need the following skills and experience:

  • A minimum of 3 years relevant commercial experience with experience of scalable data pipelines using Argo on GKE, SQL on BigQuery, Python libraries like Pandas.
  • Comfortable with APIs and Cloud storage systems.
  • Experience with containerisation (Docker) and orchestration (Kubernetes).
  • Familiarity with Terraform and data systems optimisation.
  • Commitment to data quality and experience in synthetic data, AI/ML model deployment.
  • Excellent communication skills and a collaborative and positive mindset.
  • Willingness to learn.
  • GCP certifications are a plus.

We offer a comprehensive benefits package designed to support you as an individual. Our standard benefits include 25 days annual leave, pension contribution, income protection and life assurance. In addition, there are a range health & wellbeing, financial benefits and professional development opportunities.

We realise you may have commitments outside of work and will consider flexible working applications - please highlight what you are looking for when you make your application. We have a hybrid approach to work and ask people to be in the office or with clients for 3 days per week.

We are committed to equality, treating people fairly, promoting a positive and inclusive working environment and ensuring we have diversity of people and views. We recognise that this is important for our business success - a more diverse workforce will enable us to better reflect and understand the world we research and ultimately deliver better research and insight to our clients. We are proud to be a member of the Disability Confident scheme, certified as Level 1 Disability Confident Committed. We are dedicated to providing an inclusive and accessible recruitment process.

Please note that if you are NOT a passport holder of the country for the vacancy you might need a work permit. Check our Blog for more information.

Bank or payment details should not be provided when applying for a job. Eurojobs.com is not responsible for any external website content. All applications should be made via the 'Apply now' button.

Created on 25/06/2025 by TN United Kingdom

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.