Enable job alerts via email!

Data Engineer

Cognizant

City Of London

On-site

GBP 65,000 - 85,000

Full time

30+ days ago

Job summary

A technology consulting firm in the City of London is seeking an experienced Data Engineer to design and implement robust data pipelines. The role involves developing ETL processes using GCP services, optimizing BigQuery models, and ensuring compliance with security standards. Candidates should have strong experience in data engineering and the relevant certifications in Google Cloud. This position offers a hybrid work model with three days on-site.

Qualifications

  • Extensive experience in data engineering.
  • Strong hands-on experience with GCP.
  • Experience in cloud migration and real-time data processing is a plus.

Responsibilities

  • Design and implement robust ETL/ELT pipelines using GCP services.
  • Develop and maintain data models and marts in BigQuery.
  • Implement GCP security best practices including IAM, VPC Service Controls.
  • Provide architectural guidance for cloud migration.

Skills

Python
SQL
Java (optional)
GCP Services: BigQuery
GCP Services: Dataflow
GCP Services: Dataproc
GCP Services: Cloud Storage
GCP Services: Cloud SQL
Tools: GitHub
Tools: Jenkins
Tools: Terraform
Tools: DBT
Databases: Oracle
Databases: Postgres
Databases: MySQL
Soft Skills: Strong analytical and problem-solving skills
Soft Skills: Excellent communication
Soft Skills: Ability to work in Agile environments

Education

Google Cloud Certified – Professional Data Engineer
Google Cloud Certified – Associate Cloud Engineer
Google Cloud Certified – Professional Cloud Architect (optional)

Tools

Apache Beam
Airflow
Cloud Composer
Stackdriver
Job description

Hybrid – 3 days on site

Key Responsibilities

  • Data Pipeline Development

  • Design and implement robust ETL/ELT pipelines using GCP services like Dataflow, Dataproc, Cloud Composer, and Data Fusion.

  • Automate data ingestion from diverse sources (APIs, databases, flat files) into BigQuery and Cloud Storage

Data Modelling & Warehousing

  • Develop and maintain data models and marts in BigQuery.

  • Optimize data storage and retrieval for performance and cost efficiency.

Security & Compliance

  • Implement GCP security best practices including IAM, VPC Service Controls, and encryption.

  • Ensure compliance with GDPR, HIPAA, and other regulatory standards.

Monitoring & Optimization

  • Set up monitoring and alerting using Stackdriver.

  • Create custom log metrics and dashboards for pipeline health and performance

Collaboration & Support

  • Work closely with cross-functional teams to gather requirements and deliver data solutions.

  • Provide architectural guidance and support for cloud migration and modernization initiatives

Skillset

Technical Skills

  • Languages: Python, SQL, Java (optional)

  • GCP Services: BigQuery, Dataflow, Dataproc, Cloud Storage, Cloud SQL, Cloud Functions, Composer (Airflow), App Engine

  • Tools: GitHub, Jenkins, Terraform, DBT, Apache Beam

  • Databases: Oracle, Postgres, MySQL, Snowflake (basic)

  • Orchestration: Airflow, Cloud Composer

  • Monitoring: Stackdriver, Logging & Alerting

Certifications

  • Google Cloud Certified – Professional Data Engineer

  • Google Cloud Certified – Associate Cloud Engineer

  • Google Cloud Certified – Professional Cloud Architect (optional)

Soft Skills

  • Strong analytical and problem-solving skills

  • Excellent communication and stakeholder management

  • Ability to work in Agile environments and manage multiple priorities

Experience Requirements

  • Extensive experience in data engineering

  • Strong hands-on experience with GCP

  • Experience in cloud migration and real-time data processing is a plus

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.