Job Search and Career Advice Platform

Enable job alerts via email!

Database Administration (Level 3)

LA International Computer Consultants Ltd

Scotland

On-site

GBP 60,000 - 80,000

Full time

21 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology consultancy is seeking a Database Administrator (Level 3) for a 6-month contract. The successful candidate will work on PostgreSQL and Snowflake databases, with responsibilities including database provisioning, performance tuning, and implementing data ingestion pipelines. The position requires strong scripting skills in Shell, Python, or Ansible, along with cloud computing experience. Competitive daily rate offered, with maximum of 3 days onsite in Glasgow.

Qualifications

  • Extensive experience with PostgreSQL, Snowflake, and Greenplum databases.
  • Proficient in building data ingestion pipelines using tools like Informatica and Talend.
  • Strong knowledge of database performance tuning and optimization.

Responsibilities

  • Manage the day-to-day activities such as database provisioning, maintenance, and upgrades.
  • Perform data migrations using gpcopy, minimizing downtime.
  • Develop and maintain backup strategies to ensure data integrity.

Skills

PostgreSQL
Snowflake
Greenplum
Data ingestion pipelines
SQL
Patroni
Shell scripting
Python scripting
Ansible
Cloud computing
Job description

Database Administrator (Level 3) – 6 Month contract initially – Based: Max 3 days Onsite in Glasgow – Rate: £450 - £550 p/d (via Umbrella company)

Job Overview

We have a great opportunity with a world leading organisation where you will be provided with all of the support and development to succeed. A progressive organisation where you can really make a difference.

Key Responsibilities
  • Extensive working knowledge on PostgreSQL, Snowflake, and Greenplum databases.
  • Snowflake internals and integration with other data processing technologies.
  • Data lakes, data structures, and data models suited to Snowflake architecture.
  • Snowflake modeling – roles, schemas, and databases.
  • Experience building data ingestion pipelines with tools like Informatica, Talend, etc.
  • Manage data from various sources like JSON, XML, CSV, etc.
  • Expertise in Patroni for HADR and streaming replication.
  • Strong knowledge of Postgres/Greenplum database backup and recovery strategies.
  • Performance tuning and optimization for Postgres, Snowflake, and Greenplum.
  • Develop and maintain backup strategies to ensure data integrity and availability.
  • Manage day‑to‑day activities such as database provisioning, maintenance, and upgrades in Postgres.
  • Perform data migrations using gpcopy and other tools, minimizing downtime and data loss.
  • Manage database bloat and leverage statistics for performance optimization.
  • Understand and utilize GPTEXT for troubleshooting complex issues.
  • Leverage Greenplum utilities like gpload, pxf, and GP Spark for efficient data loading and integration.
  • Ensure database security, compliance, and adherence to best practices.
  • Knowledge of ETL processes and its implementation.
  • Knowledge of SQL and complex query writing.
  • Implement monitoring solutions to proactively identify and resolve issues.
  • Develop scripts in Shell, Python, or Ansible to automate routine database tasks and improve operational efficiency.
  • Cloud computing experience with AWS, Microsoft Azure, or Google Cloud.
Key Skills & Experience
  • Database provisioning, maintenance, and upgrades for PostgreSQL.
  • Data migrations using gpcopy.
  • Performance optimization leveraging statistics.
  • Manage database bloat; understand and use GPTEXT for troubleshooting.
  • Experience with Greenplum utilities like gpload, pxf, and GP Spark.
  • Data lakes, data structures, and models suited to Snowflake architecture.
  • Snowflake modelling – roles, schemas, and databases.
  • SQL and complex query writing.
  • Script development in Shell, Python, or Ansible.
  • Expertise in Patroni for HADR and streaming replication.
Desirable Skills / Knowledge / Experience
  • ETL processes and implementation.
  • Building data ingestion pipelines with Informatica, Talend, etc.
  • Cloud computing experience with AWS, Microsoft Azure, Google Cloud.

We welcome applications from all sections of the community and people with diverse experience and backgrounds.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.