Job Search and Career Advice Platform

Enable job alerts via email!

Database Administration (Level 3)

Onyx-Conseil

Beith

Hybrid

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading IT consultancy in the UK is looking for a Database Administrator (Level 3) for an initial 6-month contract. The role involves managing PostgreSQL, Snowflake, and Greenplum databases, with responsibilities including data migrations, performance tuning, and building data ingestion pipelines. Candidates must have extensive experience with these technologies and scripting in Shell/Python/Ansible. On-site work in Glasgow is required for up to 3 days per week, offering a competitive rate of £450 - £550 per day.

Qualifications

  • Extensive knowledge of PostgreSQL and Snowflake databases.
  • Ability to manage database provisioning, maintenance, and upgrades.
  • Experience with data migrations and performance optimization.

Responsibilities

  • Administer and manage PostgreSQL, Snowflake, and Greenplum databases.
  • Build data ingestion pipelines with tools like Informatica and Talend.
  • Ensure database security and adherence to best practices.

Skills

PostgreSQL
Snowflake
Greenplum
Shell scripting
Python programming
Ansible automation
ETL processes
Data ingestion
AWS
Microsoft Azure
Google Cloud
Job description
Database Administrator (Level 3)

6 Month contract initially. Based: Max 3 days Onsite in Glasgow. Rate: £450 - £550 p/d (via Umbrella company).

We have a great opportunity with a world leading organisation where you will be provided with all of the support and development to succeed. A progressive organisation where you can really make a difference. We have a great opportunity for a number of Database Administrators (Level 3) to join the team.

Database Administration of Greenplum/ PostgreSQL / Snowflake
  • Role 1: Snowflake DBA
  • Role 2: GP/PSG DBA (good to know Snowflake)
Key Responsibilities
  • Candidate will have extensive working knowledge on PostgreSQL, Snowflake and Greenplum databases.
  • Snowflake internals and integration with other data processing technologies.
  • Data lakes, data structures and data models suited to Snowflake architecture.
  • Snowflake modeling – roles, schemas and databases.
  • Experience of building data ingestion pipelines with tools like Informatica, Talend etc.
  • Effective management of data from various sources like JSON, XML, CSV etc.
  • Expertise in Patroni for HADR and streaming replication.
  • Strong knowledge of Postgres/Greenplum database backup and recovery strategies.
  • Proven experience in performance tuning and optimization for Postgres/Snowflake and Greenplum.
  • Develop and maintain backup strategies to ensure data integrity and availability.
  • Manage day‑to‑day activities such as database provisioning, maintenance, and upgrades in Postgres.
  • Perform data migrations using gpcopy and other tools, minimizing downtime and data loss.
  • Manage database bloat and leverage statistics for performance optimization.
  • Understand and utilize GPTEXT for troubleshooting complex issues.
  • Leverage Greenplum utilities like gpload, pxf, and GP spark for efficient data loading and integration.
  • Ensure database security, compliance, and adherence to best practices.
  • Knowledge about ETL processes and its implementation.
  • Knowledge of SQL and complex query writing.
  • Implement monitoring solutions to proactively identify and resolve issues.
  • Develop scripts in shell/Python/Ansible to automate routine database tasks and improve operational efficiency.
  • Cloud computing experience with AWS, Microsoft Azure, Google Cloud.
Key Skills & Experience
  • Extensive working knowledge on PostgreSQL, Snowflake and Greenplum databases.
  • For Postgres - Database provisioning, maintenance, and upgrades.
  • For Greenplum
    • Data Migrations using gpcopy.
    • Performance optimization leveraging statistics.
    • Manage DB bloat, Understand and utilize GPTEXT for troubleshooting complex issues.
    • Experience in Greenplum utilities like gpload, pxf, and GP spark for efficient data loading and integration.
  • For Snowflake
    • Data lakes, data structures and data models suited to Snowflake architecture.
    • Snowflake modeling – roles, schemas and databases.
    • SQL and complex query writing.
    • Expertise in Shell/Python/Ansible scripting.
    • Expertise in Patroni for HADR and streaming replication.
Desirable skills/knowledge/experience
  • ETL processes and its implementation.
  • Building data ingestion pipelines with tools like Informatica, Talend etc.
  • Cloud computing experience with AWS, Microsoft Azure, Google Cloud.

This is an excellent opportunity on a great project of work. If you are looking for your next exciting opportunity, apply now for your CV to reach me directly; we will respond as soon as possible.

About LA International

LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an IT Consultancy or as an Employment Business & Agency depending upon the precise nature of the work, for security cleared jobs or non-clearance vacancies. LA International welcomes applications from all sections of the community and from people with diverse experience and backgrounds.

Award Winning LA International, winner of the Recruiter Awards for Excellence, Best IT Recruitment Company, Best Public Sector Recruitment Company and overall Gold Award winner, has now secured the most prestigious business award that any business can receive, The Queens Award for Enterprise: International Trade, for the second consecutive period.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.