Enable job alerts via email!

SQL, Big Query/Consultant Specialist

HSBC

Hyderabad City Taluka

On-site

PKR 2,000,000 - 3,200,000

Full time

3 days ago
Be an early applicant

Job summary

A global banking institution is seeking an experienced Consultant Specialist in Hyderabad City. The role requires 8-12 years of relevant experience, with strong skills in advanced SQL development, GCP BigQuery, and Python programming. The ideal candidate will be adept at automating data pipelines and collaborating with cross-functional teams. This position offers significant opportunities for professional growth within a leading financial services organization.

Qualifications

  • 8-12 years of experience in data engineering with strong knowledge of SQL and Python.
  • Experience with Google Cloud Platform, specifically BigQuery and Google Cloud Storage.
  • Ability to design and maintain data pipelines using Airflow.

Skills

Advanced SQL Development
GCP BigQuery and GCS
Airflow DAG Development
Python Programming
Shell Scripting
Continuous Learning
Communication

Job description

Some careers shine brighter than others


If you’re looking for a career that will help you stand out, join HSBC, and fulfill your potential. Whether you want a career that could take you to the top or simply take you in an exciting new direction, HSBC offers opportunities, support, and rewards that will take you further.


HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and ultimately helping people fulfill their hopes and realize their ambitions.


We are currently seeking an experienced professional to join our team as a Consultant Specialist. The ideal candidate will have:


  1. 8-12 years of experience with the following skills and requirements:

  2. Advanced SQL Development: writing complex SQL queries, optimizing for performance, and understanding query execution plans.

  3. GCP BigQuery and GCS: working with Google BigQuery for data warehousing and analytics, managing data with Google Cloud Storage.

  4. Airflow DAG Development: designing, developing, and maintaining workflows, automating data pipelines.

  5. Python Programming: developing, maintaining, debugging, and optimizing Python scripts for data processing.

  6. Shell Scripting: writing and debugging basic shell scripts for automation.

  7. Continuous Learning: staying updated with data engineering tools and technologies, demonstrating adaptability.

  8. Communication: collaborating with cross-functional teams and communicating technical concepts effectively.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.