An excellent opportunity has emerged for a Senior GCP Data Engineer to join the Cyber Security domain of an investment bank.
In this role, you will be responsible for providing top-tier data capabilities, leveraging your engineering skills, and embracing the possibilities offered by cloud technology.
Key responsibilities:
- API Development: Design, develop, and maintain robust and scalable backend systems and APIs.
- Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments.
- Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency.
- Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms.
- Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools.
- Data Lakes: Build data lakes using Google Cloud services such as BigQuery.
- Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments.
- Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation.
- Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing.
Essential skills/knowledge/experience:
- 8+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions.
- Proficiency in GCP services such as BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, and SQL.
- Experience designing, developing, and deploying scalable, reliable, and secure cloud-based solutions using GCP.
- Ability to translate business requirements into technical specifications.
- Proficiency with core GCP services like Compute Engine, GKE, Cloud Storage, Cloud Functions, Cloud SQL, and BigQuery.
- Experience implementing GCP networking configurations.
- Use of Infrastructure as Code (IaC) tools like Terraform for automation.
- Experience with CI/CD pipelines and automation of deployment processes.
- Implementing security best practices to safeguard cloud infrastructure and data.
- Ability to identify and resolve performance bottlenecks.
- Experience with data analytics and Big Data technologies.
- Knowledge of cloud security standards and compliance.
- Experience working within agile development methodologies.
- GCP certifications (e.g., Google Cloud Certified Professional Cloud Developer) are advantageous.
* Free services are subject to limitations.