Social network you want to login/join with:
A leading global consultancy organisation is seeking a skilled Data Architect to design and deliver robust, scalable data platforms on Google Cloud Platform (GCP). The ideal candidate will bring extensive experience in building and maintaining large-scale data warehouses, ensuring high standards of data quality, and adhering to best practices in data architecture.
Key Responsibilities
- Define and develop the overall data strategy in line with client requirements and compliance standards.
- Lead and mentor a team of data engineers to implement scalable and efficient data solutions.
- Translate business needs into technical solutions by collaborating with both technical and non-technical stakeholders.
- Ensure the design and deployment of data platforms follow GCP best practices for performance, reliability, and security.
Required Skills and Experience
- Proven background in designing data warehouse and data lake architectures.
- Demonstrable experience in building and operating large-scale data platforms on Google Cloud Platform, with a focus on data quality at scale.
- Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub.
- Strong programming skills in PySpark, Python, and SQL.
- Proficiency in ETL processes, data mining, and data storage principles.
- Experience with BI and data visualisation tools, such as Looker or Power BI.
- Excellent communication skills, with the ability to effectively bridge technical and business discussions.
Desirable Qualifications and Experience
- Degree (BSc, MSc, or PhD) in Computer Science, Mathematics, or a related field.
- Familiarity with alternative cloud and data platforms such as Databricks, Snowflake, Azure, or AWS.
- Knowledge of DevOps/DataOps methodologies and experience with CI/CD pipelines.
- Understanding of monitoring, logging, and troubleshooting in cloud-based environments.
- Google Cloud Professional Data Engineer certification (or equivalent) is highly advantageous.