Our client, a leading financial institution in Downtown Toronto, is looking for a Bilingual Data Engineer – GCP to move data from on-premises to the cloud, working with stakeholders in Mexico, Chile, and Peru. The successful candidate will have the opportunity to work with one of the Top 5 Banks in Canada.
Typical Day in Role:
- Build data pipelines to ingest data into the cloud for multiple technologies across different countries.
- Manage processes from repositories into the cloud, including data management, connections, and patterns, ensuring clarity and standardization across sources.
- Design, develop, and maintain data pipelines and infrastructure on Google Cloud Platform (GCP) to support scalable data products, reporting to the Senior Data Engineer.
- Lead data migration efforts.
- Receive and prioritize use cases from business units, such as mortgage, AML, and analytics.
- Perform data modeling to meet various business needs.
- Standardize data from different sources and countries for Canadian banking stakeholders.
- Translate local business processes into global standards.
- Communicate documentation in English; collaborate with remote teams in Spanish.
- Utilize GCP and BigQuery for data storage, and Composer (AirFlow) for data transformation.
- Build and maintain ETL/ELT processes ensuring reliable data flow.
- Collaborate with cross-functional teams to understand requirements and deliver solutions.
- Implement data models, schemas, and transformations for analytics and reporting.
- Monitor and optimize data pipelines for quality and performance.
- Ensure compliance with data governance, security, and standards within GCP.
- Troubleshoot and resolve pipeline issues to minimize downtime.
- Contribute to best practices, documentation, and testing.
- Stay updated on GCP services and data engineering trends.
Must-Have Skills:
- 8+ years of experience as a Data Engineer, with recent ETL experience.
- Bilingual in English and Spanish (professional proficiency).
- 3+ years of building data pipelines on cloud platforms, preferably GCP.
- 3+ years of experience with GCP tools like BigQuery, Dataflow, Pub/Sub, Cloud Composer, AirFlow.
- 5+ years of programming in Python or Java and advanced SQL skills.
Nice-To-Have Skills:
- 3+ years in data modeling, schema design, and data warehousing.
- Experience with version control (e.g., Git) and CI/CD practices.
- Knowledge of data governance and security in cloud environments.
- Experience in financial services or banking.
- Experience working in Agile teams.
Soft Skills:
- Strong problem-solving abilities and teamwork skills.
- Effective communication to explain technical concepts to non-technical stakeholders.
Education:
- Bachelor’s degree in Computer Science, Data Engineering, IT, or related field.
We are committed to creating an inclusive environment. We welcome applicants of all backgrounds and abilities, providing an accessible candidate experience. We value diversity and inclusion in our hiring process.