Social network you want to login/join with:
col-narrow-left
Client:
Meta Resources Group
Location:
Lyon, France
Job Category:
Other
-
EU work permit required:
Yes
col-narrow-right
Job Reference:
ffdd84cbdf7d
Job Views:
1
Posted:
18.05.2025
Expiry Date:
02.07.2025
col-wide
Job Description:
Job Description
Our client, a global healthcare company, is seeking an Enterprise Data Platform Lead Consultant, highly proficient in AWS, Python, and Snowflake, to spearhead the design and implementation of their Enterprise Data Platform (EDP) solutions for cross-domain reporting and analytics. You will drive cloud-based data integration, storage, and curation using AWS, Python, and Snowflake, ensuring alignment with strategic initiatives of client programs, including but not limited to the Spectra to Quest Lab transition. This role demands technical leadership in scenarios where data gravity necessitates EDP-based reporting outside SAP Datasphere.
This is a remote, contract role, with minimal travel. The length of this contract will be through the end of 2025, with the likelihood of renewal well into 2026.
Job Responsibilities:- Lead for Enterprise Data Platform reporting capabilities for cross-domain reporting and analytical needs.
- Architect and deliver cloud-based analytical solutions leveraging AWS, Python, and Snowflake.
- Design and implement end-to-end data integration, storage, and curation pipelines for high-performance analytical use cases.
- Function as the technical leader in implementing EDP solutions that support data-intensive initiatives within the client's program, especially where reporting must occur outside of DataSphere due to data gravity considerations.
- Collaborate with data engineers, analysts, and business units to capture requirements and translate them into effective data models and pipelines.
- Ensure scalability, governance, and security are core to the EDP solution design.
- Support and guide project teams, enforcing data platform architecture best practices and performance optimization strategies.
Requirements
- 5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake.
- Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases.
- Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks.
- Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.
Preferred Qualifications:
- AWS Certified Solutions Architect, Snowflake SnowPro Core/Advanced, or Python certifications.
- Familiarity with Databricks, Tableau, or Power BI is a plus.
- Fluent in English; ability to collaborate with global teams across EU time zones.
- Strong problem-solving skills and stakeholder management for technical and non-technical audiences.
Requirements5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake. Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases. Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks. Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.