Activez les alertes d’offres d’emploi par e-mail !

Enterprise Data Platform Lead Consultant (AWS, Python, Snowflake)

TN France

Lyon

À distance

EUR 70 000 - 90 000

Plein temps

Il y a 3 jours
Soyez parmi les premiers à postuler

Mulipliez les invitations à des entretiens

Créez un CV sur mesure et personnalisé en fonction du poste pour multiplier vos chances.

Résumé du poste

A global healthcare company is seeking an Enterprise Data Platform Lead Consultant proficient in AWS, Python, and Snowflake. This remote contract role involves designing and implementing data solutions for cross-domain reporting and analytics. The position emphasizes technical leadership and collaboration with various teams to ensure effective data management and governance.

Qualifications

  • 5+ years in designing Enterprise Data Platforms.
  • Expertise in AWS, Python (Pandas, PySpark), and Snowflake.

Responsabilités

  • Lead for Enterprise Data Platform reporting capabilities.
  • Architect and deliver cloud-based analytical solutions.
  • Design and implement end-to-end data integration pipelines.

Connaissances

AWS
Python
Snowflake
SQL
NoSQL
Problem Solving

Formation

AWS Certified Solutions Architect
Snowflake SnowPro Core/Advanced
Python certifications

Outils

Apache Airflow
dbt
Fivetran
Databricks
Tableau
Power BI

Description du poste

Social network you want to login/join with:

col-narrow-left

Client:

Meta Resources Group

Location:

Lyon, France

Job Category:

Other

-

EU work permit required:

Yes

col-narrow-right

Job Reference:

ffdd84cbdf7d

Job Views:

1

Posted:

18.05.2025

Expiry Date:

02.07.2025

col-wide

Job Description:

Job Description

Our client, a global healthcare company, is seeking an Enterprise Data Platform Lead Consultant, highly proficient in AWS, Python, and Snowflake, to spearhead the design and implementation of their Enterprise Data Platform (EDP) solutions for cross-domain reporting and analytics. You will drive cloud-based data integration, storage, and curation using AWS, Python, and Snowflake, ensuring alignment with strategic initiatives of client programs, including but not limited to the Spectra to Quest Lab transition. This role demands technical leadership in scenarios where data gravity necessitates EDP-based reporting outside SAP Datasphere.

This is a remote, contract role, with minimal travel. The length of this contract will be through the end of 2025, with the likelihood of renewal well into 2026.
Job Responsibilities:
  • Lead for Enterprise Data Platform reporting capabilities for cross-domain reporting and analytical needs.
  • Architect and deliver cloud-based analytical solutions leveraging AWS, Python, and Snowflake.
  • Design and implement end-to-end data integration, storage, and curation pipelines for high-performance analytical use cases.
  • Function as the technical leader in implementing EDP solutions that support data-intensive initiatives within the client's program, especially where reporting must occur outside of DataSphere due to data gravity considerations.
  • Collaborate with data engineers, analysts, and business units to capture requirements and translate them into effective data models and pipelines.
  • Ensure scalability, governance, and security are core to the EDP solution design.
  • Support and guide project teams, enforcing data platform architecture best practices and performance optimization strategies.

Requirements
  • 5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake.
  • Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases.
  • Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks.
  • Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.

Preferred Qualifications:

  • AWS Certified Solutions Architect, Snowflake SnowPro Core/Advanced, or Python certifications.
  • Familiarity with Databricks, Tableau, or Power BI is a plus.
  • Fluent in English; ability to collaborate with global teams across EU time zones.
  • Strong problem-solving skills and stakeholder management for technical and non-technical audiences.


Requirements
5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake. Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases. Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks. Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.