Activez les alertes d’offres d’emploi par e-mail !

Senior Data Engineer with Crypto

SKM Group

Nice

Sur place

EUR 50 000 - 70 000

Plein temps

Il y a 5 jours
Soyez parmi les premiers à postuler

Mulipliez les invitations à des entretiens

Créez un CV sur mesure et personnalisé en fonction du poste pour multiplier vos chances.

Résumé du poste

A leading company is seeking a Data Engineer to migrate their data infrastructure to Databricks Unity Catalog. The role involves developing data pipelines, ensuring data governance, and collaborating with teams to meet business needs. Ideal candidates will have strong skills in Python, Spark, and SQL, along with experience in cloud services and data orchestration tools. The company offers a supportive team environment and opportunities for influence and growth.

Prestations

Large freedom and real influence
Team approach to meeting challenges
Company apartments in cool cities across Europe

Qualifications

  • Advanced proficiency in Python and strong software engineering fundamentals.
  • Proven experience with Spark and SQL in data engineering.

Responsabilités

  • Migrate data infrastructure to Databricks Unity Catalog and DBXaaS.
  • Develop, maintain, and optimize data pipelines using Spark and SQL.

Connaissances

Python
Spark
SQL
Data Governance
Communication

Outils

Databricks
Airflow
Docker
Kubernetes

Description du poste

A large, global data infrastructure platform needs to be comprehensively migrated to Databricks Unity Catalog, including a transition to DBXaaS. This migration involves the transfer of tables, pipelines, permissions, and related components across all relevant environments and assets. Our goal is to ensure data security, manageability, and scalability while maintaining consistency and governance across the platform.

Key Responsibilities
  1. Migrate data infrastructure to Databricks Unity Catalog and DBXaaS.
  2. Develop, maintain, and optimize data pipelines using Spark and SQL.
  3. Design and implement robust data pipelines for blockchain data processing.
  4. Ensure data governance, security, and compliance across environments.
  5. Optimize data architecture for scalability and performance.
  6. Collaborate with cross-functional teams to meet business requirements.
Required Qualifications
  1. Advanced proficiency in Python and strong software engineering fundamentals.
  2. Proven experience with Spark and SQL in data engineering and analysis.
  3. Hands-on experience with Databricks for data ingestion and transformation.
  4. Proficiency with data orchestration tools (Airflow or similar).
  5. Experience with one or more cloud services (Azure preferred, AWS, GCP).
  6. Solid understanding of RDBMS / NoSQL data stores.
  7. Experience with Docker (required) and Kubernetes (preferred).
  8. Familiarity with Delta Lake and data warehousing concepts.
  9. Understanding of blockchain data structures and cryptography (preferred).
  10. Experience with blockchain indexing and data extraction (a plus).
  11. Excellent communication, interpersonal, and presentation skills.
Key Attributes for Success
  1. High standards of quality and professional conduct.
  2. Detail-oriented with a focus on dev testing and adherence to best practices.
  3. Ownership and pride in work delivered.
  4. Proactive, solution-focused mindset.
  5. Ability to receive and give constructive feedback.
  6. Strong team player with a "can-do" attitude.
What do we offer you?
  1. Large freedom and real influence.
  2. No unhealthy competition, team approach to meeting challenges.
  3. Company apartments in cool cities across Europe: work and enjoy a memorable getaway.
Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.