¡Activa las notificaciones laborales por email!

Databricks Solution Architect (f/m/d)

Axpo Group

Madrid

Presencial

EUR 65.000 - 95.000

Jornada completa

Hace 21 días

Descripción de la vacante

A leading company in renewable energy is seeking a Databricks Solution Architect to spearhead its enterprise data transformation efforts. This role involves designing scalable solutions on the Databricks Lakehouse platform and collaborating with cross-functional teams. Ideal candidates will have a strong background in data engineering, architecture, and solid experience with Databricks and Apache Spark. This position presents an exciting opportunity to contribute to impactful data-driven decisions in the energy sector.

Formación

  • 5+ years in data engineering and 3+ years in architecture roles.
  • Deep experience designing solutions on Databricks and Apache Spark.
  • Hands-on knowledge of CI/CD, git, and orchestration tools.

Responsabilidades

  • Design secure, scalable Lakehouse architectures.
  • Collaborate with business stakeholders for innovative solutions.
  • Guide teams in implementing technical best practices.

Conocimientos

Python
SQL
Apache Spark
Delta Lake
Data Governance
Stakeholder Management

Educación

Degree in Computer Science
Degree in Data Engineering
Degree in Information Systems

Herramientas

Databricks
Microsoft Azure
Terraform
Airflow
Power BI

Descripción del empleo

Location: Baden, Madrid | Workload: 80–100%

Who We Are
Axpo is driven by a single purpose to enable a sustainable future through innovative energy solutions. As Switzerland's largest producer of renewable energy and a leading international energy trader, we leverage cutting-edge technologies to serve customers in over 30 countries. We thrive on collaboration, innovation, and a passion for driving impactful change.

About the Team
You’ll report to the Head of Development and work closely with the Chief Data & Analytics Office (CDAO) as part of a cross-functional effort to build a secure, scalable, and business-aligned data platform. Our mission is to empower Axpo’s decentralized business hubs with self-service analytics and AI capabilities, combining the strengths of engineering, governance, and business ownership

What You Will Do
As a Databricks Solution Architect, you will play a pivotal role in Axpo’s enterprise data transformation by designing and governing scalable and secure solutions on the Databricks Lakehouse platform.

You will:

  • Design performant, secure, and cost-effective Lakehouse architectures that adhere to enterprise data governance and domain modeling standards defined by the CDAO.
  • Lead the design of performant, secure, and cost-effective Lakehouse architectures aligned with enterprise needs.
  • Collaborate with business stakeholders, engineers, and data scientists to design end-to-end solutions that enable innovation and data-driven decision making.
  • Guide engineering teams on implementing technical best practices, ensuring alignment with CDAO-defined data models and stewardship principles.
  • Collaborate with the CDAO office to implement Unity Catalog policies for access control, lineage, and metadata management.
  • Support platform observability, data quality monitoring, and operational excellence in partnership with data governance stakeholders.
  • Evaluate new Databricks features (e.g., Delta Sharing, governance enhancements) and lead their integration into platform capabilities.
  • Establish solution review processes and mentor engineers and analysts on architectural thinking and Databricks capabilities.
  • Support security, compliance, and cost-optimization efforts in close collaboration with platform and cloud teams.

What You Bring & Who You Are

You are a strategic thinker with hands-on technical expertise and a strong focus on business value. You bring:

  • A degree in Computer Science, Data Engineering, Information Systems, or related field.
  • 5+ years in data engineering and 3+ years in architecture roles, with deep experience designing solutions on Databricks and Apache Spark.
  • Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog policy implementation in coordination with data governance functions.
  • Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices.
  • Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs).
  • Hands-on knowledge of CI/CD, GitOps, Terraform, and orchestration tools (e.g., Dragster, Airflow).
  • Sound understanding of enterprise data architecture, data governance, and security principles (e.g., GDPR).
  • Strong communication and stakeholder management skills, able to bridge technical and business domains.
  • Fluency in English; other European languages a plus.

Technologies You’ll Work With

  • Core: Databricks, Spark, Delta Lake, Unity Catalog, dbt, SQL, Python
  • Cloud: Microsoft Azure (Data Lake, Synapse, Storage, Event Hubs)
  • DevOps: Bitbucket/GitHub, Azure DevOps, Terraform
  • Orchestration & Monitoring: Dragster, Airflow, Datadog, Grafana
  • Visualization: Power BI
  • Other: Confluence, Docker, Linux

Nice to Have

  • Knowledge of Microsoft Fabric or Snowflake
  • Familiarity with Dataiku or similar low-code analytics platforms
  • Experience with enterprise metadata and lineage solutions
  • Background in energy trading or related industries
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.