¡Activa las notificaciones laborales por email!

Engineer Databricks, Azure analyses services and PowerBI

Kapres Technology, S.L.

Madrid

A distancia

EUR 35.000 - 50.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

Una empresa del sector tecnológico busca un Ingeniero de Datos para trabajar en un entorno 100% remoto. El candidato será responsable de construir canalizaciones de datos, desarrollar cubos tabulares y crear dashboards utilizando Power BI. Se requiere al menos 2 años de experiencia en herramientas como Databricks y Azure servicios de análisis.

Formación

  • Mínimo 2 años de experiencia en el diseño y construcción de canalizaciones de datos.
  • Experiencia práctica en Databricks, Azure análisis services y Power BI.

Responsabilidades

  • Construir canalizaciones de datos utilizando PySpark.
  • Desarrollar cubos tabulares con Analysis Services (DAX).
  • Crear tableros utilizando Power BI.

Conocimientos

PySpark
Power BI
Azure análisis services
Databricks
Descripción del empleo
Overview

Para un importante cliente del sector seguros, buscamos Engineer Databricks, Azure analyses services and PowerBI, el trabajo es 100% remoto, Ofrecemos contrato indefinido con nosotros.

1 / Context :

Monitoring Solutions product provides monitoring capabilities covering applications performance, LOG events and BAU KPIs.

Along client initiatives aiming to increase the quality of service within client, we are building several QoS dashboards, following industrial approach.

As a Data engineer, responsibilities will include constructing data pipelines using PySpark, developing tabular cubes with Analysis Services (DAX), and creating dashboards using Power BI.

2 / Services
  • Assessement of the needs to set up a list of functional and technical requirements
  • Supporting Coordination with third party teams (Data integration team & Datalake team)
  • Implementation of data ingestion and transformation
  • Implementation of specific calculations and filtering according to the requirements
  • Implementing security and access control measures
3 / Deliverables
  • Design of the technical solution covering the needs (Data model, Integration flow, Data mapping)
  • Build integration layer based on Azure datalake : DAX, PySpark, Databricks, Azure Analysis Services, Azure DevOps
  • PowerBI dashboards connected to Azure Cubes
  • Implementation of specific calculation
  • Years of expertise we're looking for : minimum 2 years
  • The tools needed or mandatory expertise : Databricks, Azure analyses services and PowerBI - have already hand-on expertise on these tools
  • Level of expertise on tools needed : open for all the levels (intermediate, advanced, expert)
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.