¡Activa las notificaciones laborales por email!

Data Engineer With Azure Devops In Fabric Environment

buscojobs España

Palencia

Presencial

EUR 40.000 - 60.000

Jornada completa

Hace 11 días

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

Une entreprise innovante recherche un Data Engineer spécialisé en Azure DevOps pour soutenir sa plateforme de reporting de données. Ce poste offre un environnement de travail international et flexible, avec des bénéfices tels qu'une assurance médicale privée et un congé d'anniversaire. Le candidat idéal aura des compétences en Big Data et un bon niveau en Python, tout en étant capable de travailler de manière autonome.

Servicios

Assurance médicale privée
Compensation flexible
Congé d'anniversaire
Examens de salaire annuels
Cadeaux pour occasions spéciales
Semaine gratuite dans un appartement près de Valence

Formación

  • Expérience en développement avec Python et Spark ou autres outils ETL/Big Data.
  • Connaissance des architectures ETL telles que Delta et Lambda.
  • Capacité à créer des documents de mapping de données.

Responsabilidades

  • Support de la plateforme de reporting de données.
  • Diagnostiquer et résoudre les problèmes d'application.
  • Collaborer avec les équipes de développement pour prioriser les problèmes critiques.

Conocimientos

ETL / Data Warehouse
Big Data
Gestion du code source
Analyse des exigences
Développement et maintien des pipelines DevOps
Gouvernance des données

Herramientas

Microsoft Azure
Python
Java
Databricks
Jira
Confluence

Descripción del empleo

At Principal33, we strive to make happiness at work a reality. Because it's not just about the money; it's also about the work environment and appreciation. We aim to create the best team setup you can imagine and encourage involvement in your passions. Join us for a fun and productive experience!

We support innovation and allow our employees to pursue their true passions. Our strategy aligns with our vision to become a leading IT service company and promote a better work-life balance. With over 200 employees from diverse countries, we are shaping the future of work.

About the Job

We are seeking an experienced Data Engineer with expertise in Azure DevOps within a Microsoft Fabric environment. The ideal candidate will be self-motivated and able to work independently.

Responsibilities

  • Support our data reporting platform by collaborating with cross-functional teams and troubleshooting issues.
  • Diagnose and resolve application problems.
  • Conduct testing, debugging, and documentation of changes.
  • Work with development teams to escalate and prioritize critical issues.
  • Assist in deploying and configuring application updates.
  • Provide technical support and training to team members as needed.
  • Keep accurate records of support requests and resolutions.

Requirements

Skill set:

  • ETL / Data Warehouse
  • Big Data
  • Source code management
  • Design and architecture
  • Requirements analysis
  • Developing and maintaining DevOps pipelines
  • Experience with Jira and Confluence
  • Release management in JIRA
  • Data governance

Tools and Technologies:

  • ETL Architecture – Delta, Lambda, Data Mesh, Hub & Spoke
  • Data Modeling – Relational (3NF, Data Vault), Dimensional (Star/Snowflake/Galaxy Schema)
  • Programming – Python, Java
  • Cloud – Microsoft Azure and basic GCP knowledge
  • Testing – Pytest, coverage.py, flake8/pep8

Technical Skills:

  • Requirement analysis and participation in design discussions for data warehouse solutions.
  • Create and review data mapping documents.
  • Develop code using Python and Spark or other ETL/Big Data tools like Databricks.
  • Write complex SQL queries for debugging, troubleshooting, and data processing.
  • Develop lint tests to ensure coding standards.
  • Create utilities with PowerShell for automation in Azure.
  • Build CI/CD pipelines with Azure YAML, Jenkins.
  • Act as a data architect for complex data solutions.
  • Design and plan modern data platform architectures, including Medallion architecture.
  • Prepare data architecture blueprints.
  • Support QA and end-to-end testing to minimize bugs.
  • Manage releases through Octopus Deploy.
  • Support deployment and resolve issues during UAT and production releases.

What we offer:

Way of working: Mostly remote with one trip to Dublin per quarter. Only apply if you have a valid European work permit.

Benefits include private medical insurance (Spain), flexible compensation, a day off on your birthday, annual salary reviews, gifts for special occasions, and more.

Enjoy an international, multicultural environment and a complimentary week-long stay at our apartment near Valencia, Spain (subject to availability).

Participate in company events like summer parties and engage in continuous professional development through training and tech community involvement.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.