¡Activa las notificaciones laborales por email!

Data Engineer with Azure Devops in FABRIC environment

principal33 España

Cádiz

Presencial

EUR 35.000 - 55.000

Jornada completa

Hace 17 días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading IT service company is seeking an experienced Data Engineer with expertise in Azure DevOps and a strong background in ETL and Big Data. This role offers flexible remote work options, participation in an international environment, and opportunities for continuous self-development. The ideal candidate will play a key role in collaborating on data projects, troubleshooting application issues, and supporting the deployment of critical updates.

Servicios

Private medical insurance applicable in Spain
Flexible compensation and work hours
Day off on your birthday
Referral bonuses
Annual salary reviews
Gifts for special occasions
Free week-long accommodation at our apartment near Valencia

Formación

  • Hands-on experience with data engineering and DevOps pipelines.
  • Strong background in ETL architecture and big data technologies.
  • Experience with Python, Azure, and building CI/CD pipelines.

Responsabilidades

  • Collaborate with cross-functional teams to support data reporting.
  • Diagnose and resolve application issues and conduct testing.
  • Maintain records of support requests and resolutions.

Conocimientos

ETL
Data Warehouse
Big Data
Source code management
Design and architecture
Requirements analysis
DevOps pipelines
Data Governance

Herramientas

Python
Jira
Confluence
Spark
Azure
GCP

Descripción del empleo

About us
At Principal33, we strive to make happiness at work a reality. Because it's not just about the money, it's also about the work environment and appreciation. It's about creating the best team setup you can imagine and getting involved in the things you're passionate about. And you can be a part of it, because it's fun to get things done!

We want our employees to innovate, and we allow them to do what they are truly passionate about. Based on this conviction, Principal33 aligns its strategy around its vision: to become a leading IT service company and a better work-life balance. With currently over 200 employees from different countries, we are actively shaping the future of work.

About the Job
We are seeking an experienced Data Engineer with Azure DevOps in a Microsoft Fabric environment. The successful candidate will be self-motivated and capable of working independently.

Responsibilities

  • Collaborate with cross-functional teams to support our data reporting platform and troubleshoot issues.
  • Diagnose and resolve application issues.
  • Conduct thorough testing, debugging, and documentation of any changes.
  • Collaborate with development teams to escalate and prioritize critical issues.
  • Assist in the deployment and configuration of application updates.
  • Provide technical support and training to other team members when necessary.
  • Maintain accurate records of support requests and resolutions.

Requirements & Skill Set

  • ETL / Data Warehouse
  • Big Data
  • Source code management
  • Design and architecture
  • Requirements analysis
  • Developing and maintaining DevOps pipelines
  • Hands-on experience with Jira and Confluence
  • Release management in JIRA
  • Data Governance

Tools and Technologies: Experience in FABRIC environment.

Development:

  • ETL Architecture – Delta, Lambda, Data Mesh, Hub & Spoke
  • Data Modelling – Relational (3NF, Data Vault), Dimensional (Star / Snowflake / Galaxy Schema)

Programming:

  • Python, Java
  • Cloud – Microsoft Azure and basics of GCP
  • Unit / Integration / Lint testing – Pytest with coverage.py, flake8 / pep8 for Lint testing

Technical

  • Requirement analysis and participation in design discussions to derive optimal data warehouse design.
  • Create/review data mapping documents.
  • Produce/develop codebase using Python, Spark, or other recommended ETL/Big Data tools such as Databricks.

Additional tasks include debugging, troubleshooting, coding analysis, and processing data, developing lint tests, and creating utilities using PowerShell for automation in Azure and other Azure automation tasks. Implement CI/CD pipelines with Azure Yaml, Jenkins. Act as a data architect for complex data solutions, design and plan modern data platforms, and support deployment and release processes, including QA, UAT, and production environments.

What we offer:
Remote work mostly + 1 trip to Dublin per quarter. Please only apply if you have a valid European work permit.

Benefits include private medical insurance (applicable in Spain), flexible compensation, flexible work hours, a day off on your birthday, referral bonuses, annual salary reviews, gifts for special occasions, an international environment, and a free week-long accommodation at our apartment near Valencia, Spain (subject to availability).

Events: Summer party!

Self-Development: Continuous training, participation in tech communities, and opportunities to attend local and international tech events.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.