Data Engineer with Azure Devops in FABRIC environment

Sé de los primeros solicitantes.
Solo para miembros registrados
Burgos
A distancia
EUR 35.000 - 50.000
Sé de los primeros solicitantes.
Hace 4 días
Descripción del empleo

About us: At Principal33, we strive to make happiness at work a reality. It's not just about the money; it's about the work environment and appreciation. We aim to create the best team setup and involve employees in their passions, making work fun and fulfilling.

We encourage innovation and allow our employees to pursue what they are truly passionate about. Our strategy is aligned with our vision to become a leading IT service company and promote a better work-life balance. With over 200 employees from diverse countries, we are shaping the future of work.

About the Job: We are seeking an experienced Data Engineer with Azure DevOps experience in a Microsoft Fabric environment. The candidate should be self-motivated and capable of working independently.

Responsibilities:

  • Collaborate with cross-functional teams to support our data reporting platform and troubleshoot issues.
  • Diagnose and resolve application issues.
  • Conduct thorough testing, debugging, and documentation of changes.
  • Collaborate with development teams to escalate and prioritize critical issues.
  • Assist in deploying and configuring application updates.
  • Provide technical support and training to team members as needed.
  • Maintain accurate records of support requests and resolutions.

Requirements and Skills:

  • ETL / Data Warehouse expertise
  • Big Data technologies
  • Source code management
  • Design and architecture skills
  • Requirements analysis
  • Developing and maintaining DevOps pipelines
  • Hands-on experience with Jira and Confluence
  • Release management in JIRA
  • Data Governance knowledge

Tools and Technologies: Experience with FABRIC environment

Development:

  • ETL Architecture: Delta, Lambda, Data Mesh, Hub & Spoke
  • Data Modelling: Relational (3NF, Data Vault), Dimensional (Star, Snowflake, Galaxy Schema)

Programming:

  • Python, Java
  • Cloud: Microsoft Azure and basic GCP knowledge
  • Testing: Pytest, coverage.py, flake8/pep8

Technical:

  • Requirement analysis and design discussions for data warehouse solutions
  • Create and review data mapping documents
  • Develop codebases using Python, Spark, or other ETL/Big Data tools like Databricks

Additional Tasks:

  • Develop lint tests and utilities for automation in Azure using PowerShell
  • Implement CI/CD pipelines with Azure Yaml and Jenkins
  • Act as a data architect for complex data solutions
  • Design and plan modern data platform architectures, including Medallion architecture
  • Prepare data architecture blueprints
  • Support QA and end-to-end testing, fix bugs, and ensure smooth releases
  • Manage deployment and releases on UAT and production environments

What we offer:

Remote work mainly, with one trip to Dublin quarterly. Please only apply if you have a valid European work permit.

Benefits include private medical insurance (Spain only), flexible compensation and hours, day off on your birthday, referral bonuses, annual salary reviews, gifts for special occasions, an international environment, and a free week-long stay at our apartment near Valencia, Spain (subject to availability).

Events: Summer parties and more.

Self-Development: Continuous training, participation in tech events, and professional growth opportunities.