At Principal33, we strive to make happiness at work a reality. Because it's not just about the money; it's also about the work environment and appreciation. We aim to create the best team setup you can imagine and encourage involvement in your passions. Join us for a fun and productive experience!
We support innovation and allow our employees to pursue their true passions. Our strategy aligns with our vision to become a leading IT service company and promote a better work-life balance. With over 200 employees from diverse countries, we are shaping the future of work.
About the Job
We are seeking an experienced Data Engineer with expertise in Azure DevOps within a Microsoft Fabric environment. The ideal candidate will be self-motivated and able to work independently.
Responsibilities
- Support our data reporting platform by collaborating with cross-functional teams and troubleshooting issues.
- Diagnose and resolve application problems.
- Conduct testing, debugging, and documentation of changes.
- Work with development teams to escalate and prioritize critical issues.
- Assist in deploying and configuring application updates.
- Provide technical support and training to team members as needed.
- Keep accurate records of support requests and resolutions.
Requirements
Skill set:
- ETL / Data Warehouse
- Big Data
- Source code management
- Design and architecture
- Requirements analysis
- Developing and maintaining DevOps pipelines
- Experience with Jira and Confluence
- Release management in JIRA
- Data governance
Tools and Technologies:
- ETL Architecture – Delta, Lambda, Data Mesh, Hub & Spoke
- Data Modeling – Relational (3NF, Data Vault), Dimensional (Star/Snowflake/Galaxy Schema)
- Programming – Python, Java
- Cloud – Microsoft Azure and basic GCP knowledge
- Testing – Pytest, coverage.py, flake8/pep8
Technical Skills:
- Requirement analysis and participation in design discussions for data warehouse solutions.
- Create and review data mapping documents.
- Develop code using Python and Spark or other ETL/Big Data tools like Databricks.
- Write complex SQL queries for debugging, troubleshooting, and data processing.
- Develop lint tests to ensure coding standards.
- Create utilities with PowerShell for automation in Azure.
- Build CI/CD pipelines with Azure YAML, Jenkins.
- Act as a data architect for complex data solutions.
- Design and plan modern data platform architectures, including Medallion architecture.
- Prepare data architecture blueprints.
- Support QA and end-to-end testing to minimize bugs.
- Manage releases through Octopus Deploy.
- Support deployment and resolve issues during UAT and production releases.
What we offer:
Way of working: Mostly remote with one trip to Dublin per quarter. Only apply if you have a valid European work permit.
Benefits include private medical insurance (Spain), flexible compensation, a day off on your birthday, annual salary reviews, gifts for special occasions, and more.
Enjoy an international, multicultural environment and a complimentary week-long stay at our apartment near Valencia, Spain (subject to availability).
Participate in company events like summer parties and engage in continuous professional development through training and tech community involvement.