Enable job alerts via email!

Data Engineer | MS Fabric

Loanworks Technologies

Metro Manila

Hybrid

PHP 700,000 - 1,200,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology company in Metro Manila is looking for a Data Engineer to design and implement data pipelines using the Microsoft Fabric Data Platform. The role involves building data workflows, ensuring data quality, and optimizing performance while collaborating with the Data Team. Candidates should have at least 2 years of experience, proficiency in Python and SQL, and be willing to work during AU business hours in a hybrid setup.

Qualifications

  • At least 2 years’ experience in data engineering or ETL development.
  • Strong understanding of transformation techniques.
  • Effective communication and documentation skills.

Responsibilities

  • Design, build, and maintain data pipelines for data processing.
  • Develop and manage Notebooks for transforming data.
  • Monitor and optimize pipeline performance.

Skills

Python
SQL
REST APIs
data modeling
problem-solving

Education

Bachelor’s degree in Computer Science, Data Engineering, or related field

Tools

Microsoft Fabric
Job description
DATA ENGINEER

This is for a full-time position.

Job Summary

The Data Engineer will be responsible for designing and implementing data pipelines using the Microsoft Fabric Data Platform. This involves creating Notebooks to extract, transform, and load data. The role will receive technical guidance from a Senior Data Engineer based in Sydney and day-to-day support from the Data Team Leader.

Responsibilities
  • Design, build, and maintain data pipelines to support data ingestion, preparation, and transformation processes across multiple layers.
  • Develop and manage Notebooks for transforming data between Bronze, Silver, and Gold layers, ensuring proper version control and adherence to best practices.
  • Implement and maintain diverse ingestion methods, such as Shortcuts and Database Mirroring, to enable efficient data integration.
  • Conduct quality assurance and testing on ingestion processes, validating data accuracy and consistency between source systems and Microsoft Fabric.
  • Monitor and optimize pipeline performance, identifying inefficiencies and implementing improvements to enhance reliability and scalability.
  • Troubleshoot and resolve pipeline errors and alerts, ensuring minimal downtime and smooth data flow.
  • Document and maintain existing data pipelines, including creating clear technical documentation and optimizing workflows for long-term sustainability.
Qualifications
  • Bachelor’s degree in Computer Science, Data Engineering, or related field
  • At least 2 years’ experience in data engineering or ETL development
  • Proficient in Python and SQL for data processing and analysis.
  • Skilled in working with REST APIs for data integration.
  • Hands‑on experience with Microsoft Fabric or similar data platforms.
  • Strong understanding of data modeling and transformation techniques.
  • Strong problem-solving and troubleshooting ability.
  • Effective communication and documentation skills.
  • Ability to work collaboratively in a team environment.
  • Willingness to learn and try new things
  • Amenable to work on a day shift work schedule in Ortigas (AU working hours, hybrid work setup)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.