Enable job alerts via email!

Data Engineer

EQ Bank | Equitable Bank

Regina

On-site

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A financial technology company in Canada is seeking a Data Engineer specialized in Azure technologies. The role involves designing and maintaining the enterprise data warehouse, developing data pipelines, and ensuring efficient data flow and integration. Ideal candidates will have over 3 years of experience and strong skills in T-SQL and Python programming. This job offers opportunities for innovative problem-solving in a collaborative environment.

Qualifications

  • 3+ years of Proven experience as a Data Engineer, specifically working with Microsoft Azure technologies.
  • Understanding of data warehousing concepts and best practices.
  • Experience with Data modeling.
  • Experience in working with structured, semi-structured and unstructured data.
  • Excellent written and verbal communication skills.
  • Excellent multi-tasking and organizational skills.

Responsibilities

  • Develop scalable Synapse pipelines for data ingestion and processing.
  • Optimize T-SQL and Python-based data processing scripts.
  • Design mapping dataflow for data transformations.
  • Implement API integrations for data flow between systems.
  • Monitor and optimize performance of Azure data services.

Skills

T-SQL
Pyspark
Mapping Dataflow
Python programming
Azure Data Factory
Azure Data Lake
Azure Synapse
REST APIs
Data warehousing concepts

Education

Bachelor's degree in Computer Science or Engineering

Tools

Azure Function App
Azure SQL Database
Azure Logic App
Event Hub

Job description

Purpose of Job:

The Data Engineer will have the responsibility for the design and maintenance of enterprise data warehouse for various projects, as well as providing on-going support for activities impacting all enterprise databases. The ideal candidate will have hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Azure Function App, Logic App, REST APIs, mapping dataflow, and Event Hub. As a key member of our team, you will collaborate with various stakeholders to ensure efficient data flow, storage, and integration, while employing best practices in data engineering.


Main Activities:
  • 1. Developing of Data Pipelines (50%)
· Design, develop, and maintain scalable Synapse pipelines for data ingestion, processing, and transformation in Azure data platform.
· Develop and optimize T-SQL and Python-based data processing scripts and integrate them with Synapse pipelines for ETL tasks and data transformations.
· Design and develop mapping dataflow solutions for complex data transformations.
· Develop and maintain data storage solutions, ensuring efficient data retrieval and storage on Azure Data Lake and databases.
· Implement and maintain API integrations, including REST APIs, for seamless data flow between systems.
· Design and test the end-to-end process for data ingestion, data transformation, data cleansing, data delivery, data quality and data timeliness.
· Implement and ensure data security and privacy measures, complying with industry standards and regulations.
· Monitor and optimize the performance of Azure data services, Azure Synapse pipelines, Function Apps, and Logic Apps.
· Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
  • 2. Requirements gathering and Technical Documentation (10%)
· Participate in Joint Application Development sessions with business units to gather and understand their data and reporting requirements.
· Create and maintain technical documentations for tools, development, and procedures.
  • 3. Data Strategy and Planning (10%)
· Provide inputs and assist in enablement of Cloud based data solutions.
· Develop and recommend innovative approaches to solve business and technical problems.
· Work and collaborate with the larger team to exchange knowledge, solutions, and practices to build a more consistent, robust approach to development.
  • 4. Support and Maintenance of Enterprise Data Warehouse and BAU Tasks (30%)
· Continuously improve performance and proactively identify and resolve bottlenecks that will reduce time to build and deliver our products.
· Troubleshoot and resolve data-related issues in a timely manner for production and other environments (DEV,UAT,QA)
· Perform business as usual tasks (BAU) that assigned
Skills/Knowledge Requirements:
  • Bachelor's degree or equivalent in the field of Computer Science or Engineering
  • 3+ years of Proven experience as a Data Engineer, specifically working with Microsoft Azure technologies.
  • Proficient in T-SQL, Pyspark, Mapping Dataflow and Python programming language
  • Hands-on experience with Azure data services, including Azure Data Factory, Azure Data Lake, Azure Function App, Azure Logic App, Azure SQL Database, Azure Synapse Analytics, mapping dataflow, and Event Hub.
  • Understanding of data warehousing concepts and best practices.
  • Experience with Data modeling.
  • Experience in working with structured, semi-structured and unstructured data.
  • Excellent written and verbal communication skills
  • Excellent multi-tasking and organizational skills

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.