Enable job alerts via email!

Data Engineer

Cree

Belfast

On-site

GBP 40,000 - 80,000

Full time

Today
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a forward-thinking company as a Data Engineer where you'll expand and optimize data architectures and pipelines. This role involves collaborating with software developers and data scientists to ensure optimal data delivery across projects. You'll be responsible for building data solutions using cutting-edge tools like Snowflake and Azure, while also leveraging your skills in Python and SQL. If you're an intelligent, driven individual who thrives in a team environment and is eager to learn, this position offers a unique opportunity to make a significant impact in the data landscape. Be part of a dynamic team that believes in achieving what others think is impossible!

Qualifications

  • 3+ years of experience in Data Engineering or Software Engineering focused on data.
  • Hands-on skills with programming languages like Python, Java, or Go.

Responsibilities

  • Design, develop, and support data solutions for business partners.
  • Build and maintain data solutions using Snowflake, Azure, and Python.

Skills

Data Engineering
Python
SQL
Data Pipeline Development
Agile Methodologies
DevOps
ETL Tools
Cloud Computing

Education

Bachelor's in Computer Science or related field

Tools

Snowflake
dbt (data build tool)
Fivetran
Azure Cloud
Azure DevOps
Docker

Job description

The Role

As part of a global team you will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data solutions and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists while ensuring optimal data delivery architecture is consistent throughout ongoing projects. The Data Engineer will be responsible for the day-to-day activities related to the implementation of new services and support for existing services.

We are looking for intelligent, driven individuals who are passionate about what they do and have exceptional teamwork skills. The skills and experience needed for this role are listed below. However, we understand that there might be a few requirements that you don’t meet or skills that you don’t yet have. That’s ok! If you are a smart, passionate, hardworking individual who is eager to learn we would like to speak with you about joining our wolf pack!

Your day-to-day – We do what others say can’t be done

  • Provide technical expertise and execute the design, development and support of data solutions for Wolfspeed business partners, including configuration, administration, monitoring, performance tuning, debugging, and operationalization.
  • Build and maintain data solutions using Snowflake, dbt (data build tool), Fivetran, Azure Cloud (storage, VMs, containers, Azure Data Factory), Python, Docker and SQL.
  • Participate in the development lifecycle using Agile / DevOps methodologies using Azure DevOps.
  • Translate simple to complex requirements into functional and actionable tasks.
  • Serve as a subject matter expert for Wolfspeed operations for data integration from enterprise applications (SAP, Oracle, ModelN, Salesorce, Workday, etc.), using that knowledge to craft data solutions that provide maximum visibility to global stakeholders.

Your Profile – Ready to join the Pack?

  • Minimum 3+ years’ experience in a Data Engineering role, or Software Engineering role with a focus on data.
  • Hands-on skills with a programming language such as Python, Java, Go, etc.
  • Public cloud experience (Azure, AWS or GCP)
  • Writing complex SQL Queries
  • ETL tools (Fivetran, Azure Data Factory) or writing custom data extraction applications, Data Modeling, Data Warehousing and working with large-scale datasets.
  • Experience leveraging DevOps and lean development principles such as Continuous Integration, Continuous Delivery/Deployment using tools like Azure DevOps, Github, Gitlab, etc.
  • Designing and building modern data pipelines and data streams
  • This role may require additional duties and/or assignments as designated by management.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.