Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Jobs In Cyprus - Job Portal in Cyprus

Standard

On-site

CAD 60,000 - 80,000

Full time

11 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global leader in data innovation is seeking a Lead Data Engineer in Alberta. You will influence data architecture, lead projects, and mentor engineers. The role requires expertise in data modeling and ETL pipelines, with at least six years of experience. Strong skills in cloud technologies and programming are essential. This position offers competitive pay and a collaborative environment focused on innovation and quality. Join this dynamic team to shape the future of data engineering.

Benefits

Subsidized health plans
Paid sick leave
Retirement plans with match

Qualifications

  • Minimum 6 years of relevant work experience.
  • Advanced experience with big data technologies.
  • Strong desire to learn and mentor others.

Responsibilities

  • Define and communicate technical environment requirements.
  • Lead the development of technical solutions.
  • Design and implement distributed data processing pipelines.
  • Drive collaboration with architecture teams.

Skills

Data modeling
ETL pipelines
Cloud technology
SQL
Python
Problem-solving

Education

Bachelor’s degree in computer science or equivalent

Tools

Databricks
Snowflake
Airflow
GitHub
Terraform
Job description
Job Detail

Temporary

Placement Type

Temporary

Salary

$59.40-66 hourly

Start Date

12.01.2025

W-2, no C2C

Must work in Pacific Time Zone

Overview

Aquent is proud to partner with a global leader in innovation, a company that constantly explores potential, breaks barriers, and pushes the edges of what’s possible. As a Lead Data Engineer, you will be at the forefront of shaping the future of data and analytics, directly influencing critical decisions and driving the evolution of our client’s data architecture.

About the Role

We are seeking a visionary Lead Data Engineer to join a highly motivated, global team dedicated to building cutting‑edge data and analytic solutions for a prominent enterprise. This is a hands‑on leadership role where you will define development standards, frameworks, and best practices, significantly impacting the efficiency and quality of data engineering efforts. You will be instrumental in designing and developing critical data pipelines, streamlining architecture, and ensuring data quality and reliability across data lakes and warehouses. This role offers an exciting opportunity to lead, mentor, and innovate within a dynamic environment.

Key Responsibilities
  • Define and communicate technical environment requirements, determine project scope, and provide technical estimates or capacity planning.
  • Translate product backlog items into robust engineering designs and logical units of work.
  • Lead the development of technical solutions that align with architectural standards and meet business needs.
  • Drive collaboration with architecture and platform teams on integration needs and designs, creating advanced technical designs and reviewing proof‑of‑concept efforts.
  • Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile/Scrum methodologies.
  • Define and apply appropriate data acquisition, processing, and consumption strategies for various technical scenarios.
  • Design and implement distributed data processing pipelines using industry‑standard tools and languages.
  • Profile and analyze data to design scalable solutions.
  • Drive technical strategies for new data projects and optimize existing solutions.
  • Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational challenges.
  • Build utilities, user‑defined functions, libraries, and frameworks to enhance data flow patterns and implement complex automated routines using workflow orchestration tools.
  • Lead collaborative reviews of design, code, test plans, and dataset implementations to uphold data engineering standards.
  • Identify and remove technical bottlenecks for your engineering squad, providing leadership, guidance, and mentorship to other data engineers.
  • Anticipate, identify, and resolve data management issues to improve data quality.
  • Build and incorporate automated unit tests and participate in integration testing efforts.
  • Utilize and advance software engineering best practices, including source control, code review, testing, and continuous integration/delivery (CI/CD) on cloud infrastructure.
Qualifications
Must‑Have Skills & Experience
  • Bachelor’s degree in computer science, data science, software engineering, or a related field, or an equivalent combination of education, experience, and training.
  • Minimum 6 years of relevant work experience in designing and implementing innovative data engineering capabilities and end‑to‑end solutions.
  • Advanced experience with data modeling, warehousing, and building ETL pipelines, including experience with ETL tools like Matillion and/or PySpark.
  • Expertise in building and operating highly available, distributed systems for data extraction, ingestion, and processing of large datasets, with the ability to deliver end‑to‑end projects independently.
  • Advanced experience building cloud‑scalable, real‑time, and high‑performance data lake solutions, preferably with Databricks, Snowflake, and/or AWS.
  • Advanced experience with big data technologies such as Hadoop, Hive, Spark, EMR, and orchestration tools like Airflow.
  • Advanced proficiency in SQL and modern scripting or programming languages, such as Python and Shell.
  • Experience in CI/CD Pipeline for code deployment, with exposure to tools like GitHub, Jenkins, Terraform, and Databricks Assets Bundles.
  • Strong problem‑solving and interpersonal communication skills.
  • Demonstrated ability to deliver results on multiple projects in a fast‑paced, agile environment.
  • Strong desire to learn, share knowledge, and coach team members.
Nice‑to‑Have Skills & Experience
  • Certifications in Databricks and/or AWS.
  • Experience with Matillion (ETL).
  • Experience with data migration projects, particularly from Snowflake to Databricks.
  • Leadership behavior and the ability to work independently.
  • Familiarity with best practices around documentation.
About Aquent Talent

Aquent Talent connects the best talent in marketing, creative, and design with the world’s biggest brands.

Our eligible talent get access to amazing benefits like subsidized health, vision, and dental plans, paid sick leave, and retirement plans with a match. More information on our awesome benefits!

Aquent is an equal‑opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. We’re about creating an inclusive environment—one where different backgrounds, experiences, and perspectives are valued, and everyone can contribute, grow their careers, and thrive.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.