Enable job alerts via email!

Data Architect

JR United Kingdom

Slough

On-site

GBP 60,000 - 80,000

Full time

30+ days ago

Job summary

A global technology company in Slough is seeking a Data Engineer to design and maintain ETL/ELT pipelines and ensure data quality. The ideal candidate will have strong SQL skills and experience with data modeling, specifically using dbt. Familiarity with Apache Airflow and cloud platforms is essential. This role offers competitive compensation and opportunities for growth.

Qualifications

  • Strong proficiency in SQL, particularly with Snowflake.
  • Extensive experience with dbt for data modeling.
  • Proficiency in Python for data manipulation and automation.

Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines using Snowflake.
  • Implement data transformations and build analytical data models.
  • Orchestrate and schedule data workflows using Apache Airflow.

Skills

SQL proficiency
Data transformation with dbt
Workflow orchestration with Apache Airflow
Python scripting
Cloud platforms (AWS, Azure, GCP)
Data governance best practices
Job description

Social network you want to login/join with:

HCLTech is a global technology company, home to more than 220,000 people across 60 countries, delivering industry-leading capabilities centered around digital, engineering, cloud and AI, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion.

Responsibilities:

  • Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse.
  • Implement data transformations and build analytical data models using dbt (data build tool).
  • Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake.
  • Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs.
  • Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs.
  • Ensure data quality, integrity, and governance throughout the data lifecycle.
  • Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions.
  • Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git.
  • Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness.

Qualifications:

  • Strong proficiency in SQL, particularly with Snowflake's features and functionalities.
  • Extensive experience with dbt for data modeling, transformations, testing, and documentation.
  • Solid experience with Apache Airflow for workflow orchestration and scheduling.
  • Proficiency in Python for data manipulation, scripting, and automation.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services.
  • Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
  • Familiarity with data quality, governance, and security best practices.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and collaboratively in a fast-paced environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.