Enable job alerts via email!

Big Data Lead

FalconSmartIT

London

On-site

GBP 70,000 - 100,000

Full time

6 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in data solutions is seeking a Senior Data Engineer to enhance their data pipelines and cloud services. You will lead a team, utilize modern technologies like AWS and Databricks, and implement data governance best practices. This role is critical for maintaining high-performance data infrastructure in a fast-paced environment.

Qualifications

  • Senior experience in Data Engineering with expertise in AWS and Databricks.
  • Proficiency in database tools like DB2 and DataStage.
  • Experience in delivering complex data pipelines is highly valuable.

Responsibilities

  • Lead Data Engineering teams to design and maintain scalable data infrastructures.
  • Architect and manage data platforms using IBM and Databricks technologies.
  • Implement best practices for data governance and data quality.

Skills

AWS
Databricks
Python
SQL
Data Governance
ETL
Team Management
DevOps

Job description

Social network you want to login/join with:

For this role, senior experience of Data Engineering and building automated data pipelines on IBM Datastage & DB2, AWS, and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies. Experience in delivering complex pipelines will be significantly valuable to how D&G maintain and deliver world-class data pipelines.

Knowledge in the following areas is essential:

  • Databricks: Expertise in managing and scaling Databricks environments for ETL, data science, and analytics use cases.
  • AWS Cloud: Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM.
  • IBM Skills: DB2, Datastage, Tivoli Workload Scheduler, Urban Code.
  • Programming Languages: Proficiency in Python, SQL.
  • Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques.
  • DevOps & CI/CD: Familiarity with DevOps practices for data engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog).
  • Familiarity with big data technologies like Apache Spark, Hadoop, or similar.
  • ETL/ELT tools and creating common data sets across on-prem (IBM DataStage ETL) and cloud data stores.
  • Leadership & Strategy: Lead Data Engineering team(s) in designing, developing, and maintaining highly scalable and performant data infrastructures.
  • Customer Data Platform Development: Architect and manage our data platforms using IBM (legacy platform) & Databricks on AWS technologies (e.g., S3, Lambda, Glacier, Glue, EventBridge, RDS) to support real-time and batch data processing needs.
  • Data Governance & Best Practices: Implement best practices for data governance, security, and data quality across our data platform. Ensure data is well-documented, accessible, and meets compliance standards.
  • Pipeline Automation & Optimisation: Drive the automation of data pipelines and workflows to improve efficiency and reliability.
  • Team Management: Mentor and grow a team of data engineers, ensuring alignment with business goals, delivery timelines, and technical standards.
  • Cross Company Collaboration: Work closely with all levels of business stakeholders including data scientists, finance analysts, MI, and cross-functional teams to ensure seamless data access and integration with various tools and systems.
  • Cloud Management: Lead efforts to integrate and scale cloud data services on AWS, optimizing costs and ensuring the resilience of the platform.
  • Performance Monitoring: Establish monitoring and alerting solutions to ensure the high performance and availability of data pipelines and systems, preventing impact on downstream consumers.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.