Role Description:
As a Senior Lead within Software Engineering, you'll design and implement functionalities, focusing on Data Engineering tasks. You'll work with semi-structured data to ingest and distribute it on a Microsoft Fabric-based platform, modernizing data products and distribution channels. You'll drive the software development lifecycle for continuous data delivery and lead the evaluation and adoption of emerging technologies.
Key Responsibilities:
Provide partnership and support to SME's and Tech Leads to ensure delivery on commitments.
- Build and maintain secure and compliant production data processing pipelines on Microsoft Fabric and Azure to ingest, land, and transform data into data products.
- Ensure that data pipelines and data stores are high-performing, efficient, organized, and reliable, based on business requirements and constraints.
- Design, implement, monitor, and optimize data platforms to meet functional and non-functional data pipeline requirements.
- Handle data-related implementation tasks including provisioning data storage, ingesting streaming and batch data, transforming data, implementing security, data retention policies, monitoring, and accessing external data sources.
- Design and operationalize large-scale enterprise data solutions using Azure data and analytics services.
- Implement data solutions utilizing Azure services such as Delta.io, Lakehouse, Fabric, Azure Cosmos DB, Data Factory, Spark, and Azure Blob storage, among others.
Skills and Experience:
- Relevant experience in Data Platforms within the Financial Services industry, with familiarity in Azure's PaaS/SaaS offerings like Fabric, Synapse, Purview, ADF, and Data Lake Storage.
- Proven experience as a data engineer or similar role, focusing on cloud-based distributed data processing platforms like Spark and open table formats such as Delta or Iceberg.
- Strong experience with Azure services including Synapse Analytics, Data Factory, Data Lake, Databricks, and others.
- Proficiency in Spark, SQL, and Python/Scala/Java.
- Experience in building Lakehouse architecture using open-source table formats and tools like Jupyter Notebook.
- Understanding of security best practices, including Azure Key Vault, IAM, RBAC, and monitoring tools.
- Ability to integrate, transform, and consolidate data from various sources for analytics solutions.
- Knowledge of data exploration, retention, validation, visualization, and related processes.
- Strategic thinking and operational capability for day-to-day delivery.
- Ability to understand business requirements and their implications on roadmaps.
- Basic understanding of Azure DevOps and Agile methodologies (SCRUM, Kanban).
- Strong communication, presentation, documentation, and interpersonal skills.
- Self-management skills to work independently in a dynamic environment.
Location:Preferred locations are London or Bangalore.
LSEG is a leading global financial markets infrastructure and data provider committed to driving financial stability, empowering economies, and enabling sustainable growth.
Our values of
Integrity, Partnership, Excellence, and Change underpin our culture. We foster a diverse, collaborative, and innovative environment, supporting sustainability and inclusive growth.
We offer benefits including healthcare, retirement plans, paid volunteering days, and wellbeing initiatives.
We are an equal opportunities employer, committed to non-discrimination and reasonable accommodations for religious practices, mental health, or physical disabilities.
Please review our privacy notice regarding the handling of your personal data.
For recruitment agency partners, ensure candidates are aware of this privacy notice.
More jobs from London Stock Exchange Group
Boost your career
Find thousands of job opportunities by signing up to eFinancialCareers today.