Overview
We are seeking a highly experienced Senior Data Engineer with strong expertise in Azure Databricks. This role will focus on building, supporting, and administering scalable, high-performance data pipelines that power real-time and batch analytics for trading, risk, and operational use cases. The ideal candidate will have a deep background in Databricks data engineering, administration, capital markets data, and thrive in an Agile, fast-paced environment.
Key Responsibilities
- Design, develop, and maintain robust data pipelines using Azure Databricks, Confluent, DLT, Spark, and Delta Lake to support trading and market data workflows.
- Self-study the existing pipeline and enhance existing data pipelines, ensuring continuity, scalability, and performance improvements.
- Production pipeline support services, including job monitoring, incident resolution, and performance tuning in production environments.
- Administer Databricks workspaces, Unity Catalog, including cluster configuration, job scheduling, access control, and workspace optimization.
- Build and maintain CI / CD pipelines using GitLab, enabling automated testing, deployment, and versioning of data engineering code.
- Follow and enforce best practices in code management, including modular design, code reviews, and documentation using GitLab workflows.
- Collaborate with fellow team members, business analysts, and data architect to understand data requirements and deliver high-quality solutions.
- Build reusable components and frameworks to accelerate development and ensure consistency across data platforms.
- Actively participate in Agile ceremonies (e.g., sprint planning, stand-ups, retrospectives) and contribute to continuous improvement of team processes.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, with at least 2 years working with Azure Databricks.
- Strong proficiency in PySpark, SQL, and Python.
- Experience supporting production pipelines, including monitoring, alerting, and troubleshooting.
- Experience with GitLab CI / CD, including pipeline configuration, runners, and integration with cloud services.
- Familiarity with financial capital markets domain, such as market data feeds, order books, trade execution, and risk metrics.
- Proven ability to work effectively in Agile development environments.
- Azure certifications (e.g., Azure Data Engineer Associate).
- Experience with real-time data processing using Kafka or Event Hubs.
Benefits & Perks
We believe in fairly compensating all our people by providing them with a world-class health insurance plan and a range of both core benefits and flex benefits for individual preferences. You will receive:
- Hybrid Working Policy
- Discretionary performance related bonus
- Personalized Flex-Benefits
- A focus on your wellbeing, including talks and access to self-development tools.
- Medical insurance for employees.
- Comprehensive leave package of 40 days inclusive of Public Holidays
What you will love about Exinity
Freedom to succeed is our core belief. In this role you will learn from each other and from new projects, exchange information and best practices in an open-minded environment, advance your career progression, and prosper by acquiring skills and nurturing a team of professionals.
Exinity is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of gender, sexual orientation, marital or civil partner status, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.