Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading SaaS organization is seeking a Data Engineer for a 6-month contract role. The successful candidate will be responsible for designing web crawlers, building ETL pipelines, and performing data analysis to enhance data insights and support financial services. This predominantly remote position includes occasional office visits and offers a competitive daily rate.
Social network you want to login/join with:
Data Engineer - Contract
£450-£500 per day | Outside IR35
6-Month Initial Contract
Predominantly Remote | Occasional Office Visits Required
We are working with a fast-growing SaaS organization that plays a key role in providing data-driven solutions across the financial services sector. As part of their mission to scale impactful products, they are looking to expand their data capabilities and optimize the quality and availability of insights across their platform.
This role is crucial for enhancing their current architecture, integrating diverse data sources, and enabling predictive and prescriptive analytics that will directly influence business strategy and client delivery.
Key responsibilities
Design, deploy, and maintain Python-based web crawlers using tools such as Scrapy, BeautifulSoup, or Selenium
Implement scalable and reliable web scraping frameworks for high-volume data extraction across websites and social media platforms
Perform data cleaning, standardization, and normalization to ensure consistency and quality across all datasets
Build and maintain ETL pipelines for processing structured and unstructured data
Conduct data analysis and modeling using tools like Pandas, NumPy, Scikit-learn, and TensorFlow
Leverage financial data expertise to identify trends, patterns, and anomalies within complex datasets
Support and improve SQL-based queries and work with database systems including PostgreSQL and MySQL
Collaborate with cross-functional teams, including data scientists, analysts, and product stakeholders, to support data-driven decision-making
Work with cloud environments such as AWS, Azure, or GCP, and explore opportunities to scale infrastructure
Required experience and skills
3-5 years of experience in a data engineering or similar role
Proficiency in Python for web crawling using libraries like Scrapy, BeautifulSoup, or Selenium
Strong understanding of data cleaning, standardization, and normalization techniques
Experience building ETL/ELT pipelines and working with large-scale data workflows
Hands-on experience with data analysis and machine learning libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow
Familiarity with SQL and relational database systems (e.g., PostgreSQL, MySQL)
Exposure to cloud platforms such as AWS, Azure, or GCP
Experience with big data tools such as Spark and Hadoop
Previous experience working with financial data, including understanding of financial metrics and industry trends