Technical Architect – Data Science
Location: Leicester
Job Purpose
TESTQ Technologies is an IT services and solutions company with expertise across multiple industry sectors. The Technical Architect – Data Science is responsible for designing, developing, and implementing end‑to‑end data and AI solutions. This role bridges data engineering, data science, and architecture by defining scalable frameworks, guiding model deployment, and ensuring optimal use of cloud and big‑data technologies.
Responsibilities
- Design and architect end‑to‑end data science and AI solutions aligned with enterprise strategy.
- Define scalable data architectures for ingestion, processing, storage, and analytics.
- Lead the design of machine learning pipelines, model deployment frameworks, and MLOps solutions.
- Collaborate with data scientists, engineers, and analysts to operationalize ML models in production.
- Evaluate and recommend tools, frameworks, and best practices for data science and AI initiatives.
- Ensure compliance with data governance, security, and privacy standards.
- Provide technical leadership and mentorship to the data science and engineering teams.
- Optimize cloud and on‑premises data architectures for performance, cost, and scalability.
- Drive innovation through proof‑of‑concepts (POCs) and pilot implementations of emerging AI/ML technologies.
Key Skills, Qualifications and Experience Needed
- Bachelor's degree in computer science, information technology, or a related discipline.
- 3 to 4 years of professional experience in Technical Architect – Data Science roles.
- Proficiency in programming and scripting languages such as Python, R, SQL, Java, Scala, and Shell scripting.
- Expertise in data science and machine learning libraries including NumPy, Pandas, Scikit‑learn, TensorFlow, PyTorch, Keras, XGBoost, and LightGBM.
- Solid understanding of data engineering and big‑data ecosystems with hands‑on experience using Apache Airflow, Luigi, and dbt for data workflow orchestration, and knowledge of Hadoop, Spark, Hive, Kafka, and Flink for distributed data processing.
- Experience with relational and NoSQL databases such as PostgreSQL, MySQL, Oracle, MongoDB, Cassandra, and Redis, and managing data lakes and warehouses like Snowflake, Databricks, Amazon Redshift, Google BigQuery, and Azure Synapse.
- Deep experience with cloud platforms (AWS, Azure, GCP) and designing scalable cloud‑native data solutions.
- Proficiency in MLOps and DevOps tools such as MLflow, Kubeflow, DVC, TensorFlow Extended (TFX), and CI/CD pipelines with Jenkins, GitHub Actions, Azure DevOps, or CircleCI.
- Knowledge of containerization and orchestration through Docker, Kubernetes, and Helm.
- Experience with model monitoring and governance tools such as Evidently AI, WhyLabs, and Neptune.ai.
- Strong understanding of API design and integration (REST, GraphQL), version control systems, and data security and compliance frameworks such as GDPR and HIPAA.
- Strong data visualization and business intelligence skills with tools like Power BI, Tableau, Looker, Superset, Plotly, and Dash.
Qualifications
Bachelor's degree or above in the UK or equivalent.
Salary
GBP 55,000 to GBP 65,000 per annum.
Additional Information
Published Date: 03 November 2025
Closing Date: 02 December 2025
Evaluation Process: CV Review, Technical Test, Personal and Technical Interview, and References
Job Type: Full‑time, Permanent (Part‑time and Fixed‑term options available)