Cognitive Chemical Manufacturing

Cognitive Graphic.png

The Cognitive Chemical Manufacturing (CCM) is a £2.5M EPSRC project between the University of Leeds, AstraZeneca, IBM, Swagelok, UCL, University of Nottingham and Promethean Particles that started November 2018. This project aims to develop an Industry 4.0 approach revolutionising the transfer from laboratory to production using advanced data-rich and cognitive computing technologies.

It combines the expertise of IBM in the development of algorithms and the use of automated model generation and discrimination by researchers at UCL, with the experimental and automation expertise within the Institute of Process Research and Development (iPRD) at Leeds and the use of advanced hydrothermal reactors developed at the University of Nottingham.

We use an automated platform capable of self-optimising chemical reactions using new algorithms, and evolving kinetic motifs, that merge data analysis and the generation of further experiments. Cloud based services generate experiment set-points delivered to the automated laboratory platforms (LabBots). A key capability is that the analysis services can receive and analyse results, and send further experiments to the LabBots, thus creating a data generation - data analysis closed-loop. This enables the application of machine learning to chemical development: the system continuously learns and increases confidence and knowledge over time, from previous iterations.

Using the same cloud based platform, this process understanding can be rapidly transferred to a larger scale systems that use the same data transfer protocols, but on a multi-kg/day scale. The platform is then validated via in-depth case studies related to current manufacturing challenges faced by AstraZeneca (pharmaceuticals) and Promethean Particles (nanoparticles).

Automated Self-Optimising Reactors for Multistage Processes

Automated Flow Reactor.jpg

Flow chemistry is recognised as an essential technique for chemistry today. It aligns goals of safe, “green” and cost-effective chemistry. It promises the opportunity of novel reaction regimes with reduced hazard inherent within today’s chemical industry. Flow has also provided practical solutions to widen the availability of pharmaceutical products to the world’s populations. As an underpinning technology, flow chemistry has the potential for far greater impact than that currently realised - faster time to market, improved consistency and control, providing safer medicines and processing regimes that are less available solely via conventional pharmaceutical processing.

This project is intended to: widen the applicability of self-optimising systems and explore the nature of multi-process step optimisations, contrasting different machine learning algorithms; push the process envelope to cover more challenging problems for flow systems; build on the current knowledge, extending objective functions to include cost and environmental factors.

The project is supported by Dr Reddy’s Laboratories, a global pharmaceutical company based in India that specialises in making essential medicines more affordable and available to the world’s populations. Recent publications have demonstrated the concept of self-optimising automated reactors by combining reactor control with feedback optimisation algorithms: it was applied to the final stage in the synthesis of an EGFR kinase inhibitor and recently the development of multi-objective optimisations using Pareto fronts.

The project is a collaboration between Dr Richard Bourne (Leeds – Institute of Process Research and Development and Dr Reddy’s Laboratories Ltd. This interdisciplinary project will focus on several key elements:

• Automated generation of multi-step optimisation profiles
• Model generation from catalytic and multi-phase reaction systems
• Development and comparison of optimisation algorithms

FLEXICHEM: Flexible Digital Chemical Manufacturing Through Structure/Reactivity Relationships

We will create an artificially intelligent system which will self-optimise chemical manufacturing to flexibly adapt to variation in external factors (e.g. supply chain issues, cost, environment) by utilising chemical molecular property maps to suggest alternative suitable reagents, catalyst, solvents. This rapid, flexible system will be essential for promoting manufacturing by developing a more responsive chemical manufacturing framework. Here the routes will be tailored for agrochemical applications in line with our industrial partners' interests, but the components of the technology will be transferable across differing chemical manufacturing sectors. We will assemble and program a system capable of conducting several discrete chemical processing options including (i) changing catalyst or reagent choice (ii) altering reactor configuration (e.g. batch to CSTR) (iii) differing requirements based on response to external influences (e.g. cost changes due to COVID19). The system will be programmed by computationally intelligent algorithms which enable self-optimisation of the processes without user interaction or their immediate knowledge (i.e. being invisible) and made accessible through a user-friendly interface.

Flexi.PNG

NanoMan: Self-Optimising Nanoscale Manufacturing Platforms for Achieving Multiscale Precision

Improving our current lifestyle and ensuring health of a growing population is reliant on the development of more advanced consumer products. Many of these engineered products have advanced functionality delivered by particles with nanometre dimensions, many thousands of times smaller than the width of a human hair. The exact size of these nanoparticles determines the mechanism of action and performance for the specific application. In healthcare, many drugs require encapsulation within polymer nanoparticles for several reasons, including for dissolving insoluble drugs, protecting drugs from unwanted degradation (e.g. mRNA vaccines) and providing efficient delivery (anti-cancer drugs). In electronics, the colour and intensity of light produced can be finely tuned by controlling the size of quantum dot nanoparticles, thus resulting in much higher quality displays, ultra-thin smart coatings (e.g. for wearable technologies), advanced diagnostics, high intensity medical imaging or high efficiency solar panels. The accuracy required to produce these materials is phenomenal and often only achieved reproducibly in dedicated research laboratories by specialist scientists. There has therefore been little progress on scaling up in a cost-effective or sustainable manner.

In this project we will build platform technologies, comprising advanced chemical reactors underpinned by computational intelligence, which can scale up production of advanced nanoparticle products without loss in the precise control over structural dimensions which are achieved in research laboratories. We will build laboratory reactors which can be programmed to monitor the nanoparticle formation process in real time and relate conditions to the particle properties. Throughout the manufacturing process the machine learning algorithms will direct the reactors towards achieving the desired specification through 'self-optimisation' of conditions. A critical part of the project is then using the data obtained in the lab experiments to build a relationship between process and product which can be transferred onto equipment which can make the materials on a commercially relevant scale in a process called augmented lossless scale-up. We will take the optimised laboratory nanoparticle formation processes and demonstrate scale in several manufacturing environments, including R&D process laboratories and Commercial manufacturing facilities at our partners sites. Such demonstration will encourage further innovation beyond the lifetime of the project which can work towards realising advanced materials currently confined to research laboratories.

Self-Learning Reactor Systems for Automated Development of Kinetic Models

Connor Project Image.png

A major bottleneck in transitioning from chemistry research to process development is a lack of quantitative chemical synthesis information. Important aspects of this information include the development of reaction kinetics - both the reaction model and kinetic parameters. If readily available, this information would allow for the application of classic reaction engineering principles to shorten process development time and lower costs. During the chemical development life-cycle of fine chemicals and pharmaceuticals, experimental data from batch and semi-batch reactors tend to be used to estimate the unknown kinetic parameters of a proposed reaction model. This approach is still seen as a resource intensive and specialised activity.

This project proposes to couple the automated reactor platforms developed at Leeds and AstraZeneca to mixed integer linear programming techniques capable of kinetic model discrimination to create a truly autonomous system for the evaluation and development of scalable process models.

Machine Learning for Chemical Manufacture

Bayesian Optimisation.png

This exciting project combines the expertise of IBM in the development of algorithms for optimisation and the use of automated model generation and discrimination by researchers at UCL with the experimental automation expertise within the Institute of Process Research and Development at Leeds and the use of advanced hydrothermal reactors developed at the University of Nottingham. This research capability will be used to develop new algorithms for machine learning based generation of chemical process design knowledge and coupling these algorithms to a cyber platform for automated experimentation. The combined cyber-physical system will be validated via in-depth case studies related to current pharmaceutical manufacturing challenges.

This project aims to develop an Industry 4.0 approach revolutionising the transfer from laboratory to production using advanced data-rich and cognitive computing technologies. We will develop new algorithms based on Bayesian optimisation and evolving Kinetic Motifs that merge data analysis and the generation of further experiments. Cloud based machine learning services (hubs) will generate experiment set-points delivered through the cloud to automated laboratory platforms (LabBots). A key novelty is that the analysis services can receive and analyse results, and post further experiments to the LabBots, thus generating a data generation - data analysis closed-loop. This enables the application of machine learning to chemical development: the system will continuously learn, increasing in confidence and knowledge over time, from previous iterations.

Objectives:

- Develop optimisation algorithms that combine global and local search methodologies for rapid optimisation and definition of operating space
- Develop simulation tools for evaluation of optimisation algorithms on a variety of chemical kinetic profiles
- Perform case studies on real chemical systems demonstrating enhancements via benchmarking against traditional approaches
- Develop optimisation routines for non-reactive processes such as liquid-liquid separation and chromatography