The days of single targeting in medicine may be over
As pharma looks to provide personalized medicines, the industry has no option but to better inform the drug discovery and development process.
Despite the increase in pharma’s R&D funding in recent years, the amount of innovative products generated by the market hasn’t seen a proportionate rise. This is a sobering fact in the face of many medicines facing the patent cliff. Indeed, the risk of failure is high within drug discovery, however there are many areas in which the process could be de-risked. Often products fail in later development stages due to issues that could have been prevented in the earlier design stages.
The era of precision medicines will only ask for more efficiency in drug discovery and development. In order to preserve the bottom line, market authorization holders will need to discover medications faster and more efficiently. Many seek to strengthen discovery methods with a range of techniques like automation, machine learning and artificial intelligence.
Artificial intelligence (AI)has a lot to contribute to drug discovery, and is already doing so in some parts of the industry. However, key gaps in data need to be improved in order to make sure that the conclusions reached by AI are valid.
New approaches to discovery
At Biotrinity’s panel discussion on informing drug discovery through artificial intelligence it was proposed that to adequately master the biomechanisms of a disease multi targeting may be essential.
Biomarkers are key to understanding how to modulate a disease, however unknown side effects on modification may not make that marker a good target. It may not be feasible to reduce complex disease biology to a single target and get the desired changes. The panel noted that in fact it may be more accurate to focus on a combination of 5 or 6 targets.
In the Q&A section an attendee noted that closing the loop between clinical trial outcomes and discovery is a key area for improvement to enrich the data directing the development of new candidates. This will aid understanding on the reasons behind certain outcomes, trends, gene expression profiles and how certain clinical profiles influence a disease mechanisms.
Data on failed studies or compounds need to be shared by industry members and academics to prevent others in the market from repeating the same mistakes. For example, complexity in a molecule can cause reactions to fail, these aspects should be captured to inform future experiments and prevent individuals from wasting time.
Next generation chemistry
It was noted that chemistry hasn’t undergone the revolutions seen in other industries to simplify and streamline. A panel member at the event noted that genomes can now be sequenced in an afternoon for around £1000. A key stumbling block in developing a drug is getting the chemistry support needed once a target is acquired.
The panel discussion on next generation chemistry illustrated that to generate value the industry needs to focus on generating very different molecules. The universe of possible chemical molecules is so vast that to date we have sampled only a tiny portion.
Lee Cronin, Regius Chair of Chemistry at Glasgow University explained his work on the concept of digitizing chemistry to explore and sanity check the chemical space. Machine learning can be used to shape the process of discovering new molecules. The digitization of chemical synthesis involves loading a code into the machine to generate a molecule.
Digitized chemistry would be productive for reducing the duplication of efforts, using cloud technology to compare similar experiments being conducted in labs across the world. It was highlighted that this digital approach would rely on data of a good quality.
Investment and funding
BioTrinity 2018 had a strong focus on how finance is driving technology use within science: deploying machine learning and AI to decode the human genome, fast-track drug discovery, reduce cashburn and pre-empt late stage attrition. Life science funding is certainly evolving and experts noted that it is imperative to find the right type of investors for your type of company rather than being swayed by high valuations.
Molecular biotools can be very expensive. Some in big pharma are looking to smaller, niche companies to illustrate success and the best paths forward.
Nanomedicine: A new golden age
One area in need of more funding in the UK is nanomedicine. The use of nanotechnologies to manipulate biofunctions, diagnose and treat conditions. These materials include biocompatible nanoparticles and nanorobots that contribute to devices or in an existing compound. Pfizer, Roche, Merck and AstraZeneca have a presence in this space.
Nanotechnologies can provide worthwhile adjustments to existing medicines (nanoreformulation). In chemotherapy, nanodelivery systems can allow scientists to access tumours and hidden metastasis in a way that isn’t possible with a small or large molecule.
Nanomedicine has the potential to be applied to brain disorders and slow their progression. In regards to the onset of Alzheimer’s, the expert panel said nanomedicine could top up the brain’s natural protective mechanisms and pathways when the disease sets in and even repair existing damage.
The era of precision medicine does indeed bring great potential, but its bespoke nature places heavy burdens in terms of funding and labour intensity. In order to shrink these burdens, dementia could be tackled by shipping nanomedicine in bulk to sites that can then adapt the technology to specific patients.
Nanomedicine does have its risks, however it does provide great technical solutions and more precise delivery in comparison to the risks of accuracy with stem cells.
As the fabric of pharma’s set up begins to shift, to keep pace or lead the way manufacturers will need to commence a detox – stripping out inefficiencies in discovery and development and informing decisions with smarter technologies and methodologies.