Top five takeaways from SmartLab Digital 2020

Pharma leaders offer key insights on increasing efficiency, enhancing connectivity and driving superior quality from their journeys to the lab of the future




SmartLab Digital 2020 brought together leading names from the world of pharma to offer their insights and best practices to drive innovation in the lab. As labs go “paperless” and capture increasing volumes of data, pharma companies are having to explore new ways to collect, store and manage data.

As, Shivanjali Joshi-Barr, Solution Scientist at Clarivate Analytics, remarked during the event, “It is interesting to see how artificial intelligence (AI) and machine learning technology is being used to predict adverse outcomes for drug-drug interactions. AI is using adverse events circulated from known drugs and to create a new emerging area in data prediction.”

Bringing together expert analysis, data-driven benchmarks and peer-proven best practices from Pfizer, Roche, IDBS and Dotmatics, among others, the 2020 online event supported moving future visions of intelligent lab systems into strategy execution.

Unlocking the potential of your lab through master data management

Unjulie Bhanot, Solution Owner (Biologics Development) at IDBS went beyond traditional lab management and shared insight on how the right informatics tools and platform can enable cutting-edge analysis.

Bhanot’s session honed in on how digital transformation can design a more effective data strategy to gain maximum potential from the lab, and explained why enhancing your master data management was key for a structured, consistent and aligned data ecosystem.

Understanding the landscape of your organization and considering their reporting needs, process steps and legacy instrumentation could greatly determine how to move one data management implementation to the next, explained Bhanot.

Automated data acquisition

During his session at SmartLab Digital 2020, Rob Brown, VP Product Marketing at Dotmatics, demonstrated how lab scientists could save up to 80 per cent of their time on manual data transfer and curation by utilizing integration Internet of Lab Things and informatics capabilities.

With a key focus on automating workflows from instrument to analyst review, Brown tackled the recurring issue of reproducibility and reducing error in the lab by limiting human intervention. Taking that manual data schema through process creation to the final interaction with scientists in the lab, the workflow can become much more scalable. Thereby, enterprises dealing with high volumes of data can now automate contract research organization data exchanges

Making data meaningful and actionable

Stephen Tierney, President at XiltriX North America collaborated with Mark Narcy, Senior Facilities and Operations Manager at Crinetics Pharmaceuticals, to explain how to integrate multiple data sources into a single system while making data meaningful and actionable.

To avoid using data monitoring solutions that might seem cost-effective but lack access to the right information and hinder your ability to prove compliance when audited, Tierney said it was key to futureproof your measurement capabilities and integrate your data into multiple systems that are all-inclusive.

It has become a matter of overcoming data challenges, such as error-prone manual tasks, working with legacy IT systems and varying data outputs, and utilizing the technology available in the industry to be more solution-driven, he added.

Making better decisions in the lab using AI

Peter Henstock, Machine Learning and AI Technical Lead at Pfizer, and Stéphane Vellay, Senior Solution Scientist at Dassault Systèmes discussed the impact of AI on the productivity of lab staff using two different use cases.

Both cases looked at how AI provides a framework for leveraging data to make better decisions, efficiently optimize processes to guide choice of direction and enable scientists to perform richer more complex experiments. Vellay went one step further and dove into the challenges of formulating new drug candidates as bottlenecks continued to occur at this point. By leveraging machine learning technology it helped to predict various structures and activities like solubility or viscosity and ensured models were able to  continuously update as new data from other experiments arrived.

Reusing data to minimize costs

Dr. Felipe Albrecht, Bioinformatics and Computer Scientist at Roche, shared insights on how to implement a data lake for incorporating FAIR standards to minimize costs associated with accessing mass spectrometry data.

Diving straight into the key goal of implementing a mass spectrometry data lake – to improve data reuse for humans and machine learning training – Dr. Albrecht also highlighted how a data lake can save scientists a lot of time and money in pharma R&D discovery. The Roche scientist said it would allow scientists to organize data and metadata, access the data and its content, and develop methods to extract features from data all in-house.

SmartLab Digital 2021

Join us in March 2021, with IBDS confirmed as our headliner sponsor and key leaders in digital solutions including Osthus and Dotmatics, to continue strategizing your adoption pathway to build a connected lab environment to meet the needs of your scientists and researchers.

RECOMMENDED