Why future proofing your lab systems cannot be ignored
Marc Siladi shares why the lab of the future won’t comprise of one single system and what needs to happen to future proof systems for scientistsAdd bookmark
It’s hard to believe that any one solution will be able to solve the diversity of challenges the research lab faces. What will be key as we look forward is the ability to future proof systems, address integration issues and limit the operational burden.
In this interview, Marc Siladi, Data Analytics Product Manager at Thermofisher Scientific, offers up a solution to this dilemma and addressing how the future of work will change for many scientists.
Deep Dive: Join Marc Siladi for his webinar at Smart Labs Digital. Secure your place here
Pharma IQ: Why do you believe the lab of the future won’t comprise of one single system?
Marc: When you think about our industry and all the diversity of research that exists throughout the world today, it’s not just one vertical. It’s not just small molecule or biologics drug discovery or gene therapy. It’s really an amalgam of several verticals within healthcare. And each vertical has a unique set of problems that need to be solved.
Across and within these verticals, there is a clear need to capture and structure information and to derive insights from scientific data. And there’s not going to be one company or software program or programming language or set of instruments that solves all these problems. A platform that hosts a marketplace of integrated solutions will be needed to drive science forward.
There is a clear need to capture and structure data in the right way so you can gain new insights
Ultimately, we want to uncover insights hidden in our data that may exist in many different silos, but data can be a large, complex, diverse, and everchanging. Data standards also change from company to company, from industry to industry and evolve over time. And we need to make sure we can expose and aggregate all this information to uncover multiparametric insights. But it’s really going to require multiple different software systems to analyse data, cleanse and prep data and share information. Data is the focal point, it’s the fabric or glue that binds together all these disparate processes.
A challenge paramount to data scientists, statisticians and life scientists is keeping up with rapidly changing technologies. And a platform with interoperability to expand outwards and incorporate different software programmes, programming languages, and best in class tools is the answer; this platform needs to be an inclusive and an adaptable system.
The pace of scientific change is accelerating far beyond the capabilities of static legacy data management
Think about where we have come in the last 40 years. It would be unreasonable to assume the landscape won’t change at a faster rate in the next decade. This pace of scientific change is accelerating far beyond the capabilities of static legacy data management systems.
Pharma IQ: How will the lab of the future centralize these requirements and co-operate with multiple systems?
Marc: Our vision and solution is the Thermo Fisher Scientific Platform for Science.
Founded by lab scientists who wanted to easily collect, store, access, and analyze scientific data, the Platform for Science puts scientific relationships and data integrity at the center of its platform from the start. The Platform for Science is designed to support workflows using a flexible and extensible data model based on FAIR data principles. All products and application-based solutions work together on top of the Platform for Science, preserving a single point of truth for all data.
Find out more: How to enable the lab through advanced analytics
Pharma IQ: How does this type of software change things on the ground for those using the system?
Marc: Designed from the start to run on the Amazon Web Services (AWS) cloud, the Platform for Science’s services and compute power can scale along with our customer’s scientific needs. The cloud-based deployment model enables collaboration around the globe, providing secure and appropriate access to external partners. For anyone working with scientific and/or laboratory data, the Platform for Science enables scientists to store, share, and use it with more confidence and less frustration. All data lives in the same platform architecture, streamlining analysis and providing a single source of truth. Scientists can easily add capabilities over time through apps, solution sets, and configurations – with no custom code. And they can extend the platform using the industry-leading OData API to integrate with best in class tools.
Scientists need to find ways to store, share and use data with more confidence and less frustration
We believe modern Infrastructure and IT resources are too valuable to squander on maintaining informatics systems. When daily maintenance fills the IT queue, essential workflow modifications and important change orders can take weeks or months to receive attention from the IT team or a third-party vendor. Further, requests for current workflow changes or new types of data to be captured and indexed only increase the IT maintenance backlog, compounding the problem. Science, analytical techniques and experiment design are evolving rapidly.
Pharma IQ: Why is future proofing the systems that you use an important consideration?
Marc: We don’t want our scientists to learn new systems or user interfaces every couple of years, because these systems can’t keep up with the pace of science and how the lab changes. You want a system that can evolve. You also want to lower the activation and energy to get accustomed to the system so that the end users can use it and interact with it successfully.
Another way we’ve been able to achieve this is by having an open system from an API perspective. We’re betting big on our supported integration with R, Shiny and RStudio products. This integration supports our scientific end users who can harness the power of R through modern scientific web applications and for our superusers, admins and developers who want to expand the platform though the development of applications to meet their individual needs.
We don't want scientists to have to learn new systems every couple of years because the system cannot keep up with the pace of science
All users now have access to analytics and visualizations options and the R community with over 10,000+ statistical packages available from the Comprehensive R Archive Network (CRAN), Bioconductor and GitHub. Our users can communicate with developers, domain experts, and testers to help drive their science. When a new technique or algorithm is peer reviewed and published, this solution can be configured into a Shiny data analysis application or web service that integrates directly with Platform for Science software.