Can your data integrity practices withstand growing scrutiny?
In light of the FDA's recent warnings we consider areas where the pharmaceutical industry needs to improve
Last month, FDA inspectors continued to find failings across Aurobindo Pharma’s sterile drug plant in Telangana. One of the key issues observed by inspectors was the clear lack of adequate data integrity systems in place to ensure the completeness, consistency and accuracy of all raw date.
There was also a notable gap in the plant’s ability to review and investigate electronic raw data. The plant often failed to implement relevant CAPAs when issues were discovered and failed to comprehensively address all potential root causes.
The plant was also lacking appropriate controls over their computers and related systems, meaning that unauthorized personnel were able to institute changes in master production and control records.
FiercePharma noted that as Aurobindo Pharma has grown in prominence, currently claiming to be the second-largest generics producer for the U.S., the regulatory scrutiny for their actions have increased. In the past three years alone, they have already received 10 Form 483s. This is a trend being seen across the industry, with the regulatory focus on data integrity principles and practices continuing to rise.
The FDA’s growing focus on quality control
In a recent webinar with Pharma IQ, Steven Brown, Data Integrity Lead at Novartis, shared how in their efforts to continuously improve data integrity practices, the company conducted a review of all FDA warning letters since 2015.
In many of these letters, audit trails, eSecurity and GDP were highlighted, but the standout issue across 70% of letters was clearly quality control (QC).
The issues raised under the term quality control were wide ranging. They covered testing into compliance, disregarding or retesting results, failing to put in adequate access controls and not taking a risk based approach to data review. Two additional prominent issues in QC were poor document or sample management and human factors, including poor data recording and falsification.
Considering this focus, we are likely to see scrutiny on these areas increasing from all regulatory bodies. In advance of this, Novartis have already begun to ensure increased levels of data integrity, implementing comprehensive systems and processes.
It is highly recommended that companies begin to take a strong look at their data integrity processes and how diligently their staff are following these protocols. A single warning letter can have a significant impact on production and can open up a growing number of issues. With the FDA taking a closer look at all data practices, the pharmaceutical industry must prepare.
Taking a holistic approach to data integrity
During the same webinar, Dr. Daniela Janssen, Director of Product Marketing at Dassault Systèmes, shared that there are two key reasons why teams may not be following quality processes.
The first is that they may not be aware of the relevance of the process you have put in place. This issue can be resolved with training and further guidance. The second reason is that the task itself may be too cumbersome. This can be addressed by deploying technology to make the data transfer automatic instead of manual. This will limit the operational burden upon employees as they seek to maintain data integrity and also improve the overall quality and reliability of data.
Daniela believes that to truly ensure data integrity is maintained across the lab, there must be a culture of quality. This has to be driven by a simultaneously top-down and bottom-up approach. Management must be willing to encourage and incentivize good data practices. Then on the ground you must have data champions, working across all different levels, to reiterate the importance of data integrity.
For Daniela, good data integrity and strong data management practices are the foundation for good science. As she puts it, “data integrity offers real confidence that you have access to both good quality and reliable data”. She continues that “this stops you having to base your innovations on assumptions, incorrect data or incomplete results”. This will allow your decisions to ultimately be more trustworthy and reliable and limit the impact of poor quality data. This will also ensure that you continue to meet regulatory standards and maintain good practices.