By: Sandra Rodriguez, Market Analyst and Ellyn McMullin, Research Associate, Axendia Inc.
One of the basic tenets of investigative reporting is getting to the facts by asking ALL the relevant questions: Who, what, where, when and why? The issues, concerns and proposed solutions surrounding Data Integrity deserve no less a rigorous analysis.
While several articles have been published on data integrity addressing the: who, what, where and when, the most important question is left unanswered. WHY Is Data Integrity the Foundation of Good Science?
Answering who, what, where and when is a mechanical process.
- Create a procedure requiring people to sign-wet or electronic: Who
- Record the data element: What
- Establish the source of the data element: Where
- Record the time: When
The “Why” is the rationale for the procedure: “Data Integrity IS The Foundation of Good Science.”
For sound data to be the foundation for good decisions and good science, there can be no questions with regards to its integrity. Patient outcomes, product quality, safety and efficacy depend on it.
Without data integrity, all the analysis and conclusions drawn from the data is suspect at best. The challenge is some organizations focus on telling people what to do, rather than explaining why. Regulators have made it clear that data integrity is ultimately the responsibility of executive management. Without executive commitment, companies will struggle to answer the “Five Ws”.
This brings us to Axendia’s recent white paper, Data Integrity – the Foundation of Good Science, which addresses – Why should companies focus on Data Integrity?
- Data integrity is the foundation that supports good science
- Data integrity leads to better decision-making
- FDA is shifting its focus from compliance to quality through Quality Metrics.
With the implementation of FDASIA Title VII Section 706, the FDA is shifting its focus to requesting information in advance of, and in some cases in lieu of, an inspection in either electronic or physical form. This shift to e-Inspections of drug manufacturers requires massive amounts of data and information that can only be as good as the quality of available data and analytic tools used. As part of this shift in focus, FDA intends to take a close look at a company’s commitment to quality. Howard Sklamberg, J.D., Commissioner Regulatory Operations & Policy at USFDA stated:
- We will also collect data, or “quality metrics” that will help us measure a firm’s commitment to quality.
- Our evaluation of a firm would be anchored by data that prove that the risks of manufacturing problems in a certain facility are minimal.
- We believe that by improving the inspection process in this way, future “metrics” that define quality will be understood and aspired to by manufacturers – no matter where they are in the world.
In addition, according the Paula Katz, Director of Manufacturing Quality Guidance and Policy Staff at CDER Office of Compliance, “data integrity problems are not always intentional; sometimes they result from poorly controlled systems.”
The problem is not that we don’t have enough systems, rather we have too many systems. As a result, managing data across multiple systems – seeking a single source of truth – becomes a challenge.
Having a platform that allows you to connect the dots and turning that data into knowledge, is critical to support this process. But answering the WHY without providing the HOW would not be helpful. The prime component of an optimal solution begins with a Data Integrity strategy – one that is consistent across all departments, branches, and facilities and throughout the product life cycle from laboratory through production. A platform approach allows a company to integrate data applications, processes and resources across the organization.
Data sources can be integrated with multiple data consuming applications ranging from business, manufacturing, laboratory and quality areas. Integration allows users across the value chain to access, organize, analyze and share scientific, quality and process data with full confidence of its integrity.
At the same time, the solution should have the capabilities for enterprise quality and data management, laboratory informatics and data analytics. This approach supports digital continuity – the ability to maintain the digital information of a creator in such a way that the information will be available, as needed. It focuses on making sure that information is complete, available and therefore usable.
To achieve data integrity companies need to have controls (whether electronic or paper) over the system, validation and supporting evidence. Technical controls include access control / passwords, audit trails, metadata and supporting evidence.
Key Take Aways:
- For sound data to be the foundation for good decisions and good science, there can be no questions with regards to its integrity.
- Patient outcomes, product quality, safety and efficacy of products rely upon vast amounts of data that are generated throughout the product lifecycle.
- Paper is still the golden standard in many organizations and heavily relied upon but manual data entry from paper records increases the risk of errors.
- Data is often produced by a multitude of disconnected stand-alone or redundant systems that were generally implemented by functional areas to address specific needs. This disconnect can result in a lack of data integrity and consistency.
- Patients, brand equity and reputations have the potential to suffer without a strong foundation for data integrity. By taking a platform approach, companies can integrate data consuming applications that support business, manufacturing, laboratory and quality areas.
As previously stated, regulators have made it clear that data integrity is ultimately the responsibility of executive management. These teams must be committed to ensuring that a master data management system supports governance around data in order to support its integrity. However, for the system to have value, participation and commitment by staff at all levels (both internal and external resources, particularly outsourcing organizations) is necessary in order to support a single source of truth.
Is your company doing good science? Without data integrity the results of that science can be called in to question.