Tony Ettwein, Former Sr. Manager and Sr. Auditor at Pfizer, and Managing Expert for YourEncore’s Quality Center of Excellence, is one of YourEncore’s premier subject matter experts in IT Quality, collaborating with clients to improve and innovate their GxP procedures and Continuous Improvement best practices. In this Expert Q&A session, Tony provides valuable insights into audits in IT and the different A.I. systems that will soon affect the Quality landscape.
What is the most common observation you see when doing IT audits? How would you advise companies to handle these scenarios?
Two of the most common observations we see when doing IT audits are fairly basic: inadequate training, and not thoroughly testing against all requirements.
It may not be surprising that lack of adequate training is a top observation, because it’s frequently a root cause of other issues. I’ve also seen it as a common observation in other areas of GxP, including the Good Clinical, Laboratory, Manufacturing, and Pharmacovigilance standards that IT systems support. One good way to address the issue is to have a means of confirming the effectiveness of the training, whether it's quizzes, certification by an instructor, or other process. The benefits of ensuring that actions related to IT processes are done correctly and consistently are obvious, but in reality, ensuring the effectiveness of the training isn’t always considered fully.
Failure to adequately test critical functionality of a computer system can result in having a system that doesn’t do what we think it’s doing. It’s obviously not good if a computer system is working incorrectly and we can observe that it’s working incorrectly. But if the computer system is working incorrectly and we don’t know that it’s working incorrectly—and then decisions are made despite that failure to detect—it can obviously have an enormous negative impact in work such as ours. One way that our industry has improved in this area is by having comprehensive risk-based programs of determining and documenting all that the computer system will be used for and, based on that, what needs to be tested, and how much it needs to be tested. The process should take into account what could go wrong, the probability that it could go wrong, and what the potential negative impact would be in the event of failure. The likelihood of non-detection should be considered in the assessment.
Do you feel that companies are preparing appropriately for the onset of Artificial Intelligence systems, especially in pharmacovigilance?
My answer to that question would have probably been different if you’d asked me just a year ago, because the actual or planned use of Artificial Intelligence (A.I.) and machine learning is one of the fastest areas of change in the industry in the last two years. In addition to pharmacovigilance, A.I. is being used in the relatively early stages of many areas of R&D, including drug discovery, genomics, and discovering new purposes for existing compounds. A number of companies have established partnerships with firms that have already developed A.I. systems for these purposes. Artificial Intelligence is really electronically applying the same general algorithms that our brains use, except, of course, machines can do it much faster and deal with much larger amounts of data in a meaningful way. Some processes, such as signal detection, can require large amounts of input data in order to perform their functions most effectively.
In the area of pharmacovigilance, A.I. systems potentially have the ability to detect safety signals earlier in the process, thereby saving time and allowing personnel to better direct their efforts, all the way from molecular discovery to clinical study design and follow-up.
Of course, using a new paradigm such as A.I. requires a new understanding of how systems can be used in their processes. For this reason, instead of replacing existing processes, some companies are adding A.I. processes while keeping those they already have in place, using processes in tandem for some length of time.
At any rate, we’d certainly expect any computerized processes to be validated according to the company’s procedures, as well as existing laws and regulations.
Take 21 CFR Part 11 for example; do you feel that Health Authorities provide enough guidance to companies for IT quality?
From people in our industry that I’ve spoken with, health authorities on balance provide a good level of guidance for IT quality. Laws and regulations from health authorities tend to provide the overall standard (what has to be done) and allow companies within the industry to develop their processes and procedures (how it’s done).
Many of us remember that it was originally industry that asked FDA to provide guidance into what was then a new world of electronic signatures. The result, of course, was 21 CFR 11, which remains the regulation on electronic records as well as electronic signatures. It took much of the industry years to get up to speed on 21 CFR Part 11, in fact, some will still cite Part 11 for the requirement to validate computerized systems, whereas those requirements were already in place, and FDA has clarified that with subsequent guidance.
One concept I feel strongly about is that laws and regulations by health authorities not only protect the public, but also provide a model for good business. How many of our IT projects over the years went forward without proper planning or requirements, only to wind up with the company spending tens of thousands of dollars to have a system that didn’t work according to user requirements, or that was the wrong system altogether? Good Clinical, Laboratory, Manufacturing Practice regulations and related documents actually require planning and requirements to be in place in the early stages of development.
What I tend to see today is a healthy working relationship between IT experts in FDA and the industry, who work together toward the common interest of doing what’s right for the public. To that end, I’ve seen a greater degree of transparency in both directions over time.
If a company is struggling to know how or where to start, or to continue to improve their IT inspection strategy, what advice would you give?
As with starting any undertaking, or engaging in continuous improvement in an existing practice, there’s a lot to know, and the more you know, the better. Engaging with other IT experts at conferences, industry forums, and in established quality organizations can have a huge positive impact. For those IT inspection/quality programs that are relatively new, there are also a number of books on the market that provide good strategies. And if I were to select a single most useful document as a starting place, it would be the GAMP 5, which applies across the GxP spectrum, not just to automated manufacturing practices.
Consultants can also provide very useful information, in addition to providing another set of eyes to see the trees for the forest (and vice versa). But be sure that you know why you’re contracting that consultant, and I strongly recommend that you make sure to understand how the advice from that consultant will add value to what your business does. If a company creates a new document just to check it off a list, is that document really adding value?
QA is a proactive discipline. How has the evolution of QMS and CAPAs helped the world of IT?
One of the key differences between GMP as opposed to both GCP and GLP is that with GMP, the primary product is the pill, or injectable, or ointment, or the replacement knee, whatever it is that’s being manufactured. In the GCP and GLP areas, the primary product is information, especially the study report and associated data. So it’s not surprising that in the pharmaceutical industry, QMS and CAPA processes originated primarily from the GMP area, where reproducibility of the product is essential. In fact, 21 CFR 820.200 of the medical device regulations (the Quality System Regulation) specifically requires a quality management system, and 21 CFR 820.100 of those regulations specifically requires the implementation of CAPAs.
Over time, the benefits of developing and implementing a QMS and CAPAs have been adopted within the IT area, as well as GCP, GLP, and other areas. Having a quality management system, as the name suggests, helps to ensure that the processes, procedures, resources, and management support are in place to have an effective IT quality program. Similarly, an effective CAPA program helps to ensure that any problems are addressed at a root cause level, and that controls are in place to prevent or minimize future issues.
How should companies have oversight of vendor IT systems?
The processes that govern selection and managing of IT vendors is generally the same as for other vendors, including vendors for supplies and services. In fact, I frequently see effective IT vendor management done as part of an overall vendor management program.
One key to remember, especially for management, is that acquiring or licensing IT systems for our industry that are not secure, have bugs or other issues, or are not appropriate for the processes they support, can have disastrous results, with the cost being safety, financial, or both. Importantly, internal processes should help to ensure that no commitments are made to the supplier until your company has determined that the software is fit for use. A risk-based supplier management program should determine what type of assessment should be done for each supplier. Although a “desk audit” may be appropriate for some suppliers, I’d strongly recommend an on-site audit for the company’s critical suppliers.