< 1 min reading time
Hi All, Have you noticed some confusion out there in the use of the terms verification and validation as it relates to medical device software? It seems that some say that validation is the actual testing and verification is all of the other activities such as code reviews, walkthroughs, inspections, etc. I am, however, seeing another school of thought that describes verification as the actual sw testing activities (such as testing against the test cases). What are your thoughts and experience? Thanks. source: https://www.linkedin.com/groups/2070960/2070960-6054681159286210564 Marked as spam
|
Meet your next client here. Join our medical devices group community.
Private answer
David Sullivan
Software Verification is used to verify that we are buiding what we said we would build.
Verification is more than testing. It would include the items you listed such as code reviews, walkthroughs, inspections, specification reviews (Functional, Design, Standards) and testing. Software Validation is used to validate that we are building the right thing. Validation may occur during the concept, development and release phases of a product. Some activities for validation may include initial conversation with a customer (identifying requirements), Building prototypes and presenting the prototypes (May be screen shots or a piece of the functionality etc..) to a customer to validate that we are building the right thing that we understood the requirments etc... We can also work with field personel and customers on alpha or beta testing a product before release to production. Marked as spam
|
|
Private answer
Arthur Brandwood
Yes there is confusion - partly arising from historically different uses of the word validation amongst software engineers generally and the more closely defined use of this term in the context of formal medical device design controls.
But David has it right. Verification is testing that you have produced what you said you would build. More precisely verification is testing that the design output (software specification/code) meets the design input (requirements) or more precisely still "Verification is "confirmation through provision of objective evidence that specified requirements have been fulfilled" (ISO 9001 and IEC 62304). Validation is confirmation that the software works in the real world and is concerned with the clinical usage and field testing/evaluation. Marked as spam
|
|
Private answer
Anil Bhalani
FDA has probably defined the verification/validation and documentation guidelines for software better than anything else. These guidance are fairly mature and software engineers should not have any doubt on the software documentation requirements. Please search the internet for "FDA software medical device premarket software submission" and you will probably get the FDA guidance at the very top. And then there is a FDA guidance on software verification validation And one on Off the shelf software.
Generally: Validation is confirming market requirements (done at the system level) Verification is confirming design input requirements (can be done at the system or sub-system level) Marked as spam
|
|
Private answer
Clarisa Tate
Yes I've seen this confusion myself. Definitions are important for that reason. If not well defined that everyone in the company can understand, reports and protocols end up being labeled incorrectly. When those incorrectly labeled reports are submitted to FDA, most likely there will be lots of issues. I subscribe to the notion of telling individuals to drop their own idea of what these words means to them because the only thing that matters is the FDA/regulatory bodies' definition of these terms.
Marked as spam
|
|
Private answer
Rich Meader
Thanks everyone for your input and insight. This has been a helpful discussion.
Marked as spam
|
|
Private answer
Indeed David's 'explanations' are quite 'shared' by the medical device industry. However, in some other industries (like semi-conductors, embedded systems, ...) you will often find different 'interpretations'.
Rather than 'fighting' about the right 'definitions' I propose to first make sure that the work is getting done. Cheers, Bernd www.grahlmann.net Marked as spam
|
|
Private answer
Christopher Smith
I learned it as you Verify Specifications and Validate Requirements. Requirements tell you what you are making and specifications tell you how to build it. Unit testing of software and hardware come prior to verification and include white box testing, hardware and software subsystems. Verification is generally "as a user would use it" style black box testing and can be performed on alpha and beta level products. Validation is a subset of Verification, generally "Happy Path" testing on release level software/hardware/integrated systems.
Marked as spam
|
|
Private answer
David Sullivan
I have taken some of the suggestions and did some searching on the web for how different standards defined Verification and Validation activities and all were pretty much in agreement.
The standards I looked at were the IEEE standard, ISO 9001, 21 CFR 820.30 and the CMMI model The FDA seems to look at Validation at the system level at towards the end of the development cycle. I think that this needs to be completed throughout the development cycle from concept through release. It is possible to work with a customer, agree on a set of requirements, develop a product which meets the requirements (pass the verification) but fail on the implimentation of the requirements (Validation) Marked as spam
|
|
Private answer
Armin Beck
Yes, I have seen many many times confusion between Validation and Verification regarding design. It is actually very simple. Verification is used to confirm that the output meets the input. Validation is used to confirm that the device meets the “intended use”. There is no different between software development and hardware development.
Hope that helps Armin Marked as spam
|
|
Private answer
Rich Meader
Again, thanks to all for their contribution to this topic.
Marked as spam
|
|
Private answer
Ernesto Staroswiecki, Ph.D.
Hi all, long time reader, first time caller here... :-)
I have seen the confusion between Validation and Verification come up many times, particularly since they are many times bundled together as V&V. I like most of the definitions I have seen here, but I also wanted to contribute regarding some of differences resulting from these definitions. I have heard many discussions regarding making design changes, and what kind of new verification and validation efforts would this require. If both verification and validation activities have been taking place during the entire development (as I strongly believe they should) and have accounted for change management, the extra effort should be only marginal. However, in general I find that most of the Verification efforts involve designing the activities, and something like a change of a compiler would only require to re-run tests already designed. With automated tools this should take little time. Yet the actual running of many Validation tests, since they need to run on fully assembled systems, may take a significant time, so re-running them even on a compiler change, would take a significant effort. Just a couple of cents more... E. S. Marked as spam
|
|
Private answer
John E. Lincoln
Here's the U.S. FDA's definition:
This document uses the terms "verification" and "validation" (also referred to as “V&V”) as they are defined in the QS regulation.iv Verification “means confirmation by examination and provision of objective evidence that specified requirements have been fulfilled.” 21 CFR 820.3(aa). In a software development environment, software verification is confirmation that the output of a particular phase of development meets all of the input requirements for that phase. Software testing is one of several verification activities intended to confirm that the software development output meets its input requirements. Other verification activities include: walk-throughs various static and dynamic analyses code and document inspections module level testing integration testing. Design validation “means establishing by objective evidence that device specifications conform with user needs and intended use(s).” 21 CFR 820.3(z)(2). Use of the term validation in this document is limited to design validation and does not include process validation as defined in 21 CFR 820.3(z)(1). One component of design validation is software validation. Software validation refers to establishing, by objective evidence, that the software conforms with the user needs and intended uses of the device. Software validation is a part of design validation of the finished device. It involves checking for proper operation of the software in its actual or simulated use environment, including integration into the final device where appropriate. Software validation is highly dependent upon comprehensive software testing and other verification tasks previously completed at each stage of the software development life cycle. Planning, verification, traceability, configuration management, and many other aspects of good software engineering are important activities that together help to support a conclusion that software is validated. From their Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices, revised May 11, 2005 Source: http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm089543.htm In harmony with the above and its predecessor, I have used working definitions of verification to be testing, inspection, checking including testing of the requirements. I have used validation to mean the sum total of all verifications for the entire software "package". Have used the same definitions for hardware, equipment, processes for many years, in FDA submissions, validation reports / packages, et al. That format has withstood countless FDA and ISO N-B audits. The 1996 FDA training on [Device] Design Control shows a graphic with verification activities,e.g., output=input, as a subset of the overall product design validation. The key point is to define your "working" terms within the boundaries of the references quoted throughout this discussion in your SOPs, and then follow your SOPs. Marked as spam
|
|
Private answer
John E. Lincoln
That same guidance document, in Table 3 lists the 11 elements / documents the FDA requires to be part of a software validation "package" to be partially or wholly submitted as part of a 510(k) where software / firmware is involved.:
The updated FDA source on design control which shows the graphic I mentioned, is: Design Control Guidance For Medical Device Manufacturers dated March 11, 1997, and can be found at: http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm070627.htm Although it's focus is on the device, per the previous guidance, software for or as a device is to be included in the design control process of 21 CFR 820.30 (device CGMPs). Marked as spam
|
|
Private answer
John E. Lincoln
And I have used the FDA's guidance "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices" 11 elements as the "model" for all software / firmware V&V, not just devices, i,e, production equipment, test equipment, processes, and include IQ, OQ, and PQs in the 8th element, and usually include 21 CFR Part 11 elements (if CGMP e-records / sigs are involved) in the OQ / PQs portion of the same 8th element / documention, which the FDA calls "V&V".These are where all my test cases / scripts are against the requirements, et al.
By no means is this the only approach, but it has worked for me and my clients for many years, and has been used to get other consultant clients of mine out of review problems with the FDA on 510(k) submissions for devices containing software or firmware. Marked as spam
|
|
Private answer
John E. Lincoln
An interesting comment from the U.S. FDA " Design Control Guidance For Medical Device Manufacturers", dated March 11, 1997, referring to the previously mentioned graph, in Section III, para. 6:
"As the figure illustrates, design validation encompasses verification and extends the assessment to address whether devices produced in accordance with the design actually satisfy user needs and intended uses." Further, the other referenced software guidance discusses traceability, with the expectation that requirements, e.g., SRS, are proven by other elements / documents in the Validation package, especially the test cases / scripts, to prove all requirements (user, functional, standards, guidances ...) have been proven to exist and work in their intended environment / platforms. Marked as spam
|
|
Private answer
Mr. Vogel has a book that describes all these terms.
I like that a lot and actually for me those mentioning are very true and make a lot of sense. Here is the thing: http://www.amazon.com/Medical-Software-Verification-Validation-Compliance/dp/1596934220 I do not make any penny on it, no worries. There are a lot of descriptive content, however the following diagram tells the interrelation neatly: http://1.bp.blogspot.com/-hlU-4jE4p3g/U3owFhYy1zI/AAAAAAAAPl0/dJwbkV7mAdM/s1600/ValidationUmbrella.jpg Other than that ... common sense makes a good help for us when in doubt. Marked as spam
|
|
Private answer
Rajani Kumar Sindavalam, PMP®
Hi Rich,
Thanks for bringing this discussion. I completely agree with you. Verification is not just the code reviews, walkthroughs and inspection. It did includes the actual testing being performed on the device to ensure that the design requirements are met. I see validation as more the intended use testing. When I say validation, I see that the population used for the testing is the real life users. For example, putting together a study with the RNs to see whether they follow the on screen instructions and the accompanying labeling to ensure that they understand what as a designer I am trying to inform them and ensure that they use the device in-line with the design intent. Marked as spam
|
|
Private answer
Ginger Cantor
Software validation would indeed be use validation either actual or simulated. Verification is the code review, test cases etc.
Marked as spam
|
|
Private answer
Marie Suetsugu
The above discussion somewhat complicates the matter, it seems to me :-S
Test is a technical check of each function or each (smaller) set of functions. Verification is the final overall technical check if the software as a system satisfies the specifications that the programmer/s has/have written. Validation, on the other hand, is done from the user's point of view, to check if the software as a product/device satisfies the user's needs/requirements (which the technical specifications verified above may not actually satisfy). Please let me know if I'm mistaken... Marked as spam
|
|
Private answer
Jamal Alnaser, EE
It's simple. Validation is ensuring the we are developing the right software, and verification is ensuring that we are developing the software right.
Verification is more concerned with modules or phases, and validation focuses more on the entire product. Walkthroughs, code reviews etc. can, and should be done for both. Both can use automation, with human based activities (code reviews, documentation review, etc.) being more effective, and has a more important role, in verification than validation. Another point of comparison is that in validation you can, and should, run the code, but in some verification (due to the nature of the module or phase), you can't. So, verification is for modules, validation for is completed products. I believe that IEEE also uses the same concept to define the two. Marked as spam
|
|
Private answer
Alex Bromberg
Are you all able to handle your validation/qualification/verification needs in-house, or do you pull in outside resources to help?
Marked as spam
|
|
Private answer
John E. Lincoln
As a consultant, I'm called in to direct such projects, and I then train and utilize in-house personnel, with the goal that they can then do similar projects themselves. But this is usually for U.S. FDA and EU medical products CGMP SW V&V, not software development / testing, with a few exceptions, hence my references to FDA guidance documents for my definitions and "models".
Marked as spam
|
|
Private answer
Alex Bromberg
Do managers here tend to use skilled consultants/contractors like John for software validation?
Marked as spam
|
|
Private answer
Marie Suetsugu
Could people clarify please? The person who does validation, doing it from the user's (rather than the developer's) point of view, needs no knowledge of 'codes'...am I right? I feel this may be where the confusion begins.
Marked as spam
|
|
Private answer
John E. Lincoln
If you're doing a V&V under the FDA's CGMPs, and the software is custom rather than COTS (commercial off the shelf) SW, then someone, other than the programmer(s) involved in the coding, i.e., impartial, needs to do a "white" or "glass" box verification of the code and its algorithm; in addition to the "black" box verification of the software running on its platform / hardware, as I discussed above. The depth of both is product risk-based (tied to end-user / patient risk). In the projects I've been involved in, most are COTS. The few that involved custom SW, requiring a code review, the company had qualified personnel who weren't involved in that particular program, who did it. If not, another contractor or consultant would have to be brought in or the project farmed out to perform that portion (glass box verification) of the overall validation. Again this is the approach I use, with no audit problems to date, and more importantly, proof that the software works as intended in its intended environment.
Some FDA references: Gen'l principles of SW Validation: http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm085281.htm#_Toc517237933 FDA SW Glossary http://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074875.htm Marked as spam
|
|
Private answer
John E. Lincoln
A complex example:
A few years back I did a V&V of a vision system inspection station for a printed scale on a device. It used (1) PLC for the conveyor and system triggers (lights / camera triggers , accept / reject routing, etc) (custom ladder logic), (2) BASIC (custom) as an interface between the PLC and the (3) Vision System SW (Application COTS SW which controlled the camera's comparison of product sample digital image pixels with an allowable range of variance from the ideal image) which was further (4) user "programmable" with icons. The overall Vision system was validated using the 11 elements discussed previously, but several of those elements were further broken down to address the 4 types of required SW verifications mentioned here. Both the PLC and BASIC were also white box verified by other, non-project involved programmers (these were simple programs thankfully). The entire system (PLC and BASIC interface) with its Vision COTS system and it's user programming were black box verified. In addition to the normal PQ runs to address allowable part input variances, and similar production / parameter variances, additional PQs were run to qualify three separate operators. All of the verifications mentioned here, became part of the overall Vision System Validation Report, which in this case included both the software and the hardware. Marked as spam
|
|
Private answer
Marie Suetsugu
My understanding is that validation (from the user's point of view) does NOT include verification (from the technical point of view); they are two separate ways of checking the overall software - whilst verification (technical check of the overall software system) may include tests (technical checks of software items/units).
Now, we are a SaMD manufacturer and our QMS is possibly more in line with IEC 62304 than US regulations (as the FDA haven't come to audit our QMS yet), but some of you talking about validation as if it includes verification really confuses me. Is there such a huge difrerence between IEC 62304 and US regulations? Marked as spam
|
|
Private answer
John E. Lincoln
Marie, I am talking about validation as the covering activity, consisting of a series of verifications, but the majority of applications I'm involved with is not just the manufacture of SaMD or the SW in a device only, where IEC 62304 is primarily focused, but the end stage after development, when the finished SaMD. or device with software, or the process / equipment with software is validated (by a series of verifications) as acceptable for use (either on a patient, or in a production or test environment), as required by the U.S. FDA's CGMPs (similar to ISO 9001 and ISO 13485), Design Control (820.30), et al, as cited in my other responses. One of the 11 elements in the FDAs guidance on SW in devices, in Table III, (From their Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices, revised May 11, 2005, http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm089543.htm )
is Software Development Environment Description. It is here where I would cite and explain the custom SW developers' use of IEC 62304, or similar methodologies used in the development and verification of the software leading to its manufacture (for subsequent validated use). Marked as spam
|
|
Private answer
Marie Suetsugu
Further to my comment above...I don't see the US guidance is that different from IEC 62304, to be honest:
'3.1.2 Verification and Validation The Quality System regulation is harmonized with ISO 8402:1994, which treats "verification" and "validation" as separate and distinct terms. On the other hand, many software engineering journal articles and textbooks use the terms "verification" and "validation" interchangeably, or in some cases refer to software "verification, validation, and testing (VV&T)" as if it is a single concept, with no distinction among the three terms. Software verification provides objective evidence that the design outputs of a particular phase of the software development life cycle meet all of the specified requirements for that phase. Software verification looks for consistency, completeness, and correctness of the software and its supporting documentation, as it is being developed, and provides support for a subsequent conclusion that software is validated. Software testing is one of many verification activities intended to confirm that software development output meets its input requirements. Other verification activities include various static and dynamic analyses, code and document inspections, walkthroughs, and other techniques. Software validation is a part of the design validation for a finished device, but is not separately defined in the Quality System regulation. For purposes of this guidance, FDA considers software validation to be "confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled."' (General Principles of Software Validation; Final Guidance for Industry and FDA Staff http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm085281.htm#_Toc517237938 ) Marked as spam
|
|
Private answer
John E. Lincoln
Marie, we're out of sync here in our comments sequences. But per the U.S. FDA guidance documents I cited early on, including the U.S. FDA / Health Canada graph and its explanation, Validation is the sum total of a series of verifications / tests. That's also my interpretation. I have seen companies treat Validation (resultant device or system = user requirements) as a high level parallel activity to the lesser Verifications (proving output = input), and do it successfully. Each will work if clearly defined in SOPs and then those SOPs followed. But the former is the approach I've used successfully for over a decade, but it's not the only proper approach.
Marked as spam
|
|
Private answer
Marie Suetsugu
Oh sorry, John, our comments crossed in the post (as I deleted mine so as to add some more quotes from 'General Principles of Software Validation; Final Guidance for Industry and FDA Staff')...
I haven't reread it this time, but of course we read and used 'Guidance for the Content of Premarket Submissions for Software Contained' in Medical Devices as we were establishing our QMS (and preparing FDA 510(k) and EU CE Marking certification applications), and didn't have problems in understanding these two concepts/processes :-S I seem to be having some difficulty in understanding your last comment......are you saying my understanding is problematic, in light of what I've said so far...? Marked as spam
|
|
Private answer
Marie Suetsugu
Oh dear our comments crossed again! So what I had difficulty with was your second last comment now, and I do agree with what you've just said above! Thanks ;-)
Marked as spam
|
|
Private answer
Ernesto Staroswiecki, Ph.D.
Going a few comments back up, to Alex's comments, and Marie's and John's responses to it, my experience has been a bit different. Most of the projects I have helped with involve companies with a definitely undersized SW group. Since the verification needs to be done by people that have not been involved in writing the code, this typically leads to hiring firms specializing in V&V only to plan and execute these steps.
Marked as spam
|
|
Private answer
Jamal Alnaser, EE
I believe that it's important to focus on the essence of the rule, not the letter. Guidance documents are just that. Guidance documents. They are intended to to outline one set of concepts/processes, out of many possible concepts/processes, with the mail objective of ensuring the safety and efficacy of devices/software. That is the essence of it. So, whether it's cgmp, iec, eu directive, or any other regulation, the important objective of all of them is to produce a safe and effective device/software. There is nothing in any rule that forces manufacturers to follow one set of processes over another. In fact, if this happens, and regulators dictated HOW to comply, many smaller companies will go out of business.
With that said, any label (validation, verification, review, etc.) you would wish to link to a certain activity is not really important. What's important is the activity itself. Ensuring safety and efficacy wrt design testing comes in two forms that both need to be carried out; 1- modular testing (what we would call verification),which tends to be more static and 2- holistic (what we would call validation), and tends to be more dynamic. In my previous life, where I led the regulatory and QA teams of major US medical devices manufacturers, and my current life, where I advice governments in "emerging" markets on the setup of regulatory and compliance frameworks and operating models, I always believed that achieving objectives is much more important than the mechanism by which you achieve them. This will always allow for more innovative means by which manufacturers can comply, which will translate into higher efficiency in the regulatory processes sets, without compromising regulatory requirements. A rigid regulatory compliance set of processes is not necessarily always a good thing. So, bottom line, call it what you wish. The important issues is carrying out the necessary processes to achieve the desired objective. Sorry for the lengthy post. Marked as spam
|
|
Private answer
John E. Lincoln
Basically, in principle, I couldn't agree more with Jamal's comment, hence my point about developing "working" definitions for your company and then following those. However (there's always a "however"), if a company wants to clear a SaMD or device with SW / Firmware through the FDA's 510(k) process, for sale in the U.S., they will HAVE to follow the 11 documentation requirements from the cited FDA guidance document (which, as stated in those guidances, is merely a distillation of 20 years of practice in the SW industry); though specific terminology can vary somewhat, as long as the 11 categories are obviously addressed. I have had to assist other consultant's, and my client companies, to do just that to successfully complete the 510(k) process. Beyond that, each company has a degree of leeway, subject to the points raised in this forum, the applicable guidances (per the FDA they are optional, BUT either they or something as good, must be utilized, with justification, to address the regulatory concerns / recommended acceptable solutions spelled out in the guidance), and/or what a company's notified-body requires if pursuing markets outside the US / CE-marking (it is with N-B's that I have seen more strict requirements and arguments over definitions with some tendency toward a "my way or the highway" mentality). When that guidance came out, I called the FDA and spoke to the lead person on that guidance, and he agreed that it would also make a good model for all SW validation project documentation, and that's what I've done for the last approx. 10 years, with good results both with FDA and N-B auditors. It also provides consistency and project predictability, which many other approaches do not. And that's all I'm presenting here -- my decade long field-tested approaches and their rationale / justification. I'm sure in the future, other approaches will come to the fore and gain acceptance by regulatory bodies.
Marked as spam
|
|
Private answer
John E. Lincoln
Per Marie's earlier comment, I too have seen the terms V&VT (verification and validation testing) used in the industry (and I have used it), and the FDA guidance on the 11 elements has the actual test cases / scripts sections called V&V. So "working" definitions are obviously required.
Marked as spam
|
|
Private answer
Marie Suetsugu
Also......unless you've already got a system established (in which case you wouldn't be asking this sort of question I thought...), it's actually much easier to follow the rule(s) recognised 'worldwide'.
Marked as spam
|
|
Private answer
Edwin Bills, ASQ Fellow, RAC
For those of us that are visual learners I have found the diagram in Annex A (I think) of IEC 62304 does a good job of explaining the entire software development process, including Verification and Validation.
Marked as spam
|
|
Private answer
Marie Suetsugu
Edwin,
Perhaps you mean 'Figure C.2 – Software as part of the V-model' in Annex C...? Marked as spam
|
|
Private answer
Edwin Bills, ASQ Fellow, RAC
Maria that is correct. I was operating from memory, which sometimes is not as good as it once was. Thanks for correcting me for everyone's benefit.
Marked as spam
|
|
Private answer
Tom Mariner
The comments seem to fit my experience that verification assures output meets specifications, while validation is "fit for clinical use". Obviously simplistic.
Marked as spam
|
|
Private answer
Alex Bromberg
When hiring contractors/consultants to handle Validation/Verification/Qualification needs. what do you find to be the biggest challenges/headaches?
Marked as spam
|
|
Private answer
John E. Lincoln
As a consultant, one of the biggest push-backs I receive is the need to have a product (device or drug) [to patient / user level] risk document / file in place. The client feels this is scope creep if they don't have one already, tho it's a regulatory requirement anyway. However, SW V&V is a risk-based activity due to its complexity (algorithms / code), and defining it's depth and extent requires more than just a subjective "minimum", "moderate" or "major" decision on the part of the client.
Marked as spam
|
|
Private answer
Marco CATANOSSI
That's what is the worst misunderstanding of choosing a consultant in regulatory /safety : trying to buy a professional performance the same way you negotiate fruit price and forget what is the intended use of the documentation. If you pay less you will have a less performing consultant or less time dedicated to the task . The company may discover what was the real business when this documentation count more, eg in the court after an accident has happened.
Marked as spam
|
|
Private answer
Armin Beck
Ciao Marko. You are right but service quality including documentation is not seen as a value in most companies i wish that a ceo would see value in quality rather only in fake revenue
Marked as spam
|
|
Private answer
Software verification should verify the change meets the expected requirements and testing depends on what type of development methodology is used: test-driven development, iterative (agile), etc. Software validation needs to be performed at the system level if the software is part of a medical device.
Marked as spam
|
|
Private answer
John E. Lincoln
And to add another definition to those I referenced earlier in this discussion:
"All production and/or quality system software, even if purchased off-the-shelf, should have documented requirements that fully define its intended use, and information against which testing results and other evidence can be compared, to show that the software is validated for its intended use." -- General Principles of Software Validation; Final Guidance for Industry and FDA Staff, 2002 http://www.fda.gov/downloads/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm085371.pdf Marked as spam
|
|
Private answer
John E. Lincoln
A further quote from the same guidance document (in this case, notice the references to various types of verification in the overall validation process; tho elsewhere in the guidance the terms are used separately -- as I mentioned previously, I prefer the usage that follows):
"For purposes of this guidance, FDA considers software validation to be “confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.” In practice, software validation activities may occur both during, as well as at the end of the software development life cycle to ensure that all requirements have been fulfilled. Since software is usually part of a larger hardware system, the validation of software typically includes evidence that all software requirements have been implemented correctly and completely and are traceable to system requirements. A conclusion that software is validated is highly dependent upon comprehensive software testing, inspections, analyses, and other verification tasks performed at each stage of the software development life cycle. Testing of device software functionality in a simulated use environment, and user site testing are typically included as components of an overall design validation program for a software automated device." -- General Principles of Software Validation; Final Guidance for Industry and FDA Staff, 2002 Marked as spam
|
|
Private answer
I consider Verification to include executing tests to a system level specification and is not considered done until all coding, unit test, and integration have been completed, and such final development work has been verified through testing against the full system level specification (no matter what development lifecycle is chosen). It must also verify all system-level risk mitigations.
Validation is execution of tests (or other formal evaluation) against a User Specification (or arguably, it's more nebulous surrogate "intended use") that reflects actual workflow (considering: manufacture transportation, installation, user, and patient scenarios). Marked as spam
|
|
Private answer
Marie Suetsugu
Might be of interest to some:
Define medical device software verification and validation (V&V) http://medicaldeviceacademy.com/define-medical-device-software-verification-and-validation/ Marked as spam
|
|
Private answer
Christopher Smith
Old school definition, you verify specifications (how you implement the requirements), and you validate requirements (what you want the system to do). Where you draw the line depends on your level of comfort and the regulations you are covered by. In my experience (software and integrated system testing), Validation is more a subset of verification, covering "happy path" testing, field tests (beta) and regulatory testing.
Marked as spam
|