Chapter 7: Validation and quality assurance

Chapter 7: Validation and quality assurance

This chapter covers how to prove that your digital pathology system is safe for diagnostic use, how to keep it that way over time, and how to document everything in a way that satisfies regulators and accrediting bodies.


7.1 Why validation is mandatory, not optional

What you need to know

  • Moving from glass to digital slides is a major change to the diagnostic process; regulators expect you to prove that your digital system is at least as safe as your previous workflow.
  • Validation has three main purposes:
    • Show that diagnoses made on WSI are concordant with those made on glass slides for your own pathologists and case mix.
    • Define the scope of safe use (for example, which specimen types, stains, and clinical scenarios are covered).
    • Provide a documented basis for accreditation and for patient safety governance.
  • Good validation studies:
    • Use enough cases to represent the spectrum of work in your lab, including challenging diagnoses.
    • Separate training from testing (pathologists should not validate on cases they just used to learn the system).
    • Include a washout period between glass and digital review.
  • Professional guidelines give concrete recommendations for case numbers, washout periods, and documentation expectations; following them closely saves time when you are inspected.

Reference
Evans AJ, Brown RW, Bui MM, et al. Validating whole slide imaging systems for diagnostic purposes in pathology: Guideline update from the College of American Pathologists in collaboration with the American Society for Clinical Pathology and the Association for Pathology Informatics. Arch Pathol Lab Med. 2022;146(4):440–450. doi:10.5858/arpa.2020-0723-CP. Available at: https://doi.org/10.5858/arpa.2020-0723-CP


7.2 What counts as a “system” and when you need to re‑validate

What you need to know

  • Validation is about the whole system that produces and displays images for clinical use, typically including:
    • Scanner hardware and software.
    • Image management system.
    • Viewers and displays.
    • Network and storage environment.
  • Significant changes to any of these components may require some degree of re‑validation, for example:
    • New scanner model or major firmware upgrade.
    • Change in viewer or image management platform.
    • New display model for primary diagnosis.
    • New staining protocols or specimen types that were not covered in the original study.
  • Guidelines distinguish between:
    • Full validation for initial deployment and major system changes.
    • Targeted validation or verification when adding new use cases or making smaller changes.
  • A simple way to think about it: if a change could plausibly affect what the pathologist sees or how they interact with the slide, you likely need at least some validation work.

Reference
Evans AJ, Brown RW, Bui MM, et al. Validating whole slide imaging systems for diagnostic purposes in pathology: Guideline update from the College of American Pathologists in collaboration with the American Society for Clinical Pathology and the Association for Pathology Informatics. Arch Pathol Lab Med. 2022;146(4):440–450. doi:10.5858/arpa.2020-0723-CP. Available at: https://doi.org/10.5858/arpa.2020-0723-CP


7.3 Designing a validation study: cases, concordance, and washout

What you need to know

  • Classic validation studies compare diagnoses made on WSI with those made on glass slides for the same cases.
  • Key design elements:
    • Case selection: include a representative mix of your real workload, not only “easy” cases; make sure key subspecialties and specimen types are covered.
    • Number of cases: professional guidelines give minimums, but more cases provide narrower confidence intervals for concordance.
    • Read order and washout: avoid bias by separating glass and digital reads in time; common washout periods range from at least 2 weeks to several months.
    • Concordance definitions: define what counts as “major” versus “minor” discrepancy and how you will adjudicate disagreements.
  • Validation should focus on diagnostic outcomes, not just image quality:
    • The question is not “does this slide look nice?” but “does it lead to the same clinically relevant diagnosis and grading as glass?”.
  • Published studies provide concrete examples of how to design and report validation work, including how they dealt with borderline discrepancies and how they interpreted results.

Reference
Bauer TW, Schoenfield L, Slaw RJ, et al. Validation of whole slide imaging for primary diagnosis in surgical pathology. Arch Pathol Lab Med. 2013;137(4):518–524. doi:10.5858/arpa.2011-0678-OA. Available at: https://doi.org/10.5858/arpa.2011-0678-OA


7.4 Ongoing quality assurance and monitoring after go‑live

What you need to know

  • Validation is the starting point; ongoing quality assurance keeps the system safe over time.
  • Important components of a QA program:
    • Daily or per‑shift checks that scanners, viewers, and displays are functioning as expected.
    • Periodic review of rescans, scanner errors, and viewer crashes.
    • Monitoring of turn‑around times to detect workflow issues.
    • Routine checks that displays remain within acceptable performance ranges.
  • External quality assurance and proficiency testing can incorporate digital slides; this both tests your system and gives pathologists practice with digital material.
  • It is useful to define and track a small number of key performance indicators (KPIs) for digital pathology, such as:
    • Scanner uptime.
    • Rescan rates.
    • Percentage of cases reported fully digitally versus requiring glass.
  • QA processes should be integrated into your laboratory quality management system rather than treated as something separate for “digital only.”

Reference
Cross S, Furness P, Igali L, Snead D, Treanor D. Best practice recommendations for implementing digital pathology. The Royal College of Pathologists; 2018. Available at: https://www.rcpath.org/static/f465d1b3-797b-4297-b7fedc00b4d77e51/Best-practice-recommendations-for-implementing-digital-pathology.pdf


7.5 Documentation, governance, and communicating scope

What you need to know

  • Clear documentation is essential for inspectors, for colleagues, and for your future self:
    • Validation protocols and reports.
    • Lists of systems and versions included in each validation.
    • Descriptions of which specimen types and stains are in scope.
    • Records of re‑validation after system changes.
  • Governance structures help decision‑making and accountability:
    • A digital pathology steering group or committee.
    • Named clinical and technical leads.
    • Defined process for approving new use cases or system changes.
  • Clinicians and managers outside pathology need simple, honest explanations of what your digital system can and cannot safely do.
  • Good practice is to:
    • Embed digital pathology validation and QA into your wider clinical governance framework.
    • Make sure that all users (including trainees and locums) know where to find policies and whom to contact when problems arise.
  • Using published guidelines as the backbone for your local documentation makes it easier to defend your choices during accreditation and helps to keep your practices aligned with international standards.

Reference
Evans AJ, Brown RW, Bui MM, et al. Validating whole slide imaging systems for diagnostic purposes in pathology: Guideline update from the College of American Pathologists in collaboration with the American Society for Clinical Pathology and the Association for Pathology Informatics. Arch Pathol Lab Med. 2022;146(4):440–450. doi:10.5858/arpa.2020-0723-CP. Available at: https://doi.org/10.5858/arpa.2020-0723-CP