Chapter 1: Digital pathology overview and evidence

Chapter 1: Digital pathology overview and evidence

1.1 Big picture overview and definitions of digital pathology and WSI

What you need to know

  • Digital pathology is the use of digital technology to acquire, store, view, share, and analyse pathology material (especially whole-slide images) as part of routine diagnostics, QA, education, and research.
  • Whole-slide imaging (WSI) turns a glass slide into a high-resolution digital slide that you can navigate on screen: you zoom and pan with a mouse instead of moving a mechanical stage.
  • A functioning digital pathology service is not “a scanner in a room”; it is a stack of components:
    • scanners to digitize slides
    • image management and archives
    • viewers for reporting and teaching
    • integration with LIS and enterprise imaging
    • networks, security, backup, and support.
  • This has become practical only recently because storage, networks, and scanner technology are now good enough, and regulators have started to approve WSI systems for primary diagnosis in surgical pathology.
  • Once your slides are digital, you are no longer just looking at pretty pictures; you are building a data platform that supports QA, audit, remote reporting, research, and AI.
  • A vendor-neutral archive (VNA) is an important concept: images are stored in a standards-based way so that different viewers, tools, and AI pipelines can access them, reducing vendor lock-in over the lifetime of the archive.

Reference
Pantanowitz L, Sharma A, Carter AB, Kurc T, Sussman A, Saltz J. Twenty years of digital pathology: an overview of the road travelled, what is on the horizon, and the emergence of vendor-neutral archives. J Pathol Inform. 2018;9:40. doi:10.4103/jpi.jpi_69_18.


1.2 Clinical drivers and use cases for digital pathology

What you need to know

  • Digital pathology is driven by very practical clinical needs, such as:
    • getting slides to the right subspecialist when they are at another site or working remotely
    • avoiding courier delays and lost slides for tumor boards and consults
    • making it easier to retrieve and review old cases for QA, audit, or teaching.
  • Core clinical use cases you should be able to list:
    • Teleconsultation and second opinions: sending digital slides instead of shipping glass, particularly for rare or complex cases.
    • Tumor boards / MDTs: sharing digital slides on screen so all participants can see the same fields and navigate quickly.
    • Remote or home reporting: primary sign-out using WSI in services that have validated digital workflows.
    • Frozen sections and urgent intraoperative consults: enabling on-call pathologists to review slides from other sites in real time.
    • Education and exams: building stable, sharable case sets without handling fragile glass.
    • Archiving and re-use: keeping digital copies before blocks are exhausted by deeper levels, immunos, or molecular workup.
  • Digital slides are also powerful for quality and safety:
    • easier case review and double reading
    • support for EQA schemes and peer review
    • targeted review of specific case groups (for example, all positive margins over a time period).
  • These same images and metadata form the backbone for metrics, research, and AI, because you can find, label, and reuse cases at scale.

Reference
Farahani N, Parwani AV, Pantanowitz L. Whole slide imaging in pathology: advantages, limitations, and emerging perspectives. Pathol Lab Med Int. 2015;7:23–33. doi:10.2147/PLMI.S59826.


1.3 Evidence that WSI is safe and noninferior for primary diagnosis

What you need to know

  • Safety of WSI for primary diagnosis has been tested in formal noninferiority studies, not just small pilots.
  • The pivotal trial you should recognize:
    • multicenter, blinded, randomized noninferiority study
    • 1992 surgical pathology cases from several institutions
    • pathologists reported cases on glass and on WSI with a washout period
    • major discrepancies were compared with a reference diagnosis.
  • The key result:
    • the major diagnostic discordance rate on WSI was not higher than on glass beyond the pre-specified margin
    • for the case mix studied, WSI was noninferior to conventional microscopy for primary diagnosis.
  • Additional single-center and multi-center studies have shown:
    • equivalent diagnostic performance
    • similar or improved efficiency once pathologists are comfortable with digital sign-out.
  • Important caveats you should be able to explain:
    • each lab still needs a local validation because scanners, viewers, staining, and case mix differ
    • some domains (for example, certain hematopathology or cytology tasks) may need more caution or glass fallback depending on local experience
    • pathologists require a training and adaptation phase, with dual reporting, before fully relying on WSI.

Reference
Mukhopadhyay S, Feldman MD, Abels E, et al. Whole slide imaging versus microscopy for primary diagnosis in surgical pathology: a multicenter blinded randomized noninferiority study of 1992 cases. Am J Surg Pathol. 2018;42(1):39–52. doi:10.1097/PAS.0000000000000948.


1.4 Implementation, change management, and service redesign

What you need to know

  • Implementing digital pathology is a service redesign, not just an equipment purchase:
    • it changes how specimens are processed, how slides are tracked, how cases appear in worklists, and how reports are produced and checked.
  • Key elements of a safe rollout:
    • a clearly defined scope (for example, start with specific subspecialties, case types, or frozen sections)
    • explicit governance with named leads from pathology, IT, and management
    • a realistic infrastructure plan for scanners, storage, network, backup, and support.
  • Pathologist training and individual validation are central:
    • each pathologist needs a defined validation set of cases reported on both glass and digital
    • there should be documented acceptance criteria before they use WSI for primary diagnosis
    • training must cover the viewer, navigation, measurement tools, and indications for pulling glass.
  • You should be able to describe core workflow redesign questions:
    • when slides are scanned relative to staining and coverslipping
    • how barcodes and identifiers travel from LIS to scanner to viewer
    • what the digital worklist looks like, including how urgent cases are flagged
    • how late immunos, additional levels, and re-cuts fit into the digital workflow.
  • Safety measures before go-live:
    • clear rules for when to revert to glass
    • tested downtime procedures for scanner, viewer, or network failures
    • ongoing QA and audit to monitor image quality, turnaround time, and error rates.

Reference
Cross S, Furness P, Igali L, Snead D, Treanor D. Best practice recommendations for implementing digital pathology. London: The Royal College of Pathologists; 2018. (G162).


1.5 Where AI and computational pathology fit into the picture

What you need to know

  • AI in pathology depends on digital pathology:
    • algorithms require large numbers of well scanned, well annotated slides plus clinical data
    • without routine WSI and good data management, serious AI development and deployment are very difficult.
  • Most current tools fall into a few practical categories:
    • Detection / triage (for example, screening prostate biopsies, detecting micrometastases in lymph nodes)
    • Quantification (for example, Ki-67 index, ER/PR/HER2 scoring, PD-L1 proportion scores)
    • Grading and pattern analysis (for example, Gleason grading, tumor budding, tumor infiltrating lymphocytes)
    • Prediction of molecular alterations or outcomes directly from H&E in research contexts.
  • The realistic current state:
    • many models show strong performance in research and translational cohorts
    • only a smaller number of tools are fully regulated and deployed in routine diagnostics, usually for narrow, well defined tasks
    • human pathologists remain firmly in the loop, interpreting AI output rather than being replaced.
  • Limitations you should be able to articulate:
    • generalizability problems when models trained on one lab’s data are applied to another’s without adaptation
    • bias and data quality issues if the training set is unrepresentative
    • the need for prospective validation, governance, and monitoring, just like any other diagnostic device or assay.
  • How this ties back to your digital program:
    • good scanning quality, robust LIS integration, and clean, searchable archives make AI safer and more useful
    • pathologists need a basic understanding of how these systems work so they can spot failures, explain results, and participate in procurement and governance decisions.

Reference
Baxi V, Edwards R, Montalto M, Saha S. Digital pathology and artificial intelligence in translational medicine and clinical practice. Mod Pathol. 2022;35(1):23–32. doi:10.1038/s41379-021-00919-2.