Skip to main content

Main menu

  • Home
  • Content
    • Current
      • JNMT Supplement
    • Ahead of print
    • Past Issues
    • Continuing Education
    • JNMT Podcast
    • SNMMI Annual Meeting Abstracts
  • Subscriptions
    • Subscribers
    • Rates
    • Journal Claims
    • Institutional and Non-member
  • Authors
    • Submit to JNMT
    • Information for Authors
    • Assignment of Copyright
    • AQARA Requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
    • Corporate & Special Sales
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • SNMMI
    • JNMT
    • JNM
    • SNMMI Journals
    • SNMMI

User menu

  • Subscribe
  • My alerts
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Nuclear Medicine Technology
  • SNMMI
    • JNMT
    • JNM
    • SNMMI Journals
    • SNMMI
  • Subscribe
  • My alerts
  • Log in
  • My Cart
Journal of Nuclear Medicine Technology

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • Continuing Education
    • JNMT Podcast
    • SNMMI Annual Meeting Abstracts
  • Subscriptions
    • Subscribers
    • Rates
    • Journal Claims
    • Institutional and Non-member
  • Authors
    • Submit to JNMT
    • Information for Authors
    • Assignment of Copyright
    • AQARA Requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
    • Corporate & Special Sales
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • Watch or Listen to JNMT Podcast
  • Visit SNMMI on Facebook
  • Join SNMMI on LinkedIn
  • Follow SNMMI on Twitter
  • Subscribe to JNMT RSS feeds
Research ArticlePractice Management

Technical Peer Review: Methods and Outcomes

Andrew M. Keenan, Toni Cranston, Kelsey Hill and Derek J. Stocker
Journal of Nuclear Medicine Technology December 2017, 45 (4) 309-313; DOI: https://doi.org/10.2967/jnmt.117.198473
Andrew M. Keenan
Nuclear Medicine Service, Department of Radiology, Walter Reed National Military Medical Center, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Toni Cranston
Nuclear Medicine Service, Department of Radiology, Walter Reed National Military Medical Center, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Kelsey Hill
Nuclear Medicine Service, Department of Radiology, Walter Reed National Military Medical Center, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Derek J. Stocker
Nuclear Medicine Service, Department of Radiology, Walter Reed National Military Medical Center, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Peer review is routine among physicians, nurses, and pharmacy staff yet is uncommon in the field of nuclear medicine technology. Although not a requirement of regulatory agencies, nuclear medicine technical peer review can greatly enhance the quality of patient care in both hospital and outpatient settings. To date, detailed methods for accomplishing this task have not been published. Methods: 19,688 nuclear medicine studies performed at a single institution over a 5-y period were critically reviewed. Major findings (errors with potential to change physician interpretation of the study or resulting in prescription error) and minor findings (errors without an adverse effect on study outcome or interpretation) were identified and tabulated monthly according to finding type, study type, and individual staff member. Results: The technical peer review method used at our institution provided a comprehensive means to measure the rate and types of errors. Over time, this system tracked the performance of nuclear medicine staff and students, providing feedback that led to a measurable reduction in errors. Conclusion: We present a technical peer review system based on our own experience that can be adapted by other nuclear medicine facilities to fit their needs.

  • peer review
  • performance improvement
  • nuclear medicine technology

Medical peer review has been a longstanding process for practitioners, dating back to the 19th century (1). Peer review is also common in other clinical disciplines, such as nursing and pharmacy. The U.S. Congress enacted the Medicare Improvements for Patients and Providers Act of 2008 (2), which sets requirements for providers of advanced diagnostic imaging. These include a mandate for accreditation, effective January 1, 2012, which carries implications for reimbursement. Many regulatory agencies base their assessments of medical staff in part on ongoing performance-based evaluations that include peer review (3). Currently, such agencies as the Joint Commission, American College of Radiology, Accreditation Council for Graduate Medical Education, and Intersocietal Accreditation Commission have set standards for purposes of accreditation, certification, licensing, credentialing, or privileging of medical and technical staff. Furthermore, the Society of Nuclear Medicine and Molecular Imaging Technologist Section has published the “Nuclear Medicine Technologist Scope of Practice and Performance Standards” (4), and the Intersocietal Accreditation Commission has published standards for technical quality review (5). However, performance evaluation for nuclear medicine (NM) technology through a formal peer review process has yet to be addressed.

The American College of Radiology has developed a peer review scoring system for radiologists, entitled RADPEER, in which a qualified radiologist scores the original interpretation using a scale from 1 to 4: 1 denotes “concur with interpretation”; 2, “difficult diagnosis, not ordinarily expected to be made”; 3, “diagnosis should be made most of the time”; and 4, “diagnosis should be made almost every time, misinterpretation of findings” (6). Presently, no such scoring system for comprehensive NM technical peer review has been reported.

We present here our methods and outcomes using a more simplified grading scale: minor and major findings, acceptable and unacceptable studies. Results were then reviewed to identify any trends and to monitor the performance improvement of student technologists and newly hired employees, as well as to provide ongoing and constructive feedback to all technical staff members. We performed an extensive, meticulous review of all NM studies performed, in part because our institution serves as a phase II site for the Nuclear Medicine Technologist Training Program, Medical Education and Training Campus, Fort Sam Houston, Texas.

MATERIALS AND METHODS

A retrospective review of quality assessment data collected as part of an ongoing NM technical peer review process over the 5-y period January 1, 2012, through December 31, 2016, was performed and tabulated. In total, 19,688 NM studies were included in this review. Each study was critically appraised for errors and deficiencies in specific categories by a senior NM technologist assigned to this purpose. Findings were grouped into the general categories of patient information (patient identification, study orders, other administrative errors), radiopharmaceutical (prescription error, misadministration), and imaging (subcategorized here into planar/SPECT and PET/CT), as shown in Tables 1–4, and classified as major or minor. Major findings included those errors that had the potential to change physician interpretation of the study or resulted in prescription error, whereas minor findings were errors without an adverse effect on study outcome or interpretation.

View this table:
  • View inline
  • View popup
TABLE 1

Patient Order, Information, and Administrative Errors

View this table:
  • View inline
  • View popup
TABLE 2

Radiopharmacy and Prescription Errors

View this table:
  • View inline
  • View popup
TABLE 3

Image Errors on Planar and SPECT Studies

View this table:
  • View inline
  • View popup
TABLE 4

Image Errors on PET/CT Studies

The technical peer review report is compiled at the end of each month from the daily data collected by the NM technologist assigned to perform the quality control review. The peer review format examines every study and determines whether that study met the criteria for acceptability, based on the number of major and minor findings. A study with no major findings or fewer than 4 minor findings was a technically acceptable study; a study with a major finding or with 4 or more minor findings (4 minor findings equalling a major finding) was a technically unacceptable study. Findings were further tabulated for each study type and for each individual NM staff member using an anonymous code number known only to that individual and to the supervisory technologist. Frequency of findings per month or per year, expressed as a percentage, was calculated by dividing the total number of findings associated with each NM staff member by the total number of studies in which that staff member participated; some findings may be attributed to more than one staff member, potentially increasing the error rate per individual.

RESULTS

The 19,688 NM studies over the 5-y period were reviewed and are summarized in Table 5. The findings were further tabulated monthly according to finding type, study type, and individual staff member. The goal for the number of unacceptable cases was set at less than or equal to 5% of cases reviewed per month and per year. For the 12 mo of the 2016 review process, 3,710 studies were reviewed, with 92.5% of the studies judged acceptable and 7.5% unacceptable, not meeting the criteria of less than 5% (Table 6).

View this table:
  • View inline
  • View popup
TABLE 5

Total Findings

View this table:
  • View inline
  • View popup
TABLE 6

2016 Results

The number and types of individual findings were tabulated for each study to identify the most common errors. Those that occurred in numbers large enough to be considered a trend were errors of omission or inattention to detail due to lack of appropriate documentation on images or forms. These errors were not study-specific but rather were the same type of error regardless of study type. Examples include incorrect study labels, patient information, acquisitions, processing, or formatting of screen saves and missing images.

On the other hand, there were several frequent findings that were mainly study-specific. These include, in PET/CT scans, not performing the acquisition at 60 ± 10 min after injection or entering the injection time or dose incorrectly into the SUV program; in bone scans, starting the blood flow study too early or too late, failing to acquire one or more required images, or acquiring the incorrect time/frame, total time, or total counts; and, in lung scans, omitting the “right” and “left” labels on the images.

Many errors were identified and corrected immediately on discovery, before completion of the study, and had no adverse impact (e.g., pharmacy label corrected, site reimaged, or study reprocessed); these findings were nevertheless recorded for peer review purposes only.

Among the pharmacy group, the most common findings were a missing pharmacy label or a label on which the date of birth or identification number was incorrect or the patient’s name was misspelled. There were no radiopharmaceutical misadministrations, unexpected adverse reactions, or reportable events.

DISCUSSION

Preventable medical errors carry a heavy price in both human lives and dollars (7). The practice of NM technology involves numerous critical steps to achieve optimal results; therefore, the potential is great for error—from inconsequential to life-threatening—which may occur at any time after the patient first enters the department until the study is presented for final physician interpretation. Regularly scheduled reviews by a qualified medical physicist are useful for proper license maintenance, and these provide feedback and guidance to medical and technical staff but focus mostly on regulatory compliance, documentation, and equipment performance rather than the day-to-day actions of individual NM staff members.

As in any profession, error rate measurement alone does not improve performance; feedback and retraining must be ongoing for an improved outcome. This is best illustrated in the aviation industry, where small errors can yield disastrous outcomes yet are extremely rare because of rigorous review and retraining programs (8). In medical imaging, a real-time comment-enhanced program for radiologist peer review has been reported to demonstrate measurable improvement in radiologist compliance (9). Similar results were observed in our experience tabulated here, in which most NM staff members showed noticeable improvement (Table 7). For the 13 staff members with at least 2 y of data, all had a decrease in error rate, from a mean of 21.9% (SD, 12.1%) in their first year to 14.8% (SD, 9.0%) in their second year (P = 0.001, paired t test). NM staff were further categorized by number of years active at this institution—as new (<5 y) or senior (≥5 y). NM staff members 5 and 6, both hired in 2013, showed a large decline in percentage of findings, from greater than 30% during the first year to less than 10% after 2 y. Review of the 8 senior NM staff (NM staff members 9–16) also showed a significant change in the error rate over time (P = 0.014), from a mean of 19.8% (SD, 13.9%) in 2012 to 13.1% (SD, 10.8%), 10.7% (SD, 6.2%), 11.0% (SD, 5.8%), and 11.4% (SD, 8.2%), in 2013, 2014, 2015, and 2016, respectively.

View this table:
  • View inline
  • View popup
TABLE 7

NM Staff Members, 2012–2016

Prevention of errors is essential to performance improvement in any endeavor. Recently reported results from the Australian Radiation Incident Register demonstrate that in 85.6% of NM incidents, the primary cause was failure to comply with time-out protocols, with incorrect radiopharmaceutical being the most common error (10). In our institution, technical peer review has led to implementation of a prestudy checklist unique to each examination type, and mandatory time-out protocols are in place for all therapeutic and quality-management-program procedures.

Peer review findings should be discussed in a group setting so that lessons can be shared and specific elements of study performance can be presented as teaching points, as well as to provide an ongoing learning experience for staff. In our institution, errors are reviewed in detail with the technical staff at regularly scheduled meetings, taking care not to disclose individual staff member identities. Assessment of findings by study type allows NM staff as a group to recognize pitfalls that are study-specific, and applicable training sessions can be held with the goal of reducing those errors. Additionally, review of findings by each individual NM student and staff member can be used to privately counsel the individual and guide remedial actions, when needed, to reduce error. This can be a tool to show NM staff members exactly what types of errors have been made over the past year so they can concentrate on improving those areas in the future.

CONCLUSION

The peer review system presented here is intended as an example that can be adapted by other NM facilities. Such a system can be used to track the progress of NM students and newly employed NM staff and to provide a mechanism for quality improvement among all NM staff. Technical peer review can be time-consuming, is best performed daily or weekly if possible to avoid a burdensome backlog, and should be performed by a designated experienced NM technologist. The use of a checklist of indicators and a simple scoring system as shown here can standardize and streamline the technical peer review process, making it more efficient, time-effective, and cost-effective. Individual institutions are encouraged to learn from our experience and adapt their own technical peer review process using those elements that are best suited to their needs, with the goal of reducing error.

DISCLOSURE

The views expressed in this article are those of the authors and do not reflect the official policy of the Department of the Army, Navy, or Air Force; the Department of Defense; or the U.S. Government. No potential conflict of interest relevant to this article was reported.

Footnotes

  • Published online Aug. 10, 2017.

REFERENCES

  1. 1.↵
    1. Allen TC
    . Medical peer review. Pathol Case Rev. 2012;17:148–156.
    OpenUrl
  2. 2.↵
    Medicare Improvements for Patients and Providers Act of 2008, section 135 (e): imaging provisions—accreditation requirement for advanced diagnostic imaging services. Public Law. 2008:110–275.
  3. 3.↵
    1. Mahgerefteh S,
    2. Kruskal JB,
    3. Yam CS,
    4. Blachar A,
    5. Sosna J
    . Peer review in diagnostic radiology: current state and a vision for the future. Radiographics. 2009;29:1221–1231.
    OpenUrlCrossRefPubMed
  4. 4.↵
    SNMMI-TS Scope of Practice Task Force. Nuclear medicine technologist scope of practice and performance standards. J Nucl Med Technol. 2017;45:325–336.
    OpenUrlPubMed
  5. 5.↵
    The IAC Standards and Guidelines for Nuclear/PET Accreditation. Ellicott City, MD: Intersocietal Accreditation Commission; 2016 (section 2.1.2C):50.
  6. 6.↵
    1. Jackson VP,
    2. Cushing T,
    3. Abujudeh HH,
    4. et al
    . RADPEER scoring white paper. J Am Coll Radiol. 2009;6:21–25.
    OpenUrlCrossRefPubMed
  7. 7.↵
    1. Kohn LT,
    2. Corrigan JM,
    3. Donaldson MS
    , eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 1999.
  8. 8.↵
    1. Larson DB,
    2. Nance JJ
    . Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology. 2011;259:626–632.
    OpenUrlCrossRefPubMed
  9. 9.↵
    1. Swanson JO,
    2. Thapa MM,
    3. Iyer RS,
    4. Otto RK,
    5. Weinberger E
    . Optimizing peer review: a year of experience after instituting a real-time comment-enhanced program at a children’s hospital. AJR. 2012;198:1121–1125.
    OpenUrlCrossRefPubMed
  10. 10.↵
    1. Kearney N,
    2. Denham G
    . Recommendations for nuclear medicine technologists drawn from an analysis of errors reported in Australian radiation incident registers. J Nucl Med Technol. 2016;44:243–247.
    OpenUrlAbstract/FREE Full Text
  • Received for publication July 5, 2017.
  • Accepted for publication August 7, 2017.
PreviousNext
Back to top

In this issue

Journal of Nuclear Medicine Technology: 45 (4)
Journal of Nuclear Medicine Technology
Vol. 45, Issue 4
December 1, 2017
  • Table of Contents
  • About the Cover
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Nuclear Medicine Technology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Technical Peer Review: Methods and Outcomes
(Your Name) has sent you a message from Journal of Nuclear Medicine Technology
(Your Name) thought you would like to see the Journal of Nuclear Medicine Technology web site.
Citation Tools
Technical Peer Review: Methods and Outcomes
Andrew M. Keenan, Toni Cranston, Kelsey Hill, Derek J. Stocker
Journal of Nuclear Medicine Technology Dec 2017, 45 (4) 309-313; DOI: 10.2967/jnmt.117.198473

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Technical Peer Review: Methods and Outcomes
Andrew M. Keenan, Toni Cranston, Kelsey Hill, Derek J. Stocker
Journal of Nuclear Medicine Technology Dec 2017, 45 (4) 309-313; DOI: 10.2967/jnmt.117.198473
Twitter logo Facebook logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • DISCLOSURE
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Assessment of 99mTc-Succimer Residual Activity Using Inert Nonreactive Syringes
  • Radioactivity Decontamination of Materials Commonly Used as Surfaces in General-Purpose Radioisotope Laboratories
Show more Practice Management

Similar Articles

Keywords

  • peer review
  • performance improvement
  • nuclear medicine technology
SNMMI

© 2025 SNMMI

Powered by HighWire