Skip to main content

Main menu

  • Home
  • Content
    • Current
      • JNMT Supplement
    • Ahead of print
    • Past Issues
    • Continuing Education
    • JNMT Podcast
    • SNMMI Annual Meeting Abstracts
  • Subscriptions
    • Subscribers
    • Rates
    • Journal Claims
    • Institutional and Non-member
  • Authors
    • Submit to JNMT
    • Information for Authors
    • Assignment of Copyright
    • AQARA Requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
    • Corporate & Special Sales
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • SNMMI
    • JNMT
    • JNM
    • SNMMI Journals
    • SNMMI

User menu

  • Subscribe
  • My alerts
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Nuclear Medicine Technology
  • SNMMI
    • JNMT
    • JNM
    • SNMMI Journals
    • SNMMI
  • Subscribe
  • My alerts
  • Log in
  • My Cart
Journal of Nuclear Medicine Technology

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • Continuing Education
    • JNMT Podcast
    • SNMMI Annual Meeting Abstracts
  • Subscriptions
    • Subscribers
    • Rates
    • Journal Claims
    • Institutional and Non-member
  • Authors
    • Submit to JNMT
    • Information for Authors
    • Assignment of Copyright
    • AQARA Requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
    • Corporate & Special Sales
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • Watch or Listen to JNMT Podcast
  • Visit SNMMI on Facebook
  • Join SNMMI on LinkedIn
  • Follow SNMMI on Twitter
  • Subscribe to JNMT RSS feeds
Research ArticleSpecial Contribution

Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3

Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk and Eric Rohren
Journal of Nuclear Medicine Technology June 2025, 53 (2) 162-168; DOI: https://doi.org/10.2967/jnmt.124.268359
Geoffrey Currie
1Charles Sturt University, Wagga Wagga, New South Wales, Australia;
2Baylor College of Medicine, Houston, Texas;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Johnathan Hewis
3Charles Sturt University, Port Macquarie, New South Wales, Australia; and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Elizabeth Hawk
4Stanford University, Stanford, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Eric Rohren
1Charles Sturt University, Wagga Wagga, New South Wales, Australia;
2Baylor College of Medicine, Houston, Texas;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Disparity among gender and ethnicity remains an issue across medicine and health science. Only 26%–35% of trainee radiologists are female, despite more than 50% of medical students’ being female. Similar gender disparities are evident across the medical imaging professions. Generative artificial intelligence text-to-image production could reinforce or amplify gender biases. Methods: In March 2024, DALL-E 3 was utilized via GPT-4 to generate a series of individual and group images of medical imaging professionals: radiologist, nuclear medicine physician, radiographer, nuclear medicine technologist, medical physicist, radiopharmacist, and medical imaging nurse. Multiple iterations of images were generated using a variety of prompts. Collectively, 120 images were produced for evaluation of 524 characters. All images were independently analyzed by 3 expert reviewers from medical imaging professions for apparent gender and skin tone. Results: Collectively (individual and group images), 57.4% (n = 301) of medical imaging professionals were depicted as male, 42.4% (n = 222) as female, and 91.2% (n = 478) as having a light skin tone. The male gender representation was 65% for radiologists, 62% for nuclear medicine physicians, 52% for radiographers, 56% for nuclear medicine technologists, 62% for medical physicists, 53% for radiopharmacists, and 26% for medical imaging nurses. For all professions, this overrepresents men compared with women. There was no representation of persons with a disability. Conclusion: This evaluation reveals a significant overrepresentation of the male gender associated with generative artificial intelligence text-to-image production using DALL-E 3 across the medical imaging professions. Generated images have a disproportionately high representation of white men, which is not representative of the diversity of the medical imaging professions.

  • generative artificial intelligence
  • bias
  • diversity
  • nuclear medicine
  • radiology

Footnotes

  • Published online Oct. 22, 2024.

View Full Text

This article requires a subscription to view the full text. If you have a subscription you may use the login form below to view the article. Access to this article can also be purchased.

SNMMI members

SNMMI Member Login

Login to the site using your SNMMI member credentials

Individuals

Non-Member Login

Login as an individual user

PreviousNext
Back to top

In this issue

Journal of Nuclear Medicine Technology: 53 (2)
Journal of Nuclear Medicine Technology
Vol. 53, Issue 2
June 1, 2025
  • Table of Contents
  • About the Cover
  • Index by author
  • Complete Issue (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Nuclear Medicine Technology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3
(Your Name) has sent you a message from Journal of Nuclear Medicine Technology
(Your Name) thought you would like to see the Journal of Nuclear Medicine Technology web site.
Citation Tools
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3
Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk, Eric Rohren
Journal of Nuclear Medicine Technology Jun 2025, 53 (2) 162-168; DOI: 10.2967/jnmt.124.268359

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3
Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk, Eric Rohren
Journal of Nuclear Medicine Technology Jun 2025, 53 (2) 162-168; DOI: 10.2967/jnmt.124.268359
Twitter logo Facebook logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSION
    • DISCLOSURE
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 1: Preliminary Evaluation
  • The Impact of the Coronavirus Disease 2019 Pandemic on the Clinical Environment
Show more Special Contribution

Similar Articles

Keywords

  • generative artificial intelligence
  • bias
  • diversity
  • nuclear medicine
  • radiology
SNMMI

© 2025 SNMMI

Powered by HighWire