Attention: Restrictions on use of AUA, AUAER, and UCF content in third party applications, including artificial intelligence technologies, such as large language models and generative AI.
You are prohibited from using or uploading content you accessed through this website into external applications, bots, software, or websites, including those using artificial intelligence technologies and infrastructure, including deep learning, machine learning and large language models and generative AI.

ARTIFICIAL INTELLIGENCE The Role and Accuracy of Generative Pretrained Transformer Tools in Men’s Health: Bridging the Gap

By: Rohit Reddy, MD, University of South Florida–HCA, Brandon; Roei Golan, BS, Florida State University College of Medicine, Tallahassee; Ranjith Ramasamy, MD, Desai Sethi Urology Institute, University of Miami Miller School of Medicine, Florida | Posted on: 19 Jan 2024

Purpose

The rapid evolution of generative pretrained transformers (GPTs) has ushered in a new era of information dissemination, particularly in the healthcare sector. This article delves into the potential of GPT tools in delivering men’s health information, emphasizing the accuracy, benefits, risks, and current research status.

Introduction

GPTs, a subset of large language models, have shown promise in various applications, from language translation to creative content generation. Their integration into health care has been particularly noteworthy, offering patients personalized information and support. However, the accuracy of these tools, especially concerning sensitive topics like men’s health, warrants further investigation.

Benefits of GPT in Men’s Health

GPTs offer a personalized approach, tailoring information based on a patient’s medical history and symptoms. Their ability to translate medical data into various languages and formats enhances accessibility, addressing barriers like language and disability.1 Furthermore, GPTs serve as a source of support, answering queries and offering encouragement throughout treatment phases.2 For urologists who want the latest information on restorative therapies for Peyronie’s, they can employ GPTs to swiftly gather updated research articles, potential treatment advancements, and patient testimonials. By harnessing the vast database accessed by GPTs, urologists can stay at the forefront of medical innovations in the field of Peyronie’s disease. Collaboration and continuous investigation, along with regular updates of reliable GPT information, will be crucial in maintaining the efficacy and accuracy of these tools for medical professionals.

Personalized health information: Tools such as Ada Health3 and K Health4 have been designed to assimilate a patient’s medical history and symptoms, subsequently generating tailored information about their condition, potential treatments, and prognosis.

Accessibility: Platforms like Babylon Health5 and HealthTap6 have the capability to translate medical data into various languages and formats, including text, audio, and video. This not only broadens the reach of the information but also makes it more accessible to non-English speakers and individuals with disabilities.

Support and encouragement: Solutions like GlassAI’s Clinical Assistant7 and MedChat8 stand out by offering patients motivational messages or addressing their treatment-related queries, providing a pillar of support throughout their treatment and recovery phase.

Historically, men have been less likely to seek medical care, especially for sensitive topics related to sexual health. Societal stigmas and personal apprehensions often prevent men from initiating health discussions. GPTs offer a transformative solution to this challenge. By providing a discreet and immediate platform, men can initiate discussions about their health concerns, symptoms, and potential next steps. A man seeking care for erectile dysfunction can utilize GPTs to input his specific symptoms and receive an initial understanding of potential causes, treatments, and next steps.9 This tailored information can serve as a preliminary guide before an in-person consultation, helping to alleviate any anxieties or misconceptions. While this innovative method of accessing information equips individuals with valuable knowledge, we hope it will also act as a catalyst, motivating them to ultimately consult medical experts, rather than avoiding it.

Risks and Limitations of GPT in Men’s Health

Accuracy, bias, and misleading information: GPTs, trained on vast datasets, may not always provide up-to-date or accurate advice. They can also reflect biases from their training data, leading to potentially skewed or culturally insensitive guidance. Additionally, while they can generate technically correct information, they might sometimes lack context, be misleading, or misinterpret reliability and quality of certain texts.

Data privacy and storage concerns: Engaging with GPTs often involves sharing personal health details. There’s a risk associated with how this data is stored, managed, and potentially accessed. While many platforms prioritize user privacy, the digital nature of these tools always carries a degree of risk.10

Regulatory, ethical, and overreliance issues: The integration of artificial intelligence (AI) in health care is still navigating regulatory waters, and clear guidelines might not be present in all regions. This can lead to ethical dilemmas, especially if there’s a conflict between AI-generated advice and established medical practices. Moreover, there’s a concern that individuals might overly rely on GPTs, sidelining the nuanced care that medical professionals provide.11

While GPTs offer a novel avenue for accessing men’s health information, it’s essential to use them judiciously, always corroborating their advice with trusted health care sources.

Our Recommendations

Leverage personalization: Utilize tools for tailored health information based on individual medical history and symptoms.

Enhance accessibility: Opt for platforms to access translated and varied format information, catering to diverse user needs.

Seek support: Use platforms for motivational messages and treatment-related queries.

Cross-reference information: Always corroborate GPT-generated advice with trusted medical sources or professionals.

Prioritize data privacy: Engage with platforms that emphasize user data privacy and understand their data storage and management policies.

Stay updated on regulations: Be aware of regional regulatory and ethical guidelines governing AI in health care.12

Avoid overreliance: Use GPTs as supplementary tools, not replacements for professional medical consultations.13

Conclusion

GPT tools hold significant promise in revolutionizing men’s health information dissemination. Their potential to provide accurate, personalized information can address the traditional reluctance many men feel toward discussing health issues. However, as with all tools, careful and informed application is crucial.

  1. Golan R, Reddy R, Muthigi A, Ramasamy R. Artificial intelligence in academic writing: a paradigm-shifting technological advance. Nat Rev Urol. 2023;20(6):327-328.
  2. Golan R, Ripps SJ, Reddy R, et al. ChatGPT’s ability to assess quality and readability of online medical information: evidence from a cross-sectional study. Cureus. 2023;15(7):e42214.
  3. Ada Health. Ada. Accessed October 1, 2023. https://ada.com
  4. K Health. K Health. Accessed October 5, 2023. https://khealth.com
  5. Babylon Health. Babylon. Accessed October 2, 2023. https://www.babylonhealth.com
  6. HealthTap. HealthTap. Accessed October 4, 2023. https://www.healthtap.com
  7. Glass Health. AI in Healthcare. Accessed October 3, 2023. https://glass.health/ai/
  8. MedChat. MedChat. Accessed October 6, 2023. https://medchatapp.com/
  9. Cascella M, Montomoli J, Bellini V, Bignami E. Evaluating the feasibility of ChatGPT in healthcare: an analysis of multiple clinical and research scenarios. J Med Syst. 2023;47(1):33.
  10. Haupt CE, Marks M. AI-generated medical advice—GPT and beyond. JAMA. 2023;329(16):1349-1350.
  11. Shen Y, Heacock L, Elias J, et al. ChatGPT and other large language models are double-edged swords. Radiology. 2023;307(2):e230163.
  12. WHO calls for safe and ethical AI for health. World Health Organization. May 16, 2023. Accessed May 29, 2023. https://www.who.int/news/item/16-05-2023-who-calls-for-safe-and-ethical-ai-for-health
  13. Nazario-Johnson L, Zaki HA, Tung GA. Use of large language models to predict neuroimaging. J Am Coll Radiol. 2023;20(10):1004-1009.

advertisement

advertisement