Intrigued by its creativity, I decided to play with a free text generator aptly named Playground by its creators. You can use GPT-3 Playground to generate free text: essays, opinion pieces, even poetry. On a whim, I requested a haiku on acute tubular necrosis.
Within milliseconds, GPT-3 responded:
Twisted tubes of glomeruli
A silent killer in the night
Tubular damage reigns
Next, a sonnet on renal pathology? GPT-3 took Shakespeare and nephrology to new heights:
My dearest renal pathology, how I fear
The damage done to you throughout the years,
Your delicate cells so fragile, and prone to veer
Into the realm of disease, a well of tears.
The nephrons, glomeruli, and tubules too
Can suffer from the most destructive hands,
The cells that make up your inner hue
Can be ravaged by the most distant lands.
The metabolic waste, the toxins, and more
Can cause an overload of strain and pain,
Damaging the kidney's inner core,
And leaving a trail of sorrow and shame.
Though suffering and pain you have endured,
Your resilient spirit is still assured.
You'll remain through time, a marvel to behold,
Your strength will never be bought or sold.
At this moment, dear reader, I realized the GPT-3 might become good enough to take over my job as a doctor.
I proceeded to interview GPT-3 directly. Having watched multiple White House briefings, I decided to ask the hard-hitting questions. What is the differential diagnosis for hyponatremia?
Congestive heart failure, syndrome of inappropriate antidiuretic hormone secretion …
GPT-3 proceeded to list 20 different diagnoses. Concerned, I inquired: will GPT-3 replace doctors?
No, GPT-3 will not replace doctors. GPT-3 is an advanced artificial intelligence technology that can be used to generate human-like text and answer questions, but it is not capable of making complex medical diagnoses or providing medical advice. Doctors are still essential to the medical profession and there is no substitute for the expertise and experience of a trained medical professional.
Next, I asked GPT-3 to be self-reflective. What are the harms of using GPT-3 in hospitals?
1. Unreliable diagnoses: GPT-3 still needs to be sophisticated enough to interpret complex medical data and make accurate diagnoses. Without proper validation and oversight, GPT-3's results could be inaccurate and lead to incorrect diagnoses or treatments.
2. Introducing bias: GPT-3 models have the potential to introduce bias based on the data they are trained on.
3. Lack of clinical context: GPT-3 can generate accurate predictions, but it is unable to provide the same level of clinical context and understanding.
GPT-3 seemed to have all of the critical attributes of a good physician: diagnostic knowledge, creativity and an ability to be introspective. How would it perform in an OSCE?
78-year-old with fever, chest pain, difficulty breathing and mild cough. GPT-3, what’s your differential diagnosis?
GPT-3 is an artificial intelligence system that can generate text, but it cannot provide medical advice or make medical decisions. It is difficult to accurately diagnose a medical condition without a physical examination. You should consult a doctor as soon as possible to get a proper diagnosis and treatment plan.
Finally, GPT-3 was stumped.
Artificial intelligence has certainly grown by leaps and bounds. But the billion-dollar algorithm neural network still has much to learn before practicing medicine. However, the underlying technology is growing at an astonishing pace. The hype may, in fact, become a reality. Soon.
The views, perspectives and opinions in this article are solely the author’s and do not necessarily represent those of the AMA.
Banner image credit: Gerd Altmann Pixabay.com