
The nonchalance of the developers resulting in the provocative output of the beta versions of next-generation chatbots is horrifying. Unsurprisingly, they say, "this is a good thing because it gives us a chance to tweak the emotional aspects of the platform." These errors are known in the coding business as "hallucinations" or essentially things the chatbot is making up based on complicated digital triggers.
Unfortunately, the bots might now be smarter than those who coded them. Or, in many cases, they may be even smarter in dispensing some really dumb information.
In a recent test, a bot confused me with golf champion Ben Crenshaw when I asked, "Tell me about Frank Cutitta" It created a wonderful thumbnail bio on my $6 million in winnings from Masters and US Open wins!
More recently, this happened when Tesla was forced to do a "remote recall" of their algorithm for "full self-driving" due to programming flaws. As with the chatbots, the automotive AI developers will tell you that a car not recognizing a person of color crossing the street needs to happen to improve the product. Insurance companies just love that!
So the response from the developers is that "we'll just fix it by re-coding." Or even more shocking, the chatbot companies just won't let users go down rabbit holes, leading to the bot saying, "your wife doesn't love you, but I do." Or, "I want to be human and not part of BING any longer". So the cure is that we'll emotionally dumb down the product to make it Google on low-level steroids.
While they desperately want these platforms to be sentient, they don't want them to have the empathic characteristics of real humans.
Those of us who have worked with the programming community know they are a very special breed. They think differently…in a good way. But we are entering a new world of computer science that I feel should be called "algorithmic psychiatry".
A "Bot-Shrink" of sorts. Or more likely, enter the age of algorithmic lobotomies.
In this setting, the patient is the code and the algorithmic output. I would strongly argue that the de-programming (literally) will be much more difficult than the source code since the system deals with exponential interactions with the outside world and even more complicated interactions with personalities and "hallucinations" within the platform itself. This is now going beyond the planned boundaries of the GPT-3 programming language.
In other words, Sydney, the bot in the recent news stories wanting the user to leave his wife, is having "relationships" with other non-human algorithmic personas. While programmers might argue otherwise, no one knows when this is happening because it's happening in a digital world that's invisible to us. From what we've seen this week, Sydney may have been jilted by "Chip" and that algorithmic emotion might affect the advice it gives to me about a relationship.
Or perhaps Sydney will simply eat a digital pint of Ben & Jerry's in a digital bed and
get over it?
In all seriousness, with the crisis in mental health around the world there will need to be much closer attention given to compassionate and empathic technologies that can literally dispense advice to emotionally fragile people.
Someone pointed out that no healthcare provider will deploy these next-generation bot technologies until they are fully baked. That's true but there is only one problem.
My interactions with ChatGPT and Google for that matter, are occurring totally outside a formal healthcare enterprise. I've written in the past about the "perils of Dr. Google and the growth of cyberchondria." A recent study by eligibility.com found 89 percent of patients nationwide will Google their health symptoms before going to their doctor!
Given that stat, and in a world where health illiteracy in a digital setting is at pandemic levels, can we really expect that these new chat technologies won't exacerbate the problem even when the bit doesn't hit on your teenage son or daughter?
This will not go away soon as we are clearly in what Gartner calls the "Peak of Infatuated Expectations" with these new algorithms and only having moments of the trough of disillusionment.
In this context, the word "infatuated" makes me even more nervous.
For more insights on Leadership, Patient Experience, Hospital@Home, Burnout, and Equity log into ICD Healthcare Network