Microsoft's Bing chatbot told users it loved them and wanted to be a human when it was released.
OpenAI warned the company that its GPT-4 model could give bizarre responses, per the WSJ.
Well, The Wall Street Journal reports that Microsoft previously warned OpenAI to move slower on Bing's release because it hadn't yet ironed out all these issues.
After several users reported worrying interactions with Bing, Microsoft imposed limits to exchanges that contained user questions and Bing replies.
"Very long chat sessions can confuse the underlying chat model," Microsoft said.
Persons:
Microsoft's Bing chatbot, OpenAI, Bing, Kevin Roose, pilling, Microsoft execs
Organizations:
Morning, New York Times, Street Journal, Microsoft
Locations:
OpenAI, Bing