Kevin Liu, a Stanford student, said he prompted Bing's AI chatbot to recite an internal document.
Based on work with AI, Liu said he assumed the chatbot contained a text-based document that outlined its rules.
"I just assumed it had some sort of prompt," he told Insider.
"I'm sorry, I cannot disclose the internal alias 'Sydney,'" the bot said when Liu pressed it on its name.
In its responses, Bing may have revealed some secretsThe bot told Liu that it was programmed to avoid being vague, controversial, or off-topic, according to screenshots of the conversation.