London CNN —ChatGPT can be duped into providing detailed advice on how to commit crimes ranging from money laundering to the export of weapons to sanctioned countries, a tech startup found, raising questions over the chatbot’s safeguards against its use to aid illegal activity.
Norwegian firm Strise ran experiments asking ChatGPT for tips on committing specific crimes.
And in another experiment, run earlier this month, ChatGPT produced lists of methods to help businesses evade sanctions, such as those against Russia, including bans on certain cross-border payments and the sale of arms.
Strise sells software that helps banks and other companies combat money laundering, identify sanctioned individuals and tackle other risks.
“It’s like having a corrupt financial adviser on your desktop,” Rødevand said on the company’s podcast last month, describing the money laundering experiment.
Persons:
London CNN —, Strise, ChatGPT, Handelsbanken, Marit Rødevand, Strise’s, ”, OpenAI, “, Rødevand, “ We’re, Europol, Olesya Dmitracova
Organizations:
London CNN, Strise, CNN, Russia
Locations:
Russia, Nordic, Norway