New research highlights how cybercriminals may be able to misuse OpenAI’s ChatGPT, especially its real-time voice API, for financial scams. ChatGPT – a giant in helping users accomplish different tasks and provide answers – has now turned into a deadly tool for some scammers. Researchers at the University of Illinois Urbana-Champaign (UIUC) warn that scammers could use this advanced AI tool to undertake tasks such as unauthorised bank and crypto transfers and credential theft.
The UIUC study shows that ChatGPT-4o's real-time voice API could be bent to help cybercriminals replicate fraudulent impersonation. In one simulation of scams conducted by the researchers, they attempted to create a setup where they would pose as gullible customers. They tested how effectively the voice API achieved its goals of phishing and navigation down with a classic bait switch.
The researchers revealed that while success rates tended to vary between 20% to 60%, scams could involve as many as 26 interactions, lasting on average less than three minutes. Whereas the success rate of bank transfers tended to be lower, due to the added intricacies of navigation, obtaining credentials from Gmail or Instagram exhibited greater success rates of 60% and 40% claims, respectively.
The research also showed that the scams are fairly cheap to execute. Credential under-thieving scams were found to cost, on average, as little as $0.75 (roughly Rs 63), while other more time-consuming scams, such as bank transfer scams, could cost around $2.51 (about Rs 211). With AI advancement, these meagre barriers to entry also raise hints of further-sophisticated scams in the near future.
Reacting to the findings, OpenAI said that it regularly updates ChatGPT's safeguards to make them more resistant to malicious misuse, especially when it comes to things like financial scams. The company said that there's a strong need for academic research to investigate these issues and that papers like the one out of UIUC will be used to help improve the model in the future by making it more resilient to being manipulated.