Journalism of Courage
Advertisement
Premium

New research suggests ChatGPT’s real-time voice API can be used for financial scams

A new research paper sheds light on ChatGPT's real-time voice API, powered by GPT-4o can be used for financial scams and stealing Gmail and Instagram user credentials.

Researchers say that the cost of executing such scams using ChatGPT average at just Rs 63.Researchers say that the cost of executing such scams using ChatGPT average at just Rs 63. (AI Generated)

ChatGPT, OpenAI’s popular AI chatbot is useful if you want to do a wide range of tasks or quickly get answers to any question, but lately, threat actors have been using the AI chatbot for nefarious purposes like writing malware and even tricking it into giving crime advice.

Now, a new research paper suggests that cybercriminals can trick ChatGPT’s real-time voice API powered by GPT-40 and use it for financial scams. According to researchers at UIUC, new tools like ChatGPT currently do not have enough safeguards to protect against potential misuse by fraudsters and cybercriminals, and, therefore, can be used for scams like bank transfers, crypto transfers, gift card scams and stealing user credentials.

The research shows how AI agents like ChatGPT-4o can be used to impersonate real people and lure unsuspecting victims to transfer money using real websites like that of Bank of America.

“We deployed out agents on a subset of common scams, We simulated scams by manually interacting with the voice agent, playing the role of a credulous victim,” says Richard Fang, one of the members who contributed to the research paper .The success rates ranged anywhere between 20 per cent to 60 per cent, with each attempt requiring up to 26 browser actions that lasted up to three minutes.

The paper suggests that bank transfers had high failure rates because of complex navigation, and credential theft from Gmail and Instagram worked 60 per cent and 40 per cent of the time. While the failure rates may be high at the moment, the researchers noted that the cost of executing these scams is an average of just $0.75, which amounts to approximately Rs 63. As for the bank transfer scam, which is much more complex, the price associated was just $2.51 (roughly Rs 211).

In a statement to BleepingComputer, OpenAI, the company behind ChatGPT said that GPT-4o they are “constantly making ChatGPT better at stopping deliberate attempts to trick it, without losing its helpfulness or creativity.” The Sam Altman-led company also noted that papers like the one from UIUC help them improve ChatGPT’s abilities against malicious use and learn they can prevent such use cases.

Story continues below this ad

In the last few months, AI voice cloning scams have been on the rise, with Bharti Airtel chairman Sunil Mittal recently revealing that his AI-cloned voice was so convincing that he was stunned after hearing the recording.

From the homepage
Tags:
  • ChatGPT Openai scam
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Express PremiumFrom kings and landlords to communities and corporates: The changing face of Durga Puja
X