As more stolen ChatGPT Premium accounts are traded, cybercriminals can circumvent OpenAI’s geofencing restrictions and gain unrestricted access to ChatGPT, according to Check Point Research (CPR).
One of the most thriving markets in the hacker underworld and on the dark web is account takeovers (ATOs) or stolen accounts to various online services.
Historically, this industry concentrated on stolen emails, social media, online dating sites, and financial services accounts (banks, online payment systems, etc.).
Geofencing limitations are imposed by ChatGPT on access to its platform from specific nations, such as Iran, China, and Russia.
“ChatGPT accounts store the recent queries of the account’s owner. So when cybercriminals steal existing accounts, they gain access to the queries from the account’s original owner. This can include personal information, details about corporate products and processes, and more” Check Point.
Trade of Stolen Accounts of ChatGPT
Cybercriminals frequently use the fact that consumers reuse the same password on various platforms.
Using this information, bad actors launch an attack on a particular online platform to find the credentials that match the login to the platform by loading sets of combinations of emails and passwords into specialized software (also known as an account checker).
A malicious actor seizes control of an account without the account holder’s consent in a final takeover.
Researchers say those accounts are sold; however, some actors also distribute the stolen ChatGPT premium accounts for free to promote their services or tools to steal them.
A web testing suite called SilverBullet enables users to send requests to a target web application. This software can be used for various tasks, including data scraping and parsing, automated pen testing, unit testing using Selenium, and more.
Cybercriminals also frequently use this tool to conduct credential stuffing and account-checking attacks against different websites and thus steal accounts for online platforms.
“As SilverBullet is a configurable suite, to make a checking or bruteforcing attack against a certain website requires a “configuration” file that adjusts this process for a specific website and allows cybercriminals to steal account of this website in an automated way,” Check Point research.
Another cybercriminal, calling himself “gpt4”, focuses solely on fraud and abuse against ChatGPT products. He advertises ChatGPT accounts for sale in his threads, along with a setup for another automated tool that verifies credentials.
An English-speaking online criminal began promoting a ChatGPT Plus lifetime account service on March 20th, guaranteeing customers’ total satisfaction.
“The lifetime upgrade of regular ChatGPT Plus account (opened via email provided by the buyer) costs $59.99 (while OpenAI’s original legitimate pricing of this services is $20 per month).
However, to reduce the costs, researchers explain that this underground service also offers an option to share access to ChatGPT account with another cybercriminal for $24.99, for a lifetime”.
Struggling to Apply The Security Patch in Your System? –