Data is a valuable currency. Unfortunately, where there is something to be had, criminals often lurk. Recently, over 100,000 stolen credentials for the ChatBot OpenAI ChatGPT were offered on the dark web. These marketplaces are areas of the so-called “Darknet” – parts of the Internet that are hidden from normal search engines and can only be accessed with special browsers. 12,632 of these stolen data come from India alone.

What exactly happened?

So-called info stealers are behind the attacks. They can read passwords, cookies, credit card details and other important information from browsers.

Cyber ​​criminals have gained unauthorized access to ChatGPT accounts. ChatGPT is a so-called “advanced chatbot” that is able to communicate in natural language through artificial intelligence and machine learning. The criminals used special software tools to steal login credentials and other sensitive data. This stolen data was then made available for purchase on the dark web.

The cybercriminals' successes suggest that users do not use one-time passwords or have not activated two-factor authentication. The anonymity on the Dark Web makes it difficult for law enforcement to stop cybercriminal activity.

What are the risks?

The biggest danger is that the stolen data may contain other personal information stored in the chat logs. For example, if you entered personal information, passwords, or private code fragments into ChatGPT, this information could fall into the hands of cybercriminals.

The possible consequences: What could criminals do with this data?

There are several worrying ways this data could be misused:

  1. Identity theft: Criminals could use your personal information to impersonate you, open credit cards in your name, or commit other fraudulent acts.
  2. Access to other accounts: If you have entered passwords or credentials for other online accounts into ChatGPT, criminals could use this information to gain access to these accounts.
  3. Corporate sabotage: If you have entered code fragments or credentials relevant to your company into ChatGPT, criminals can use this information to harm your company.

Conclusion: It is better to be safe than sorry

This case shows how important it is to handle our data carefully. Entering sensitive data into a chatbot or online platform is always a bad idea. Even if you think your data is safe, there is always a risk that it could fall into the wrong hands. So be careful and always remember: your data is valuable and worth protecting.

Source: The Hacker News

In line with this topic:


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )