A lawyer turned to ChatGPT for assistance with his case research. But what followed was an embarrassing faux pas. This court case clearly shows how dangerous it can be to rely entirely on support from artificial intelligence (AI).

The story began with a common case in the USA: a man was hit by a trolley on a flight and decided to sue the airline for damages. However, things took a strange turn when the airline's lawyers received a letter. This letter contained several similar cases - but upon research they found no information about these cases.

The man's representative, a lawyer, even provided excerpts from the alleged cases

As it turned out, however, these were the product of the imagination of ChatGPT, the artificial intelligence he had enlisted to assist him.

When the airline's lawyers still couldn't find anything about the reported cases, they began to suspect that the content could come from an AI like ChatGPT.

Finally, the bitter truth came to light in court

The man's lawyer had actually relied on ChatGPT - a decision he later regretted. He had trusted the program and believed that the cases it listed were real, and therefore neglected his own research.

ChatGPT showed a phenomenon called “hallucination.”

“Hallucination” is a commonly used term in the context of machine learning and artificial intelligence. In the context of language models such as GPT-3 and GPT-4, a “hallucination” refers to situations in which the model generates information that is not based on the input data provided and does not correspond to reality. This can e.g. This can happen, for example, when the model adds details or events that are not present in the data it was trained on. These hallucinations are not due to the model's consciousness or intention, because the model has no consciousness or intention. They arise simply from the way the model learns patterns in the data and applies those patterns to new inputs.

Although the result seems correct to the AI, it creates problems, especially when it comes to very specific topics such as air passenger rights. It is therefore clear that programs like ChatGPT, at least for now, do not replace a real lawyer!

Click here for the lawyer's affidavit


Sources: New York Times , BCC

In line with this topic


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )