In a high school in the US state of New Jersey, it became known that male students used AI software to create and distribute supposedly nude photos of female classmates. This led to extensive police investigations.

Digital bullying (here: nude pictures) and the misuse of AI technology

The affected students and their parents were hit hard by this form of digital bullying. It is a modern form of defamation and misuse of image editing technologies that required an immediate response from the school administration. The school responded by informing parents and asking the students involved to delete the images.

Legal challenges and gaps in protection for nude images

Although large technology companies such as OpenAI or Adobe do not allow the creation of pornographic material through their platforms, there are smaller providers that have less restrictive policies when creating nude images. These providers offer tools that can swap faces and create so-called “deepfake” pornography. The legal situation in the USA is currently not clear, as there are no federal regulations and only a few states have laws against the distribution of such fake content.

Statistics and the need for action

According to Sensity AI, over 90 percent of AI-generated fakes are pornographic in nature. The Biden administration recently advocated for restrictions on such AI generators to prevent the creation of material depicting child sexual abuse or non-consensual intimate images. This is a move that shows the government has recognized the problem and is taking action.

Existing precedents

A 22-year-old was sentenced to six months in prison in New York state for posting fake images of women. This case shows that there are legal options to punish such acts, but enforcing and finding appropriate punishments is still a challenge.

Although the incident discussed took place in New Jersey, USA, the risks and challenges it presents are not limited to one region. In German-speaking countries, where the use of AI tools is also widespread, there is also a risk that such technologies will be misused. It is important to be aware of these risks and take preventive measures accordingly.

Prevention and awareness

To prevent this, education about digital behavior and media literacy is crucial. Users should carefully consider which images of themselves they share online and how they could potentially be misused. It is also advisable to configure privacy settings on social media strictly so that photos are not publicly accessible. Schools and parents should inform the younger generation about the ethical and legal aspects of image manipulation and distribution.

Instructions for action if affected

If someone in German-speaking countries is affected by a similar incident, it is important to act quickly and decisively. The first step should be to secure proof of the images and immediately contact the platform on which they were shared to request their removal. It is then advisable to seek legal advice and file a criminal complaint, as creating and distributing such images is a punishable offense in many countries. Data protection authorities and specialist organizations can provide additional support and advice.

Conclusion:

The New Jersey incident highlights the dark areas when dealing with AI-generated images and the need for clearly defined legal regulations. It's a wake-up call for lawmakers, education officials and technology providers to work together to find solutions to protect citizens' privacy and well-being.

Source: wsj

So that you are always up to date on the latest developments in the digital space, we recommend subscribing Mimikama newsletter You can also use our media education offering to strengthen your digital skills.

Already read?

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )