Scientists achieve almost 100 percent security using artificial intelligence. New method for deepfake detection.

Researchers at the Thapar Institute of Engineering and Technology and the Indraprastha Institute of Information Technology have developed a method that reliably detects deepfake videos. To do this, they use the same technology that was used to produce the counterfeits: artificial intelligence. The success rate was 98.21 to 99.62 percent – ​​and that with a short training time.

Detection on two levels

The algorithm has two levels of detection. In the first, videos are slightly edited, such as zoomed. In the second, two techniques are used - first a “convolutional neural network”, a type of convolutional neural network. It is a concept in the field of machine learning inspired by biological processes.

The second technique is called “Long short-term memory”. This is used to train artificial neural networks. It remembers misjudgments and is only satisfied when the correct result is available, i.e. the decision as to whether a video is real or fake.

[mk_ad]

First, the Indian researchers had to train their algorithm. For this purpose, they created a dataset with 200 videos of similar-looking couples of politicians. 100 of them were real and the other 100 were manipulated using deepfake technology. The developers labeled some of them as real or fake.

This helped the algorithm to make independent decisions. The researchers tested the new method on 181,608 examples of real and fake videos. This means that social media may have a reliable tool for identifying and deleting often dangerous fakes.

You might also be interested in: Deep Fake: Chancellor Sebastian Kurz as a crooner

Source: press text
Article image: Shutterstock / By metamorworks

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )