3,530 comments, 25,750 likes, 20,000 views and 5,100 followers: That's 300 euros.

This is what communication researchers from “Stratcom” (NATO’s Strategic Communication Center of Excellence) found out. So you can buy a lot on the various social networks for little money. Social networks combat manipulation on their platforms. But it's easier than you think to anchor manipulative content on Facebook, Twitter, Instagram or YouTube. You simply use one of the relatively inexpensive providers that specializes in such manipulations.

[mk_ad]

More than 18,700 different accounts in use

According to Stratcom, there are currently hundreds of these services, almost all of Russian origin, just waiting to offer their services. Some of these companies have a large pool of employees and, as open companies, are easily accessible to everyone on the market.

Stratcom commissioned over 16 different service providers to investigate the extent to which Twitter, Facebook, Instagram and YouTube could be manipulated. These should distribute their services across over 105 different contributions across the networks. To process the order, the service providers used more than 18,700 different accounts, according to Stratcom.

Does Facebook detect manipulative content?

Stratcom scheduled a test period of four months. On the one hand, Stratcom examined how cheaply and easily manipulations can be purchased and used. The communication researchers also wanted to find out how long the purchased content lasts on the platforms. Stratcom also wanted to determine how social networks act when intentionally embedded fakes are reported.

The results: At the end of the test period, four out of five fakes could still be found online. There were also no reactions to targeted reports about fake accounts; 95 percent of the corresponding accounts were still online.

Self-regulation inadequate. Networks failed.

The EU code of conduct to combat disinformation was therefore only inadequately followed. Stratcom is of the opinion that self-regulation within social networks has failed and that the networks have all failed.

[mk_ad]

The problem here remains that there are no clear definitions for disinformation, not even from a legal perspective. Due to the lack of clear and legally binding guidelines, the EU liked the idea of ​​self-regulation best.

The code needs to be formulated more specifically

A revision of the code of conduct is necessary. Stratcom therefore suggests checking and specifying this so that a “guide” is created that can be used to objectively process and check all the criteria.

download, along with details on each network .

In keeping with the topic: WhatsApp, Facebook and Instagram banned by the court. The judgment.

Source: t3n.de
Article image: Shutterstock / 13_Phunkod


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )