At a time when trust in online content is shaky, an NGO called Ekō has uncovered an alarming vulnerability in giant Meta's advertising system. The case raises questions about the safety and integrity of social media advertising and shows that even a giant like Meta is vulnerable to being overlooked.
Ads calling for the execution of a politician or calling for the burning of synagogues are deeply contrary to human values and also violate Meta's clear advertising guidelines . Despite this, some such ads have recently received Meta's approval. According to the NGO Ekō, Meta's content moderation failed in this regard. Of the 13 ads Ekō placed, eight were approved - including ones that described the Spanish elections as 'rigged'.
Meta's Advertising Gate: Ekō's Revealing Experiment
The non-governmental organization Ekō decided to take a bold step: it bought ads on Meta that clearly violated the company's anti-hate speech policies. Surprisingly enough, many of these ads were approved.
In the context of the latest provisions of the Digital Services Act, which Meta is expected to comply with from August 25, 2023, Ekō calls for increased security measures when running advertisements. Ekō emphasizes that it is currently too easy to spread hate speech via meta ads.
During this revelation, not only a worrying gap in Meta's advertising moderation emerged, but also the urgent need to rethink how online advertising is regulated and monitored.

Ekō planned to run 13 ads in Europe through Meta, all of which contained AI-generated graphics and text that clearly violated Meta's policies. Eight of these advertisements were approved. The ads that Meta blocked were flagged and therefore not aired because of their political nature - not specifically because of their strongly anti-people messages. Ekō withdrew the ads before they could air. Meta commented on the organization's findings in a statement:
This report was based on a very small sample of ads and is not representative of the number of ads we review daily across the world. Our ads review process has several layers of analysis and detection, both before and after an ad goes live. We're taking extensive steps in response to the DSA and continue to invest significant resources to protect elections and guard against hate speech as well as against violence and incitement.
Means:
The report is based on a very small sample of advertisements and is not representative of the number of advertisements we review worldwide each day. Our ad review process includes multiple levels of analysis and detection, both before and after an ad is published. We are taking comprehensive action under the DSA and continue to invest significant resources in protecting elections and combating hate speech, violence and incitement to violence.
Is Meta ready for DSA?
While Meta has already taken steps to meet the requirements of the DSA, it is clear that more needs to be done. The company needs to implement a more robust content moderation strategy to ensure that harmful content is not exposed. Additionally, greater transparency and accountability are needed to regain public trust.
Meta must therefore devote more resources to the fight against hate comments and risky advertising in the future, as stated in the company statement. If a company or group violates the Digital Services Act (DSA), they face penalties of up to six percent of their global annual turnover.

The Commission will have the same surveillance powers as under current antitrust rules, including investigative powers and the ability to impose fines of up to 6% of global turnover.
Why it matters
It is no exaggeration to say that the Ekō case is a wake-up call for the entire advertising and social media industry. At a time when fake news and hate speech poison the online environment, we do not have the luxury of underestimating the importance of safe and responsible advertising.
Conclusion: The case of Ekō vs. Meta shows that even the largest and most advanced companies in the world are not immune to mistakes. It also highlights the need for stricter regulation and self-regulation in the online advertising industry. Hopefully, such revelations will serve as a catalyst for positive change and not just another scandal in the annals of internet history.
At the end of the day, we are all users and it is within our power to put pressure on these platforms to ensure they operate more safely and responsibly. Let's use this case as a reminder of the importance of staying informed and doing our part.
Stay informed and engaged. Sign up now for the Mimikama newsletter and take advantage of the Mimikama media education offer . Our future in the digital space depends on our collective vigilance and actions.
Sources:
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

