“We have reviewed the page you reported for hate messages or symbols and determined that it does not violate our Community Standards.” Users often get these and similar answers from Facebook after reporting a page. Lack of understanding is often the result, and people complain about the “bots” who determine what violates the standards and what doesn’t.

Chris Green, a reporter for the UK magazine “ The Independent ”, was the first to take a look into Facebook’s European headquarters in Dublin to look over the shoulder of the so-called “Community Operations Team”.


report1(Screenshot: The Independent )

Outside of California, Facebook's headquarters, Dublin is the most important location. Not only reports from Europe, but also from the Middle East, Africa and a large part of Latin America are coordinated here and assigned to the individual departments. Sonia Flynn, the managing director of Facebook Ireland, says that while the majority of reports are harmless in nature, there are also those cases where quick and decisive action needs to be taken.

In order to cover all languages, there are employees from 44 nations in Dublin alone who not only speak the language fluently, but also have to be familiar with the respective national culture. Facebook attaches great importance to this, says Sonia Flynn.

Ciara Lyden, Facebook's content regulations manager, dispels a myth that is often held against Facebook: namely, that so-called bots determine whether a post or page has been legitimately reported. “Often after I help someone on Facebook, the response I get is 'Thanks, even though you might just be a bot'. However, I can assure you that every single request is handled by real people.”

Many users understandably have the impression that they are just bots, as there is not really any personal contact between Facebook and users, an objection that Lyden also understands. However, she says, one must remember that 1.39 billion Facebook users, who are online around the clock and report millions of posts per day, simply do not allow the time for direct contact to be spared.

In order to decimate the number of reports, Facebook is now relying more on interaction between users. Previously, you could simply report something, and staff would review the reports and decide whether or not it violated community standards. However, Facebook now also offers the option of contacting the person whose post should be reported directly so that the matter can be sorted out between each other.

“In most cases, this system works,” says Flynn. In real life I also approach a person and tell them what bothers me. “We try to make this natural behavior possible for our users too.”

Facebook has a total of four headquarters worldwide: in California and Texas (USA), Dublin (Ireland) and Hyderabad (India), which cover all time zones. This means there is never a “night shift” in which fewer employees than usual work on the reports. Facebook also attaches importance to this, as there are also reports that require particular urgency.

When a user clicks on “Report”, the message is sent to the respective team depending on the level of severity and urgency. Julie de Bailliencourt, the security manager for Europe, Africa and the Middle East, says: “Especially for reports of murder - or suicide announcements, all of these postings that require particular urgency, these reports are no longer passed through the teams, but are instead sent directly to the police and security authorities. We cannot risk that someone will implement their supposed plan because of our hesitation, but through this procedure we have already been able to save a number of lives.”

But there are also some points of criticism. For example, images of breastfeeding mothers have been deleted, something de Bailliencourt describes as a “human error.” The multicultural team often has heated discussions about whether something should be deleted or not. “Sometimes it’s like a UN meeting here,” she says. You know that every reaction, whether a post is deleted or not, will be met with criticism, so it often has to be carefully considered.

Here are some examples:

Pictures of breastfeeding mothers

In 2011, a group that promoted breastfeeding was deleted. However, a little later the page was restored, Facebook apologized for this error and emphasized that they want to support every person's right to personal freedom of expression.

Incitement to violence

In 2011, a page called “Third Palestinian Intifada” was also removed from Facebook. It was aimed at Palestinians who were called upon to peacefully protest against Israel. There were many reports against that page, but it was only removed when the Israeli government contacted Facebook directly, as in the meantime the page had turned into a hodgepodge of hate messages and calls for murder against Jews.

Freedom of speech

In 2012, a fansite was created for James Holms, the man who shot 12 people at the premiere of a Batman film in Colorado. Facebook did not delete this group because, although it was incredibly distasteful, it did not violate the rules and operated within freedom of speech and expression.

Suicide prevention

In 2013, New York police officers prevented the suicide of a teenager who wanted to jump off a bridge and announced this on Facebook shortly beforehand. The police themselves responded to his status report and sent photos of him to the patrol officers until the teenager could be found and taken to a hospital.

“It is also a myth that if a page or post is reported more than once, it will be more beneficial,” says de Bailliencourt. A report is not only handled by one individual, but always by several members of a team. Sometimes decisions are reversed, but it is never the number of times an amount is reported that counts.

There will always be controversy over posts and pages on Facebook.
What we Germans find to be very tasteless, such as Hitler fan sites, only makes Americans smile. Many people may not like this, but this is the freedom of speech and expression that Facebook has represented since the beginning. However, Facebook also values ​​the maturity of its users. On the interaction between users. To exchange ideas and opinions. Because this is how Facebook became what it is today: a platform that often sparks controversy between users, but has also made the world a little smaller.

 

Author: Ralf, mimikama.org

Source:
http://www.independent.co.uk/life-style/gadgets-and-tech/features/what-happens-when-you-report-abuse-the-secretive-facebook-censors-who-decide- what-is–and-what-isnt-abuse-10045437.html?origin=internalSearch

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )