In one month we reported 80 anti-semitic sites. Of the 80 reports, Facebook agreed to remove just 10 of them. This means that 87.5% of hateful images and pages are still lurking online.
The silver lining? Facebook responded to 73% of reports within 24 hours. Yet for 19% of reports, Facebook took up to 3 days to get back to us and for 2/80 reports they failed to respond at all. This has serious implications as within 72 hours, countless people have access to the hate speech.
In 90% of cases, Facebook responded to our reports with an automated response directing us to visit their community standards page. This was a totally inadequate response as the lack of transparency made it impossible for us to know how Facebook distinguished between content that was funny as opposed to offensive. Without an objective standard to go by, it seems Facebook’s decisions are totally arbitrary and lack uniformity.
74% of surveyed users are reluctant to moderate hate speech as they think it’s pointless trying to argue with extremists. Should this job be left to Facebook?
It was interesting to see how a statement that Facebook deemed “humorous” was classified as offensive by 84% of surveyed users. Facebook’s failure to offer an objective guideline for its decisions suggests the reporting mechanism needs a serious overhaul.
User moderation and organisations such as Elimihate and OHPI can only do so much in regulating the abundance of content on Facebook. Ultimately the onus lies with Facebook to strike a balance between free speech and censorship. Rather than hiding behind legal definitions of platforms and publishers, Facebook needs to realise that in an era of digital convergence, they are accountable for the content of their users and must foster an open dialogue to educate users about what constitutes hate speech.fdshf
- Facebook should establish channels of dialogue for experts in order to improve its recognition of hate speech.
- All hate-speech related reports should be responded to within 24 hours, complete with a clear guideline for how the decision was made and options for appealing a decision.
- If Facebook fails to improve its handling of reports, government intervention may be necessary (eg requesting that Facebook delete offensive public groups and pages, suspend the accounts of repeat offenders and trace users to an IP address).