Facebook正因其審核其平臺上內容的方式而再次受到關注。此前,一項由報紙開展的調查披露了Facebook對員工的指導意見,該意見用于指導員工處理涉及暴力、仇恨性言論、虐待兒童及色情等問題的帖子。
The training manuals, which were published by the Guardian on Monday, reveal how the social media group’s 4,500 global moderators judge when to remove or allow offensive content.
由《衛(wèi)報》(The Guardian)在周一發(fā)布的Facebook指導手冊,揭示了這家社交媒體集團的4500名全球審核人員如何判斷何時該刪除或保留冒犯性內容。
They show how posts that threaten to kill Donald Trump, US president, are banned because heads of state are considered “vulnerable”, but violent threats against others are permitted.
指導手冊顯示,那些威脅要殺死美國總統(tǒng)唐納德•特朗普(Donald Trump)的帖子會被封禁,因為政府首腦被視為“易受攻擊”。不過,暴力威脅其他人的帖子則被允許保留。
For example, the manuals guide moderators that it is permissible to allow users to post messages such as “to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, on the grounds that the company does not consider them “credible” threats.
比如,手冊指導審核人員可以允許用戶發(fā)布諸如“要想捏斷一個婊子的脖子,就必須確保用盡全力掐她喉嚨中間”的帖子,理由是Facebook不認為這些是“可信的”威脅。
The manuals also show how the platform, which now has 1.94bn users, will allow livestreaming of attempts to self harm because Facebook “doesn’t want to censor or punish people in distress who are attempting suicide”.
手冊還顯示,這個如今擁有19.4億用戶的平臺允許用戶直播自我傷害企圖,原因是它“不想審查或處罰正企圖自殺的抑郁人士”。
Facebook said safety organisations had advised the social media group that leaving posts of this nature online allowed people to seek help.
Facebook表示,安全組織曾建議這家社交媒體集團,在網上保留這類帖子的話,人們將能夠尋求幫助。
The revelations come as the group is under mounting pressure from politicians and campaigners who argue that it should take more responsibility for the content that appears on its website.
這些爆料出現之際,Facebook正面臨政界人士及活動人士越來越大的壓力。這些人聲稱,Facebook應為出現在其網站上的內容承擔更多責任。
Last month, Facebook removed a video of a man in Thailand who murdered his baby daughter before killing himself. Two days later, another man in the US livestreamed his suicide by gunshot.
上月,Facebook刪除了一名泰國男子的一段視頻,這名男子在殺害了襁褓中的女兒后自殺身亡。兩天后,美國另一名男子直播了用槍自殺的過程。
The guidelines only come into play after users flag potentially offensive content to Facebook. The company says it already has automated systems in place to stop the publication of certain types of material such as child sex abuse and terrorism.
只有在用戶向Facebook舉報了可能有冒犯性的內容后,這些指導意見才會發(fā)揮作用。Facebook表示,該公司已經配備了自動化系統(tǒng),以防止發(fā)布特定類型的內容——比如兒童性虐待和恐怖主義內容。
Facebook confirmed the authenticity of the manuals, some of which were reproduced by the Guardian, but added that some of the material was out of date.
Facebook證實了手冊(其中一些內容由《衛(wèi)報》轉載)的真實性,但補充稱一些內容已經過時。
The company said the manuals were drawn up by a group of “highly trained people” with regular advice and input from external groups such as safety campaigners and non-governmental organisations. It added that the material was under constant review with the group holding weekly meetings.
該公司稱,手冊由一組“訓練有素的人”制訂,定期接受來自安全活動組織和非政府組織等外部組織的建議和意見。Facebook補充稱,該小組每周舉行會議,不斷對手冊內容進行審閱。
Monica Bickert, Facebook’s head of global policy management, admitted that there were “grey areas” in policing content on its website.
Facebook全球政策管理的負責人莫妮卡•比克特(Monica Bickert)承認,在監(jiān)督網站內容方面確實存在“灰色區(qū)域”。
“For instance the line between satire and humour and inappropriate content is sometimes very grey,” Ms Bickert told the Guardian. “It’s very difficult to decide whether some things belong on the site or not.”
“例如諷刺幽默和不當內容之間的界線有時會非常模糊,”比克特向《衛(wèi)報》表示,“很難決定一些內容是否適合放在網上。”
In the UK, politicians have been stepping up the pressure on the social network. Last week, the Conservative party manifesto set out plans to reform the way technology and social media groups operate.
在英國,政界人士正加大對這家社交網站的施壓。上周,保守黨的競選宣言提出了改革科技和社交媒體公司運營方式的計劃。
“We want social media companies to do more to help redress the balance and will take action to make sure they do,” said Theresa May, prime minister.
“我們希望社交媒體公司付出更大努力恢復平衡,我們也將采取行動確保他們這么做,”英國首相特里薩•梅(Theresa May)表示。
A report by the select committee for culture, media and sport said earlier this month that “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal or dangerous content”.
本月早些時候,由文化、媒體和體育事務特別委員會發(fā)表的一份報告稱,“可恥的是,最大、最富有的社交媒體公司遠未采取足夠行動來處理非法或危險內容”。
Charlie Beckett, director of Polis, a media think-tank based at the London School of Economics, said he was reassured by the leaks for suggesting that Facebook was taking the issue of content on its platform “seriously”.
倫敦政治經濟學院(London School of Economics)媒體智庫Polis的主任查利•貝克特(Charlie Beckett)稱,曝光的Facebook指導手冊表明Facebook正在“嚴肅”對待其平臺上的內容問題,這令他感到安慰。
But he added: “Facebook is making taste decisions. What we may find offensive in the UK is likely to be very different to what people in Saudi Arabia find offensive. I’m concerned that freedom of expression campaigners will say even if it’s offensive, why not have it out there?”
但他補充稱:“Facebook眾口難調。我們在英國認為冒犯性的內容,與身在沙特阿拉伯的人認為冒犯性的內容,可能非常不同。我擔心支持言論自由的活動人士會說即使具有冒犯性,為什么不能表達出來?”