Facebook’s Combat Against Problematic Content

Facebook’s Combat Against Problematic Content

标签

撰稿人

日期

分享

Share on email

In order to maintain the integrity of content across its family of apps, Facebook has been adopting a strategy called “remove, reduce, and inform” since 2016. The strategy involves removing the content in violation of its policies, reducing the spread of problematic content that’s not in violation of its policies, and providing users with information so that they make informed choices when deciding what to click, read or share.

Remove

Facebook’s Community Standards page sets out what content is or is not permitted, and is designed to create a communication environment that embraces real-world safety, diversity, and equity.

Facebook has recently introduced a new section on this Community Standards page where users are regularly informed of the latest updates. A new feature named “Group Quality” has also been implemented to help admins of Facebook groups understand better how Facebook enforces its standards.

Reduce

While some types of content might not violate the Community Standards, they are not exactly popular among Facebook users, e.g. misinformation and clickbait. Utilizing technology and people in its fight against photo and video-based false news, Facebook has been gaining momentum enforcing against fake accounts and coordinated inauthentic behavior. Measures have been introduced to help users identify false news. Certified fact-checking partners are also used to assess content in 24 languages.

Going forward, Facebook is to continue consulting academics, fact-checking experts, journalists, and other organizations to find new ways to snuff out fake news stories. Groups found to have repeatedly shared false content will have their overall News Feed distribution reduced.

Inform

Features, e.g. the Context Button, have been launched to provide background information related to News Feed content to help users evaluate source credibility.

As part of its upcoming plan, Facebook is set to expand the Context Button to images as well. Existing features, e.g. the Page Quality tab, will be enhanced, and more features like the Trust Indicators will be added in order to provide clarity on a news organization’s ethics and standards.

Messenger

Facebook will be bringing its Verified Badge to Messenger, helping users avoid scammers. Messenger users will be given more control over who can reach them and whom to block. In a move to curb the spread of false news, Messenger will also have its own Context Button and Forward Indicator.

As scammers are constantly evolving their techniques, Facebook is determined to continuously step up its efforts in its war against misinformation.

Ready To Talk?

To know more how to help your business successfully, get in touch with our specialists

我们还能为您的公司做些什么?

请与我们的专家队伍联系

你可能还会喜欢…

联系我们

我们有信心能帮助您的企业取得非凡成功。
欢迎您在info-sh@ndngroup.com留言,让我们一起开启光明未来。

集团旗下核心成员
联系信息
Our Core Value
Our Core Value

Contact Info

E: info@ndngroup.com.cn

T: +852 2831 1888 (Hong Kong)

T: +86 21 5569 9879 (Shanghai)

T: +1 646 726 3553 (New York)

© 2023, NDN Group (SH) Limited. All Rights Reserved.