TikTok now warns users about videos spreading misinformation
- TikTok is playing its part in reducing the spread of misinformation online by adding new warning labels to videos with with questionable information.
TikTok is playing its part in reducing the spread of misinformation online by adding new warning labels to videos with with questionable information.
This warnings will appear on videos whose content could not be verified by fact-checkers to ensure that users do not share content that is unsubstantiated - but not definitively untrue. The warning prompt is essentially designed to dissuade users from sharing such videos.
While TikTok has been fact-checking content for some time now, it is the first time it has decided to publicly flag such videos with a warning label that reads, “Caution: Video flagged for unverified content.”
Moreover, creators will also receive a message if a warning label is added to their video. Videos that violate TikTok’s misinformation policy will then be removed immediately.
TikTok has not provided any information about how its fact-checking process works, how many videos it fact-checks every day and how it chooses which videos to review. However, a spokesperson at the company commented that fact-checking is more focused on content related to elections, vaccines, and climate change, as reported by The Verge.
It was also reported that a video's popularity does not qualify it for a review. Hence, it can be expected that all kinds of videos may be subjected to fact-checks.
It is important to note that other social media apps such as Facebook Twitter have also tried to prevent the spread of misinformation and fake news through the use of similar warning labels on their platforms. However, more advanced social media analytics will provide greater insights about the effectiveness of such warning labels online.
Comments
Comments are closed.