Facebook Inc. (NASDAQ: FB) will start fact-checking images and videos, the Company said Thursday, expanding its review efforts to posts that are traditionally harder to monitor.
"People share millions of photos and videos on Facebook every day. We know that this kind of sharing is particularly compelling because it's visual. That said, it also creates an easy opportunity for manipulation by bad actors," Facebook said in a blog post.
Edited photos and strong visuals were common among the posts by Russian agents attempting to interfere with the 2016 U.S. presidential election and other global elections, according to examples released by members of Congress.
Facebook has been ramping up fact-checking efforts and third-party human reviewers in recent months in an effort to protect future elections from foreign interference. The Company has already detected what it called "coordinated inauthentic behavior" ahead of the midterm elections in November.
"Many of our third-party fact-checking partners have expertise evaluating photos and videos and are trained in visual verification techniques, such as reverse image searching and analyzing image metadata, like when and where the photo or video was taken," Facebook said.
"Fact-checkers are able to assess the truth or falsity of a photo or video by combining these skills with other journalistic practices, like using research from experts, academics or government agencies."
Facebook users can also flag photos or images for review.