Pittsburgh, Pa. — Plenty of people are concerned about fake news. Maybe in the past they fell for something bogus. Or maybe they worry that fake or exaggerated articles could impact how people vote. Ryan Beam, 16, wondered just what it would take to stop fake news in its tracks. So he did a science experiment. It showed that it takes more than just a tiny icon to keep people from spreading “facts” that aren’t true.
But a tiny icon will make a few people think twice.
Ryan, a sophomore at Scotts Valley High School in Santa Cruz, Calif., presented his results here, this week, at the Intel International Science and Engineering Fair (ISEF). Created by Society for Science & the Public, this competition brought together almost 1,800 students from 81 countries to share their winning science projects with the public. ISEF was sponsored this year by Intel. (SSP also runs Science News for Students and this blog.)
The teen is quite familiar with fake news. He once almost fell for some himself. “I remember one [headline] about the Pope endorsing Donald Trump,” he says. “It was the most shared article on Facebook.” And he almost bought it. “I didn’t not believe it at first,” he admits. “it seemed like it was unusual but maybe possible.” (It was, in fact, untrue.)
Ryan’s not the only one to get fooled. Worries about fake news spreading on social media spiked after the 2016 U.S. presidential election. For a while, Facebook tried putting small flags next to items that might be untrue. But this social media platform stopped in December 2017, saying the flags didn’t work.
Sites like Facebook aren’t very open with the data that they have about how fake news gets shared around. “I thought I would try and get some of the data for myself,” Ryan says. He gathered 10 articles. Seven had real headlines, such as “Air Force One Needs New Refrigerators. They Cost $24 Million.” Others were headlines for news that turned out to be fake, such as “Indictment Handed Out in Russian Bribery Case Involving Uranium One, Hillary Clinton.”
How did he prove the fake news item was untrue? “I used fact-checker websites,” Ryan says. “I would make sure there was unanimity that an article was deliberately misleading.”
For the new study, he took all of those headlines and put them together into three newsfeeds. Each feed looked like something someone might see on Twitter or Facebook. The first was just a list of the articles, with options to “like” or “share.” In the second newsfeed, he added a little red warning sign next to the fake articles. That warning sign was meant to highlight articles that might not be real.
The third newsfeed went a step further. The headline of any news item that might be fake had been obscured with a note that warned the reader this news might not be true. The reader then had to click a button to see the article.
(Story continues below image)
Ryan put all three newsfeeds on Mechanical Turk. This is a website that lets people take scientific surveys for money. The teen waited until 150 people had visited each newsfeed. He then checked to see how often each of the 10 links had been “shared” by his volunteers.
In the first condition, which listed all the headlines, “the most shared article was a piece of fake news,” Ryan found. But the two other pieces of fake news were the least shared. The fake news that was most shared was highly sensational, which might be why it was so popular — even though it wasn’t true. “Misleading information has the sensationalist factor,” the teen observes. “Once it goes viral it can reach a lot of people.”
The second condition, with the small warning signs, did make the fake news less popular. But whether the fake news had been shared depended on someone’s political-party affiliation. Democrats and Republicans were more likely to share false news. But Independents “steered away from the fake news,” Ryan says. “They became the least likely to share misleading information.”
The third condition — where users had to make an extra click just to see the fake news — did make the fake news articles the least shared of the 10. This treatment also made Ryan very uncomfortable. “It felt dishonest trying to hide stuff, trying to redirect people,” he says. “This is the closest I got to outright censorship.”
So the teen doesn’t want people adopting his third newsfeed — the one that hides the fake news. He would rather that there be warning signs, like the ones Facebook deployed. It may not change the minds of Democrats or Republicans, he notes, but “it’s the undecideds who swing elections.” So if the goal is protecting elections, he says, his second method might work well enough.
Ryan concedes that age also might play a role. Younger readers are probably more skeptical, he says. “In school,” he explains, “we get classes now about how to identify legitimate sources online.” As such, he says, “We’re being prepared to enter the world where not everything is the truth.” So given enough time, he hopes that identifying fake news “may not be as much of an issue.”
For now, Ryan definitely keeps an eye out for phony news: “I take everything I read with a grain of salt.”
Update: For his project, Ryan won an honorable mention at ISEF from the American Psychological Association.