Study sheds light on how online news algorithms can skew your picture of reality
A news aggregation app is viewed more favorably when it offers politically personalized content to users, but this personalization might reduce attention to high-quality mainstream sources, according to new research from Louisiana State University. The findings were published in Public Opinion Quarterly.
“This study was part of a larger news portal experiment where we manipulated the availability of different types of news content, as well as some interface features of the news website. The effects that we looked at were both attitudinal (whether people think of issues, politicians, and news organizations differently after using the portal for two weeks) and behavioral (whether people click/scroll more when certain content or interface features are present),” explained study author Kirill Bryanov (@KBryanov), who is now a research fellow at the Higher School of Economics in Saint Petersburg.
“As we were sketching the design of this massive study, we realized that we can simulate the effect of a real-life personalization algorithm by presenting our participants who initially indicated their party preference with an increased proportion of content from politically friendly sources.”
“There are studies galore looking at the effects of personalized content on people’s political attitudes and news selection, but we decided to go a somewhat different route. We were interested in what political personalization of the newsfeed does to the metrics that news platforms such as Google News care about the most: the amount of user attention and people’s perceptions of the platform,” Bryanov said.
“Since platforms’ design choices are primarily driven by business logic, we wanted to know whether tipping the balance of mainstream to personalized partisan news can be economically desirable to them. And if it is, the next question is: What how this personalization affects users? We think of our study adds a novel angle to the debate over platforms’ communication power and their responsibilities to society.”
Bryanov and his colleagues developed their own mobile news app and paid 2,343 U.S. residents to use it for 12 days. The app, which updated every hour, aggregated real news articles from Google News. For some participants, the app also aggregated content from partisan sources that were supportive of their preferred party.
The researchers found that participants who received politically-aligned news articles spent more time scrolling through the app, but the presence of partisan-sourced news did not lead to more clicks on articles in the news feed. Those who received politically-aligned news were also more likely to rate the app as reliable and comprehensive.
“Online aggregators where many people access their news are run by self-interested organizations, who want folks to return to their website/app and spend as much time as possible with it. The most powerful tool they have is the ability to tailor the mix of stories that they serve up to each individual user’s tastes. If they see that a certain level of political personalization is good for business, they will most likely adjust their algorithm accordingly,” Bryanov told PsyPost.
“Why, then, do the major news aggregators today present mostly centrist, mainstream news to the majority of users instead of flooding them with news from politically like-minded sources? This is because the share of vehement partisans in our society is actually pretty small, and the overwhelming majority still prefer quality news from reputable publications to feel-good opinionated content.”
But the researchers also found evidence that the presence of partisan content could make people less likely to engage with non-partisan mainstream sources.
“The demand for partisan content can go up, and our study looks at what can happen if news platforms respond as we expect them to: by driving up the share of partisan content in people’s newsfeeds. If they do, users will like them more and will spend more time at the website. At the same time, because less mainstream news will be available to them, people can read less mainstream news, which can lead to skewing their picture of social reality,” Bryanov said.
“Notably, in our experiment these effects were nearly identical among both left- and right-leaning participants, meaning that it doesn’t matter which side of the political spectrum you belong to. You are as likely as your political opponent to enjoy it when news algorithms give you more of what you like.”
“In sum, news platforms are powerful. As they chase user attention and loyalty, they can exacerbate, deliberately or not, some negative tendencies that exist in our society. People should begin to think more critically about their online news consumption habits and be mindful of the fact that recommendation algorithms exist not just to give them a pleasant experience online,” Bryanov explained.
The study — like all research — includes some limitations.
“The effects we observed were the result of two weeks of portal usage. This is better than in many other experiments, but likely still short of the timeframe needed to form a media consumption habit,” Bryanov said.
“Would users like the politically personalized platform twice as much if they used it for a month instead? How many days does it take for the effect to first kick in? These questions are important for the platforms’ behavior as well, so we should answer them to have a better idea what the algorithms can do to digital news users.”
The study, “Effects of Partisan Personalization in a News Portal Experiment“, was authored by Kirill Bryanov, Brian K Watson, Raymond J Pingree, and Martina Santia.