What is the effect of algorithmically delivered news?
Algorithmically delivered news that is based on user actions is reducing the diversity of the news we receive increasing the echoing of ideas.
By Chrissy Clary
I was lucky enough to work in a real newsroom right before the industry started to tumble. Every day, the editors gathered at 10 a.m. to discuss what would make the paper the next day. It was not unusual for a heated argument to erupt over what story should be featured. These people believed in what they were doing, they wore their ethics on their sleeves, and they cared deeply about the community they influenced. They were the gatekeepers.
Today, much of our news is delivered through social media platforms. According to the Pew Research Center, 62 percent of US adults get their news from social media (Gottfried and Shearer, 2016):
News plays a varying role across the social networking sites studied. Two-thirds of Facebook users (66%) get news on the site, nearly six-in-ten Twitter users (59%) get news on Twitter, and seven-in-ten Reddit users get news on that platform. On Tumblr, the figure sits at 31%, while for the other five social networking sites it is true of only about one-fifth or less of their user bases (Gottfried and Shearer, 2016).
The thing is, there is no daily editor meeting at Facebook. There are no local groups of people or community members deciding what is important. The news is being delivered to you algorithmically based on a variety of data points gathered by that particular platform to identify your likes, friends, interest, and actions. Such formulas for news delivery only take you into account. Does this method really give you the information you need to be a healthy, contributing part of a local community? Isn’t that what the editors were doing?what you see will depend more on who your friends are, what they share, what you click on
In a video posted in the Facebook Newsroom, Adam Mosseri, VP of product management for News Feed, explains how News Feed works. He also comments that the goal is to “connect people with the stories that matter most to them” (Mosseri, 2016).
In 2016, Facebook updated the News Feed algorithm. Now, “what you see will depend more on who your friends are, what they share, what you click on (Sunstein, 2017).”
News Feed uses data on its users to make decisions about what those users most likely want to read. Thus, News Feed is helping us sort through thousands of articles and delivering exactly what we want, when we what it (Mosseri, 2016). That doesn’t sound so bad, right?
In an attempt to deliver the news you are most likely to interact with, News Feed appears to be strengthening the echo-chamber effect. With regard to media, the echo-chamber effect occurs when when your opinions and preferences are echoed back at you.
As a point of reference, an echo chamber can be described as “a bounded, enclosed media space that has the potential to both magnify the messages delivered within it and insulate them from rebuttal” (Jamieson and Cappella, 2010).
In an opinion piece for Wired Magazine, Kartik Hosanagea, a professor at the Wharton School of the University of Pennsylvania, calls echo chambers problematic because “social discourse suffers when people have a narrow information base with little in common with one another (Hosanagea, 2016).”
Is his book #Republic, Cass Sunstein cites research by Facebook employees that appears to indicate that the algorithms are responsible, in part, for our political echo chambers: “Evidence shows the algorithm suppresses exposure to diverse content by 8 percent for self-identified liberals and 5 percent for self-identified conservatives” (2017).
In his book The Internet of Us, Michael Patrick Lynch raises the concern that only reading about the things we already agree with is giving rise to “group polarization – that we are becoming increasingly isolated tribes (2016).”
Wasn’t the Internet supposed to open us all up to new people and cultures? It appears the opposite is happening. We are being profiled based on our online actions. Without proactive steps on the part of the user to contradict these affects, it is possible that our scope of knowledge and understanding will shrink