News

Facebook’s opaque algorithm for filtering its News Feed likely distorts access to information and public debate. By Guy Healy.

Facebook’s distortion of news access

A police officer shot and killed African-American teenager Michael Brown, in Ferguson, Missouri, two years ago, setting off national protests and a chorus of outrage on social media site Twitter. However, Facebook’s treatment of people’s posts about Ferguson has raised questions about the potential distorting, “black box” effects of a platform with 1.79 billion users, 15 million of them in Australia.

United States social media researcher Zeynep Tufekci says that while many of her friends were “furiously” posting about Ferguson on Facebook and Twitter, her Facebook News Feed was instead dominated by the highly shareable “ice bucket challenge”, where people poured buckets of freezing water over themselves for charity.

Tufekci, an associate professor at the University of North Carolina, reports that hundreds of Twitter users complained of what they described as an “information blackout” about Ferguson on their Facebook News Feed. But she says this distorting process was neither a blackout, nor “old-style censorship”, but a phenomenon she described as “algorithmic burying or dampening”.

Facebook’s opaque, proprietary algorithm for which stories are given prominence changes often and “can cause huge shifts in news traffic,” making or breaking stories, and even affecting whole media outlets, Tufekci says.

Facebook’s algorithm – an autonomous computer program designed to sort information, and rank data on web pages such as the News Feed – had “decided” stories such as Ferguson were not “relevant” to selected users. In contrast, Twitter – which doesn’t use an algorithm to rank users’ posts, which are presented chronologically – allowed for the timely expression of “millions of tweets” from concerned citizens, Tufekci says.

“Facebook’s News Feed algorithm prioritises all the posts from your friends and pages you follow according to an internal and secret algorithm, choosing which ones you see first, and which ones are buried deep, and never seen unless you scroll to see every single post,” Tufekci tells The Saturday Paper.

“This algorithmic prioritisation and dampening is not transparent – Facebook changes this algorithm whenever it wants and however it wants. Since the most prominent mechanisms are ‘like’ and ‘share’, things that are easier to like and share tend to have an easier time going viral. But there is no way for people to indicate ‘this is important’ or ‘this is troubling but crucial’.”

According to research by Facebook employees published in Science last June, Facebook is a growing source of news and civic information. But the researchers – who examined what millions of Facebook users shared, what they were presented with, and what they ultimately consumed – conceded users are exposed to significantly less “cross-cutting” or ideologically challenging content as a result of the algorithm.

“People encountered roughly 15 per cent less cross-cutting content in their News Feeds due to algorithmic ranking, and clicked through to 70 per cent less of this cross-cutting content,” the researchers reported in their study.

But compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content, the researchers said.

The potential for distortion of what people know about their world appears significant. The Pew Research Centre reported in May that Facebook reaches 67 per cent of American adults. The two-thirds of Facebook users who get news from the platform amount to 44 per cent of the general population.

Tufekci says that as more and more advertising money that used to support journalism is now funnelled to Facebook, and as people around the world increasingly get their news from the social media site, their algorithm has emerged as a crucial question in the public sphere.

There are growing calls for a better public understanding of how algorithms on major internet platforms operate. Two months ago, German chancellor Angela Merkel called on the major internet platforms to make their algorithms more transparent.

“Algorithms, when they are not transparent, can lead to a distortion of our perception – they can shrink our expanse of information,” Merkel told a media conference in Munich.

The Washington-based digital watchdog Electronic Privacy Information Centre has called for autonomous devices to reveal both their identity and the basis of their decisions.

The problem is compounded by the fact that many people aren’t aware that social media platforms such as Facebook are filtered in this way, according to a 2015 study presented at the prestigious ACM Conference on Human Factors in Computing Systems. The researchers reported that more than 62 per cent of respondents didn’t know about the algorithm at all, with many expressing surprise and anger when told.

When Tufekci asked her university class of 20 bright students, only two knew of Facebook’s algorithm. When friends didn’t respond to their posts, they thought they were being ignored, rather than considering that Facebook wasn’t showing them the post.

“The algorithm also doesn’t let people know who sees what, so we are in the dark there, too,” she says. “If an important news item we shared gets no response, is it because our friends ignored it, didn’t like it, disagreed with it, or is it because the algorithm buried it? Only Facebook knows.”

Speaking to The Saturday Paper from Facebook’s Menlo Park headquarters outside San Francisco, Adam Mosseri, vice-president of product management, denies Facebook filtered or suppressed users’ content over Ferguson posts.

“We looked into this and saw lots of interactions around Ferguson, whether it was likes, comments, shares and articles about Ferguson,” Mosseri says. “There was also a lot of activity about the ice bucket challenge as well. What we could see when we looked into this – there was no blackout. We are always trying to improve the News Feed.”

Mosseri says the News Feed is based on three “signals” aimed at delivering stories that people find the most interesting and meaningful to their lives. These signals were the closeness of a person’s relationships to other people; the type of media people favoured, such as text or video; and acclamation.

“We look at all the different stories you can see and we order them according to what we think will interest you.”

This is done for every post, billions of times a day, in the hope that people will see the content they find most interesting and meaningful, he says. Mosseri also counters concerns that Facebook’s News Feed made less “cross-cutting” content visible.

“Prior to Facebook, people got their news from fewer sources. Now with Facebook we find this acts as a natural force for more diversity,” he says.

News Feed is not filtered but ranked, he says. Every post from your friends is visible if you keep scrolling long enough, but most people have limited time.

Making the most of people’s attention has become a focus of non-government organisations, for which social media sites such as Facebook are considered critically important mobilisation tools, enabling them to reach hundreds of thousands of people via peer-to-peer sharing. But they say the algorithmic-based ranking doesn’t always help their task.

“The Facebook algorithm is brutal in reading the initial engagement with a piece of content and then killing its reach or boosting its reach on this basis,” says Kathryn McCallum, communications and mobilisation manager for the Australian Conservation Foundation. “The algorithm is primed for engagement. When you put content up, if people immediately interact with that content – whether it’s clicking, sharing, liking or commenting – Facebook will immediately see that and show it to a bigger audience. If you put it up and it tanks, Facebook will kill it, and won’t show it to so many people.”

The main problem the ACF faces is the organised exploitation of social media accounts by climate sceptics, especially in relation to a “polluter pays” campaign. “When you see a Facebook account with no followers posting every two minutes for an entire day, you have to question whether that’s normal behaviour,” she says.

However, national spokesman for GetUp! Adrian Dodd says their organisation has learnt not to rely on Facebook’s News Feed, since the algorithm can change quickly, and Facebook imposed substantial costs when NGOs sought to communicate directly with their members en masse.

“Facebook pushes people [such as GetUp!] in the direction of advertising more. The idea of blasting all the members of our Facebook page with a piece of information is a thing of the past,” Dodd says.

“We have hired crack digital marketers from the ad industry. We no longer use polling research. We do message testing on sponsored Facebook posts for targeted demographics and for fundraising. Facebook pushes those posts because we pay for them.”

Other research, by the University of Southern California’s Kjerstin Thorson, shows social pressures on Facebook’s networked environment can “train” young people – often passionate about activism and non-major party politics – to keep political commentary neutral. Thorson wrote in Information, Communication & Society that as a Facebook post may be seen by a diverse range of schoolfriends, workmates, family members and even strangers, people either self-censor or adopt a tone of political neutrality – much like a politician’s stump speech – because it’s less likely to lead to conflict online.

The precise effect these factors have on political discussion is unknowable. But researchers are examining the reporting and commenting on cases such as the Ferguson shooting for signs the giant social media companies are influencing our social behaviour, especially the way we communicate with one another via algorithmic gatekeepers.

This article was first published in the print edition of The Saturday Paper on Dec 17, 2016 as "News front". Subscribe here.

Guy Healy
is a journalist and doctoral researcher in digital screen ecologies.