Facebook just announced that they will remove humans from the production process for their “trending” news section (the list of storiesÂ on the right of the page, not the mainÂ news feed.) Previously, they’ve had to defendÂ themselves from accusations of liberal bias in how these stories were selected.Â Removing humans is designed, in part, to address these concerns.
The reaction amongÂ technologically literate press and scholars (e.g.Â here, here, and here) has beenÂ skeptical. They point outÂ that algorithms are not unbiased; they are created by humans and operate on human data.
I have to disagree with my colleagues here.Â I thinkÂ this change does, or could, remove an important type of bias: a preference alongÂ the US liberal-conservativeÂ axis. Further, relying on algorithmic processesÂ rather than human processesÂ leads toÂ a sort of procedural fairness.Â You know that every story is going to be considered for inclusion in the “trending” box inÂ exactly the same way. (Actually, I don’t believe that Facebook’s trending topics were ever politically biased — the evidence was always thin — but this is as much about appearance and legitimacy asÂ any actual wrongdoing.)
Of course algorithms are not at all â€œunbiased.â€ I’ve been one of many voices saying this for a long time. I’ve writtenÂ about the impossibility of creating an objective news filtering algorithm. I teach the students in my computational journalism class how to create such algorithms, and we talk about this a lot. Algorithmic techniques can be biased in all sorts of ways:Â theyÂ can be discriminatoryÂ because of the data they use for reference, they can harm minorities due toÂ fundamental statistical problems, Â and they can replicate the biased ways that humansÂ use language.
And yet,Â removing humans really can remove an important potential source of bias. The key is recognizing what type of bias Facebook’s critics areÂ concerned about.
There are many ways to design a “trending topics” algorithm. You can just report which stories are most popular.Â But this might hide important news behind a wall of Kim Kardashian, so most trending algorithms also include a “velocity” component that responds to how fast a story is growing (e.g. Twitter.) Facebook’s trending topics are also location-specific and personalized.Â None of this is “objective.” These are choices about what it is important to see, just as an editor makes choices. And perhaps Facebook is making choices that make them theÂ most money,Â rather than the supposedly neutral and public-service oriented choices of an editor, and that’s a type of bias too. It’s also true thatÂ algorithmic systems can be gamed by groups of users working together (which is either a feature or a bug, depending on what you feel deserves coverage.)Â Users can even work together to suppress topics entirely.
But none of this is left-right political bias, and that’s the kind of bias that everyone has been talkingÂ about. I can’t seeÂ anythingÂ in the design of these types of trend-spotting algorithms that would make them more favorableÂ to one political orientation or another.
This doesnâ€™t mean the results of the algorithmâ€Šâ€”â€Šthe trending news stories themselvesâ€” are going to be politically neutral. The data that the algorithms operate on might be biased, and probably will be. Facebook monitors the articles that are being shared on their platform, and there is no guarantee that a) news sources produce and promote content in some â€œneutralâ€ way and b) the users that share them are unbiased. If it turns out that more Facebook users are liberal, or liberal Facebook users are more active, then liberal-friendly articles will be more popular by definition.
However, this is a bias of the users, not Facebook itself. Every social software platform operates under a set of rules that are effectively a constitution. They define what can be said and how power is distributed. And some platform constitutions are more democratic than others: the administrators have power or the users have power in varying degrees over various things. Facebook has previously made other changes to reduce human judgment; this can be seen as a continual process of devolving control to the users, although itâ€™s probably more to do with reducing costs through automation.
By removing humans entirely from the trending topics, Facebook is saying that the trending algorithm itself — which is very likelyÂ neutral with regard to the liberal/conservative axis — is the governing law of the system. The algorithmÂ may not be “objective” in any deep way, but it is democratic a certainÂ sense. We can expect theÂ trending storiesÂ to mirror the political preferences of the users, rather than the political preferences ofÂ Facebook employees. This is exactly what both Facebook and its critics want.
Personally,Â I think that humans plus machines are terrific way toÂ decide what is newsworthy. The former trends curatorsÂ did important editorial workÂ highlightingÂ stories even when they weren’t popular: “People stopped caring about SyriaÂ …Â if it wasnâ€™t trending on Facebook, it would make Facebook look bad.” This is exactly what a human editor should be doing. But Facebook just doesn’t want to be in this business.