There is no polite way to say this: while a person must doubtlessly be very intelligent to get a job as a professional Facebook algorithm writer, the algorithms themselves are completely stupid, thanks primarily to their complete inability to read things in context.
And, therefore, those who put those algorithms in charge of evaluating news stories are both stupid and irresponsible.
If you have a Facebook account, you've seen it yourself: post a withering insult about your least-favorite politician, and Facebook's algorithms will recommend that you “Like” his page and donate to his re-election campaign.
A friend of mine, who is a fan of the British science-fiction series Doctor Who, spent one summer getting constant Facebook recommendations that he read news articles about a then-current drug scandal in the world of European competitive bicycling; eventually he figured out it's because the articles all mentioned a doctor who allegedly helped athletes cheat.
Lack of standards
So in some ways, it's no surprise to see this Boston Globe article complaining that “Facebook draws fire on 'related articles' push,” nor any surprise to see that in the article's third paragraph, an unnamed Facebook spokeswoman blamed the problem on “algorithms.”
But read more closely and you'll notice that algorithms aren't the problem; lack of standards is. Or you could call it reckless disregard for facts.
New criticism of Facebook focuses not merely on articles unlikely to interest specific individuals (there's nothing inherently wrong with news stories about European bike-racers; there's just no reason to think Doctor Who fans are especially interested in them), but on pushing articles proven to be demonstrably false. As the Globe said:
A surprise awaited Facebook users who recently clicked on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless.
Facebook responded to the click by offering what it called “related articles.” These included one that alleged a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.
Facebook's algorithms are proprietary information, so nobody knows exactly how they calculate what will and will not appear on Facebook “feeds,” but Facebook has indicated two factors: it has something to do with word association (obviously), and also has something to do with how “popular” an article is. But that's all; Facebook doesn't engage in fact-checking or anything else to verify the content of whatever its algorithms promote.
Were Facebook positioning itself exclusively as a social media site, focusing exclusively on popularity would be a perfectly legitimate tactic. The problem is that Facebook is also trying to position itself as a source of actual news, where mere popularity is supposed to matter far less than whether something is actually true.