Saturday, December 3, 2016

Worse Than The Disease

Politico's Jack Shafer declares Facebook's potential ability to squash fake news to be "worse than the disease." He makes a few points:

  • Fake news of sorts has been around for a while--so why squash it now?
  • People who consume fake news are like people at a magic show--they know some of the news they see will be fake--but that's okay because it's a rush.
  • Political bias contributes greatly to who believes a fake-news story: Clinton-voters were less fooled because they didn't want the news (anti-Clinton stories) to be true.
So he thinks that the moral panic around fake news should be allowed to burn itself out without creating a new technology around restriction of speech and information.

Well, Is He Right?

No. He is catastrophically wrong about fake news and therefore his conclusion is a bit, well, stupid.

Uh, That Escalated Quickly

Lest Shafer, who will have this tweeted at him in short order thinks The Omnivore is just another Internet hater, let The Omnivore assure the reader that is not so. Shafer on the whole isn't stupid. His take on fake news moral panic isn't necessarily off base. No, this piece is wrong because The Omnivore (apparently) knows things about fake news that (apparently) Shafer doesn't

What Does The Omnivore Know?

The Omnivore knows 'fake news' from the inside. The Omnivore has watched its creation, propagation, and fin-crest into the mainstream media. The Omnivore knows what kinds of fake news spread better and faster. The Omnivore knows who consumes it and how they think. The Omnivore has spoken with people fooled by fake news--and has a history of (trying) to debunk fake news with its adherents.

In short, The Omnivore is an expert (right now Jack is going "The hell he is. That claim up there? Probably fake.") and The Omnivore knows Shafer is wrong. Let's look deeper.

Fake Newsies Want To Be Fooled

This, alas, is not true: Fake Newises want emotional vindication / validation. Fake news, overwhelmingly provides a feeling that "you are right" to its consumers. It's a statement that stands in the face of the mainstream media which keeps, relentlessly saying "you are wrong" (Hillary did not order a Benghazi stand-down!--She didn't try to get those people killed!). The Mainstream Media also says, sometimes "You are racist."

Fake (or just heavily biased) news says "No, you're OK! You should fear the spreading plague of The Knock Out Game."

So, no, it's not like going to a magician. It's far, far deeper rooted than that, which makes it far more psychologically powerful.

But They Know It's Fake, Right?

No. They don't. They may acknowledge that they aren't sure what's fake and what's not--but that applies as strongly or more to the mainstream media as to random Internet news. They aren't just playing a game: fake newsies think that the mainstream media cooks up lies as much as anyone else and the only time they tell non-leftists truths are when events of great magnitude force them to.

So, no. They don't know it's fake.

What About The Great History of Fake News?

The idea that fake news has been around for a long time is almost intentionally deceptive. Facebook hasn't been around for a long time, neither has Twitter, neither has Internet advertising. What we are seeing here is a new thing. It should, at least, be treated as such.

Fake News Is Easily Debunked By Experts

Jack points out that in other arenas such as sports or automotive areas fake news is quickly and easily debunked. He also notes that Hillary voters were rarely fooled compared to Trump voters--but he ascribes confirmation bias as the major factor here.

Well, yes: that's why Hillary voters were largely not fooled by fake news created for Trump voters--but why didn't they fall for fake news created for them? It was out there.

Here's why: when you trust the main stream media it takes you two seconds to tell if a blockbuster story is true or not: go to and see if the story is on there. Look at ABC or NBC or CNN and if you don't see the big news about Obama selling Alaska to the Chinese you know it's something's fishy right away.

Conservative consumers of fake news have no such back-stop: they have already come to the conclusion that the overwhelming bias of the liberal media is such that they will do almost anything to squelch a story they don't like.

The reason this same dynamic applies easily in, say, sports is that there are plenty of recognized authorities. That's not the case in America anymore. Not on the right, at least.

What About The Cure?

The cure--where Facebook does something to either flag news as fake--or likely fake and then impedes the spread of it--could  be used to stop other things. That's true. What's questionable is "What other things?"

The example is other "moral panics" which, fine, if we accept the classification of concern about fake news as a moral panic--but what if it's not? What if fake news is more like a disease? How so? Well, firstly it "infects and spreads." It has a pattern that is like a disease. Secondly, while some people build up an immunity others, if hit with enough of it, might "get infected" (decide it's true) even if it isn't.

Finally, it comes through contact--nodes that are connected in our social graphs. Fake news, per se, unlike a magic show isn't something people seek out (yes: people go to The Onion--but they go for humor. Most fake news isn't funny). If you get fake news from trending stories, that's kind of like going out into the big wide world and being exposed: you may well reject it. But when it comes from someone you know and trust? 

That's different.

How Facebook Helps

What Facebook will do is essentially automate the fact-checking part of the equation: people can still email stories around, after all. A lot of people--and herein The Omnivore includes his highly educated cousin--have no idea how to use Google to fact-check a story or a quote. They don't especially like being told their forwarding fake news but they also don't (generally) intend to forward fake news. 

If Facebook can convince moderate skeptics that its algorithm is decent, there's a good chance people will like it.

This is very different from Facebook then re-purposing its technology to stop some other form of news from spreading: without support of the user-base the back-lash would be severe (it may be severe anyway) and the barrier to leaving is high--but as we saw from Facebook trying hard to accommodate conservative critics . . .  Facebook itself doesn't think it's that high.  

No comments:

Post a Comment