Skip to content

The Fake News problem will not be solved by technology

One reason we struggle with finding a solution to the fake news problem is that we have never defined the problem properly. The term “fake news” started as referring to publications that look like news but are entirely fabricated. It then migrated to consist also of news articles that are just grossly inaccurate, to later expand further into consisting also of news one doesn’t like and tries to dispute.

It is amusing to see how we seek technical mitigation towards a problem which is entirely semantic. Just like a lie detector does not detect untruths but only the artifacts of a lying person, all technologies that are considered for fighting fake news do not detect untruths but mostly willful propaganda. However, just like plain deceiving, publishing propaganda also consists of many shades of grey, implying that whatever solutions we find, we will never be happy with them.

We should recalculate our route.

How did we get here?

How did this get started? Most would say: with Web 2.0. As soon as every Internet user became a publisher, we lost control.

Web 2.0, and the new ways of using the Internet in general, brought the message of decentralization. This trend favors breaking centralized models of distribution into a peer-to-peer architectures. Examples are: peer-to-peer (P2P) file sharing, peer-to-peer payments (with Bitcoin at the top, operating with zero centralization), and of course, peer-to-peer content distribution, news included.

The history (and present) of computing tells the story of quite several pairs of opposite approaches, where it often seemed as one is way better than the other (particularly where one is newer or fancier), whereas in reality they are equally good just not for the same uses. Examples for such pairs are: thin computing versus client-based computing, RISC versus CISC processor architectures, and of course, centralization versus decentralization.

Decentralization is the hot trend of this decade, and so with just a little lobbying by a few large commercial stakeholders we were all made to treat it as the one true destiny of computing and sharing. There is nothing wrong with decentralization, but there is nothing wrong with centralization either.

There are strong cases for moving from centralized to decentralized models in many areas. Some choke points are often better cut off for the sake of reducing cost and improving quality. In other cases, however, those intermediaries do have a significant role for which we should be less eager to do without them.

The need for attribution

As opposed to other areas, where the intermediary is a facilitator, or a conveyer, which serves a purely technical role, in the supply and distribution of news articles the news agency has an additional role to that of a facilitator. The news agency is the nucleus of liability.

Nothing technological can protect us against false news, whether entirely fabricated or just unintentionally inaccurate. We are protected against such by two measures, both non-technical: social reputation and legal liability. Nothing else. If the Washington Post is to start fabricating news items, it will lose its reputation and hence its clientele and revenues. If it goes as far as publishing falsities that cause damage to someone, the people responsible might also be dragged to court.

None of those two measures, that the reliability of our news system relies on, can operate without the waterfall distribution model. They both rely on the source of news items having a name (at least a pseudonym, for reputation to be bound to) and an address.

This does not at all mean that only large corporate players can create and distribute news. A centralized approach does not imply that there is just one or a few centers, and the waterfall analogy does not imply that there are just a few waterfalls; there can be millions. What it does imply, however, is that we should treat news distribution as something that has a source (a news agency) and sinks (the consumers), and that the source matters. Decentralization blurs attribution, and this is the root of the problem. Unlike music and other forms of art, and unlike other goods where the quality is evident in the item itself, news quality cannot be determined without attribution to its real source. And, determination of the source is most conveniently done by a waterfall distribution model. If you visit https://www.nytimes.com, you know you’re reading items by the New-York Times. (There are other means of determining sources, such as using digital signatures, but we are not there yet.)

Going forward

Tech giants monetize our use of their platforms to distribute and consume information. The more information being distributed, the more ad revenue is created. What happens to be conveyed in this information flow is far less interesting, just as long as its engaging enough to cause more exposure to ads. We cannot expect those tech giants to promote dismissing the decentralized parts of their platforms for news consumption. Moreover, decentralization of news delivery does not prevent waterfall delivery to coexist on the same platform (the New-York Times can distribute its news over, say, Facebook with proper attribution), so those tech giants merely provide a multi-purpose tool for anyone to use as he/she sees fit. The problem is ours, not theirs.

As always, technology has no values; it has capabilities. We have centralized and decentralized information distribution models, and having both is a good thing. The duty of using the right tools for the right purposes is solely ours.


See also

Trackbacks

No Trackbacks

Comments

Display comments as Linear | Threaded

No comments

Add Comment

Markdown format allowed
Enclosing asterisks marks text as bold (*word*), underscore are made via (_word_), else escape with (\_).
E-Mail addresses will not be displayed and will only be used for E-Mail notifications.
Form options

Submitted comments will be subject to moderation before being displayed.