Facebook faces the task of using its algorithms to fight fake news but does it know the real problem its fighting against?
Three years into Donald Trumps presidency, the moral panic over fake news and post-truth has not abated. If anything, it has now blossomed into a full-blown culture war. Conservatives insist that their views are suppressed by Facebook and Twitter; progressives accuse the same platforms of not doing enough to crack down on hate speech and foreign manipulation of elections.
Mark Zuckerbergs recent testimony in US Congress where politicians competed to deal him the lethal rhetorical blow doesnt bode well for Silicon Valley. The Valleys only savior, at this point, is the Communist Party of China. Only indefinite trade war with China will prevent US lawmakers from regulating the strategic tech sector; to break up the industry would weaken Washingtons global standing. The Trump administration is not blind to these risks.
Invoking the Chinese threat has bought the tech companies some time but it wont work forever. The impending tech bubble is only going to increase everyones hatred of Silicon Valley; the calls for action will grow louder. The public humiliations of WeWork and Uber, the former darlings of tech investors, are signs that public tolerance of highfalutin technology platforms (and their leaders) is already running short. More government regulation is, indeed, likely to follow and stemming the tide of fake news would be one of the highest priorities.
But just how strong is that tide? What remains unexamined in the public debate but also in many academic discussions of post-truth is the background assumption that ours is the time of postmodernism on steroids: a time where no firm truths hold and no single narrative can survive the assault of radically different worldviews grounded in diverse material, cultural, and racial experiences.
To deny that something like this is happening facilitated by the business models of digital platforms, their algorithmic nudges, and the filter bubbles that result thereof would be disingenuous. But the fragmentation of truth is only one and perhaps not the most important part of the story.
One unappreciated paradox of todays digital condition is that it celebrates post-truth and hyper-truth simultaneously. As narratives get fragmented, allowing competing truths to proliferate, theres also a concurrent effort to deploy bots, ledgers, and algorithms to produce a singular, objective, and eternal truth.
The first stage of this objectification began with Wikipedia. Although the platform could be used to provide multiple readings and interpretations of any subject or phenomenon, a decision was taken that a community of editors and writers, armed with trustworthy and reliable sources, would converge upon a single interpretation of history.
While the critics of Wikipedia zeroed in on the fact that it was, in a truly radical manner, democratizing the production of knowledge everyone could contribute! they missed a more fundamental, conservative side of the project: while many controversial topics featured lengthy and often bitter discussions among the editors, the front-end presentation often gave no explicit sign of internal disagreement. The controversy and disagreement were, thus, hidden from the average viewer.
Instead, the proliferation of editorial and citational guidelines and regulations on Wikipedia ensured that those rules were presumed to have more say in determining the content of a page than the information supplied by the very subject of the entry. Hence the many curious cases of people complaining that Wikipedia has wrong information about them but they cannot change it as they are not presumed to be authoritative sources about themselves. This adherence to rationality and rules is the true modernist part of Wikipedia that has, so far, befuddled many of its observers.
The second stage of the objectification of narrative began with the rapid explosion of the blockchain technology. It created the illusion that everything can be embedded in digits and eventually presented, in an unalterable manner, on the ledger: the final truth, set in stone, not to be altered by anyone.
Applied to the narrow world of commercial transactions or computer events, this assumption appears harmless. Applied, however, to the more substantial issues politics, arts, journalism this epistemology of the blockchain creates the rather perverse expectation that, unless and until something has been packaged in a blockchain-friendly way, it must be corrupted by subjectivity, venality, or bias. Subjectivity is the enemy; opacity is sinful.
In other words, were starting to see an irony of the post-truth world: the democratization of knowledge has been matched by the intensification of the bureaucratic model. This time, however, the human side of bureaucracy is presented as archaic and uncool, to be replaced by objective algorithms and ledgers. The one true utopia of this mode of thinking already glimpsed in places like Singapore or Estonia is a fully-automated bureaucratic system enforcing the rules with Prussian efficiency.
The digital culture that ensues makes for a very odd beast. Not surprisingly, its conducive to the kind of cognitive dissonance feeding the alt-right. On the one hand, in a populist manner reminiscent of Wikipedia, it dispenses with expertise, as everyone is assumed to be equal to everyone else, much like the nodes on the blockchain network (another myth). On the other hand, it intensifies the modernist faith in rules and regulations and the possibility of finding, by some quantitative means, the single truth, which can then be made available to all, without any intermediation by forces other than technology. If one had to come up with a label for this ideology, populist modernism would be quite appropriate.
The contradictions of such a bizarre ideological mix are quite apparent: in dispensing with the experts, it replaces them with faith in technology and progress. But since such accounts usually lack any meaningful discussion of the political economy of technology (let alone that of progress), they have nowhere to fall back upon to explain historical change. What, after all, drives and shapes all that technology around us?
In such accounts, technology is usually just a euphemism for a class of uber-human technologists and scientists, who, in their spare time, are ostensibly saving the world, mostly by inventing new apps and products. The experts, thus, are brought in through the back door, but without any formal acknowledgement (or possibility of democratic contestation). These experts whether Wikipedia editors or blockchain engineers – are presented as mere appendages to the sheer force of technology and progress, when in reality theyre often its drivers.
This is hardly the sort of secure, reliable foundation on which democratic culture can flourish. Its one thing, in a typical postmodernist move, to celebrate situated knowledges and multiple epistemes, refuting any appeals to the one and only truth; a visit to a grad school seminar in humanities will confirm that this kind of language is still very much alive in academia. Its quite another to do it while also building a system to algorithmically enforce the truth through the zealous application of bureaucratic rules and regulations that would make Otto von Bismarck look like a carefree bricoleur.
Facebook, which is built on the populist assumption that horizontal communication among users trumps vertical preaching by experts, exemplifies this dilemma: for all its populism, it now faces the enviable task of using its algorithms to fight fake news. This, however, cannot be done without accepting the virtues of expertise and grounding ones approach in a singular, coherent worldview.
The problem with Facebook is that it doesnt even know that it has this problem: it will, thus, most likely continue its schizophrenic efforts of groping in the dark, erecting the sort of expert-led bureaucracy that it was supposed to demolish.
Nothing good will come out of such efforts, but they do highlight a fundamental truth that we seem to have forgotten: both fake news and its opposite, the excessive quest for hyper-rationalization, are the consequences not causes! of our problems. Postmodernism did not begin in Mark Zuckerbergs dorm room.
-
Evgeny Morozov is a Guardian US columnist
Recent Comments