Last week, Facebook released several tips on how to identify false and misleading news stories along with new tools users can use to report whether a news source is fake or not. This is part of Facebook’s post-election awakening as the company has come under fire for the proliferation of fake news on its platform.
Yet, a glance at these tips doesn’t reveal any sort of understanding of what the “fake news” phenomenon is. The first tip: be skeptical of headlines as they are often catchy and misleading and doubly so for headlines in all caps. But what makes a catchy headline and how do I know when it is misleading? Other tips cautions users to evaluate the sources and the evidence. Does the author have a good reputation? Is the publisher trustworthy? Are the studies accurate? But, as media scholar danah boyd points out, there is no shared definition of what constitutes a trustworthy source. It’s possible for two people to investigate the same news source and come to different conclusions about its trustworthiness.
Instead, the tips rely upon vague definitions of trust, truth, and accuracy that shift responsibility away from the platform and onto the user. These definitions allow the user to use preconceived ideas of what constitutes a trustworthy news source regardless of whether the source is actually accurate. Despite the fact that Facebook users will have different levels of media literacy, trust in various authorities, and ideological commitments, the vagueness of the tips relies upon the assumption that we all make sense of information in the same way and that our definition of truth is universally shared.
Yet, research shows that people of all backgrounds tend to believe information that confirms their existing understandings. Further, when presented with information that contradicts their beliefs, the commitment to existing beliefs is often intensified. This doesn’t mean that people are stupid or lack critical faculties. It means that information processing and the formation of belief is based upon more than “truth” and “rational” evaluation. Appealing only to truth and accuracy does nothing to change the underlying structures that shape how we define these terms in the first place.
What the tips ignore is that many of the characteristics of fake news are also characteristics of mainstream news sources. Trustworthy sources also rely on shocking headlines, unnamed sources, and misleading photos and graphs. In addition, the modular nature of social media platforms means that both fake and true news sources rely upon the same social forces and infrastructure to spread their content. From the perspective of Facebook, any news is revenue generating, shareable content. From the perspective of news, there is no difference between a share or click from someone who believes it or from someone who rejects it – all that matters is going viral.
Further, the emphasis on the individual user for determining truth aligns smoothly with the American obsession with personal responsibility and the bootstrap myth. Users can report news stories as false with new reporting tools and enough red flags may trigger third party evaluators to check the accuracy of a story. But, what is to stop people from being untrustworthy of the third party evaluations? And, if someone is mislead by a news story, is it their own fault for being mislead? If we know that different people have different, and sometimes conflicting, definitions of what constitutes trustworthy news, how can we expect this process to lead to the development of a shared consensus of meaning?
The problem of fake news is larger than Facebook. To place the responsibility for this problem on Facebook is not only to misunderstand the deeper issue but to grant even more power to a corporation that already has an incredible amount of influence. The company has already demonstrated that it is willing to play with conventions of truth and shareability when it comes to its community standards and expansions into new markets. The problem is related to the deeper sociocultural processes through which we determine what truth and trustworthiness mean. It will take more than Facebook to solve this problem and I believe we should scrutinize the role social media companies wants to play in addressing it.