Amanda Brockinton, David Ormrod, John McAlaney, Amy Ormrod, Michael J. Jensen, Steve Fitchpatrick, Char Sample, Keith Scott, Sample, Char, Jensen, Michael J, Scott, Keith, McAlaney, John, Fitchpatrick, Steve, Brockinton, Amanda, Ormrod, David, and Ormrod, Amy
The misleading and propagandistic tendencies in American news reporting have been a part of public discussion from its earliest days as a republic (Innis, 2007;Sheppard, 2007). “Fake news” is hardly new (McKernon, 1925), and the term has been applied to a variety of distinct phenomenon ranging from satire to news, which one may find disagreeable (Jankowski, 2018;Tandoc et al., 2018). However, this problem has become increasingly acute in recent years with the Macquarie Dictionary declaring “fake news” the word of the year in 2016 (Lavoipierre, 2017). The international recognition of fake news as a problem (Pomerantsev and Weiss, 2014;Applebaum and Lucas, 2016) has led to a number of initiatives to mitigate perceived causes, with varying levels of success (Flanagin and Metzger, 2014;Horne and Adali, 2017;Sample et al., 2018). The inability to create a holistic solution continues to stymie researchers and vested parties. A significant contributor to the problem is the interdisciplinary nature of digital deception. While technology enables the rapid and wide dissemination of digitally deceptive data, the design and consumption of data rely on a mixture of psychology, sociology, political science, economics, linguistics, marketing, and fine arts. The authors for this effort discuss deception’s history, both old and new, from an interdisciplinary viewpoint and then proceed to discuss how various disciplines contribute to aiding in the detection and countering of fake news narratives. A discussion of various fake news types (printed, staged events, altered photographs, and deep fakes) ensues with the various technologies being used to identify these; the shortcomings of those technologies and finally the insights offered by the other disciplines can be incorporated to improve outcomes. A three-point evaluation model that focuses on contextual data evaluation, pattern spread, and archival analysis of both the author and publication archives is introduced. While the model put forth cannot determine fact from fiction, the ability to measure distance from fact across various domains provides a starting point for evaluating the veracity of a new story.