SCET will host the Innovation X Roundtable: Algorithms and AI for Media in the 21st Century to further discuss the top of truth and disinformation online on Oct. 21, 2021
I was recently at the Advisory Board Meeting of the BI Business School in Norway. In this remarkably well run meeting, we were asked to list new concerns as we emerge from the pandemic. One topic that surfaced was the disturbing trend of violence and social unrest that we have observed recently. Many recent articles have validated this disturbing trend.
While the increase of hate and violence is an alarming signal, it is actually the consequence of a larger issue, and it is not by itself the root cause of the global tensions today. The far bigger problem is that we may have lost the concept of truth and our ability to resolve truth as a society.
Currently, we are living in a world where silos of people live with vastly different views of what is real or true. These factions now live fully in their own social media echo-chambers. They are increasingly armed with conflicting facts and differing viewpoints of history. The trend can only result in even more emotionally-charged conflicts and a sequence of even larger problems.
Many will argue that we may never have had access to truth and that our beliefs were always always affected by government and society. And this is a valid point to consider. However, as a population, we certainly don’t have consensus.
The issues of truth are also central and practical in our work and life. It has implications for the future of news and media which today now compete indirectly with Facebook and other social media firms. It also has implications for educators, policymakers, and in fact for all of us as consumers of information.
How did it get to this point?
Point #1: Truth today is largely decided by algorithms which do not actually evaluate any facts directly. For example, if a large number of people post and link to something absurdly false on the Internet, then Google’s PageRank algorithm is more likely to raise that false result to the top of any related web search. In this model, the truth is not a function of the facts, but of the louder voices of the population. To be clear, this is not how we used to determine truth. In the past, we had knowledgeable, trusted scholars or community members who would review details, discuss, and then communicate their findings. In the new democratized truth model, a scientific fact is just one possible option for truth to be compared with many conspiracy-oriented alternatives.
Point #2: Social media has further amplified this problem. It is funny, but when the Internet was being developed, we believed that it would bring the world together by offering common information and education. In reality, what has happened is that it allowed communities of people to spend time talking, not with their neighbors, but only within specialized communities. While this is great for a group of experts that cannot meet together due to distance, it is bad when people are in groups with narrow, homogenous, and often distorted views of the world. By considering the illusion of truth effect (1977, Villanova, Temple), we know that repetition of information, from people who one is familiar with, reinforces its validity. Roughly stated, your beliefs are the average of the five people or sources you interact with the most.
Why This Matters
Truth is fundamental to trust. The loss of truth and trust result in widespread conflict. When we do not have the ability to discern truth we cannot trust recordings of history. We will end up suffering from a widespread case of forgotten memory. Factions of people now live fully in their own echo-chambers, armed with conflicting facts and differing viewpoints of history. The trend can only result in an even greater level of emotionally charged conflicts and a sequence of even larger problems.
Truth and trust are not only necessary to avoid conflict, but they are also necessary for innovation. The intensity of innovation has been shown to depend on both trust and diversity of thought. So as whole, the current situation is leading to conflict as well as a breakdown in progress. The loss of truth is actually the biggest global sustainability challenge that we face today.
Is there anything we can do?
For sure, this is a topic that we all need to discuss, understand, and work to resolve. This is likely the most important issue of our time. In the case of the Internet, we may have created a technology that affects each of us so deeply that we all are collectively afflicted by it.
In the discussion that I have observed so far, I have noticed these themes:
- Should we be teaching our students to defend themselves from the echo-chambers that we are all subject to in our environment (and to realize that what we think is true may actually not be true)?
- Should we focus on the opportunity to fix the technology? Should algorithms be assessed for the global damage they do? Do we need a new class in technology or AI that is more sophisticated than the current version which only amplifies people’s biases? Just as running water now works with copper pipes instead of lead, the value is only greater now that it is also safe to drink.
- Should we be fixing this problem with policy that regulates the effects of the social networks and their corresponding technologies? Is it possible for those in their current echo-chamber to step out long enough and think clearly enough to be able to devise a better path forward? Per the analogy, if you have been drinking water from lead pipes, are you still able to function well enough to see the problem and then invent a better pipe?
We have a myriad of global challenges ranging from climate change to world hunger, but no challenge is more limiting to our global sustainability than fixing the loss of truth from social network echo-chambers and the sub-standard AI that we are currently using to differentiate fact from fiction.