Not The But A Reality: Our Own Personal Truth

Arryk
3 min readApr 23, 2022

In Douglas Adams’s book “Dirk Gently’s Holistic Detective Agency” a plot point of the book is the creation of a piece of software called “Total Reason”, unlike traditional information gathering wherein the participant is presented information to draw a conclusion from, this piece of software lets you start with a statement (the example used in the story being “I Want To Invade Switzerland”), for the software to then go on to find reasons as to why your decision is a good idea.

In the real world where we live, there is an online phenomenon called the social media bubble. The idea is that people — despite using the same social media platform — due to the ability to curate experiences on these platforms, can get the point where for any given story two people can go onto their social media feed, hear about it, and come to completely different understandings of how the events of that story occurred, where the fault, the blame, and explanation lies for the events therein. Moreover, people in these bubbles likely wouldn’t even hear about most of the same stories in the first place, they would instead be presented stories which fit their preexisting worldview.

The technology which enables this phenomenon is based on machine learning algorithms, the AI behind these algorithms has the goal of getting people to spend as long as possible on the platform they are built for, and as it so happens, this often involves showing people information which leads to a particular worldview, be it one that the user already comes to the website with, or one which the website encourages the user to take interest in through recommended posts.

With all this in mind, it would be difficult to describe the online reality people are interacting with as truthful, instead, it seems as if everybody has their own truth, one crafted from a tapestry of different articles, reactions, impressions, and anecdotes presented to them. So many truths that the “real” one is buried in the mound.

The sad thing is, it’s hard to really call any of the tailor-made truths “lesser.” If you look at this situation through the coherence theory of truth, truth isn’t an absolute value that can be determined for any given statement, but a function which arises when a series of statements cohere. As such, a statement becomes true when it is compatible with another. This explains why the narratives fed through social media algorithms are so powerful, the person using media applications isn’t being shown a screen telling them what to believe, they are being put through a deluge of stories and statements which support an overarching idea which they can come to at their own pace.

So what then? Is humanity doomed to an infinitely fractured understanding of reality, everybody having an personal AI-tailored world to live in, where all their biases are validated and all their experiences are predetermined? Probably.

Annoyingly, it would be difficult to imagine technology getting us out of the hole that it in many ways dug. Even if you could write an algorithm which made sure every article recommended to somebody on Facebook or Instagram or YouTube or anywhere else was factually true, that wouldn’t solve the issue. You would have to not only show articles which don’t lie, but show these articles in accordance with their importance to reality. To understand why this is important, imagine a person who is only shown true stories about plane crashes, but is nevertheless shown every story about a plane accident. It is very likely that they would fear planes more than would be otherwise reasonable. “Importance to reality” is something which is entirely a subjective value judgement. It would be very hard to determine both factual and emotional truth.

Maybe they can invent an AI which maximizes how nice we are to each other, so that all our hearts can be one.

--

--