In the chaos of mis/information nowadays, finding moral reference points can be quite difficult. Everyone wants to get along with their friends, so if one says something that might sound a bit morally awkward, we might shrug it off. I believe some moral premises should be regarded as red flags that warrant further discussion. For instance: the zero-sum-game.
A zero-sum-game is a model of interaction in which any participants gains are balanced out by another’s losses. In other words: there is no scenario in which all players benefit. In order for one to gain, another must lose. As a moral premise, it is necessarily malicious.
As such, I think we should listen for it, especially in social or political discourse.
Listen for it in ideas like:
“I wish we didn’t have to keep bombing the Middle-East to stay safe.”
A much more insidious place this premise hides is in variants of “If we don’t hurt them, they will hurt us” from those who aren’t clear and present danger. As a softer variant, “if I’m not comfortable, they don’t deserve to be comfortable”, or “If I don’t feel safe, they don’t deserve to feel safe”. In application, the context can range from warzones to courtrooms to interpersonal relationships.
Not that people who say these things are evil, but these ideas should be elaborated on and scrutinized. Often this sort of rhetoric is used to support causes that, on the whole, have been good for the world. Despite this, I believe that actions carried out based on these ideas tend to do more harm than good. Do not let friends and loved ones act on the zero-sum-game unchallenged.
I think political discourse among those who disagree are much more fruitful when addressing moral premises, rather than debating the details of recent events.
No amount of research or raw information will affect the actions of the politically polarized, be they Congresspeople or new voters. We have to engage morally, because that’s where the divides must be bridged.
The time for abstract armchair philosophy and detached hypothetical scenarios is over, at least for me and mine. Now is the time to engage and act on however you think you can contribute to your idea of a better world. Find where we agree.
Violent conflicts may be an inevitable feature of human nature, but indifference is a moral blight.
Humanity’s relationship with the basic concept of information has experienced very rapid, jarring changes over the course of history. The first great leap was is attributed to the printing press in the mid-15th century. Nate Silver’s book, The Signal and the Noise, does an excellent job of describing it’s impact on civilization that I think should be closer to the forefront of everyone’s conscious mind as we live through these chaotic times.
The original revolution in information technology came not with the microchip, but with the printing press. Johannes Gutenberg’s invention in 1440 made information available to the masses, and the explosion of ideas it produced had unintended consequences and unpredictable effects. It was a spark for the Industrial Revolution in 1775,1 a tipping point in which civilization suddenly went from having made almost no scientific or economic progress for most of its existence to the exponential rates of growth and change that are familiar to us today. It set in motion the events that would produce the European Enlightenment and the founding of the American Republic.
But the printing press would first produce something else: hundreds of years of holy war. As mankind came to believe it could predict its fate and choose its destiny, the bloodiest epoch in human history followed.
With the printing press, one didn’t necessarily need to be in good standing with the church to create and distribute books with ideas that may have conflicted with religious doctrine. While this also spawned the European Renaissance, the chaos that ensued should not be forgotten.
If it’s not obvious why I brought this up: the internet is causing another information revolution. Before the printing press, the basic mechanisms of civilizations were designed around people having no more knowledge than what could be remembered offhand. If few people are literate, how can one distinguish a written lie from a truth?
Today, we’re having another stage of this problem. In the modern world, information has been ubiquitous for centuries. Every public school has a library filled with books that children are taught to reference whenever they seek more information to absorb and convert into knowledge that can be shared or applied infinitely. In fact, for centuries, simply having certain stores of knowledge in one’s head well enough to regurgitate on command was a viable skill; now, even that has changed.
Now, we have the same problem with information as the 15th-18th centuries had: more of it than we know what to do with. Academic institutions still require students to memorize things despite ubiquitous tools that store and regurgitate information for us. Bureaucratic institutions process physical forms at a rate several times longer than what’s needed to Google everything about the process and maybe even design a better one.
In homes, parents don’t know what to tell their children about surfing the internet safely because the internet is so fundamentally different from what it was just ten years ago. There’s no institutional knowledge for how tablets affect toddlers or at what age children are liable to wander into the darker alleys of the internet… or if that’s even correlated with age. What content is good or bad for them? Is that even possible to measure? I remember a time before the internet where ideas backed by information were stronger than ideas that weren’t. Now, every idea can so easily find information to support it, but not all information is equal. Therein lies the problem.
Our ability to distinguish valuable, relevant information from noise has not grown proportionally with our access to it. With books, humanity eventually developed filtration methods, primarily in the form of literacy and critical thinking. Rigorously refined ideas were the only ones that warranted the effort to be studied, reproduced, and incorporated into reference texts and archives. Schoolchildren and especially college students are rigorously taught to distinguish good from bad sources, and to cite their assertions.
We don’t quite have that yet for the internet. Digital activity is monetized by clicks, and measured by attention (time spent viewing content), so the economic incentive of any web-based company is to present you, the digital denizen, with information that you react to, which is probably what you like.
If you read something in a newspaper or heard it on a radio or even saw it on T.V., describing it to a friend would require some degree of processing and mental digestion, during which many baseless ideas might get filtered out. Now, with the touch of a screen or click of a mouse, any headline that even for a split second inspires you to share can within seconds be presented to hundreds or thousands of others, many of whom might have a similar reaction and continue the chain. I’ve fallen victim to this mentality many times, and am thankful for friends who hold me accountable and prevent me from harboring false ideas.
I implore you to ask: What is your standard for truth? Specifically in the realm of politics. Whom do you believe, and why? I don’t have a clear answer for this if you ask me, but these are questions I want to think about and discuss with as many people as possible. Given recent events, I think it’s evident that the average American’s understanding of the political climate is, at best, guesswork. Which might be fine if Democracy didn’t depend on us understanding one another. But it does. We’re all on this rock hurtling through space together; let’s at least try to get along, however bleak the task may seem.
And so, I’m going to start this series, primarily as my own outlet for meditation on the goings on of the world.