(via Instapaper)
Cormac McCarthyâs Blood Meridian: Early drafts and history.
Only two things are infinite, the universe and human stupidity, and I'm not sure about the former.
~ Albert Einstein
Stupidity and intelligence work on a sliding scale: each of us have some degree of both. The most fascinating aspect of this simple truth is that we can slide the scale in whichever direction we choose.
Just as emotional intelligence can be increased, actual intelligence can also be increased.
A recent study seems to offer fairly conclusive evidence. Dr. Adrian Owen explains:
When we looked at the data, the bottom line is the whole concept of IQ — or of you having a higher IQ than me — is a myth.
The concept is astonishing- but what are we supposed to do with that information?
For starters, make yourself smarter.
There are (at least) two approaches to this: make yourself smarter in a particular area, or make yourself smarter in general.
The first option is quite a liberating one. How many of us do what we dreamt of doing as kids? Too few of us, certainly. I wonder why, then, that is the case.
Most of us are familiar with the 10,000 hours concept- that it takes 10,000 hours of rigorous activity in a given field to become an expert. Prodigies are the exception, of course, but they are just that: an exception. The vast majority of us simply belief ourselves to be of inferior talent or smarts. The only difference, though, between amateurs and experts (allowing exceptions for the prodigies) is the time applied, the hard work.
When put in this light, it becomes obvious that intelligence itself is malleable. Great men and women, be they thinkers, painters, businessmen, or rodeo clowns all began with a simple curiosity that pushed them to begin. They acquired some small sliver of knowledge, a foundation, then they slowly built on that knowledge, piece by piece, until they had put in enough time to be considered experts. Even when that point is reached, you’ll notice, few acknowledge it: they cannot shake their curiosity long enough to stop building their dreams.
That is option one: to find something you love, and make yourself better at doing it. The second option is to increase your general intelligence with no specific aim in mind. Read a great book. Discuss something with someone smarter than you. Travel to learn about a new culture. Or simply open your laptop. There’s more there than cat videos and status updates in that box of transistors; there are free courses at Khan Academy and Coursera, there are brilliant articles, there are actual people on the other side of the world with whom you can converse.
If you want to be more intelligent, pick up an easy book. Then pick up a harder one. Then a harder one. You’ll get to where you want to go. You won’t get there if you keep watching Honey Boo-Boo.
So, those are your two options: general or specific knowledge (option three would seem to be some form of brain training, but the study mentioned above debunks that sort of training as a myth).
You do not get the option of taking neither approach, of doing nothing. It seems quite contrary to the opinion that we should all be free to be as stupid as we want, that somehow liberty entails the right to do nothing.
It does not. Liberty is not an absolute. The right to liberty ends when it infringes on the rights of others.
Here’s the thing: your ignorance on many a subject is detrimental to others. That very fact puts the onus on you to learn more than you do right now. Here’s an example.
I was as emotional as anyone else in this country on Saturday morning; so much so, in fact, that I did something very out of character: I ranted. On Facebook, of all places. I wrote in the midst of a surge of anger which would not be contained.
That, of course, was a mistake, yet, instead of regretting the decision, I chose to think about why I was angry enough to do something so out of character. It didn’t take long for the answer to become clear.
I was mind-bogglingly infuriated that someone would take the lives of innocent children, true- but it was the knowledge that this didn’t have to happen — that sheer stupidity had allowed this to happen — that drove me over the edge.
This was an anger that had been building for some time, in response to a number of issues politicized in the past few years. To explain:
There are some issues in this country that are genuinely a matter of principle. The definition of freedom can be debated. So¸too, can spending vs. saving, the role of taxes and of higher education, the role of government.
Some things, however, are not debatable: namely, those things for which science, also known as reality, is in direct contradiction of a particular stance.
Creationism, for example, is false, period. It simply does not mesh with the evidence.
Likewise, global warming is happening, whether you like it or not.
Next up (surely you saw where this was going) is gun control. Not only is there no evidence that more guns make a society safer, there is direct and overwhelming evidence to the contrary.
Some would say that there is no harm in allowing someone their delusions- and in theory, that is true. The problem occurs when those delusions permeate the reality of others. There is no harm in a person believing that the world was made in six days. When that belief graduates to trying to teach my daughter, in a public school, that science is wrong, we have a problem.
Similarly, when the scientific world proves that less guns always equals less death, you don’t have a right to dispute that (other sociologists, etc, do, but they have facts and data to argue wth). When lives are at stake, you don’t get to ignore the evidence to protect your delusions. (If you’d like to dig deeper in to the subject, Jason Kottke has dedicated his site to the subject of late, or you can peruse this fantastic list of articles.)
I realize that I am blurring the lines a bit between stupidity and ignorance. Ignorance is the absence of knowledge. Stupidity is the blatant disregard of it, and it is that disregard that I’m addressing. Ignorance is forgivable. Stupidity is not.
It is ignorance that our the age we’re living in best addresses. It’s almost impossible to be ignorant of such pervasive issues in the age of the internet. The information is available to you, and by and large, we’re aware that it’s there- and you don’t even have to leave your front door to get to it.
The internet provides us with a foundational framework with which to combat stupidity in a number of ways, the most undeniable of which is collective intelligence: the phenomenon by which we can store information in an area outside of our brain (the web), as long as that information is easily accessible. Simply put, if we can Google something, our brain does not need to retain that information. The result is that we are able to tap into the collective intelligence of the entire web. We are all smarter by association. To ignore the intelligence that exists at our fingertips is not only to do so at your own peril, but the peril of others as well.
There is a small comfort in knowing that, in the history of our species, science and reason, thus far, have always won out. But it is indeed a small comfort, because the triumph can never come soon enough when reason’s opponents inflict so much suffering on others.
The world is too rich, too full of the beautiful, both known and (as yet) inexplicable to cause needless suffering. And since we are each solely responsible for our own intelligence, we are also responsible for the damage our ignorance inflicts. Sometimes the damage is minimal: perhaps we’re accountants instead of the astronauts we wanted to be. Sometimes, the damage is greater. Sometimes, children die.
All this is not to say that some are stupid and some are not; just as intelligence works on a sliding scale, it is also issue-specific. Rocket scientists can be quite the ignoramus when asked about psychology. There are a great many things about which I am embarrassingly lacking in knowledge, and it is for that very reason that, when someone corrects or enlightens me on a matter, I am grateful. They have given me a gift, a gem of education to be added to the treasure chest of knowledge I’ve spent my life filling.
I’m also susceptible to the very human tendency to see that gem as a lump of coal if it is delivered as an insult or a personal attack and to discard that gem accordingly, so we must take great care to deliver our gems as the precious stones that they are. After all, if our intelligence is in our own hands, it must be properly cared for.
(via Instapaper)
There has been a brilliant — and necessary — discussion springing forth from Matt Haughey’s Why I love Twitter and Barely Tolerate Facebook. It has elicited a wide range of responses.
It has sparked a discussion, a conversation that needs to be had. After all, if any had doubts about the profound effect of social media on our culture, those doubts should be dissipated by now; the effect is real. The next phase is to attempt to understand it. To that end, let’s continue the discussion.
Rian van der Merwe pins down the crux of the problem in Facebook and the Imperfect Past: that we are torn as to how to approach social media.
From the beginning (of Twitter, of Facebook), we’ve endeavored to put our best foot forward online. When we open that little box, when we’re presented with that blinking cursor asking us “What’s happening?”, our first instinct is to present the very best version of ourselves.
Look at me! Look how wonderful everything is in my world!
That predisposition is a natural one. Were it not inherent in human nature, status symbols would not exist. No one buys a BMW for the experience. BMWs sell because they proclaim to the world that the owner is someone to be envied. He has arrived, world: take note.
I’ve been as guilty as anyone in presenting this polished version of myself, and my motivation, I suspect, is not uncommon: we want to provide those who are interested enough to follow us with some value. We don’t want to drag them down; we want to uplift.
More, we present this ideal of ourselves as something to aspire to. Having spent years in sales, I’m quite aware of the power of creating a unified image of your goals and positioning that image so that it stares you in the face, saying “If you want this life, come and get it.” It works, and, incidentally, that tendency may even be an adaptive advantage.
A shift is underway, though. A potentially transformative way to approach social media — indeed, our entire online identity — that I like to call the new normal.
Rian explains it well:
I’m slowly coming around to the idea that if we’re going to embrace public living (in the form of social networks) at all, we should either go all in with the full spectrum of our emotions, or rather not bother. Because if we only share a small, perfect sliver of our lives, we start to create unrealistic expectations for ourselves, and the people who know us.
JD Bentley expands on that thought in The Imperfect Past:
What I find most intolerable about social media—the ability to “only share a small, perfect sliver”—is what the most ardent social media users seem to like best.
With a few clicks and keystrokes, a perfectly boring, mediocre and unenjoyed life can look downright meaningful.
The word I latched onto here is ‘meaningful.’ We present this meaning as a form of value. Our tweets, our status updates are only valuable insofar as they provide some sort of meaning to our followers. Those of us who don’t post pictures of cats stuck in a box or gifs of laughing babies post things because of their perceived value.
Here is the conundrum, then: how do we present the entire spectrum of our lives, our selves while still providing value?
The answer begins in the difference between Facebook’s Timeline and Twitter’s ‘now, now, now’ approach. I’m not a Facebook fan, by and large, but I have to concede that the Timeline feature is somewhat brilliant. It is perhaps the greatest example of doing something with our updates, our data, something Frank Chimero expanded upon in The Anthologists:
The desire to archive things for posterity is the itch that makes yearbooks and timelines feel necessary. We create edges and impose order on documentation to help us understand time, experiences, and ideas.
We create edges to contain our data; to tell a story, and it is the story that can create the value derived from the full spectrum of our lives.
Rian touches on the story we tell through our updates:
Obviously Facebook only tells the story it knows, and most of the time it only knows about your happy times. What we sometimes forget is that it’s conflict that makes the story of our life interesting.
He goes on to quote Donald Miller in an excerpt from A Million Miles in a Thousand Years:
When we watch the news [and stories about violence come on], we grieve all of this, but when we go to the movies, we want more of it. Somehow we realize that great stories are told in conflict, but we are unwilling to embrace the potential greatness of the story we are in.
The story is how we create value, and a good story must necessarily contain equal parts light and darkness. Steven Pressfield explains:
The antagonist is the dark side of the protagonist.
A story succeeds precisely because of the presence of darkness, of struggle, because, as Pressfield puts it, “these stories provide us with models for dealing with adversity.”
We can create value by linking to thought-provoking articles, by quoting the occasional inspirational thinker, by spouting off a snarky comment about the latest story to break.
But we can also create value by telling our story. The hopes, the dreams, but also the heartache, the struggle. The world in which we live, like a good story, contains equal parts light and darkness. We’d do well to present each to the world, to better mirror the intricacies of the offline in the online.
I’m not convinced, though, that that reflection should be a perfect one. JD disagrees:
People like me would prefer that the online world was a better reflection of the offline world. People who drive social media prefer the opposite. Social media is a platform for publishing exaggerated accounts that support one’s values. It’s not a collection of facts as much as it’s a collection of persuasive arguments for particular worldviews.
To hope for a better reflection seems necessary, but it can only go so far: no one wants to know what I had for lunch, or that I just stubbed my toe. We can and should be true to our story, but we must also respect the time of our followers, friends, and colleagues. If we fail to do so, they, like us, face the prospect of an endless stream of banalities.
JD gets that, of course. Lamenting the data-mining motivation behind the Timeline, he goes on:
So, on the one side, you have a company that cares about telling a profitable story. On the other side, people who care about telling a good story.
Social media isn’t interested in telling the whole of the truth.
Leaving aside, for now, the motivation behind the platforms (I think we need to delve much further into that discussion, but that’s another post), a couple of points can be made.
Firstly, those people who care about telling a good story are those who will drive the future of social media. As early adopters (read: geeks) are those who now drive, to a large extent, the future of our culture (what is geek culture today is mainstream culture tomorrow), so the passionate voices in the online world have the power to shape the direction of that world.
That’s us. We’re the passionate ones, as evidenced by the fact that we’re having this discussion, that you’re reading this piece.
On the other hand, there will always be the others. There will always exist those for whom banality is the mainstay of their existence: just take a look at the TV ratings for Honey Boo-Boo or the prevalence of celebrity gossip.
We have to be okay with those people living out their existence in the same space as we do. The solution is simple: unfriend, unfollow.
Then move on to witnessing the stories of those who actually give a damn.
Recently, I’ve come to a crossroads in my writing: I’ve written many essays over the past year or so, but I’ve reached a point at which the things I want to say are too large, too abstract, or too fundamental to properly convey in an essay. The things I want to say are those types of truths that can only be properly told through a good story.
Socrates and Jesus told parables for a reason: they had a greater impact than any other form. We are creatures of story, in fact: it is the thing that our entire world consists of. We tell stories to our children and grandchildren to teach them of our own past. We read newspapers and magazines for their stories of success or failure: the poverty-stricken rise to success through ingenuity; a celebrity meeting his demise at the hands of an ill-advised bender.
As I made the transition to telling stories, I began to think more and more about subjectivity. Even now, as I write this, I struggle with the distinction: do I outline this piece, form a coherent structure, then fill in the blanks? Or do I write it as it comes, so that it becomes a mirror for my stream of consciousness? One implies objectivity by mimicking a sort of journalistic narrative; the other implies subjectivity by letting the thoughts in my head spill out onto the page.
In the end, I’ve chosen a combination of the two.
Let me give you an example of the distinction we’re discussing.
To my mind, there are two titans of Russian literature who perfectly epitomize the effects of subjectivity and objectivity.
Tolstoy was a master of objectivity, if not the master. Take this excerpt from War and Peace:
For him, it was no new conviction that his presence at all ends of the world, from Africa to the steppes of Muscovy, struck people in the same way and threw them into the madness of self-oblivion. He ordered his horse brought and rode to his camp.
This is a remarkably poignant observation, made all the more effective by its striking tone of objectivity. The subject of the paragraph is Napoleon, and Tolstoy gives us extraordinary insight into his psyche by remaining detached. “Threw them into the madness of oblivion” is particularly effective. The phrase conjures such a wide array of emotion and thought in the reader in so few words. Tolstoy could describe the “madness of self-oblivion,” but he doesn’t, instead opting to allow room for the reader’s thoughts to echo, enhancing the effect. By remaining objective, he allows the reader to transfer something of himself onto the text. The last sentence tells us of the sociopathic demeanor of Napoleon: though he causes this absurdly strong reaction in people, he himself couldn't care less: he simply goes about his business.
Tolstoy gives us the facts, allowing us to draw our own conclusions from them.
Dispassionate objectivity is itself a passion, for the real and for the truth.
~ Abraham Maslow
Now, take a passage from Dostoyevsky’s Crime and Punishment:
Coughing stopped her breath, but the tongue-lashing had its effect. Obviously, Katerina Ivanovna even inspired some fear; the tenants, one by one, squeezed through the back door, with that strange feeling of inner satisfaction which can always be observed, even in those who are near and dear, when a sudden disaster befalls their neighbor, and which is to be found in all men, without exception, however sincere their feelings of sympathy and commiseration.
Here, the subjectivity leaps from the page. Dostoyevsky is describing a scene which may or may be not interesting in and of itself, but it is the imposition of his own beliefs onto the scene which lends the passage its depth of emotion. The simple scene of tenants leaving a room becomes quite powerful when we think of the opinion that all men- every single one- feel a sense of satisfaction at the misfortune of others. In this case, the misfortune is death, and it's appalling to think of the tenants squeezing out of the house with a smug smirk of satisfaction on their face. But once we are sufficiently appalled, Dostoyevsky then reveals to us that we, the reader, should also be appalled at ourselves, because these feelings of satisfaction are present in “all men, without exception,” you, dear reader, included.
The focus of subjectivity is a distorting mirror.
~ Hans-Georg Gadamer
Now, it’s quite easy to contrast subjectivity and objectivity in terms of dead Russian writers, but we’re interested in how it affects our writing.
Suppose I’m writing a piece for Sssimpli, a profile of a new startup. I want to give the reader the facts: this is what the startup does, this is the problem it solves, these are the devices it works on, etc. There is an objective element.
I don’t want to remain completely objective, though. The things I write about on Sssimpli (and indeed everywhere) are things that I feel strongly about. I’m excited about this particular startup, and I want to convey that excitement to the reader. I can only do that (or, rather, I can do it much more effectively) with subjectivity.
A similar chord is struck on link blogs: take a couple of examples from Rian Van der Merwe’s homepage. One recent entry points to a New York Times op-ed: Rian offers a simple opening statement, a quote from the piece, then ends with: “I don’t want to spoil it, so I’ll just link there quietly.” He offers no insight, no dissection. He simply lets the piece speak for itself. He recognizes the impact that objectivity can have here.
In another piece, Rian recognizes the value that some personal insights can offer, so he breaks down some of the recent decisions at Twitter, interjecting his own subjective thoughts into their design process. Had he stuck with the facts, the piece is not nearly as useful.
This is the line we must walk, and I’m convinced that it is not only a remarkably difficult line to toe, but it is the litmus test of a true writer: he who masters the use of these two extremes is a craftsman. Whether the text requires complete subjectivity, complete objectivity, or a deft mixture of the two, it is what transforms the mediocre read into the words that find a home in our soul, never to let go.