Can Knowledge Bring About a Mystical Experience?

It is a truth near universally recognized that you cannot become a superb mechanic unless — as a minimum requirement — you have a girlie calendar in your garage first acquire a great deal of knowledge.  Moreover, that knowledge needs to be of two sorts.

In no particular order, you must both learn the sort of stuff you can learn from instructors and books, and you must also learn the sort of stuff you can only learn hands-on, by doing something — sometimes over and over again.  The ancient Greeks called the first kind of knowledge logos, and the second kind, gnosis.  We tend to call them “knowledge” and “know-how”.

The question arises, however, whether you need one, the other, or both kinds of knowledge in order to become a superb mystic?

Put differently, must you be of any particular religious tradition or school in order to attain to a mystical experience, the sine qua non of which is a perception of all things being in some sense or way one?  Must you follow any specific teachings or practices?  And do teachings and practices even help?  Or are they really mere ways of idling away the hours before your boy or girl calendar arrives in the mail you attain to a mystical experience?

I suspect many good folks would say “yes” to most of those questions.  Perhaps it is even a cultural assumption — at least in the West — that knowledge is key here.  Especially, book or scriptural knowledge, because we in the West are so accustomed to seeing the bible as key to our spiritual development.  So if we start pursuing a mystical path, we naturally think in terms of how learning all the right things will help us.

But is that assumption borne out by the evidence?  I actually think not.  At least, not nearly so much as we might assume.

For the past forty years, I’ve been an amateur collector of mystical experiences.  When I guess someone might have had one, I prompt them a bit to tell me about it, and they sometimes do.   And something I’ve noticed: A fair number of people have been stumped what caused the experience, for they can recall having done nothing to bring it about.

Again, awhile back, a group of researchers solicited over 2000 accounts of mystical experiences and found that about 20% of them came from nontheists — atheists and agnostics.  Presumably, many of those fine folks had done nothing intentionally to bring about their experiences, although some might have.

It seems to me likely then that any kind of knowledge might not be as key to having a mystical experience as we might at first suppose.  But if that’s true, then it raises a fascinating question.

What, if anything, does the fact knowledge has relatively little to do with mystical experiences tell us about the experiences themselves?

I think it underscores or emphasizes how fundamentally mystical experiences are shifts in perception, rather than gestalt-like experiences.  By “gestalt”  I mean an event in which what you know “comes together at once” to create a new understanding of something.

Of course, that is sometimes called “a new perception” of something, but that’s not what I mean by perception here.  “Perception” in this context is the frame of our sensory fields.  For instance, we see, taste, touch, and feel thirst.  Each of those is a sensory field.  Perception frames them in the sense that it provides or adds to them certain characteristics that are not the things we see, taste, etc.

An example would be our sense that what we are seeing at the moment is real.  That perception of realness is not a property of the thing itself, but rather a property or characteristic of our sensory field of sight.  Again, we perceive things as being us or not us.  That once more is a characteristic of our perceptual frames.

A mystical experience can be seen as an abrupt shift from how we normally perceive things to a different way of perceiving things.   One of the frames that is lost during that shift is our perception of the world as divided between us and not us.

So now we might ask: Do we need to know much for such a shift to come about?

I think not.  For one thing, the neural sciences have now revealed that mystical experiences involve at least a reduction in activity in the parietal brain lobe, most likely an increase in activity in the thalamus, and perhaps a change in dopamine levels, among other things.  Those are things that could be influenced by some kind of conscious or subconscious knowledge, but they don’t necessarily need to be.

Again, there are no techniques of bringing about a mystical experience that guarantee you will have one, but Eastern meditative techniques have been relatively successful.  These, however, do not generally depend on more than the knowledge of how to practice them.

Last, some traditions, such as Zen, are full of stories of people who attain to mystical experiences without need of scriptures, or much need of instruction.  So I think it’s pretty clear that, for those and other reasons, knowledge is not much use in bringing about a mystical experience.

Jiddu Krishnamurti, by the way, was adamant on that point.  He routinely went even further to state that too much knowledge hindered or prevented mystical experiences.  One of his key points was that, if you set off in pursuit of a specific experience, you were likely to find it soon or later, but it would turn out to be a construct of your knowledge.

So there might be a way in which knowledge is not only unnecessary, but actually detrimental.  But where does that leave us?

In a word, calendars!  Krishnamurti had a beautiful metaphor.  Imagine you are inside your house (i.e your mind) and wish for a breeze to come in (i.e. a mystical experience). What can you do to make that breeze happen?  Nothing, of course.

You simply can’t force a breeze to rise.  However,  you can open your windows and doors.  That is, you can remove the obstacles to a breeze coming in, should one arise.

Rumi also said much the same thing: “Your task is not to seek for love, but merely to seek and find all the barriers within yourself that you have built against it.”

Rationalism and the Immanent Death of All Religions

When I was growing up, there were arguably few good fantasy novels.  Lord of the Rings was yet to become popular in my home town, but I didn’t feel I was missing anything because science fiction attracted me like no other genre.  Hardcore science fiction.

No unicorns, no dragons — and usually no gods.  Just stuff based on the science or scientific speculations of the day.  Issac Asimov and Author C.  Clark.  In fact, I believe it might have been in Clarke’s book, The Deep Range, where I for the first time came across the notion that rational science was replacing irrational religion in the hearts and minds of all the world’s peoples.

I simply assumed Clarke had a point.  After all, he surely knew more about it than me.  A few years later, I carried the idea with me to university, where I signed up for my first course in comparative religious studies at least half convince religion would be taught as part of the history department within twenty years or so.

I have since then been thoroughly disabused of that notion.  I was actually a bit surprised the other day when someone brought it up again.

Granted, there are plenty of reasons to believe that religion is on the decline in the industrialized world.  Numerous surveys seem to demonstrate that beyond doubt.  For instance,  a 2016 Norwegian study found that 39% of Norwegians “do not believe in God”,  while a 2015 Dutch Government survey found that 50.1% of the population were “non-religious”.   And even in the US, which remains the most religious industrialized nation, younger people are notably less religious than their elders.

Yet, to me these studies are very difficult to interpret for at least two reasons.  They don’t always seem to have clear enough categories, and they often seem to have too few categories.

I’m out of my league in any language but English, so I haven’t studied the non-English language studies, but  I’m suspicious of categories that get translated as “non-religious” or are based on questions that get translated as, “Do you believe in God?”

“Non-religious” can mean so many different things to different people.  I would describe myself as “non-religious” meaning not an adherent of any organized religion, but I’m also a bit of a mystic, and to some people, that’s quite religious.

Beyond that, there are usually not enough categories to these surveys to satisfy my insatiable appetite to categorize things.  Don’t believe in god?  Fine, but do you consider yourself,  an atheist, an agnostic, someone who believes in spirits, ghosts, etc, a Christian atheist (big in the Netherlands), a believer in a “transcendent reality”,  or do you perhaps feel “there just must be something out there”, etc.

But putting aside my uniformed suspicions about the studies I’ve seen, I think there are at least two compelling reasons to suppose religion will survive rational science so long as we’re Homo sapiens.  Both reasons are rooted in the origin and nature of religions.

Now, anytime you speak about the origin and nature of religions some folks are bound to bring up the traditional ideas about that.  Religions began as proto-sciences that tried to explain nature, such as thunder, in terms of supernatural beings.  Thunder becomes a thunder god, in that view.

Freud thought religions began as a desire for a father figure that turned into a god.  Feuerbach, following some ancient Greeks, thought religion began as an idealization of a great man, such as a notable leader, following his death.  Others have argued that religion was begun by people seeking a sense of purpose or meaning in life.  And so on.

I myself would not actually argue against any of those traditional notions.  For all I know, they and many other such notions at least played some role in getting religion off to a start.  But I do think there are two more influential candidates.

There is general agreement these days among cognitive scientists that religion involves the architecture of the brain.  That is, religion is based in our genes, and most likely evolved early in our history.  Beyond that, there is much debate and a handful of theories about exactly what our brain’s architecture has to do with religion.

For reasons of space,  I’ll stick to the one theory I favor.  According to its view, we evolved functional brain modules, such as modules allowing us to think of others as having beliefs, desires, and intentions (Theory of Mind), organize events into stories or narratives (Etiology), or that predispose us to respond to danger signs in ways that might save our lives if the danger is actually real (Agent Detection).  Depending upon who you consult, there are up to two dozen or so such modules.

One way these modules might come together is this:  You’re sitting around a campfire one night, partying over an antelope carcass, when you hear a rustle in the bushes and perhaps even an indistinct growl that you might only be imagining.  You startle, the hair on your neck rises, and chills run down your spine.  “Something is out there!”

That’s Agent Detection speaking.  The rustle could be from a breeze or a harmless small animal.  The growl might only be imagined.  But the key thing here is that you react with fright just as you would if it were known to be a lion.

A few minutes later, you and your buddies pick up your speaks to investigate.  Can’t very well get to sleep with a  possible lion that close in.  But you find nothing.

This is repeated a few times during the night.  Each time you find nothing, but then it happens the next night, and so on.  Sooner or later, your best story-teller cooks up a narrative (Etiology)  in which a malevolent spirit is “out there”,  prowling around your camp,  perhaps waiting for the moment to strike.  But your sense of Agent Detection predisposes you think there must be something there.  Being a spirit, you cannot see him, but you don’t need to — what else could explain something making noises that have no body behind them?

Last, as time goes on, you start ascribing more and more beliefs, desires, and intentions to the spirit (Theory of Mind), until one day you have perhaps a god.  Or maybe not, maybe you and your buddies are devout spiritualists without any recognizable deities.  Whatever the case, you’ve now got something “religious”, in at least some sense of the word.

If the above is true, then we now have one deep root of human religiosity.   A root so firmly grounded in our brain’s architecture that it must be genetically based.  A clear implication is that, having evolved it, we would need to evolve out of it to be entirely free of its influence on us.  Until or unless we do that, we will be born with a predisposition to some kind of religiosity.

But is there another root, as equally well grounded?  It seems curious to me that a second root of human religiosity seems so often ignored.  Even if one dismisses mystical experiences as “rare hallucinations”, that would not actually demonstrate they were of little or no influence on the world’s religions.  Indeed, they seem core to at least Hinduism, Buddhism, Jainism, and Taoism, and a significant theme in others, such as Christianity, Islam, and Judaism.

Now, there seem to be about 12-16 different kinds of experiences that are commonly called, “mystical”,  so I should take special care here to clearly distinguish what I mean by the word.  I mean only one quite specific kind of experience, which I call “the mystical experience”,  for lack of being inspired to come up with any other name for it.

The problem here is that, while it is easy to come up with words to describe the  content of that experience, it is impossible to come up with words capable of communicating that content to anyone but people who have had the experience.

Buddhists sometimes describe nirvana as a “cessation of suffering”,  and Christian mystics describe their experiences as “experiences of God”,  but neither phrase is able to communicate what those things mean to anyone other than the people who use them.  The problem is the nature of words themselves.  Words are symbols that ultimately depend on shared experiences to communicate much of anything.  If you had a barn, and I had never seen anything like it,  you would be reduced to describing your barn in terms of what I had seen.  “It’s like your mud hut, Paul, only much, much bigger.”

However, I have had some luck describing mystical experiences as involving a dissolution of subject/object perception, replaced by a perception of all things being in some sense one.  The key is to grasp that subject/object perception is perceiving the world in such a way that you divide the things you perceive into self and non-self.

That is, I not only see the tree in my yard, but I see the tree as “not me”.   That’s the normal, everyday mode of waking consciousness.  But if and when that breaks down and you perceive — if only for a moment or two — the tree and you as unified by some sense of oneness, then you’re having a mystical experience.  The sine qua non of those experiences is that breakdown into oneness.

In addition to that, there is much other content typical of a mystical experience, but it’s much harder for most us to understand how mundane joy differs from mystical bliss, than it is for us to understand we have suddenly lost or abandoned our sense of things being either “me” or “not me”.

Hence, I am only concerned with that one kind of mystical experience, but that’s not to say there are no other kinds — most of them probably more interesting than the mystical experience.

As I said, Christian mystics tend to interpret their experiences as experiences of the Christian God, but so too do most people around the world, and through-out the ages (except they aren’t usually talking about the Christian god).  Not the Buddha, of course, nor Lao Tzu, but so many others use “god” or virtual synonyms for god.  So, although there are an appreciable number of atheists and agnostics who have had mystical experiences, it’s easy to see how the experience could create a sense of deity.

Mystical experiences seem to be as deeply rooted in our genes as the other kind of experiences.  The neural sciences have revealed that they are associated with at the very least changes in the activity levels of the parietal lobe and the thalamus.   There seems to be evidence that they might also have something to do with “brain chemicals” like dopamine.  So, I think despite our understand of them is still quite limited we do now know enough to safely say they are genetically rooted in us.

Of course, the implication is that “god won’t go away anytime soon”.   But I think that can be more clearly seen when we consider that the sciences have no means for disproving the notion god might be behind, or the ultimate cause of, such experiences.

Even if we knew everything about their natural causes, we would have no means of knowing anything about whether or not there were supernatural causes to them also.

Now, if all of the above is true enough, then I think its safe to say the imminent death of all religions is not exactly “around the next corner”.  We would most likely need to evolve so far as to become a new species — with a new kind of brain — for that to happen.  So, while people may shift from one form of religiosity to another, I think most of us will retain some kind of religiosity.

I hope the future brings us ever more benign forms of religiosity.

Late Night Thoughts: Intellectual Honesty, Social Engineering, Meditation, and Sex Lives (July 1, 2018)

A couple weeks ago, I looked out my door to see a doe trailed by two spotted fawns passing through my yard in broad daylight — quite an unusual time of day to spot deer moving about so near to the center of the city.

A day or so later, presumably the same doe and fawns — but I’m not sure about that, since I didn’t get their names the first time around.

Since then, just the usual three or four raccoons, and those at night — nearly every night.

♦♦♦

The truth isn’t neutral, is it?  I don’t mean “neutral” in the most important sense — in the sense of being objective.  But rather how we so often feel emotionally about it as being either for or against what we believe or are willing to accept.  As everyone from Plato to the present has known, emotional attachments or aversions can distort rational thinking.

On a perhaps more abstract level, we are subject to cognitive biases — Those are genetically inherited, systematic ways our brains function that cause us to deviate from rational thinking.  The most famous of them seems to “confirmation bias” — a tendency to  “search for, interpret, focus on, and remember information” in a way that confirms our existing notions and expectations.

So far as I can see, there are only two practical remedies.  First, a good education in critical thinking skills,  beginning early in life, and very much including the effects of cognitive biases on us, but also including logic and semantics.  Still, I don’t think that would be enough.

To me, the key is to recognize how much  easier it is for us to notice that someone else is gone off the rails in their reasoning, than it for us to notice we ourselves have.

And then build on that.  Teach the kids to seek out and find people through-out their lives who they can reliably trust to give them honest and accurate feed-back or reality-checks on their reasoning.

I suspect a likely side-effect of that kind of an education would be a general awareness of the importance of intellectually honesty.

Yet, I have little hope any such education will become generally available — at least not anytime soon.  We don’t have the best tradition in America of funding schools well, for one thing.

♦♦♦

July Fourth.  I do not know how far we have departed from the concept of “citizenship” that folks like Alexis deTocqueville noticed we embraced back in the early days of the Republic, but I suspect it’s a great deal further than most of us would be comfortable with — assuming we were to fully grasp what we have lost.

Volumes can and have been written about that, but I would like to focus on Edward Bernays and what he called, “The Engineering of Consent”.

As Bernays believed back in the 1920s, when he founded the public relations industry in America, that the social and psychological sciences had advanced to the point they could be used to engineer consent — or systematically get folks to “support ideas and programs”,  as he sometimes put it.

Not just through normal, more or less amateurish, means of persuasion, but through greatly more effective and reliable “scientific” means.

Now, despite his goals, Bernays was not the evil villain of Hollywood melodramas.  For one thing, he urged professionals in his newly created field to guard against any temptation that might involve them in such nefarious things as undermining the Constitution — especially, the “freedoms of press, speech, petition and assembly”.   Moreover,  his motives seem decent enough in some ways.

Bernays was Sigmund Freud’s nephew, a Jew, and quite aware of how mobs could quickly turn into pogroms against innocent people. Like many people, he thought democracies were especially susceptible to mob rule and violence.  So, it seems that one of his goals was to find ways to defuse those mobs before they even happened.

Yet, regardless of his motives, Bernays made what I regard as more or less a pact with the devil, for his strategy to make democracy safe for everyone has now had a hundred years to bear fruit — and what fruit!

In a nutshell, this was his strategy: Persuade people to seek self-fulfillment through consumerism so that they would be so satisfied with the acquisition of ever more and more material goods and services, they would not feel any need or desire to “take on” or change the status quo.  In short, they would be content with their lot.

Put differently, he sought to change the American culture and mindset from a people intimately concerned with politics as a means to at least create the best possible conditions under which people could seek self-fulfillment, to a people intimately concerned with consumption as the best possible means.

I think if deToqueville can be at all relied on for a glimpse of the political activism of the early Republic, then a comparison of that activism with today’s relatively insipid and dispirited activism is instructive.  We have, to some large extent, realized Bernays’ dream of turning us from a nation of citizens into a nation of consumers.

Should you be interested to learn some of the details, I recommend the award winning documentary, The Century of the Self.

♦♦♦

I read a startling statistic awhile back.  About 40% of married, middle-age women in America report no longer being interested in their sex lives, and that their husbands no longer satisfy them.

Perhaps it’s selfish of me to have immediately thought of myself, but it’s just a fact that I do take pride in how satisfied my two ex-wives were during our marriages.  A whole lot was wrong in both marriages, but not so much the sex.

I often heard them say the sex was “extraordinary”, “mind-blowing”, or even once or twice, “Had never been better”.   At least, those are the sorts of things they would tell me on the nights they came home very late.

♦♦♦

There are so many hard things in life, and I think most of us are all too aware of at least the big ones.  Raising kids, saving up enough money for the rainy days that come too soon and too often, being laid off,  looking for work, struggling for a promotion, and so forth.  The list just goes on.

One of those things, though, is especially curious to me.  As fully as possible appreciating people we are profoundly familiar with.  Most of  the time, I think I do.

But sometimes I meet a new person, and after I’ve gotten to know them a bit, I have the strangest moment of discovery when I realize that I quite likely appreciate them more than most anyone greatly familiar to me.

What to do about that?

New Years resolutions and other self-admonishments just don’t work for me here.   They’re ok up to a point, maybe.  So long as I keep reminding myself of them, I seem to make some progress, but then within a few short weeks, I fall off the bandwagon.

Trying to make a habit of appreciating someone also doesn’t work.  When I get into a genuine habit of “appreciating” someone, it soon becomes artificial.  “It’s Tuesday — time again to tell my brother how much he means to me.”

About the best thing I’ve found has been meditation.  Meditation seems to sharpen my senses a bit, making me more aware for at least a little while of what’s going on inside (e.g. hunger) and outside of me (e.g. the raccoon crossing my yard, a shadow in the night).   In an analogous manner it seems to sharpen my awareness or appreciation of people on the days I mediate.

Moreover, if I meditate frequently enough, then appreciation seems to become, if not permanent, at least somewhat more lasting than the other methods I’ve tried.

Last, it can have the peculiar effect of my seeing someone, not just in terms of what he or she means to me, but somewhat more objectively.  Perhaps.

♦♦♦

Colorado Springs is a conservative town.  It also has quite a few “city deer”, and they are so numerous now that they are viewed by many of us as a problem.

A while back, there was a serious proposal put before the City Council to solve the deer problem by legalizing hunting the animals within city limits.  With rifles and shotguns.

Not all my conservative friends are just as bonkers as I am, but it’s sometimes reassuring that at least some of them are.  So long as they don’t make the rules.

Could Star Trek’s Mr. Spock Really Exist?

(About a 5 minute read)

Like most sensible people, I am firmly convinced that around 2,400 years ago in Athens, Greece, Plato invented Mr. Spock.

Of course, I do not believe that Plato invented all the details of Mr. Spock right down to his curiously arched eyebrows and pointy ears.  So far as I know, those details were worked out by Gene Roddenberry, Leonard Nimoy, and their band.  But the essential notion that a hyper-rational person would have few or no emotions — that was Plato.

In Plato’s view, emotions and thought were clearly distinct, and the only connection between the two was that emotions could mess with thought.  That is, while emotions could cause us to reason poorly, they had little or no positive impact on reasoning.  Apparently, Plato was the first to come up with those ideas — ideas which went on to become commonplace assumptions of Western thought.  And Roddenberry, etc seized on those assumptions to create Mr. Spock.

Of course, there are some rather obvious ways in which Plato was right.  Most likely everyone has had some experience with their emotions overwhelming their capacity for reason.  Every child is cautioned not to act in anger or other strong emotional state, least they do something irrational.  And many of us — perhaps even most of us — know that we tend to be more gullible when listening to someone present their views with a great deal of passion than when listening to someone present their views coldly.  “I don’t think Snerkleson is quite right in his views, but he’s so passionate about them that he must honestly see some merit to them.  Maybe there’s at least some truth to what he says about dog turds replacing petroleum as the fuel of the future.”  There are clearly ways emotions can interfere with thought, as Plato knew.

As it happens, though, the notion that emotions only have a negative impact on thought is not borne out by the evidence.

In the early 1990s, a man — who has come to be known as “Elliot” — was referred to Antonio Damasio, a neuroscientist, by his doctors.  Elliot had applied for disability assistance despite the fact that, “[f]or all the world to see, Elliot was an intelligent, skilled, and able-bodied man who ought to come to this senses and return to work”.  His doctors wanted Damasio to find out if Elliot had a “real disease”.

Damasio found that Elliot tested well when given an IQ test and other measures of intelligence.  His long-term memory, short-term memory, language skills, perception, and handiness with math were unquestionably sound. He was not stupid. He was not ignorant.  Yet, when Damasio started digging into Elliot’s past job performance, he found that Elliot had often behaved as if he was indeed stupid and ignorant.

For instance, Elliot had at least once spent half a day trying to figure out how to categorize his documents.  Should he categorize them by size, date, subject, or some other rule?  Elliot couldn’t decide.  Moreover, he had been fired for leaving work incomplete or in need of correction.   And when Damasio studied what had happened to Elliot after his job loss, he found the same pattern of poor decision-making and incompetence.  Elliot had gotten divorced, then entered into a second marriage that quickly ended in another divorce.  He had then made some highly questionable investments that brought about his bankruptcy.  He couldn’t make plans for a few hours in advance, let alone months or years. Unable to live on his own, he was staying with a sibling. His life was in ruin.

When Damasio looked at Elliot’s medical history, he found that the turning point for Elliot had come about when he developed a brain tumor.   Before the tumor, Elliot had been highly successful in his business field.  He was even a role model for the junior executives.  And he had had a strong, thriving marriage.  Although the brain tumor had been successfully removed,  Elliot had suffered damage to some of the frontal lobe tissues of his brain having to do with the processing of emotions.

Damasio began testing Elliot for his emotional responses to things.  In test after test, Elliot showed little or no emotional response to anything.  He was, Damasio concluded, cognitively unaware of his own emotions.  Then Damasio had a revelation.  “I began to think that the cold-bloodedness of Elliot’s reasoning prevented him from assigning different values to different options,” Damasio wrote.

Damasio went on from Elliot to look at other case studies of people who had suffered brain injuries preventing them from being cognitively aware of their emotional states.  He found the same pattern over and over:  When emotions were impaired, so was decision-making.

The findings of Damasio and other scientists have largely revolutionized how scientists view the relationship between emotion and thought.  It now seems that emotions are, among other things, the means by which we sort out information: The relevant from the irrelevant, the high-priority from the low-priority, the valuable from the worthless.

And Mr. Spock?  Well, a real life Mr. Spock might spend hours trying to figure out whether to set his phaser to stun or kill.  Without emotions, decision-making becomes extraordinarily problematic.

Late Night Thoughts: Ice Cream, Reasoning, Robots, Wisdom, and More

(About a 6 minute read) 

The other day I woke up feeling pretty much under the weather.  I stumbled onto my blog bleary-eyed and somehow deleted a whole post while trying to fix a mistake in grammar.  After that, I spilled half a pound of coffee beans on the floor while getting almost not a one of them into my grinder.  Not yet recognizing that it wasn’t my day, I wrote 500 words for a blog post before realizing I wasn’t making any sense even by my lax standards.  This time the delete was intentional.  A sane man would have gone back to bed at that point.  Naturally, I didn’t.

Instead, I somehow got it into my head to catch up on what’s going on in politics.  I was still catatonic when the paramedics found me two days later After reading three or four articles the thought occurred to me that any sensible and informed person these days must feel a whole lot like I felt that morning: Our hopes and intentions are so far out of line with the bizarre reality of the times.  It almost seems as if the feeling, “This isn’t my day”, has become expanded to include most of the world.

◊◊◊

It is sometimes said that a difference between liberals and conservatives is that liberals are more concerned with humanity than they are with individuals, while conservatives are more concerned with individuals than they are with humanity.  As Dostoevsky put it in The Brothers Karamazov,  “The more I love humanity in general the less I love man in particular”.

It seems to me that — regardless of whether one is a liberal or a conservative — those two extremes are both inadequate in and of themselves.  The liberal position leads to treating the people one knows like dogs, the conservative position leads to treating the people one doesn’t know like dogs.

Now, the older I get the more I expect to find such “twists” in life.  That is, I have come to largely agree with Immanuel Kant:  “Out of the crooked timber of humanity no straight thing was ever made.”

What could our human nature not accomplish if our human nature did not stand in our way?

◊◊◊

I recently came across an article stating that eating ice cream for breakfast improves brain performance.  I immediately began dancing around my cottage for half an hour in gratitude to whatever deity or deities had arranged the world such than eating ice cream could be thought of as a duty.

Even since, I have been eating ice cream for breakfast, but alas!  With no discernible results.

Still, this is not something to be lightly dismissed.  One has a duty, you know.  I must redouble my efforts.  Obviously, the problem is I have not been eating enough ice cream to see any results yet.  Obviously.

◊◊◊

I think it was W. Edwards Deming who used to begin his graduate seminars with an experiment.  He would place a large glass jar full of marbles in front of the class, which typically numbered about thirty students.  Then he would ask the students to guess how many marbles were in the jar.

Their individual answers were typically wildly off the mark — either way too high, or way too low.  And yet — consistently in class after class — when their answers were averaged, the result was within 5% of the actual number of marbles.   As a group, the students were always more accurate than most of them were as individuals.

◊◊◊

It seems to me quite possible that how people reason might be almost as subject to fashion as how people dress.

The rules for what constitutes good reasoning might not change much, but certainly what constitutes “acceptable” reasoning can change quite a bit.   By “acceptable” I mean what a majority — or at least a large minority — of us think is good reasoning.

I suspect many of us don’t learn how to reason from a competent instructor so much as from media figures such as talk show hosts and their often questionable guests.  Even advertisements teach a form of reasoning.  It might not often be a sound form of reasoning, but it’s a form nonetheless.  It would make an interesting study to see if the popularity of certain kinds of arguments changed from one decade to the next.

◊◊◊

It seems possible that robots will at some point become sophisticated enough that someone will start making “lovebots”.  That is, artificial lovers.   At which point one wonders when sex education classes will become as hands-on as instruction in tennis or driving?

I have no idea whether such a thing will become commonplace in public education, but I can certainly foresee special academies for it — private schools that use robots to teach love making.

Then again, I think it’s only a matter of time before genetics advances to the point that we have pets with glow in the dark fur.  I am, quite obviously, bonkers.

◊◊◊

Is chocolate also good brain food?  Might be.   Better eat some just to be on the safe side.  Is duty.

◊◊◊

According to Barry Lopez, the Inuit word for “wise person” literally translates as, “one who makes wisdom visible [through their behavior]”.   If we in the West had a corresponding translation for “wise person” it would doubtlessly be something along the lines of, “one who speaks wisely”, for we typically assume that someone who says wise things is actually wise.

◊◊◊

Often enough, great intelligence, or great wisdom, is shown less by what someone says or does than by what they do not say or do.

◊◊◊

An inability to laugh at oneself can be as creepy as showing up in a clown costume at a funeral.

◊◊◊

We so often blame our emotions for the bad behavior of our psychological self.  We say, for instance, that our anger at Smith got out of hand.  But before there was our anger, there was our ego’s perception that Smith slighted us.   Without that perception, we would not have been angry at Smith in the first place.

Speaking Ill of the Dead

(About a 3 minute read) 

The death this morning of Roger Ailes prompted someone to ask why it is customary in the West to not speak ill of the dead, and whether there was still any merit to the custom.

Ailes, the co-founder of Fox News who for decades was a leading force in conservative politics in America, died this morning at the age of 77.  You can find The New York Times report here.  He was, to put it mildly, a controversial figure, one who is certain to be spoken ill of in many quarters today, custom notwithstanding.  And, of course, many people will scold those who do speak ill of him on the grounds that it is neither customary nor seemly to do so.   But is the custom justified?

Like so many Western cultural traits, the custom of not speaking ill of the dead seems to go back to the ancient Greeks.  Around 600 B.C., Chilon of Sparta — one of the “Seven Sages” of ancient Greece — is reputed to have said, “Don’t badmouth the dead”.  Around 2000 years later, during the Italian Renaissance, his words were popularized by a humanist monk as,  “Of the dead, nothing unless good”.  And thus the notion comes down to us today.

I have heard it said that we should not speak ill of the dead in order to honor them, or at least to honor the good they did in life.  But I don’t buy into those notions.   I think there are people who were so vile that honoring them is borderline immoral.  And to honor the good they did amounts to a species of dishonesty in light of the evil they did.

If there is today still some reason not to speak ill of the dead, that reason might have more to do with us than with them.   Death is one of the most poignant and powerful reminders that, in the end, we are all human.  It seems to me that a brief period of grace — perhaps only the time between one’s death and one’s burial — during which we do not speak ill of the deceased would drive home the lesson of our common humanity.

We live in an age in which nearly everyone is at risk of having their humanity denied by other people at sometime or another.  All you need do to see the truth of that is go on anyone of tens of thousands of websites and announce a political opinion that’s unpopular on that site.  Sooner or later thereafter someone — perhaps many people — will vilify you, demonize you, dehumanize you.  And that is a dangerous situation:  At a minimum, it is not conducive to liberal democracy, which rests on compromise; and at worse, wars and genocides are made of such things.  A society — or world — can only hold together when it is widely recognized that our commonalities outweigh our differences.

The remembrance that we all have in common the same ultimate fate would help, I think, to put things in perspective for many of us.   Moreover, a few days in which we do not speak ill of the dead might go far in reminding of us of that.

Having said all that, I think remarkably controversial figures, such as Ailes, present a special problem.  Their deaths almost invariably become political occasions.  There is a rush by politicians, pundits, and others to make use of their passing in order to further agendas.  It might be noble to refrain from criticizing the dead under such circumstances, but certainly, it is not always practical to refrain.

In general, though, I think the practice of not speaking ill of the dead is a good one.  But what do you think?  Your comments, views, thoughts, and feelings are welcome.

“The Point of Most Religions is the Betterment of Mankind”

(About a four minute read)

“The point of most religions is the betterment of mankind.”  — Posted on an internet religious forum.

A dear friend of mine is a kind, sweet lady who, with her husband, belongs to a fundamentalist church in the Midwestern county I grew up in. Her church means everything to her.

Besides that she’s retired now and spends most of her time doing one thing or another for her church community, her church community presents to her a sort of oasis of love, charity, kindness, compassion, and all around goodness in an otherwise rather disturbing larger world whose values are often alien to hers.

I suspect she would largely agree with the above quote. From where she’s at, the quote must make a lot of sense. She only needs to look at the way her church community took up a collection for the family whose breadwinners were out of work, or the way her pastor visits and comforts the sick, or how most of her church buddies believe in the ideal of treating each other with loving kindness — she only needs to look at those things to agree the point of her religion is the betterment of mankind.

Of course, her church is officially a busybody that’s intolerant of premarital sex, abortion, homosexuality, and many other private things it has no real business being intolerant of. Its pastor is also a staunch supporter of neocons in general, Bush and Cheney in particular, the War in Iraq, the War on Terror, and his side in the so called “Culture Wars”. And many of the people in her church community are bigoted, narrow-minded folk who would never vote for a Black, a Muslim, or a woman to be president. So, to an outsider, her church might appear anything but an oasis of love, charity, kindness, compassion, and all around goodness — let alone dedicated to the betterment of mankind.

Yet, how is she expected to stand back from her church community — which occupies her days and means nearly everything to her — and clearly see the moral ugliness of people who reserve their best “Christian” behavior for insiders just like themselves, while damning and condemning every outsider from scientists to liberals and beyond?

She would much rather help her elderly neighbor get out and about, or bake something to raise money for a needy family, than to consider her pastor’s outrageous notion that homosexuals undermine and destroy the sanctity of her marriage.

I recall a young fundamentalist here in town a while back who I overheard blithely telling her friend that when Jesus said, “Love your neighbor”, he meant love those who belong to your church.

She was certain she was thereby realizing the highest Christian principle of universal love — because, after all, most of the people who belonged to her church were strangers to her, and hence her love for them was “altruistic”.

Yet, even the Bible says there is nothing remarkable about loving only those who are members of our own group.

Humans evolved as a social animal living in small groups. Most of us need little prompting to treat the members of our group with respect, compassion, kindness — even love. After all, we evolved to do that. It’s to a large extent instinctual. We’re almost always ready to “better mankind” so long as “mankind” is the group of people we hang out with.

On the other hand, there are very few Gandhis, very few Martin Luther Kings, very few people like Jesus — very few people who somehow realize in practice the notion the whole world should be treated with kindness, compassion, respect, and love. To most of us, such a notion is “wild”, suspect, perhaps even immoral.

Today, the world — the entire world — is involved in a grand experiment. An experiment to see whether we can all get along together in dignity, freedom, peace and sustainable prosperity. No one seems to have wanted that extraordinarily daring and risky experiment, but it’s now imposed upon all of us nonetheless.

So, what’s going to be the outcome? Will the world descend into endless wars as some think likely? Will it sink into corporate fascism as some others think likely? Will it be the birth of a new golden age for humanity — as very few seem to think likely? Or will something else happen?

More to the point, just what is going to be the role of the world’s religions in bringing about the “New World Order” — whatever that Order actually turns out to be? Are religions going to finally live up to their own professed ideals of universal compassion, kindness, charity, love, generosity, etc.? Will they ever, really, make “the betterment of mankind” their honest “point”?

Frankly, I strongly suspect that any sustained progress towards a world in which most people live in dignity, freedom, peace and sustainable prosperity will ultimately come — not from religions for the most part — but from Humanism. If such progress comes at all.


Originally published on this blog January 15, 2008.  Lightly edited May 6, 2017 to better reflect my current views.