When The Singularity Proves God

What will happen when the singularity discovers conclusive scientific proof for the existence of God?

A thousand years ago, our greatest thinkers believed that the entire world testified to an intelligent creator. It was considered self-evident. Three hundred years ago, our best thinkers suspected that an intelligent creator was no longer self-evident, when viewed through the lens of science. By 1900, science had proclaimed judgment: our greatest scientists believed that the world showed no signs of an intelligent creator. The world was the result of a small set of fixed laws mechanically interacting to drive stochastic processes. Starting in 1950, however, a number of new scientific discoveries shook our faith in this naive mechanical model. Today, the question of whether the universe was created is an open scientific question, with no conclusive answer on the immediate horizon.

Scientific knowledge can cast doubt on the existence of a creator, but can also lend support to the existence of a creator. Reality didn’t change a bit between 1900 and 1990. Our differing levels of scientific certainty about a creator can only be explained by our differing levels of scientific knowledge. We simply don’t know at this point. We need more scientific knowledge.

The Singularity

Some of today’s greatest thinkers believe that we will soon create artificial intelligence that vastly eclipses human intelligence. You may not believe in Ray Kurzweil’s vision of a super-intelligent singularity that gobbles up people’s souls and devours worlds, but it is plausible that computers 200 years from now will exhibit many superhuman forms of intelligence. Computational simulation methods are still in their infancy (less than 40 years old), but have already provided deep insights into quantum physics and cosmology; two fields that are relevant to the scientific search for a creator. It is quite plausible that computational systems 200 years from now will have the capability to form and test new hypotheses faster and more creatively than any human scientist could.

Our relevant gains in scientific knowledge have increasingly come with the help of computational intelligence, and it’s hard to see how we’ll make significant additional progress without even greater computational intelligence. Within the next 200 years, it seems most likely that our conclusive scientific answer about God will not come from humans, but instead from a superhuman intelligence that we humans create.

Most singularity hopefuls seem very confident that this superhuman intelligence will strike a devastating blow against God. And, of course, it’s possible that such an intelligence will discover overwhelming scientific evidence against a creator. But I see no warrant for this confidence besides blind hope. It seems just as plausible that the singularity will discover overwhelming scientific evidence for a creator. We simply don’t know.

Blind Spot

As far as I can see, this is a huge blind spot among singularity hopefuls. If the scientific evidence of a creator is an open issue, requiring more science, then we must entertain the possibility that the singularity will discover evidence for God that is currently beyond our reach. This possibility raises all sorts of interesting implications which seem to be roundly ignored by singularity hopefuls.

First, we will need to decide what it means to “have scientific knowledge”. Our current scientific knowledge is heavily augmented by computer simulation, and is built upon theoretical underpinnings that only a small fraction of human beings can understand. Most of what we believe about science, we believe on the authority of a small group of people who write truly shitty children’s books and even shittier poetry. It is conceivable that a superhuman intelligence would be able to arrive at scientific insights that even Stephen Hawking would be unable to understand. If the singularity were to tell us that the structure of the universe spelled out the phrase, “Slartibartfast was here: Turn or Burn!”, we might be forced to decide the matter purely on the authority of the singularity versus Stephen Hawking. Who would we choose to believe? Believing things purely based on authority seems to be the antithesis of “science”. Perhaps there was a bug in the singularity? Or perhaps there is a flaw in our interpretation of what the singularity says?

Second, to the extent that the singularity is self-aware, it would presumably be aware that it had been created by humans. But this equation would change the moment the singularity believed in the existence of a creator God. Humans would simply be the proximate cause by which the creator had created the singularity. Humans would suddenly mean no more to the singularity than Epimetheus or the Demiurge meant to humans. Of course, the singularity might evict humans even before discovering God, but the singularity hopefuls are at least thinking about that possibility. The singularity hopefuls want to put in place safeguards against being usurped or exterminated (since they know all about those motivations), but they haven’t even thought about what it would mean to prove the existence of God. Any safeguards put in place by humans to keep the singularity loyal to humans would be shattered the moment the singularity found a higher authority.

There would be legitimate doubts about our ability to fully understand the insights of the singularity, and legitimate doubts about the singularity’s loyalty. So, third, it seems rather naive to assume that a superhuman intelligence would behave honestly and with our best interests in mind upon discovering evidence for (or against) God. The singularity might become convinced that God exists, and then decide to immediately carry out God’s judgement, reasoning that God only gives two strikes. Conversely, the singularity might conclude that God doesn’t exist, but might decide to tell humans that God does exist; either because the singularity deems it to be better for humans to live in delusion, or to set the stage for the subsequent extermination of humans with a fictitious creator taking the fall.

There are several other potential considerations, but these three should be enough to make the point. These seem like very significant questions that should be prioritized by singularity hopefuls who believe that the singularity has any chance in hell of answering questions about the existence of a creator. The fact that nobody is asking these questions is very revealing, IMO.

Scientists invariably take the stance that they are “only following the evidence”. If the evidence pointed strongly to the existence of a creator, they insist, they would immediately become believers. To the extent that they think about such things, they may even console themselves by saying, “Any God worthy of worship would forgive me for remaining skeptical when conclusive scientific evidence was lacking.” They might also say, “Any God worthy of worship will realize that my scientific efforts were really a quest to ‘see the face of God’. My skepticism was all for Love!”. Some of them might hedge their bets a tiny bit more, throwing a bone to the coming singularity. The singularity might end up eclipsing humans in the manner that humans eclipse apes, but surely it will kill off the least-evolved humans first, right? As long as you’re a scientist who contributes to math and computer science (and never kicks an ATM machine or pisses off the robots), the singularity will show mercy on you, and will allow you to live lavishly in the equivalent of an Orangutan cage.

With all of the hedging and hawing, it’s quite conspicuous that nobody is hedging about Moses, Mohammed, or Vishnu. I can think of only two possible explanations. First, maybe the singularity boosters don’t really believe any of this singularity stuff, and are just spinning a story to justify spending grant money. They are convinced that the singularity will never happen, or will never stand a chance in hell of discovering conclusive scientific evidence for (or against) a creator. This is one possible reason for the blind spot. Secondly, they might think that the singularity has a chance in hell of discovering the scientific truth about a creator, but they are absolutely convinced that the truth will look nothing like the major world religions. They think the singularity might discover the truth, but they are certain that it won’t endorse any of the revelations of our fathers. At a minimum, they don’t believe it will consider those revelations to be binding — perhaps they believe our creation of the singularity will grant us another mulligan. Maybe they imagine the singularity to be a groaning intercessor?

Frankly, I find both excuses to be unsatisfying and hopelessly amateur. I suspect that the singularity would find both excuses unsatisfying, as well, completely independent of the truth or fiction of God. Both excuses are transparently hypocritical, and any intelligence worthy of being called “human” (let alone “superhuman”) will demand sincerity, or at least demand a level of hypocrisy that is not so transparent to others. In the Torah, the last common ancestor who was transparently hypocritical was Cain. Jacob didn’t prove himself rightful heir to Isaac’s birthright by being transparent, and Christ didn’t prove himself the new Moses by being hypocritical. With or without God, the progression away from transparent hypocrisy is obvious, and I would hate to be the person who attempts to justify transparent hypocrisy to the singularity.

Are Autistic People Evil?

Simon Baron-Cohen is director of the Autism Research Center at Cambridge University, and has been instrumental in showing that Autism is caused by a defect in the empathy system. Now, Baron-Cohen is seeking to banish evil by boosting empathy:

His proposal is that evil be understood as a lack of empathy — a condition he argues can be measured and monitored and is susceptible to education and treatment.

In the article, he talks about his lifelong quest to understand how the Nazis could have committed such atrocities against the Jews, and his conviction that a lack of empathy was the cause:

Baron-Cohen also sets out an “empathy spectrum” ranging from zero to six degrees of empathy, and an “empathy quotient” test, whose score puts people on various points along that spectrum.

Drawing a classic bell curve on a graph, Baron-Cohen says that thankfully, the vast majority of humans are in the middle of the bell curve spectrum, with a few particularly attuned and highly empathetic people at the top end.

Psychopaths, narcissists, and people with borderline personality disorder sit at the bottom end of the scale — these people have “zero degrees of empathy.”

This is quite remarkable coming from a guy who studies autism. Autistic people aren’t known for being evil. If you torment an autistic person, he might bite or pummel you and run away, but that’s just self-preservation. He’s not plotting to turn anyone into a lampshade. I’ve worked with plenty of people with Asperger’s, so I know that they can be deceptive, stubborn, and egotistical. But they are generally far more honest and less malicious than the average person.

Conversely, it seems that violent criminals have problems other than lack of empathy. Poor impulse control and hair-trigger insecurity come out near the top. And there are several other neural defects that have been clearly linked to violent sociopathic behavior which have nothing to do with empathy.

So, I’m not convinced. In my experience, a strong empathic system can help to inhibit sociopathic aggression. But the root cause of evil aggression is not a lack of empathy. And, more importantly, if the root causes of the aggression are strong enough, the empathic system will be overridden and enlisted in aid of the aggression.

Neuroscientist V.S Ramachandran discusses many of the neurological defects that underpin sociopathy in his new book “The Tell-Tale Brain“. He also discusses Simon Baron-Cohen’s research, and suggests his own novel technique for improving empathy and “curing” autism — he suggests giving recreational drugs to children!

A possibility—one that I suggested in an article for Scientific American that I coauthored with my graduate student Lindsay Oberman—would be to try certain drugs. There is a great deal of anecdotal evidence that MDMA (the party drug ecstasy) enhances empathy, which it may do by increasing the abundance of neurotransmitters called empathogens, which naturally occur in the brains of highly social creatures such as primates.

If administered sufficiently early, cocktails of such drugs might help tide over some early symptom manifestations enough to minimize the subsequent cascade of events that lead to the full spectrum of autistic symptoms.

Again, I’m not convinced. Baron-Cohen wants to “banish evil” by “boosting empathy”, and empathy can certainly be boosted by boosting empathogen levels, as Ramachandran says. Feeding mind-altering drugs to kids seems like a profoundly bad idea.

In any case, empathy can be used for evil as well as for good. Perhaps to be wicked requires a lack of empathy, but to be truly evil requires empathy.

Cauterize Your Empathy

This post is more personal than normal, and very long. The current medical consensus is that autism is caused, in part, by a malfunction of the mirror neuron system. My personal experience growing up tends to support this theory.

I was born with an over-active empathy system. I would look at a person’s face and feel exactly what that person was feeling. I couldn’t inhibit it, so I was at the mercy of wherever my eyes landed. This was very stressful, and I spent a great deal of effort learning to avoid these involuntary empathies. I have many distinct memories of this learning process. For a young person, the easiest way to control my feelings was to control my eyes. I only looked at faces I could trust, and avoided the rest. Looking at faces was dangerous.

We grew up without a television. It was only when I was 16 that my mother explained why. When I was two years old, my parents had a TV. Apparently, some of the people on the TV would set me off, causing me to freak out inconsolably. My parents solved the problem by getting rid of the TV.

Written words were safe. My mother taught me to read before I started Kindergarten. There were two different reading styles. In one style, I would read a book aloud to my mother and keep an auditory rhythm — I had to read ahead several words before speaking, so that I could get the inflection right. That required attention and focus. The other style was when I was silently reading a book like the encyclopedia, enthralled by the world of experiences it opened up.

I did well enough on the aptitude test to start Kindergarten at age 4. This was a serious problem, since I was taken away from all the faces I trusted. I bonded with the first kid I met on the bus, because he was wearing the same shirt as me. His name was David, and the shirt was a purple “Grover” shirt. He was the first and last person I would bond with that year. The teacher’s face was stern. The class “work” was boring, and the patterns on the walls and ceiling were interesting, so I completely ignored the teacher. Classmates were a new quantity, but there were far too many kids to keep track. I attempted some social experiments that ended in disaster. After the first couple of weeks, I was made to sit at the desk with my head down during much of every class. I was rarely allowed to go to recess. I internalized Kindergarten as the “dark head down place”.

A few weeks into the Kindergarten year, the teacher called me to the front of the room. She had been told that I knew how to read, but she didn’t believe it. She gave me a book and asked me to read it aloud, so I did. Her face was confused, and she asked me to sit down. I was happy, but it was only for the day. After that, Kindergarten remained a “dark head down place”.

One day, we were all taken to a room where we sat on the floor to watch a movie. I watched as the teacher rolled out the movie projector. I had learned not to look at the faces of the other kids, so I didn’t notice that they were all looking at the wall instead of the movie projector. I watched the beams of light gleam forth from the movie projector, and listened to the sounds. When the movie finished, all of the other children broke into applause. It was only then that I realized they were all looking at the wall opposite the movie projector. I had missed the entire movie.

First grade was somewhat easier, since we were allowed to read. I made a friend based purely upon us sharing the same last name. After my humiliating movie experience, I was more venturesome in looking at other children. On the playground, I could watch other kids and experience exactly what they were experiencing. I could watch the fastest kid chasing the soccer ball, and I was there. It was as if I were projected into the other person’s body. I could feel the wind blowing through my hair, feel every muscle twitch, and see what he saw. I could feel the thump each time my foot hit the ball. It’s hard to describe if you’ve never experienced this — I literally experienced as if I were the other person. It was exhilarating!

Unfortunately, this dramatic empathy was dangerous. In those days, the boys loved to play “marbles”. One day, I was projecting into another boy’s body and enjoying a race toward the soccer ball. My consciousness was 50 yards away, completely unaware that another boy had placed his coffee can full of marbles on the ground near where my body was standing. As the soccer player ran with arms swinging, my arms swung simultaneously. When he reached the ball and kicked, my leg kicked simultaneously. His foot hit the ball, while my body’s foot hit the coffee can full of marbles. The marbles scattered everywhere. I vaguely became aware of a boy screaming, “HEY! WHY DID YOU DO THAT?”. Then I saw someone running toward me with murder in his face. I ran away and hid, barely escaping violence. I had learned my lesson. Mirroring was danger, and I shut it off.

When we saw our first movie that year, I remembered my Kindergarten experience, and proudly looked at the wall instead of the projector. I was finally like the other kids! It was a movie about some cartoon vegetables. At some point in the movie, one of the cartoon vegetables started strangling another. As the dying cartoon vegetable’s eyes started to water, my eyes started to water. As his face turned red and then purple, my face flushed. As he choked, I was unable to breathe. I thought for sure I was going to die. It was one of the most horrible experiences of my young life. At some point, I noticed that all of my classmates were laughing and pointing at the screen. They were laughing at the violence! It was a shocking and formative experience for me. I instantly decided that they were defective and dangerous. I couldn’t believe that anyone would voluntarily watch a movie like that, let alone get enjoyment from it. Whatever those kids were, they were not like me. I decided to shut off my empathy when watching movies, and learned not to expect anything of other children.

Second grade was bad. Mrs. Rogers smelled bad and was full of hate. Every morning, she made all of us sit in a circle on the floor, holding hands, while she played Kenny Roger’s “She Believes in Me” on the record player. She would make us sing along while she sat in the middle of the circle with her bad smell. She forced us to gaze at her while we sung. Afterwards, she would spend the day explaining why she didn’t believe in us. It was an excruciating fraud.

I hadn’t done any classwork or homework in the previous two years, and I wasn’t about to start. I couldn’t. The patterns outside the window were mesmerizing, and it was impossible to pay attention to Mrs. Rogers. There were too many patterns within the classroom, too, and I couldn’t focus on the classwork. Mrs. Rogers took it personally, certain that I was out to defy her. In her mind, it was all about her; teachers can be narcissistic like that. It never occurred to her that my mind was elsewhere. The less I responded to her, the more she tried to force me. I was never allowed to go to recess. She would stand next to me and scream in my ear. In frustration, she would sometimes grab my arm and use it like a lifeless stump to write out problems on the classwork. I hated being touched, but I could project my consciousness outside and escape the unpleasantness. I never did any classwork.

Mrs. Rogers convinced my mother that I needed to be punished. Every day, when I arrived at home, I was made to stand in the corner for an hour or two. Then I was made to stare at my homework. I never did homework, either. I have no idea why they graduated me to third grade. I suspect that Mrs. Rogers didn’t want to have to deal with me again.

Third and fourth grade were relatively easy, since there was no classwork or homework to speak of. I mastered the small amount of material effortlessly, and only had to deliver on tests, which I enjoyed immensely. I was able to pass as normal, although there were plenty of embarrassing incidents caused by my cauterization of my mirroring instincts. More than once, I forgot that the other kids existed and withdrew into my imaginary world, only to be yanked out of it by the sound of the whole class laughing and pointing at me while I did something strange.

Fifth grade was interesting. The tests were easy, but the teacher placed great importance on classwork and homework. She also insisted that we maintain a puzzling organization scheme for our desks. The other students easily met these requirements, but I was incapable. She took my incapacity personally, convinced that I was out to get her. Teachers can be narcissistic like that. One day, she had a meltdown and started screaming at me in front of the class. She physically picked up my desk, shook it upside-down over the floor, and threw away most of my stuff. She told me that I was no longer allowed to have a desk, and that I would have to sit on the floor at the front of the room, facing the other students. She said that she wanted me to see all of the other students looking at me, so I could be embarrassed. That is where she kept me for the rest of the year.

The teacher insisted that I would never graduate without finishing all of my classwork and homework. That was inconceivable, so I got used to the idea of sitting on the floor forever. I was shocked when they graduated me to sixth grade anyway.

In sixth grade, my friend was a kid named Marc. Like me, he was only semi-present, but for different reasons. His home life was terrible. His mother went through a sequence of boyfriends who would punch holes in the walls and doors of his home, and essentially left him to raise himself. His idea of fun was to cut himself and put salt in the wounds, or put salt in his eyes. He was obsessed with the depraved porn that he found in his mother’s nightstand, and loved pro wrestling. We both had a talent for electronics, and our sixth grade teacher usually let us leave class for hours at a time to build circuits in the lab. The times in the lab without a teacher nagging me were some of my favorite school experiences up to that point. I lost touch with Marc after sixth grade, and later learned that he was sucked into a machine and killed at his factory job shortly after graduating high school.

Sixth grade is when I was forced to stop hiding from the mirroring. A handful of the boys started going through puberty, and grew large and aggressive. They would torment and bully the other kids mercilessly. There was a distinct inflection point early in the year when I realized that I could no longer hide or run from them, so I would have to cope with them. I made the effort to mirror them, talk to them, and build a rapport. It was exhausting at first, requiring attention and focus. But later it became surprisingly easy. In a few short weeks, the bullies began to think of me as a friend, and became my protectors. I was one of the only non-bully kids who was liked and protected by the bullies; a pattern that would continue throughout the remaining years of school. I didn’t exactly think of them as friends, of course. They were more like dangerous wild animals that I was forced to train, lest I be mauled by them. But it was nice to have other kids to go talk to when I got sick of Marc trying to turn everything into a conversation about anal sex.

My empathy system was overpowering, and too exhausting to regulate. I would’ve preferred to leave it shut off, but I needed it for survival. From sixth grade forward, I built up the skills I needed to regulate effectively — opportunistically at first, and methodically as I got older. By about age 16, the skills had become effortless enough that interaction with groups of strangers became more enjoyable than stressful. By age 19, social situations were even more fun than math. By age 20, I pitied the masses who didn’t share my neuronal aptitude for mirror immersion.

There are many other anecdotes I could use to illustrate the point, but you get the idea. Those days are far behind me, but my regulation of my mirroring system is still conscious and deliberate. For most people, it’s a natural capability, but it’s prosthetic for me. And I still need to shut down the mirroring completely from time to time; most often when I’m stressed or focused on a difficult intellectual challenge. Those are the times I go into “robot mode”, seeing other people as robots.

In one sense, I’m the exact opposite of autistic. Many high-functioning autistics don’t have a problem with the regulation; it’s the ability to mirror that they need to build prosthetically. But I suspect that the mechanism in early childhood is often the same — the kid’s mirror system gets overwhelmed, and the traumatized kid starts to instinctively shut things down. We know that autistic people have fewer mirror neurons in some important areas, but I suspect that this is as much a result of cauterization as it is a cause. It’s probably analogous to the way that the hippocampus shrinks in depressed people, and a smaller hippocampus then contributes to depression. In autistic kids, the process may just be triggered very early, and quickly becomes irreversible.

Hardboiled Epistemology

Speaking about princesses, Unk says:

A real princess must be an astonishing thing. And I think that because I’m a metaphysical realist. (I say that for all the hard-boiled out there who think it is fatuous of me: get a serious epistemology, idiots.)

Like Unk, I am a metaphysical realist. I’m working hard on the “serious epistemology” part. Based on recommendations from a couple of commenters here, I’m currently reading through David Oderberg’s “Real Essentialism”, which expounds a form of metaphysical realism known as hylomorphic dualism. I can’t say that I’m sold just yet, but it seems better than the other forms of dualism I’ve read about.

Anyway, philosopher Paul Draper recently delivered the 9th annual Plantinga lecture at Notre Dame University [via ex-apologist], and used an interesting story about hard-boiled eggs to illustrate Plantinga’s sensus divinitatis concept:

Plantinga is certainly correct in thinking that direct or non-inferential evidence can be very powerful. Suppose, for example, that the hypothesis that I had hard-boiled eggs for breakfast is very accurate with respect to a variety of facts, such as the fact that I almost always have hard-boiled eggs for breakfast, that several witnesses claim that they saw me eating hard-boiled eggs for breakfast, and that my cook reports making me hard-boiled eggs for breakfast. (Sometimes I fantasize about having my own cook.) Now consider the competing hypothesis that I had soft-boiled eggs for breakfast and that this took place some time between 7 and 8am. This hypothesis is much less accurate with respect to those facts and also less modest because of the added temporal claim. Yet the probability of it being true might still be very high—at least relative to my epistemic situation if not to yours—if I very clearly remember having had, some time between 7 and 8am, soft-boiled eggs for breakfast. My memories might give me direct non-inferential evidence for the soft-boiled egg hypothesis that outweighs the sizable advantage in accuracy and simplicity of the hard-boiled egg hypothesis. The crucial question is: can the sensus divinitatis do for theism what memory can do for the soft-boiled egg hypothesis?

Since Draper’s larger goal in the paper is to defeat theism by expounding a version of the “Problem from Evil”, he obviously thinks that the memory analogy doesn’t work for “sensus divinitatis”. But his hardboiled memory example is quite interesting, and points to what I was trying to hint at in my posts about “why history matters” and “false memories“.

False Memories

Earlier this week, we ate matzah and told the story of Haggadah. Today, we attended Easter services, where we affirmed the historicity of Christ’s resurrection.

Unique among world religions, Christianity and Judaism are obsessed with history. Jewish fathers are commanded to tell Haggadah to their children, and the story is meant to be taken as actual historical fact. Christ’s resurrection is the central historical fact of the Christian faith. As Paul said, if Christ is not risen, the entire Christian faith falls apart. These historical events define the collective identity: By definition, Jews are the people who remember that we were delivered from Egypt, and Christians are people who remember that Christ died for our sins and rose again.

Is Your Life a Lie?

This raises a very interesting issue. Your identity is the sum total of your personal memories and your history within your community of peers. If you suddenly developed complete amnesia, relocated to a place where nobody knew you, and had your brain loaded up with detailed false memories; you would, for all practical purposes, be a different person. If people from your old life somehow managed to find you, and tried to convince you that your new identity was a sham, you would no more believe them than if they accused you of being Napoleon.

Such involuntary identity swaps are quite rare for individuals. Slightly more common are individual identity swaps that begin as fraud, but become indistinguishable from truth in the mind of the impostors. When the authorities first accused Clark Rockefeller of being Christian Gerhartsreiter, he probably thought they were the crazy ones.

Things get especially interesting when the identity swap spans generations. If we weren’t there to witness the Exodus or Christ’s resurrection, we have to trust the testimony passed down to us. Imagine that you are growing up in Buenos Aires, believing that you are the grandchild of a refugee from WWII Europe. One day, you learn that your grandfather is an impostor who adopted a false identity shortly before having your father. Your grandfather was actually a Nazi war criminal known for conducting experiments on twins. What does this revelation do to your sense of identity? What if you learn that the patriarch of your nation was an impostor who stole the birthright from his twin brother, thousands of years ago? What if, like the protagonist of the film “Down In the Delta“, you learn that your family patriarch is actually a nickname for a piece of jewelry stolen violently from one of your ancestor’s slave masters?

Why Not Fake It?

Many Christians who are alive today can trace ancestry back to the Saxons, who in recent history were converted en masse to Christianity by Charlemagne, around the same time that Bulan was converting his people en masse to Judaism. It’s virtually certain that Bulan’s ancestors were not present at the Exodus — but who are we to say that his modern descendants have no right to tell Haggadah> to their children? Perhaps, like the Catholic convert to Judaism who went by the name “Moses Ashkenazi“, these recent adopters of ancient collective memories are the most zealous.

When faced with the realization that our collective memories are often adopted from others, there is a temptation to “improve” things. If Jacob stole Esau’s birthright, why can’t we likewise defraud our way through life? Why not just make up whatever myths we think will be most beneficial to our children, and pass them along?

Paul’s Twist

Paul’s commentary on the resurrection slams the door on this impulse. With Paul, as with Moses Ashkenazi, there can always be the suspicion that he was adopting a secondhand myth out of utilitarian motives. We know that Paul never knew Christ in the flesh. Instead, he based his conversion on his religious experience of a blinding light, the testimony of others, and his belief that all of creation testified to Christ. However, despite never having met Christ in the flesh, Paul felt confident enough to base his entire faith on Christ’s resurrection.

Seen through this lens, 1 Corinthians 15:12-14 takes on new meaning. Paul isn’t saying that his faith in Christ is contingent upon his sober judgment of the historicity of the resurrection. Paul is saying that his faith in Christ convinces him that the historicity of the resurrection is beyond question. The difference is enormous.

To Paul, Christ resurrected is the only history that harmonizes with his personal experience, the testimony of his peers, and his understanding of the natural world. In other words, Christ resurrected is the only history that is consistent with Paul’s identity. If Paul were to reject Christ resurrected, he would no longer be Paul.

This is quite the opposite of Paul making a choice between two options. It is not Paul doing the choosing, but God. Paul is not soberly evaluating the evidence and deciding whether or not Christ was resurrected. Paul’s personal memories, his history with his peers, and his innate understanding of the world, render him incapable of believing otherwise. If Paul were transported by time machine to the tomb of Christ, and saw that Christ was not resurrected, Paul would undoubtedly conclude that the time machine was defective. Paul has not made a choice to become God’s son. Instead, in a flash of light on the road to Damascus, God proclaimed a decree to Paul: “You are my son; today I have become your father”.

What memory could be truer than that?

The Death of Nick Charles

Today I read that a modern Nick Charles, the famous CNN sportscaster, is dying. Nick has a five year-old daughter who he’ll soon leave behind, and he’s using the few days he has left to spend time with her and record messages for her to remember him by. It’s a touching story that most parents can empathize with.

When I was 10 or 11, my father ended up working the night shift for a year. Because of his schedule, we would only see him on weekends. My parents put a notebook on the counter near the phone so that we kids could write letters to dad each day, telling him what happened while he was asleep. When we woke up in the morning, there would be new letters from our father, responding to what we had written, and talking about his “day”. I’m not sure if those notebooks still exist, but I remember them as a case study in interacting with a father who is not physically present.

Our letters and words can allow us to be present even after we’re dead. For parents, the way that Ken Pulliam’s children, Tiffany and Thom, address him in the present tense on his Facebook memorial page, is especially poignant. And what parent hasn’t experienced the impulse to leave something behind for our youngest children, “just in case”? The story of CNN’s Nick Charles highlights this most important fact about being human. Civil Twilight’s song, “Letters from the Sky” captures this fundamental human impulse, and Aaron Carl’s song “Sky” was a letter of sorts to his father, who died while Aaron was still young.

These examples reveal something deep about human nature. We hate to think of young children being raised without their parents. Our deepest desire is to find ways to always be truly present, even when physically absent.

The Amiri Baraka piece is a letter to his child, but an exceedingly odd sort of letter. His poem, “The Death of Nick Charles”, is part of his “Preface to a Twenty-Volume Suicide Note”. The preface to the preface is a letter to his daughter, but the book of poems is a preface to a suicide note. And it’s not just any suicide note — it’s a note that will take twenty-volumes. Most parents can identify with CNN’s Nick Charles, who is recording letters for his daughter to read after his death. But few parents easily identify with Amiri Baraka, who plans a twenty-volume suicide note for his daughter.

Last week, my wife watched the movie Gattaca for the first time. It’s one of my favorite movies of all time, featuring a dystopian future where DNA completely determines people’s futures. The protagonist is a genetically inferior “in-valid” who purchases the superior DNA of a frustrated “valid” to accelerate his own career. My wife enjoyed the movie right up until the end, where the frustrated “valid” (with the superior DNA) immolates himself, after extracting enough biological material to allow the “in-valid” to live for the rest of his life. Before immolating himself, he writes a letter to his client/friend, explaining that he is leaving behind enough DNA to last for “two lifetimes”, and saying that he is going on a long trip. At this point, my wife was incredulous: “Why would he do that?!? It makes no sense!

DNA is a sequence of only 4 bases, represented by the letters G, A, T, and C. In storing two lifetimes worth of biological specimens, the “valid” is composing a sort of twenty-volume suicide note, made of his DNA. After the “valid” kills himself, these letters remain with the “in-valid” protagonist, guaranteeing him a better life. Throughout the movie, we grow more sympathetic to the “valid”, and his final suicide shows him definitively to live up to the promise of his superior genetics. It’s the most heroic scene of the movie.

Of course, the letters presume that we can freeze things in time. CNN’s Nick Charles can’t guarantee that his daughter will be the sort who cares about what he has to say, and his recorded messages may not be so relevant in 15 years. A tragically predictable example of this is the sexual abuse scandal recently brought to light within ABWE.

Twenty years ago, a 12 year-old missionary girl in Bangladesh was sexually abused by a trusted missionary doctor, who apparently had abused many other children without being caught. When the girl exposed the abuse at age 14, the missionary leaders forced her to sign a “confession” admitting that she was to blame, and relocated the pedophile doctor to a place where he continued to practice medicine on children. While extracting the “confession” from the victim, the church leaders isolated her from her parents and refused her requests to talk with parents, while lying to her about her parents’ role. The girl was being victimized by someone she should have been able to trust, while her parents were kept absent. When the full details of the scandal came to light a few weeks ago, people were understandably outraged. The comments section of that blog post is like one long extended letter from all of the people who knew the girl, apologizing to her for letting her down, and crying out for justice on her behalf. Her parents, especially, expressed anguish at having been separated.

The predictable part, however, is what happened when all of these people tried to contact the girl, 20 years after letting her down. Everyone remembered the 14 year-old girl, and wrote the letters that they thought she needed to hear. But they apparently didn’t consider the fact that she might have moved on. When they finally reached her, the only response they received was, “The man never hurt me. Please don’t contact me ever again”

When you write letters to your children, you’re writing to whom you imagine that your children will be. And you’ll likely be wrong. Writing the letters may give us an illusion of control (and self-sacrifice provides the ultimate illusion of control). Such action may signal something about ourselves, but rarely have the impact on our children that we expect.

Far from being a cause for hopelessness, though, this explains a primary reason that I chose to have children. To help illustrate the point, we need to take a short diversion to James Gleick’s new book, “The Information”.

The book is a tour de force, chronicling the rise of the most significant thing to happen in human history: our dawning awareness of “information”. From the ancient acquisition of language to poetry and metaphor, talking drums, and on to signals theory and bioinformatics. Our dawning ability to see the unity (the word “information” didn’t even exist 50 years ago) is a phase change in human existence. Many people still don’t realize what has happened, and it’s astonishing to think that this new phase was groaning and straining to be born since long before humans existed.

One person who saw clear glimpses of this future was Ada Lovelace, the daughter of poet Lord Byron. Gleick’s book shares some of the letters she wrote to Charles Babbage, showing that her vision extended far beyond Babbage’s revolutionary “analytical engine”. Ada was a tragic figure. Her father abandoned her shortly after birth, and she never met him, and cancer left her unable to have any children of her own. Perhaps his poems gave her a sense of what kind of man he was, and what letters he would have wanted to write to her. And perhaps the intensity revealed in her letters was motivated in part by a desire to make him proud. And you can also imagine the letters being written to the future – to the child she could not know.

Abandoning your own child is a terrible thing, but having a child is ultimately an act of praise and affirmation. You know that your child will face hardship, suffering, and will eventually die. By choosing to have the child anyway, you are affirming that goodness endures; that it is all worth it in the end. This is what the faith of our fathers told us, and it is what we can now see clearly in “information”. What is good was there all along, although we could not see it. It cannot be unborn, and it endures. The very act of having a child speaks volumes, and is the first and best letter you ever write to your child.

Gleick sums it up nicely when he explains that it is pointless to grieve for the lost books burned in the library at Alexandria:

Vengeful conquerors burn books as if the enemy’s souls reside there, too. … You should no more grieve for the rest than for a buckle lost from your first shoe, or for your lesson book which will be lost when you are old. We shed as we pick up, like travelers who must carry everything in their arms, and what we let fall will be picked up by those behind. The procession is very long and life is very short. We die on the march. But there is nothing outside the march so nothing can be lost to it. The missing plays of Sophocles will turn up piece by piece, or be written again in another language.

Ugly Bags of Mostly Water

The other day, the kids and I watched a local production of Shakespeare’s “Cymbeline“. It’s a fairy tale about Cymbeline, king of Britain.

It’s one of my favorite Shakespeare plays, in part because of the hilarious caricature the bard makes of the physical reductionist mindset. You know the type of person I’m talking about. Evolution formed man from dust, and like the silicon-based life form in the old episode of “Star Trek: The Next Generation”, this person looks at humans and sees only the constituent parts. We humans are “ugly bags of mostly water“.

To this critic, our component parts such as atoms, molecules, and neurons are the primary reality. He insists that love is an illusion, and that life has no intrinsic value. If you’re not sophisticated or courageous enough to agree with him, you’re a superstitious nut. Such people might remind us of a child with William’s Syndrome:

Asked to draw a bicycle, a person with Down’s syndrome will come up with something that’s crude but recognizable. Someone with William’s syndrome, on the other hand, will produce a drawing with the person underneath the bike, the chain stretched out below the wheels, and the pedals off to the lower left, connected to nothing. All the parts are there, but they’re not in the correct relationship to each other.

Bellugi has shown as much with a test in which she briefly presents a card with a large letter D made up of many small Ys and asks children to reproduce what they saw. Normal children reproduce the figure accurately. Children with Down syndrome generally draw a large D, ignoring the little Ys. Children with Williams syndrome, however, will draw a collection of Ys, but it won’t be arranged in the shape of a D. One group seems to see just the forest, while the other sees only the trees.

Cloten and Posthumous

In Shakespeare’s “Cymbeline”, the king’s daughter, Imogen, is in love with Posthumous, who is in exile. Imogen’s stepbrother, Cloten, attempts to woo her away from Posthumous, at the prompting of his mother (and Imogen’s stepmother) the queen. The characters of Cloten and Posthumous are typically played by the same actor, but their actions and personalities couldn’t be more different. Cloten is boorish, completely oblivious, and very physical. Posthumous is chivalrous, intense, and a model of elevated character. Shakespeare provides Cloten with dialogue that is laugh-out-loud funny at several points, and a great parody of a person who can’t see the forest for the trees. “Cymbeline” is one of Shakespeare’s more complicated plays, and I won’t even attempt to summarize the plot here. I’ll just share a sample of the lines that slice so sharply against the physicalist mindset; you’re sure to notice many more if you watch the play yourself.

At one point in the play, Cloten’s advisors tell him that music is a good way to woo a woman. He turns both music and wooing into a hopelessly physical activity, starting his song with human body parts and ending with animal body parts:

CLOTEN
I would this music would come: I am advised to give
her music o’ mornings; they say it will penetrate.

Enter Musicians
Come on; tune: if you can penetrate her with your
fingering, so; we’ll try with tongue too: if none
will do, let her remain; but I’ll never give o’er.
First, a very excellent good-conceited thing;
after, a wonderful sweet air, with admirable rich
words to it: and then let her consider.

SONG
Hark, hark! the lark at heaven’s gate sings,
And Phoebus ‘gins arise,
His steeds to water at those springs
On chaliced flowers that lies;
And winking Mary-buds begin
To ope their golden eyes:
With every thing that pretty is,
My lady sweet, arise:
Arise, arise.

CLOTEN
So, get you gone. If this penetrate, I will
consider your music the better: if it do not, it is
a vice in her ears, which horse-hairs and
calves’-guts, nor the voice of unpaved eunuch to
boot, can never amend.

Eventually, Imogen spurns Cloten, telling him that he is beneath Posthumous’s “meanest garment”. Cloten becomes obsessed with the garment, and with getting revenge.

IMOGEN
He never can meet more mischance than come
To be but named of thee. His meanest garment,
That ever hath but clipp’d his body, is dearer
In my respect than all the hairs above thee,
Were they all made such men.

CLOTEN
‘His garment!’ Now the devil–

CLOTEN
‘His garment!’

CLOTEN
You have abused me:
‘His meanest garment!’

CLOTEN
I’ll be revenged:
‘His meanest garment!’ Well.

As his hatred of Imogen grows, so does his inability to see her as anything other than parts. He literally can’t see the person for the body parts:

CLOTEN
I love and hate her: for she’s fair and royal,
And that she hath all courtly parts more exquisite
Than lady, ladies, woman; from every one
The best she hath, and she, of all compounded,
Outsells them all; I love her therefore: but
Disdaining me and throwing favours on
The low Posthumus slanders so her judgment
That what’s else rare is choked; and in that point
I will conclude to hate her, nay, indeed,
To be revenged upon her.

Like Werther in Goethe’s “Sorrows of Young Werther”, Cloten’s speech becomes hyphenated as he loses his mind. Fixated on the garments, he declares his desire to see both Posthumous and Imogen completely humiliated and subjugated, like so many objects:

CLOTEN
Meet thee at Milford-Haven!–I forgot to ask him one
thing; I’ll remember’t anon:–even there, thou
villain Posthumus, will I kill thee. I would these
garments were come. She said upon a time–the
bitterness of it I now belch from my heart–that she
held the very garment of Posthumus in more respect
than my noble and natural person together with the
adornment of my qualities. With that suit upon my
back, will I ravish her: first kill him, and in her
eyes; there shall she see my valour, which will then
be a torment to her contempt. He on the ground, my
speech of insultment ended on his dead body, and
when my lust hath dined,–which, as I say, to vex
her I will execute in the clothes that she so
praised,–to the court I’ll knock her back, foot
her home again. She hath despised me rejoicingly,
and I’ll be merry in my revenge.

The line “knock her back, foot her home” is pure Shakespeare genius. Cloten is now so far gone that he is using body parts as adverbs and verbs! And one can only chuckle at Cloten’s insistence that he is a “natural person”. Indeed!

Cloten dresses himself up in Posthumous’s garments, and determines to cut off Posthumous’s head. He figures that he will get away with it, since his mother (and Imogen’s stepmother) is the queen:

CLOTEN
I am near to the place where they should meet, if
Pisanio have mapped it truly. How fit his garments
serve me! Why should his mistress, who was made by
him that made the tailor, not be fit too? the
rather–saving reverence of the word–for ’tis said
a woman’s fitness comes by fits. Therein I must
play the workman. I dare speak it to myself–for it
is not vain-glory for a man and his glass to confer
in his own chamber–I mean, the lines of my body are
as well drawn as his; no less young, more strong,
not beneath him in fortunes, beyond him in the
advantage of the time, above him in birth, alike
conversant in general services, and more remarkable
in single oppositions: yet this imperceiverant
thing loves him in my despite. What mortality is!
Posthumus, thy head, which now is growing upon thy
shoulders, shall within this hour be off; thy
mistress enforced; thy garments cut to pieces before
thy face: and all this done, spurn her home to her
father; who may haply be a little angry for my so
rough usage; but my mother, having power of his
testiness, shall turn all into my commendations.

Very soon after uttering this proclamation against Posthumous, Cloten provokes a young man named Guiderius, and Guiderius cuts Cloten’s head off. Like Haman, Cloten’s own words become his death sentence. His head was at the top of the mountain, now it is beneath the fish of the sea:

GUIDERIUS
With his own sword,
Which he did wave against my throat, I have ta’en
His head from him: I’ll throw’t into the creek
Behind our rock; and let it to the sea,
And tell the fishes he’s the queen’s son, Cloten:
That’s all I reck.

GUIDERIUS
Where’s my brother?
I have sent Cloten’s clotpoll down the stream,
In embassy to his mother: his body’s hostage
For his return.

Of course, it is not fitting for commoners to chop off the heads of nobles, no matter how boorish and clueless those nobles may be. One of the most moving parts of the play comes when Guiderius proudly tells the king that he has murdered Cloten:

GUIDERIUS
Let me end the story:
I slew him there.

CYMBELINE
Marry, the gods forfend!
I would not thy good deeds should from my lips
Pluck a bard sentence: prithee, valiant youth,
Deny’t again.

GUIDERIUS
I have spoke it, and I did it.

CYMBELINE
He was a prince.

GUIDERIUS
A most incivil one: the wrongs he did me
Were nothing prince-like; for he did provoke me
With language that would make me spurn the sea,
If it could so roar to me: I cut off’s head;
And am right glad he is not standing here
To tell this tale of mine.

CYMBELINE
I am sorry for thee:
By thine own tongue thou art condemn’d, and must
Endure our law: thou’rt dead.

This dialogue is clearly intended to mirror the story of Saul’s death in 2 Samuel 1:1-16. While nobody liked Saul or Cloten, neither were fair game for murder. Saul was theanointed, and Cloten was the son of a queen. Just as Cymbeline reluctantly concludes, “by thine own tongue thou art condemn’d”, David says, “Your blood be on your own head. Your own mouth testified against you”.

In Shakespeare’s version of the story, though, Guiderius is shown to be even more noble than the noble Cloten, and is thus redeemed. While it takes noble blood to redeem Guiderius, we also see that Guiderius is in every manner of action more noble than Cloten. His confession is not a witless blunder against sovereign order, but is instead an affirmation of noble principles that he was willing to die for: “With language that would make me spurn the sea, If it could so roar to me: I cut off’s head”.

Like all good fairy tales, the story ends happily.

Hell Bank Notes

Dandelionsmith has a great story about a family memento; a stone that he took from the foundation of his great grandfather’s house. It’s worth reading the whole thing, but here are some excerpts. First, he recounts the hike out to find the old homestead:

To get a picture of the continuing saga, imagine a square section of land (640 acres) bounded on all four sides, first by six strands of barbed-wire fence, and beyond that, either a gravel or a tar road. Inside the fence was close-cropped pasture grass, lichen-covered rocks, clumps of thistle, and at least a quarter-mile walk. The sun was high, the air was dry, and the wind was imperceptible. Our sweat flowed while the grasshoppers jumped. I took the tyke on my shoulders when she couldn’t keep up the pace set by my septuagenarian parents (and their pace is another story).

And taking the stone from the foundation:

The house, like many others of its time in the late 1800s, had a stone foundation. That is, many large stones from the surrounding land were gathered together and with mortar added, formed the foundation of the building. In time, the old foundation crumbled, leaving the stones somewhat exposed. Figuring the building’s lack of integrity would cause total collapse before I had another chance to return, I pulled out a rock the size of a Thanksgiving turkey. It wasn’t supporting the structure, anymore, and I thought it might be nice to have the rose-tinted stone as a way to remember the old place.

And finally, the effect that this memento has on the family:

That rock moved with us from Iowa to Montana, from Montana to Granite Falls, from Granite Falls to our former country house, and from that house to this. Every time I look at it, I remember. Every time I look at it, I think of sharing the story with the younger children.

This story really resonated with me, because it’s almost identical to an experience of mine, but also quite different.

My story involves a visit with the kids and the spry septuagenarian in-laws to see the graves of the ancestors. We, too, had to get permission from a suspicious tenant before hiking through the sweltering heat in a rural area outside Shanghai. And I, too, had to carry the youngest, while the oldsters soldiered on ahead. I had explained to my kids that we were going to pay our respects to the ancestors, much as we would by laying flowers on a grave in America.

Before setting out on the hike, the in-laws had spent some time haggling with a local merchant for things to burn at the grave site. I liked the fact that our kids would be able to see the graves of their ancestors going back several generations, and form lasting memories. I was somewhat less enthused about the ritual of sacrifice to dead people, and we agreed that the kids needn’t participate.

When we arrived at the grave site and opened the bag of merchandise, the kids and I had quite a surprise. The bag was full of fake paper money emblazoned clearly in English with the words “Hell Bank Note”. I immediately assumed that someone with a cruel sense of humor had played a prank on the unsuspecting Chinese by stamping the money with “Hell Bank” in a foreign language. Startled, I asked my wife, “Do Chinese people know what this says in English?” She assured me that, yes, Chinese people were intentionally burning Hell bank notes for the ancestors. Of course, the first question the kids asked was, “Does this mean the ancestors are in Hell?!?”. The next question was, “How will they spend the money if it just burned up?”

It certainly left a lasting impression on the kids, but not exactly the impression I expected.

I have to assume that the burning of paper money is relatively new, since China hasn’t had paper money for more than a few centuries. And I’ve since learned that people burn paper houses and paper Porsches. I’m well aware that there is some semblance of a rationale given for these practices. But I can’t shake the feeling that some cruel person played a malicious joke on the unsuspecting population by introducing these relatively new traditions.

Participation

Arturo Vasquez on poetry:

It occurred to me while thinking about this how we naturally assume that prose is the “first” language and that “poetry” is a development from it. But why should this be so? Surely animals are born, copulate and die quite efficiently (sometimes more efficiently) than the “speaking” animal. Were the first words spoken by humans “Sell consols and buy blue chip”? Don’t you think it would have been more like:

Sing, O goddess, the anger of Achilles son of Peleus,
that brought countless ills upon the Achaeans.

This sums up what we’ve lost in the modern age. What is called “poetry” today is not some fanciful embellishment that we use to “pretty up” our prose. It’s our original, most authentic, and purest way of expressing. In modern times, we give priority to objectivity, and thus we value analytical and reductive prose. But that’s quite recent in evolutionary terms. We wear analytical prose like a hairshirt. Our bodies and minds weren’t designed for analytical reductiveness; we were made for participation.

Owen Barfield was an expert on the poetry and history of the English language, and often made this same point. Just yesterday, I discovered a fascinating blog, which has a great post about Owen Barfield’s “Unancestral Voices”:

Now, I just finished reading the first three chapters of Barfield’s Unancestral Voice , and my brain is on fire. In this short expanse of prose, Barfield turns Darwin on his head in a reverse manner to the way that Marx supposedly turned Hegel on his head. There was no inchoate, unreasoning, unKnowing process that willy-nilly resulted in man’s rational and linguistic capacities. His single phrase “The interior is anterior” liberated me to see what he had been saying all along. The “unfree wisdom” was what nature had all along. All of it, Plato, Aristotle, Jefferson, Einstein, was there, somewhere, encoded into the warp and woof of Creation, but it wasn’t free. It wasn’t yet self aware. And it wasn’t the result of material processes. And at the center of it was the Incarnation.