Noooooooo!

Ted Gioia:

You can create a unifying vision. You can build something that’s fair and transparent and gets people excited about music again. […]

Taylor Swift, you are the one person who can make this happen. I believe this is your destiny.

That makes me very, very sad. (It also makes me this.) Mostly because I think if T. Swift is the only hope for the music industry, then there is no hope for the music industry. Not that I’m qualified to disagree with Ted Gioia on anything, but… There’s way too much credit given here to “taking on the system”; all things Taylor Swift are very much part of the system. Generating cult-like interest in a tour of shitty pop songs is not the congratulatory same as “generating interest in live music.” And as someone who takes the Widow’s Offering as the true standard for generosity, multi multi multi millionaires handing out bonuses will never stop being unimpressive to me.

(To quote the great Raylan Givens, “Every longtime fugitive I’ve ever run down expects me to congratulate them for not doing what no one is supposed to be doing anyhow.” Mutatis mutandis and whatnot.)

I was happily ignorant of all this for a long time, mostly because I couldn’t have told you even one song she sang, or even recognized her music on the radio. (This is a place of peace I plan to return to, if at all possible.) But then I came to Montana last February, and after about the 40th song played at work, and after about the 40th 60-year-old doctor saying “You know, her music is pretty good,” I thought I must be losing my freaking mind. Maybe Mainers have a built-in immunity to the influence of crowds, or maybe I’m just blind and oblivious, but I just didn’t see any of this before. Now I see it everywhere, and between the Trumpers and the Swifties out here, I expect the universe to implode at any minute.

In fact, I’ll go a step further. Show me that Taylor Swift fans can be talked away from the punch bowl, and I’ll believe there’s hope for the Republican — no, I’ll believe there’s hope for both political parties. Apples and oranges? Maybe. Maybe.

But at the end of the day, I can’t tell you much about meaningful differences between many of the current cultural obsessions — can’t meaningful differentiate between the Swiftie phenomenon and the recent teen obsession with Prime energy drinks or the re-obsession with Crocks.

Honestly, if Taylor Swift can marshall her powers for the forces of good, then more power to her. I’m not planting a flag or claiming a hill to die on here. I could certainly be wrong, and like many things in life, I’m happy enough to be wrong. Call it a rolling of the eyes from the peanut gallery or something. But I’m simple (I’ll always take Wendell Berry over the next Big Idea) and the whole “use the momentum of bad ideas” is exhausting.

That being said, there is a great takeaway line from the article, one from a tune Gioia has been singing for a long time:

It’s hardly an exaggeration to say that ten thousand musicians live on peanuts so Spotify can cut a deal with Joe Rogan.

I’ve never had a Spotify account. (My music streaming sins were committed with Amazon, to my everlasting shame.) But, while I don’t think that Swiftogeddon will be anyone’s answer to anything — much less a force for a “unifying vision” — refusing Spotify’s poisoned apple is a simple song I’m happy to keep on singing.


Update:

For what it’s worth, I regret saying “shitty.” That was unkind, and certainly undermines what I wanted to say, which I think has little to do with “aesthetics.” Though I do think that arguments around aesthetics are unavoidable (see this post — different subject, same idea.)

Every alternative suggestion I have is based on small things for small people to do. (I have Curly Howard’s voice in my head on a regular basis: “A simple job for simple people!”) Pay for the music you listen to (i.e. buy albums) and support local music venues. It isn’t revolutionary, but at this point, even these basic things seem to constitute reform. 

I loved being a small part of One Longfellow Square back home, for instance. And they remain my favorite live music venue in the world. And I’m looking forward to doing more of that when I get back there. For streaming, I’ve switched to Bandcamp. We’ve even talked about what it would look like to go back to using CDs and DVDs so that our son has a more streaming-free, physical experience of music and movies — and thus a better experience of support for those things — in childhood. And frankly, we’d benefit just as much ourselves.

I get that almost nothing I support is very new or ingenious or groundbreaking. I get that the impact seems so small as to be negligible. And I get that this has been a problem that has plagued lower-case-c conservatives of technoculture (if that’s a word), from Marshall McLuhan to Wendell Berry to your local librarian. I’m just convinced that, rather than change the “strategy” of what we know to be good, the questions we face are always the same: How do we maintain our lives and support our culture in good faith, and how can we be more convincing to more people that quality work — good quality and gritty work — is the way.

All of the writers and thinkers that I love have had at least a relatively small following, and even fewer real supporters. If they were looking for some measurable standard by which to gauge their success — by which to be “realistic” — I’m sure they would’ve quit long ago. But the fact that they weren’t is an essential part of the reason I have found them to be trustworthy in the first place. 

For the record, I think that Ted Gioia has taken a higher ground than I in addressing Taylor Swift the individual. While my disagreements stand, what I cringe at is the Taylor Swift phenomenon, the “brand,” which, when a star gets as big as she is, is as real as anything else. But if T. Swift brought her acoustic guitar down to li’l ol’ One Longfellow Square, I would absolutely support her doing so.

***Please don’t misinterpret the fact that I am talking right now as genuine interest in Taylor Swift and attempt to discuss her career with me further. End of speech.

healthcare reality check

Freddie deBoer:

But we don’t have a free market healthcare system or anything like it. Most people intuitively believe that everyone should have access to healthcare, even if they deeply disagree on how best to provision it; at the very least, few people would countenance children dying of preventable illnesses simply because their parents are poor. And the crucial thing to understand is that once you extend any societal commitment to provide care, you’re making government intervention into the system inevitable. That’s why various arguments against regulation and entitlements are so wrongheaded, because they suggest a system untainted by government influence where no such system can exist. The default American system of employer-provided health insurance has always left millions uncovered, chained people to jobs they would like to leave behind, and done nothing to guarantee access after retirement. […]

I am well aware that I’m not going to change anyone’s mind here. What I’m trying to do today is to demonstrate that, first, as soon as we make the moral determination that everyone should have at least some access to medical care regardless of ability to pay, any “free market” issues go out the window and government involvement becomes necessary and inevitable. Second, the point is that getting to this point didn’t take any rabid socialist sentiments or anti-capitalist assumptions. You can get here purely through a pragmatic consideration of the underlying reality. Medicine just is not like other human goods; the basics of capitalism don’t work when people simply cannot choose to go without an expensive service, and if we agree that they shouldn’t have to, then we’re left to comprehend how much simpler, more efficient, and more humane our system could be, if we committed to providing care to everyone via the only organization in the country large and rich enough to accomplish that, the United States government.

the intractably human

Matthew Mullins, reflecting on what made Frederick Buechner such a deeply empathic writer:

The theologian, in a commendable attempt to make the ways of God known to humans, drifts away from the death of his dear friend, or her mother, or your mother’s dear friend, and sets out to understand death as one abstract element in the larger rubric of God’s world.

What is lost in this effort to fit death into the rational matrix of God’s world is the particularity of human experience that is integral to human understanding. […]

The poet does not seek to translate experience into the language of certainty, nor to rationalize it into a coherent and reliable system. Rather, the poet attempts to recreate the experience itself in language that enables readers to inhabit the very same experience, or a similar one perhaps. The poet sets out to use language that will allow readers to imagine a world in which they themselves can see, hear, and feel what’s happened, and to grapple with what it means to care for a sick child, or to get stuck in a thunderstorm, or to wake up from a dream. Literature doesn’t try to tell you what to think about the things that happen to you; it tries to make things happen to you.

Oh, I like this very much.

I don’t remember when it started exactly, but somewhere around 2015, I started loosening my grip when it came to “theology,” at least as I had usually understood it. I was tired and feeling (always mildly and often heavily) hypocritical and inconsistent. I don’t know what originally changed. I know there were events from 2017 and on that were very significant for me. But the “trouble” started before then and I’ve never quite been able to put a finger on it.

Some things I do remember are things I read. They are the kinds of things that, when you read them for the first time, you feel as if you’ve been preparing for it. You’re not standing in the dark and suddenly the lights come on; it’s more like watching a sunrise and then, one second more, there it is. Or maybe it is like feeling for a light switch in the dark. You don’t know exactly where or when you’ll find it, but you’re feeling around the walls for a switch and when you do find it and the lights come on you know exactly where you are.

Here’s how Seamus Heaney describes what I think is this same feeling. Referring to Borges’s introduction to his own first book of poems:

Borges is talking about the fluid, exhilarating moment which lies at the heart of any memorable reading, the undisappointed joy of finding that everything holds up and answers the desire that it awakens. At such moments, the delight of having all one’s faculties simultaneously provoked and gratified is like gaining an upper hand over all that is contingent and (as Borges says) ‘inconsequential.’ There is a sensation both of arrival and of prospect, so that one does indeed seem to ‘recover a past’ and ‘prefigure a future’, and thereby to complete the circle of one’s being.

To complete the circle of one’s being — and not only once, and never perfectly, but continuously to do so. This is what I mean by being prepared for it. The fact is, the best reading a person will ever do isn’t the kind that corrects all of his or her mistakes, but that steers toward completion those thousand internal works which, whether we know it or not, are just waiting for the right word to come along.

I remember, for example, reading Gilbert Meilaender’s book on C. S. Lewis, The Taste for the Other, and the description of what he calls Lewis’s “reality principle,” that we are made for relationship with God and each other and the world. Given the inherent messiness, sacrifice, self-denial, discovery, and everything else that goes with it, the opposite of “reality” is the purchasing of “a tidy view of life at the cost of the world.” Lewis believed that, rather than systematizing the world, “the theologian must give himself, as it were, to the amazing multiplicity of experience and wrestle with the data of a pilgrim existence.”

So while I can’t say exactly why it started to happen, one way to describe what did happen is that I started losing whatever tidy picture of the world I had had. Which meant that I lost the need to explain things that I didn’t and shouldn’t know how to explain. I started losing the need for answers, and therefore the (usually unstated) need for perfection. And what followed was greater, more natural human sincerity. Genuineness, in other words.

Here’s Czeslaw Milosz, describing something very similar:

When I was, as they say, in harmony with God and the world, I felt I was false, as if pretending to be somebody else. I recovered my identity when I found myself again in the skin of a sinner and nonbeliever. This repeated itself in my life several times. For, undoubtedly, I liked the image of myself as a decent man, but, immediately after I put that mask on, my conscience whispered that I was deceiving others and myself.

The notion of sacrum is necessary but impossible without experiencing sin. I am dirty, I am a sinner, I am unworthy, and not even because of my behavior but because of the evil sitting in me. And only when I conceded that it was not for me to reach so high have I felt that I was genuine.

There is a lot to be said about this feeling. It matters to me that Milosz doesn’t say that he’s realized there’s nothing actually wrong with him. Somehow he realizes that he’s been unnecessarily hard on himself without be entirely wrong about himself. Imperfection is simply his condition. I think that sometimes when we notice that imperfection, we are driven to undo it. Or, if not to undo it, perhaps by systematizing it and telling ourselves and others that we’ve understood it all, we can get a similar feeling. We don’t tell ourselves that we’ve undone our condition per se; we’re not that openly arrogant. But our (supposed) extensive grasp of the situation lets us feel just about as good as if we did somehow get around it.

The opposite of getting around it is jumping into it. “Always,” Lewis wrote, “one must throw oneself into the wave.”

Kay Ryan said of Emily Dickinson that “She sends very hot things through the cooling coils of her poems and plays with them in her bare hands. For of course poems must include hot things; if all the hot things are removed the result cannot be poetry since it is the job of poetry to remain open to the whole catastrophe.”

“Open to the whole catastrophe.” Is that not a better image of theology — an image of better theology — than the “systematic” one that has dominated the craft for so long?

Of course, the only possible way to end this is with a poem.

Richard Wilbur:

SOMEONE TALKING TO HIMSELF

Love is the greatest mercy,
A volley of the sun
That lashes all with shade,
That first day be mended;
And yet, so soon undone,
It is the lover’s curse
Till time be comprehended
And the flawed heart unmade.
What can I do but move
From folly to defeat,
And call that sorrow sweet
That teaches us to see
The final face of love
In what we cannot be?

the self-identifying con

Mary Harrington:

So where I found a tiny subculture of those who felt at odds with sex stereotypes and their bodies, Keira Bell was greeted by 15 years’ worth of mass acculturation to this avatar-first understanding of ‘identity’. She was embraced by an international, well-networked digital community, dedicated to promoting the idea that our ‘selves’ are self-created and independent of our bodies, and backed up by serious lobbying money and a well-developed medical infrastructure. […]

But in turn the traces and scars offer a clue to a more grounded and human understanding of self than the digital one. The scars, wrinkles and other traces now building up on my gradually ageing physiology are ‘me’ in a sense that’s far more profound than Sebastian ever was. And seen thus, ‘who you truly are’ isn’t something that comes from within, to be greeted with awed affirmation by a hushed and waiting world. Rather, it’s something that emerges, over time, in conversation with that world.

“the tyranny of pure analysis”

Ronald Dworkin:

[T]he whole enterprise is based on an error. Millions of careers rest on the false belief that by analyzing human phenomena from the outside, and by gathering more and more knowledge through research, we can get an accurate representation of reality to substitute for knowing these phenomena from the inside, through intuition. […]

Analysis has generated thousands of empirical concepts that large numbers of people are believed to share. Examples include “rational decision making,” “wellness,” “whiteness,” and “addiction,” to name just a few. Much of our economy is built around these words in the form of services sold or models constructed, while millions of people are employed to perform research around these concepts or simply offer services in their name. Yet much of this is based on an illusion. The concepts may represent certain aspects of people, but they are not parts of people, as people’s minds cannot really be broken down into parts.

The philosopher Henri Bergson illustrated the futility of relying solely on the analytical method when he described breaking down a poem into letters, and then, without knowing the poem’s meaning, trying to reconstitute the poem through the letters alone. It can’t be done, he said, because the letters are not “parts” of the poem; they are merely symbolic elements used to express the poem’s meaning. Rather than fragments of meaning, the letters are merely fragments of symbols. Applying analysis to the poem’s letters without any intuition of the poem’s meaning yields a ridiculous outcome.

Reconstituting the totality of a person knowing only the “parts” of his or her mind is equally nonsensical. What we think of as parts are just fragments of feelings, thoughts, or sensations that run through the mind and have been given names, but which cannot be assembled to estimate the meaning of any person’s life. To understand that we need intuition.

“Burning Man is a Capitalist Lie”

Mary Harrington:

[F]or denizens of Black Rock City, there’s an outside to the experience of hardship and scarcity. The Google multimillionaires who helicopter into Nevada for a week of self-expression and gift economy against the (usually) arid backdrop of a dusty lakebed enact a crystallised essence the American civilisation’s founding myth of abundance manifested ex nihilo and brought into being through resourcefulness and creativity. But in truth they’re play-acting at the ideal, having pre-resourced that resourcefulness and creativity via a much more cut-throat reality of material competition in which there are, unlike in Black Rock City, winners and losers.

And unlike their fellow-countrymen in the “flyover states” — the losers, in fact, in the real economy that enables the Burning Man fantasy one — most of Black Rock City’s citizens have the option at any time to pull the ripcord, and exit desert survivalism and gift economies for an air-conditioned condo in some of the world’s most expensive postcodes. Unlike those who inhabit that scarcity all the time, they can enjoy the generosity and camaraderie that comes with scarce resources, safe in the knowledge that they have largely foreclosed the risk of genuine material suffering or interpersonal violence that so often accompanies real scarcity.

My own [Burning Man experience], and the flyover-state tour that preceded it, happened before widespread fentanyl abuse blighted the American interior. The period since my visit has also seen the Great Crash, and widening income inequality. It’s a safe bet that in the intervening period the contrast has only grown starker, between those in the Land of the Free who can afford to play at trying to flourish in a world of scarcity, and those for whom that’s just everyday life.

inescapably gnostic

Anne Snyder:

I understand that every generation wants to find its “thing,” its scene, a new level of probing the boundaries of life’s possibilities to set it apart from what the previous generation accepted as reality. And there’s no denying that Gen Z has done away with some of the more straitjacketing stereotypes that Christians should have done away with long ago. But the solution that our culture is offering, the so-called positive vision of valuing souls as the drivers of our bodies, which are mere instruments on which to express one’s freedom and felt identity, is inescapably gnostic. It should not go unnoticed that this ideology has gained rapid entrée into the courts of polite consensus precisely when we’ve never been less physically present to each other, to ourselves, or to the earth. When seven hours of the average person’s day is spent on a screen, with a limited if emotionally powerful toolbox of words and images, it’s no wonder that a disembodied reality finds so much purchase among us. Except that this fantasy is not constrained to the screen: its technologies are not merely digital but surgical and chemical, doing violence to bodies, to psyches, to families, and to public squares.

naive and healthy

I found this simple line from Anne Snyder insightful:

I entered the real world naive, but healthy.

Is it not the case today that many seem hellbent on sending kids out into the world “informed” but shockingly unhealthy?

There are many things that have drawn me to Hannah Arendt over the years, but one that I often think about is this story from her 1982 biography, For Love of the World, by Elisabeth Young-Bruehl:

Arendt chastised American progressive education for artificially depriving children of their protected, prepolitical time and space, the school; for destroying the natural authority teachers should have over children; and for enjoining children to behave like little adults with opinions of their own. Adults must not, she urged, forgo their responsibilities for children as children, they must not refuse to children a sheltered period for maturation, for being at home in the world. “Our hope always hangs on the new which every generation (by virtue of natality] brings; but precisely because we base our only hope on this, we destroy everything if we try to control the new [so] that we, the old, can dictate how it will look. Exactly for the sake of what is new and revolutionary in every child, education must be conservative.” Hannah Arendt was very strict about this principle, and she maintained it in her own political action. Some years later, when a branch of the Student Mobilization Committee to End the War in Vietnam contacted her for a donation, she agreed, but then she changed her mind after reading their pamphlet: “When we talked over the phone, she informed the committee’s fund raiser, “I was not aware that you intend to involve high school students, and I regret to tell you that I will not give a penny for this purpose, because I disagree with the advisability of mobilizing children in political matters.” Her rule of thumb was “from eighteen to eighty,” and she was flexible only at the upper limit.

As was so often the case with Hannah Arendt, her plea for conservatism was the vehicle for a revolutionary impulse. So-called revolutionaries, who try to insure the longevity of their revolution through education produce indoctrinated, unspontaneous young: “To [forcefully] prepare a new generation for a new world can only mean that one wishes to strike from the newcomers’ hands their own chances at the new.” Educators should introduce children to the world, give them the tools for understanding it accurately and impartially, so that the children can, when they mature, act in the world intelligently.

I’m glad I went back and read this again, because I think about it all the time. For one thing, it simply emphasizes what we all should know intuitively: that children should be given a broad span of time to mature while feeling safe and confident. I remember years ago my sister showing me a video on Facebook of a friend of hers, a teacher, marching little 5-year-olds around in front of a classroom with picket signs of some kind. (I don’t recall if this was before or after reading Arendt’s biography.)

Now, I have no doubt that this teacher believed she was merely involving the children in something meaningful, instilling in them a strong sense of civic duty and whatnot. But Arendt’s response to this behavior toward children is twofold. 1) Rather than installing a strong sense of civic duty in them, it more likely will instill a strong sense of anxiety, about one or any number of issues, and possibly do so from an age before which they have no memory. And 2) the tragedy is deeper and the irony thicker because, by trying to control the next generation’s revolution, you smother that revolution, you rob them of the chance to offer what is new or revolutionary in themselves.

Opp thoughts


While we were down in Phoenix last month, Meghan and I made it to the movie theater, and for only the second time in three years together. Of course, it was to see Christopher Nolan’s Oppenheimer.

One of the most important (and obvious) metrics for measuring the success of a film is the engrossing factor. If a movie is able to pull you into itself for the duration (three hours, in this case), then no matter what else, the movie is, perhaps in the most significant way, successful. By this measure, Oppenheimer was absolutely successful. I was completely engrossed. And I’m sure it was successful for all the reasons that knowledgeable movie critics would give. As far as I can tell, for instance, this may be one of the most historically accurate movies ever made. That achievement is quite interesting in itself. (Although, I’m less certain of how accurately it portrays Oppenheimer’s by-all-accounts complicated personality. The Oppenheimer of Oppenheimer is something of an enigma, but not a particularly complicated one.)

But I am not a movie critic and, outside of conversations over beer, I have no desire to be one. Neither am I a(n) historian. While I’m sure I was engrossed for all the same reasons that over 90% of critics and audience members were, I was also very caught by the film for more personal reasons.

The big moment, the apex of the movie, was not particularly enjoyable for me. I have no problem admitting (and neither does Meghan) that when it comes to movies, I am the crier in the relationship. Those moments are probably quite predictable most of the time. But who gets choked up over Trinity, over a remake of the first successful nuclear bomb explosion? Me, that’s who.

Oh, I’m sure the combination of music and silence and all the artistic buildup to that moment in the film did its work on me. But I also know what I was feeling, and that I have felt it before. I put up an essay from a history class that is about as much as I have ever said about that feeling. Here’s how it starts:

When I was eighteen years old, I remember feeling the chills of inspiration as my Air Force commander stood before us — in a church on Sunday no less — to remind my fellow airmen and me of the glory of our profession. “Make no mistake about it,” he said, “we are here to break things and kill people.” Energized by his words, we were young, motivated, and stupid. In recent months, I have reflected often on that moment, on what I thought and felt then. I spent most of my summer this year [2017] working at a field hospital in Mosul, Iraq, where the familiar sound of bombs is not relinquished to a bitter past or confined to foreign soil, where violence in its most aggressive forms and destructive consequences is an ever-present reality. After seeing the other side of that “glory” — the broken homes and dead victims — I have felt increasingly restless with the thought of war.

… with the thought of war or any instrument whose chief purpose is war or with anything that reminds me of it.

I used to love fireworks, deeply, especially the sound and the boom you feel in your chest. Fun, excitement, awe, inspiration, brilliance — I felt only good and empowering things. Now, I’d rather be buried in the ground with a straw to breath through until the celebration is over. I grew up shooting guns about as regularly as anything else, and I loved those, too. But when my friend took a few of us to the shooting range for his birthday in late 2017, it was all I could do to avoid a panic attack. As I write that, it sounds even to me like a bit of an exaggeration, but if it is, it is only a slight one. The fact is, I did not enjoy any of it for one second and couldn’t wait to get out of there. And that was just such a complete and sudden one-eighty for me. I just don’t like any of it anymore, and I think it’s absurd that I ever did.

(There’s a scene in Marilynne Robinson’s Gilead where John Ames’s father digs up a gun he had buried, smashes it to pieces on a stump with a borrowed maul, and throws the separate pieces of it in a river. “I got the impression,” Ames says, “he wished they didn’t exist at all, that he wouldn’t really have been content to drop them in the ocean, that he’d have set about to retrieve them again from any depth at all if he’d thought of a way to make them vanish entirely.” I feel exactly the same way.)

And yet, a very large number of us can’t seem to get over how cool it all is. And most of the time I can’t really blame anyone. The fact is, that change in me didn’t happen because I pulled my heart up by its bootstraps. There is an inherent separation between the makers of the instruments of modern warfare and the lives they leave buried and bleeding in the rubble. And it plays on another inherent separation in humankind altogether, one that can only be bridged by experience and by empathy.

“Who could convey this understanding across the barriers of his own human experience?” asked Aleksandr Solzhenitsyn. “Who could impress upon a sluggish and obstinate human being someone else’s far off sorrows or joys, who could give him an insight into magnitudes of events and into delusions which he has never himself experienced?”

One of the things that has struck me the hardest is that, even as a member of the military, I didn’t experience this, had no thought about the hundreds of sorties I supported, where they went or what happened when they got there. But that’s normal, isn’t it? It’s sad and it’s horrifying, but isn’t it normal? It’s certainly been normal with the history of nuclear bombs.

Why is it, for instance, that every single adult that I can remember while I was growing up had the same exact write-off response to the history of Hiroshima and Nagasaki? How could we justify it? No debate. No difficulty. No deep regret over vaporizing and crushing and radiating civilians. In my experience, having this conversation with the previous generation is impossible. The narrative is too deeply embedded for any reexamination: It had to be done. The Japanese would never have surrendered. The Bomb saved lives. And so on. Bottom line, it was un-American — and, somehow, un-Christian — to think or say otherwise.

Perfectly normal.

I’ve lost track now of where I was going with this, or maybe I’ve already said what I wanted to say. When the bomb went off on the screen in that theater, I felt the same way I feel when they go off in real life. I didn’t see scientific achievement; I saw mass murder and violence. I saw children on stretchers, in ICUs, and in graves. 

Here’s what Rand Richards Copper wrote in Commonweal after seeing the film:

After watching Oppenheimer, I streamed The Day After Trinity. (“Trinity” refers to Oppenheimer’s name for the bomb test site, inspired by a Donne poem, and the “day after” refers to yet another hearing, in 1965, at which Oppenheimer was asked about talks on halting the spread of nukes, and responded, “It’s twenty years too late. It should have been done the day after Trinity.”) It may seem paradoxical to suggest that a documentary more acutely conveys the tragedy of Los Alamos than a feature film does. Yet for me at least, it did. In the decades since the Manhattan Project, many commentators seeking to capture the dreadful awe that accompanied the advent of the atomic bomb have invoked Oppenheimer’s quotation from the Bhagavad Gita—“Now I am become death, the destroyer of worlds”—and Nolan leans heavily on it, using it not once but twice. The documentary pursues the horror more subtly, in a banality-of-evil way. It contains a small but terrible moment, when the Manhattan Project physicist Robert Serber displays a section of a wall removed from a classroom in Nagasaki, bearing the outline of a window sash imprinted on it photographically by the blast. “You see the angle here?” Serber says, holding it up. “That shows you that the bomb went off at exactly the height it was supposed to.” And Serber can’t quite suppress a smile—quickly followed by a look of sickly confusion. All these years later, he still feels pride.

That look does more to evoke the scientists’ moral disarray than does the pose of abject contrition in which the last third of Nolan’s film freezes Robert Oppenheimer. Serber’s smile reveals candor about the thrills of scientific discovery, even as his sickened look betrays an awareness of what resulted when those thrills were channeled into the priorities of what Eisenhower himself would call the military-industrial complex. What does it mean—for science and its practitioners, for civilization itself—when mass death becomes, well, a project?

Neil Postman made a similar point in 1985:

[T]o the modern mind … the truth in economics is believed to be best discovered and expressed in numbers. Perhaps it is. I will not argue the point. I mean only to call attention to the fact that there is a certain measure of arbitrariness in the forms that truth-telling may take. We must remember that Galileo merely said that the language of nature is written in mathematics. He did not say everything is. And even the truth about nature need not be expressed in mathematics. For most of human history, the language of nature has been the language of myth and ritual. These forms, one might add, had the virtues of leaving nature unthreatened and of encouraging the belief that human beings are part of it.

Postman’s next line echoes in my head on a regular basis.

It hardly befits a people who stand ready to blow up the planet to praise themselves too vigorously for having found the true way to talk about nature.

And yet we do praise ourselves regularly for just how so-damn-smart we’ve gotten, don’t we? One of the things that struck me the most watching Oppenheimer was how surprisingly boyish the whole enterprise seemed. I’ve had the tendency my whole life to think of many of these historical scientists as men of greatness. But what I saw in that theater — and maybe this is the most historically accurate thing of all — were only boys playing with chemistry sets.

Maybe this is simply one of the tragedies of growing up, realizing that there are no adults in the room. Or very few, anyway. In this case, just big kids and their chemistry sets. “You see,” the little boy said with a proud smile, “it went off at exactly the height it was supposed to.”

Satyagraha: A Brief Christian Perspective

An essay I wrote for a history class in 2017


When I was eighteen years old, I remember feeling the chills of inspiration as my Air Force commander stood before us — in a church on Sunday no less — to remind my fellow airmen and me of the glory of our profession. “Make no mistake about it,” he said, “we are here to break things and kill people.” Energized by his words, we were young, motivated, and stupid. In recent months, I have reflected often on that moment, on what I thought and felt then. I spent most of my summer this year [2017] working at a field hospital in Mosul, Iraq, where the familiar sound of bombs is not relinquished to a bitter past or confined to foreign soil, where violence in its most aggressive forms and destructive consequences is an ever-present reality. After seeing the other side of that “glory” — the broken homes and dead victims — I have felt increasingly restless with the thought of war. And as a Christian, I have been compelled to ponder afresh words very different from the ones my commander spoke, and words that are vastly more common to a church setting: Blessed are the peacemakers, for they shall be called the children of God (Matt. 5:9).

If you asked a random stranger today who he or she thought has been the most influential promoter of peace in the world, there is a good chance the answer would be Mahatma Gandhi. With the early 20th century in the background, his fame is not difficult to understand. Gandhi’s popularized commitment to nonviolence stands in such contradistinction to the aura of his time — a century suffused with as much violence as the world has ever known — that the merit of his methods seems practically self-evident. In a modern century as bloody as any before it, a commitment to nonviolence stands out as a light shining in the dark. And that is, I think, the way Gandhi saw it as well, and why he named his version of nonviolent resistance satyagraha (roughly translated as “truth force”), both to differentiate it from the philosophies from which he borrowed, and to promote it as the ultimate force for good. Satyagraha was, hence, not just an alternative to war and violence, but it was a complete replacement, able (alone) to accomplish what violence never could.

In the West, there is probably no greater example not merely of the failure but of the complete inability of violence to resolve conflict or hatred than the American Civil War. As the Civil Rights era struggled to complete the unfinished business of the postbellum years, it was clear that war had not solved the underlying issues of slavery and racism in America. In 1868, Father Paul Joseph Munz wrote, “The North can free the slaves with force, but it cannot civilize them and deliver them from contempt and mistreatment.” While the first victory had been won by force of arms, the ultimate victory would require another approach. To fight contempt, we need a very different force than the one war offers. For this task, only a nonviolent force will do, and to this end Martin Luther King, Jr. employed Gandhi’s methods with great success.

With the virtue of Gandhi’s method in mind, King said he believed that “more than anybody else in the modern world, [Gandhi] caught the spirit of Jesus Christ and lived it more completely in his life.” As I think about that statement, I have to wonder why I’m so inclined to disagree. Jesus did teach, after all, in the Sermon on the Mount, to turn the other cheek rather than resist an evil person (Matt. 38:39). In a moment of introspection, I’m tempted to think that the part of me that thoughtlessly cheered at the notion of destroying homes and lives is the same stubbornness and pride that keeps me from embracing a more passive political stance. I’m sure that may still be part of it, but it has been fourteen years since I heard my commander give that speech, and today I am not the least bit moved by it. Yet, even stimulated by a fresh hatred for violence, I still find something in satyagraha distasteful. In fact, the very same thing that makes me hate violence is also what drives me to reject Gandhi’s approach. In all its historical detail, I’m sure the problem would be more complex, but as I see it, it is Gandhi’s approach to truth that causes the most significant problem.

Let me return to Iraq for a moment. I had been there for about a week when she showed up — a beautiful eleven-year-old girl, barely clinging to life. Though I cannot use her real name, I’ll call her Jaleesa. I remember setting up for the surgery, not knowing her face or who she was, knowing something only in the abstract: a transfer was coming in and we needed to do an emergency laparotomy for an abdominal blast wound. Within fifteen minutes, she was in the room. As the surgeons began operating, one of the national doctors realized that she had been his patient only a day or two before, on the other side of the Tigris River. He had performed an emergency surgery after she was badly wounded by shrapnel from a mortar round. Not believing she would live long, he stitched her up as best he could and left. Yet here she was again, about 10 miles east, still fighting to live.

Jaleesa stayed with us for a couple weeks, receiving several operations and spending much of the time with a hideous-looking ABRA system holding and stretching her belly together. Eventually she was transferred to a hospital in Erbil, where she died about a week later. It is both wonderful and shameful for me to admit that I don’t think I have ever loved anyone as much as I loved that little girl. In the many hours I spent with her, in the operating room or sitting next to her bed, I grew to hate the thought of what a bomb can do. And whether war is right or wrong, necessary or not, I grew to hate that part of me that was ever inspired by such things. How many little girls like her had been casualties of what I so thoughtlessly cheered for? (Think about that the next time those fighter jets fly over your favorite football stadium.) At this point, however, rather than turning to a theory of nonviolence, this is where I would put Gandhi to the test, and where I inevitably find his theories wanting.

Imagine for a moment that Gandhi was there in Mosul that day, watching from a distance. An ISIS fighter is about to drop a shell into the barrel of a mortar, the arc aiming at his 11-year-old daughter. With a sniper rifle in his hands, if Gandhi pulls the trigger, the ISIS fighter dies; if he does nothing, his daughter spends the next several weeks in pain, and dies. It’s not a very original scenario, I know, but it remains an important one. What do we think Gandhi would do if he were there? We can be fairly certain what he would do because he told every mother and father in Britain to do the same thing when he encouraged them to “allow yourself, man, woman and child, to be slaughtered” by the Nazis rather to resist them. I don’t doubt that it is an interesting philosophical and historical thing to ponder — what the world would look like if the Allied Powers refused to fight — but it remains impossible for me to consider it noble, if it isn’t in fact criminal, for Gandhi to choose nonviolence if he were given the opportunity to save Jaleesa — that he would choose a principle over a child.

As I said before, I think the fundamental problem lies in Gandhi’s view of truth, or, more specifically, in his view of God. Bhikhu Parekh wrote that Gandhi did not ultimately believe God was truth, but that truth was God. For Gandhi, God was either reduced to or overshadowed by a principle: “His cosmic spirit was therefore not a creator but a principle of order, a supreme intelligence infusing and regulating the universe from within.” In other words, Gandhi believed that ultimate reality was an impersonal force. And therein lies the problem: what is truth without love, and what is love if it is not personal? Committing myself to a universal principle of nonviolence does not provide me with a noble answer to the problems of violence. Instead, it conveniently removes me from the personal nature of those problems. Rather than being moved by personal love, I am forced to ask what an impersonal force requires of me. This, I think, is why Gandhi could tell every British parent to let the Nazis kill their children (and why he could so arrogantly advise the Jews to adopt his method — as though resistance to the British was somehow on par with resistance to Germany).

Gandhi borrowed much from the Christian faith, and he quoted from the Bible often, but, as Parekh points out, his studies “did not involve understanding religious traditions in their own terms.” Aside from the fact that this approach itself is inherently untruthful, it means that he misunderstood the fundamental point of Christianity. Jesus did not simply point to the truth, as Gandhi would have liked to believe, but he claimed to be the truth. For the Christian, ultimate reality, virtue, knowledge — these are all found in a person. To overcome doubt in times of trouble, the author of the letter to the Hebrews encouraged his readers by reminding them where to fix their attention — not on a principle of virtue or hope, but on Jesus, “the author and perfecter of faith.” For the Christian, the question is not what principle to adhere to, but how best to love and protect life in honor of the Life to which they continually look. Even love itself is not a principle that exists on its own, but something found in the character of the person who reveals God to us. The truth is always and in every way personal because we find and know it most clearly in a person, not in a force. For Gandhi, it was the principle, which (ideally) provides only one option. For the personal, however, the question of whether violence is justified will always remain an open question, with different situations leading to different answers. That means that someone could very well be compelled to use violence in the name of love and truth, an idea rooted in the personal nature of morality, which Gandhi seemed intent on eliminating.

It’s important to note that I’m not talking about repaying evil for evil. It was not uncommon for us to treat members of ISIS at the hospital, and it has not escaped my mind that one of them may very well have been the one to kill Jaleesa, or any of the other victims we treated or never got the chance to treat. Trying to understand how violence may justifiably be used against a man in one situation does not imply that one may not try to save his life in another. These situations are difficult, and they require careful attention, not quotable maxims. In Iraq, the duty was not to maintain strict adherence to a principle of nonviolence, it was to love and care for the victims of war — in this case, to love and care for a person named Jaleesa. In her entire life, before or after her injury, she was never in need of a principle, or a “supreme intelligence.” Once she entered the room, the abstract procedure became personal. The truth was no longer that we needed to perform a surgery, but that we were compelled by love to save the life of a little girl. She did not need a principle to comfort her; she needed love and protection that was personal.

Though fourteen years have passed since that speech, I can still feel the chills as I think about my commander’s words now, but I feel them for very different reasons. Many questions remain, but at this stage in life, I think I can say that I long for an end to violence as much as anyone. For Christians, especially in a world that seems to take the “authority” and “power” of violence as a given, Gandhi’s method of satyagraha still stands as a modern reminder that Christ showed a better way. Contrary to King, however, I think for anyone who wishes to understand that way, Gandhi’s satyagraha will be a disappointment. And I think it should be disappointing, for the immense personality of Jesus Christ cannot be summed up by the words “Thou shalt not kill.” And everything that it means to walk among the world as a peacemaking child of God cannot possibly be simplified to a principle or an impersonal force.