A few unorthodox observations

I am no longer interested in establishing who is right and who is wrong, or guidelines for determining who is in and who is out. I am only interested in getting closer to the Truth—or, if you prefer, learning how to live in increasing harmony with God’s economy.

This is one reason why I am fascinated by the growing controversy over Pope Francis. I have no dog in the fight over the future of the Roman Catholic church, and probably couldn’t be much further away from Francis on matters of ecclesiology. But I think he is correct in pointing out Christendom’s ongoing failures, and the resulting discomfort in some circles indicates that he is turning over some interesting rocks.

No one has changed my mind more about how to read the Bible than Andrew Perriman. Somehow he has persuaded me that the Bible speaks much less comprehensively to Christians than the conventional wisdom tells us—and at the same time it has deeper, richer meaning to offer, if only we will read its stories on their own terms intead of universalizing its lessons. How he managed to take away all the old comforts, and yet leave me more comfortable, is a magic act I haven’t yet figured out.

A bit more than a year after rediscovering Dallas Willard, I’m more convinced than ever that our proper goal is not behaving properly, but becoming the kind of person to whom proper behavior is second nature. I’ve known this for much more than a year, but Willard is the one who crystallized the thought and filled in the details for me.

I suppose this idea is neglected or outright rejected because the alternative—telling people how to behave, and expecting/exhorting them to such behavior through dint of sheer willpower—is what keeps teachers and pastors employed. If instead we only allowed/expected them to help us become fully Christian, and judged their performance by the results—well, who could pass such a test? I’m willing to raise my children this way, and to be judged based on the results in their lives. No one I know seems to be interested in putting me (or anyone else) to such a test in their own lives—and perhaps that’s just as well.

One nice thing about being an anarchist, Christian or otherwise, is that consistency allows you to believe that others are wrong but insists that you not impose your own correctness on them. Limiting, but also liberating.

An ongoing “war on the weak”

Way too much axe-grinding for my taste in this short essay, but I think the concept of a “war on the weak” is good, one that can help a lot in sorting through some superficially mysterious trends of the past couple of hundred years. (emphasis added)

For the last few decades, cultural leaders have been waging a war on the weak. Their goal is to dismantle traditional norms and rules for family life. Well-educated adepts know how to use today’s multicultural patois to navigate in our brave new world of officially mandated gender blindness. […] To a great extent, our progressive culture strips ordinary people of almost all settled roles, other than economic ones. This heightens the existential pain of the already harsh economic realities of our globalized economy, which can be very punitive to the poorly educated. Two generations ago, a working class man was often poor or nearly poor, but he could be respected in his neighborhood as a provider for his family, father to his children, law-abiding citizen, coach of a Little League team, and usher in church. The culture that made such a life possible has disintegrated, partly due to large-scale trends in our post-industrial society, but also because of a sustained and ongoing ideological assault on the basic norms for family and community.

I know some of these men from two generations ago, courtesy of my father and his church. Although they are frustrated with what they see going on around them (particularly among their kids and grandkids), they are mostly free from angst regarding their own lives. I’m less acquainted with working class men of my own generation or the one following—personally, at least. But I can buy the idea that educated adepts have steadily dismantled the social mechanisms which used to both bind and support us, for the sake of pursuing their own ideas of freedom. As Aldous Huxley wrote in Ends and Means,

I had motive for not wanting the world to have a meaning; consequently assumed that it had none, and was able without any difficulty to find satisfying reasons for this assumption. The philosopher who finds no meaning in the world is not concerned exclusively with a problem in pure metaphysics, he is also concerned to prove that there is no valid reason why he personally should not do as he wants to do, or why his friends should not seize political power and govern in the way that they find most advantageous to themselves. … For myself, the philosophy of meaninglessness was essentially an instrument of liberation, sexual and political.

Well, convenient for some, not so much for others.

Learning (and internalizing) a new approach

I complained in my previous post that discussions of character building often err in one of two ways, either focusing on manageable changes whose benefits are vague and indirect, or on changes whose benefits would be specific and direct if they could be achieved, but failing to explain exactly how they can be achieved. Right after I finished the post I started in on a project which has been on my stack for awhile, and as I proceeded it slowly dawned on me that it an excellent illustration of how to walk a third path. Unfortunately, the project is a programming project, and to get to my point I will need to lay some groundwork in an area my readers may not be familiar with. I’ll try to be gentle and do my best to not burden you with unneeded details. I think the effort it takes to understand the example will yield rewards, but I also completely sympathize with any reader who chooses to wait until I have a more relatable example with which to make the point.

When writing a program of any scale, the traditional approach has been to decide what tasks the program needs to perform, then start building pieces that accomplish the various tasks, and once the pieces seem to be capable of performing their tasks to assemble them into a whole which a user can use to get their work done. This means, for one thing, that a lot of writing must happen before one has even the smallest bit of practical capability, i.e. a program that does anything at all.

Thirty years ago, though, I saw someone approach the problem in a totally different way. I was working in the research labs at Texas Instruments, and as part of a joint project with MIT I was “donated” to Prof. Steve Ward, then the head of their Laboratory for Computer Science. Debbie and I moved to the Boston area, and I spent the next two years being directed by Steve on various programming projects.

Though he didn’t program much anymore, Steve had a reputation around the lab for being an excellent intuitive programmer. One lab member told me (approvingly) that he was known for being able to “debug a blank screen.” I had no idea what that meant until one day when we were both in his office discussing a project, deciding that we needed a small utility program that shouldn’t take more than a day or two to write. He suddenly turned to his computer, opened a text editor window, wrote a program that contained exactly one function with no content, i.e. it did exactly nothing, ran the program, saw that it did nothing, and said “Well, that didn’t work right!” He then proceeded to “debug” the program into existence, making small changes that inched the behavior of the program toward the behavior he wanted, checking each step of the way to see that his incremental change did exactly what he intended, no more and no less. After an hour or so he had a program which not only did what was needed, and only what was needed, it “did it right”. I was amazed, even more so over the coming days when I built on his initial work and came to appreciate how robust and elegant his code was. And I absorbed part of his ethic in my own work. But only informally, and whenever the chips were down, i.e. deadlines loomed, I always reverted to my old ways. I knew I would be much better off in the long run if I could fully embrace Steve’s approach, and was pained each time I suffered consequences I knew his approach would have spared me. But none of that was enough to convert me.

Meanwhile, Steve’s approach (not original with him) has become more known, more understood, and more formalized, even earning a name: Test-Driven Development (TDD). Programmers have long known that they benefit greatly by subjecting their code to automated testing. A body of tests is developed which in their way specify how the program is supposed to behave, and as the program grows and evolves the tests can be run again and again to insure that no matter what changes the programmers have made, the program still behaves the way the tests say it should. The major weakness in automated testing, as you can imagine, is that only the running of the tests is automatic—the tests themselves still need to be written, by programmers who find writing tests boring and tedious and of little immediate benefit. So the comprehensiveness of test suites, when they exist at all, tends to be scattershot.

Those who preach the gospel of Test-Driven Development, though, are promoting a very different approach to creating a program, more or less turning the old model on its head—write the tests first, then create a program which can pass them. You can see right away that, if nothing else, this approach will not only get the test written but will produce a battery of them that comprehensively tests the program in question. And if you squint a bit, you can see how practicing such a discipline requires a very different rhythm and outlook from the traditional approach.

Being quite aware of the benefits, I still never wrote tests in tandem with writing code, only long after the fact, and only when the organization required it. If the project was solely my own, I might resolve to give it a try, but any effort I made quickly petered out in the face of just wanting to get the work done. And for the years 2001-2012 the question was more or less moot, since what little computer work I did didn’t involve coding. But then I saw that my new job, managing a worldwide network of bluegrass music teachers, would be much easier if only I had a certain sort of software program, one which—as far as I could discover—no one had yet taken the trouble to write. I had idly kept up with developments in programming, but now I took the opportunity to re-immerse myself in that world. I dusted off my second-favorite programming language, learned my way around a website-building framework which used it, and started sketching out the program I needed.

I also stumbled across an amazingly good book, Harry Perceval’s Test-Driven Development with Python. I can’t imagine a better introduction to the discipline, warm and engaging, clearly explaining the how and why, containing an extended tutorial centered around the creation of a useful and realistic website for creating and managing to-do lists. I read the book, caught TDD fever, and knew that I would use its techniques to create my new program. And then time pressures built up, and my good intentions were jettisoned in order to get the job done, using the old approach which for better or worse I had thoroughly internalized.

Normally that would be the end of the story, a tale of one more Good Idea abandoned in the breach. I managed to get the first version of my program sufficiently functional to be put into everyday use—and it worked pretty well, and indeed made our back office operation way more efficient and error-free. In fact it was so useful that I wanted to add some additional capabilities I hadn’t initially thought of, and so I did that. And as I did, the program often broke in unexpected (sometimes embarrassing) ways, because I had forgotten details of how the internals actually worked. And of course nearly every one of those flaws would have been detected right away if only I had written tests for my program—and not necessarily the comprehensive testing that TDD requires, but any tests at all, feeble as they might have felt at the time. By the way, feeble is an important part of the puzzle, at least as it exists in the mind of the practitioner—I was sold on testing, had a clear understanding of what a comprehensive test suite would do and the blessings it would bestow—but I also knew how much work it would be to create a comprehensive test suite, and how feeble my initial efforts would be when measured against that monumental standard. In other words, my understanding of the ultimate goal, and even my zeal to reach it, were working against me—anything I would actually be able to do as I set out on this path would be so small and ineffective in comparison, that I was more comfortable daydreaming about the destination than setting out on the journey.

So when a week-long lull came along, I sat down and dutifully worked through Perceval’s tutorial, getting hands-on experiences with TDD (even if it was only second-hand), chasing down his references, immersing myself as best I could in the project. The idea was that I would get a better feel for the kind of tests that TDD leads one to create, then go back to my program and do my best to create such tests after the fact. Imperfect, of course, since the tests should really come first, with the program evolved to pass them. But since any tests would be better than none, it would be a way to get at least a limited and imperfect test suite in place.

And still I wasn’t able to make progress on the real problem. Even though I had absorbed the TDD principles, and even practiced them through the hands-on tutorial, it was all very little help when I thought about how to go about creating even a single test for my already written program. Since Perceval’s tutorial was aimed at the initial creation of a program, the “But … how” gap was just too wide for me to bridge in trying to apply it to an already created program.

One possible solution, which I thought about seeking (though I don’t know if it actually exists) is to find a similar treatment of the problem I actually faced, creating a test suite for an already existing program. But although there are likely such treatments, I doubt that any of them are excellent—otherwise I would have heard of them already (the software community is quick to promote an excellent canonical example). And more important, I didn’t really want to learn how to retrofit tests, I wanted to become a practicing TDD disciple. Creating the tests for my existing program was only intended to be a step on the path, and if it turned out not to be practical I didn’t want to get sidetracked.

Fortunately I have some time in my schedule to take on a discretionary project, so I decided I would repeat the process that Perceval had taken me through, but this time with a program of my own choosing, similar enough that I could adapt the steps of his tutorial but different enough that I would not be able to unthinkingly fall back on the details he had worked out in advance. I searched around for a good idea, a program that wasn’t too ambitious but would still be useful to me in day-to-day work. Not an easy task! But fortunately as I was exploring some of the capabilities my to-do list manager provides (Todoist, highly recommended), I ran across a program some other fellow had written to make Todoist do certain very useful tricks, and as I looked at his source code I found myself thinking, “I would do all this differently … and better, at least for my purposes”. So my personalized tutorial had now become the task of building such a program.

And still I found myself blocked. But this time in a good way. It wasn’t immediately obvious to me how to take the first couple of baby steps Perceval had provided and adapt them—my program was just different enough to make adapting them less than trivial. But why should they be trivial? I resigned myself to the fact that there was going to be some work involved here. But at least the work was clearly defined and not open-ended. It took me a day or so of fairly hard thinking to lay the groundwork and take that first step (which is so trivial in its scope that a programmer might laugh at spending any time at all worrying over it). And it finally became clear how to take the step as TDD required. And once I actually done it—installed the needed software, created the file folders, edited the first file, and so on—I had a feeling that I was actually on my way, plus a clearer notion of how to take the second step.

To invoke an idea my boss (the music teacher) is fond of quoting, I am currently traveling the “ugly part of the learning curve”. There are many ways of viewing the learning curve (and you’ll see a lot of bogus examples out there), but this is the one I’m referring to.

enter image description here

The X-axis represents the amount of time invested in learning the task, and the Y-axis one’s proficiency at the task. The earliest portion, say between 0.0 and 2.0 in the diagram, is the ugliest part—lots of effort expended, very little payback in terms of proficiency gained. But there’s no other path to the latter part of the curve. And, good news—between 2.0 and 4.0 the increase in proficiency is finally detectable, and between 4.0 and 8.0 it actually becomes dramatic, major leaps in proficiency costing relatively small amounts of effort. There are good solid reasons behind this, some of them intuitive (e.g. we build on what we’ve done, so the more we’ve done the easier the subsequent building).

So the initial lesson is: if you can find a way to get started on the learning curve and stick with your efforts during the early ugly section, you will be amply rewarded as time goes on. And you have probably heard this before, in some form or other—just stick with it, it gets easier, the rewards will be worth it. This often comes off as fairly trite cheerleading, for good reason. When someone is stuck in a practice and not making obvious tangible progress, it is possible that they are still in the early ugly section of the curve. But it ain’t necessarily so— they could also be headed down the wrong path, one which will never repay effort. Or they could be engaged in something where proficiency is not involved (I suspect prayer is one of these things).

Given these uncertainties, I wouldn’t conclude that the idea of the learning curve is too vague or trite to be of use in pursuing a practice. What I would suggest is that there are a certain subset of practices which for certain people will exhibit a learning curve, and for those we should be willing to put up with initial difficulties for the sake of future payback. How to know whether a certain practice will exhibit a learning curve for oneself? Advice from an expert coach or mentor can be very helpful, but in the absence of that I’d recommend low-risk experimentation. Identify some small, easy steps that seem to head in the right direction, take them, and (keeping expectations low) see what happens. The results might not be dramatic—but is there any positive result at all? If so, this should warrant a bit more experimentation, a few more similar steps to build reassurance, or one step that is slightly more ambitious. After that, do you find yourself somewhat more comfortable with the practice, or more adept at detecting the effects, or having a clearer understanding of how you could have been slightly more effective in your efforts, or with a clearer view of how to proceed into new areas? These are all hints that a learning curve exists and is available to a person with your makeup.

Applying this to my tale of software testing: all my understanding and zeal for the Test-Driven Development approach were not enough to make me a TDD practitioner, or set me on a path to becoming one. I had to find a path which was viable for me, given my background and abilities, one where each step was small enough to be manageable yet in-the-right-direction enough to produce a tangible result, one that would keep me at the work of becoming a practitioner—in fact, allowing me to understand that I actually was a practitioner at that point, though a very inexperienced and unskilled one. But experience and skills are things that can be gained over time, through effort.

And now (finally!) applying this to my complaint of teachings on the Christian disciplines falling short in the “But … how?” department. “Read your Bible, pray every day” are achievable goals, but practiced for their own sake I don’t know they will result in much more than a greater knowledge of the Bible and … well, I’m not sure that excellence in prayer is actually a suitable goal, any more than achieving excellence in driving or breathing. Similarly with faithful church attendance, or giving of our time/talents/resources, or even obeying God’s commandments—these things can have beneficial effects as part of greater efforts in certain directions, but on their own they can be performed with little or no effect on the practitioner. They can increase one’s understanding and stoke one’s zeal, but—as with most of my encounters with TDD philosophizing—they will not by themselves make you a practitioner, and are not even likely to set you on the path to becoming one.

Similarly with lofty goals that are effectively destinations easy to envision, admire, and yearn for but impossible to chart a path towards. We all agree that we should love God and others selflessly, live godly lives, esteem others more highly than ourselves, love our enemies, bless those who persecute us, and on and on. But if we select one of those as a goal—say, selflessness, or godliness—our imaginations tend to fail us when we try to envision any practice beyond “be less selfish” or “think about God more”—we are unable to map out a practice that will slowly, steadily, and inevitably make us into people who are truly selfless, or godly. And so we content ourselves with establishing the standard, and discussing it endlessly among ourselves, and exhorting one another towards it (without giving explicit directions), and feeling bad when we contemplate how far we fall short of it. But rarely do we seek out and set off on a path we think will lead us there—and even more rarely do we stick with it through the difficult early stages.

This is why I’m intrigued by the idea of focusing on kindness as a goal. Most everyone understands what kindness is—they can usually identify a given action as kind, unkind, or neutral, and are generally willing to do so (as opposed to right or wrong or good or sinful). Most everyone is attracted by kindness and repelled by unkindness. Most everyone sees that kindness is operative at all levels, in small actions and grand ones, easy actions and difficult ones. And most everyone has encountered kindly people, and found them attractive. Kindliness is not the whole of Christian character, but it is certainly a vital part of it, an outworking of many other Christian virtues, a mature Christian should certainly exhibit it, and a Christian who doesn’t exhibit it is flawed in some fundamental way.

One thing I think most everyone no longer believes, though, is that it is possible to become a kindly person, to grow in kindliness through deliberate practice. I disagree with that, and I think in fact that a clear path can be charted where the early steps are not only small and manageable but have definite, tangible (though small) rewards to pay, enough to motivate the pilgrim to follow a path which will slowly, steadily, and inevitably turn them into a kindly person. I also think that anyone who walks the path of kindliness, no matter where they begin, will slowly, steadily, and inevitably be drawn closer to God—even an unbeliever. And I think it is possible to chart such a clear and manageable path that we could honestly to say to any modern: Would you like to become a kindly person? Come with me! (I think such paths were available to premoderns, but times really have changed and the path must be charted anew for these post-Christendom times.)

If you’ve stuck with me this far and find any of this intriguing, I’d like to hear your reactions. It doesn’t have to be kindness, but: do you think we can improve any specific aspect of our Christian character through deliberate long-term practice? and, if so, what keeps us from doing it (assuming that you agree with me that it hardly happens, and is certainly not part of normal church practice)?


I’ve been following online discussion of the Benedict Option, a hot topic in some circles. Briefly, it is a recognition that Christendom is over, together with a rallying cry to spend less time trying to impose a Christian vision on society by fiat and more on living out the Christian vision in our own lives. There’s much more to it than that, to the point of incoherence, but that’s the angle which interests me. I’m intrigued and occasionally heartened to see such an idea catch the attention of highly distracted people—there must be some power lurking within it—but I’m equally discouraged to watch it devolve into a common type of self-improvement thinking, along the lines of “If only the world could be structured in such-and-such a way … why, it would carry all of us along into becoming the people we ought to be, and with hardly any effort on our part!”

My reaction is usually a frustrated “Yes, those are all great things—so why not just incorporate some of them into your ongoing lives?” And in days gone by I would have just chalked it up to a general inclination to indulge oneself in comforting if-only delusions: “if only the world were different life would be better, and I’d be a better person … but ….” I’ve spent a lot of time fighting that inclination in myself, and especially in the beginning it was an important part of setting out on and sticking to a different path. But as I’ve walked the path, I’ve discovered that it isn’t a particularly heroic thing to do. The steps are fairly clear, fairly easy, possible to take in small and steady increments, and provide immediate, tangible rewards (although sometimes you have to learn to recognize them).

Over forty years as an adult I’ve tested, refined, and internalized a certain outlook, and the rewards are now abundant and lasting. I have no interest at all in systematizing what I know, especially since one big thing I know is that system has the power to turn anything good into something pernicious. And I’m happy to continue exploring my outlook in the company of family and close friends, blessing them as well as myself with its rewards, and hopefully imparting some of my thinking to them along the way, for them to consider. I’ve given up the idea that there might be One Way good for all, at least one I’m capable of conceiving and (even harder) articulating clearly. But I do think I’ve learned things that might be helpful to some people in some circumstances, and I’m always looking for ways to make them available to those who might benefit.

Every time I write about this I take pains with how I express it, to be as clear as possible that I am not talking about obligations but only opportunities. If I go on about how to craft a character that is more humble or patient or considerate or other-focused, it is not because I think it one’s duty or that it will somehow disappoint God if one doesn’t. I’m done with the weirdly comforting call-and-response pattern that Christians and their teachers have fallen into—”You are bad, bad people to have led God down again/Yes, we are bad, bad people to have let God down again.” Anymore I only see these things as available paths to a better, fuller life, and all I care to do is point out that not only can anyone avail themselves, but any level of effort will be rewarded and serve as a spur to further effort.

It’s not that these things aren’t recognized and discussed—quite the opposite—but for me at least the discussion almost always fails a critical test: “But … how?” It’s not hard to find teachers who correctly identify the need of the hour, shortcomings which are especially pertinent in our modern setting. And though they may be vaguer about it, they can usually cast a reasonably accurate vision for how things would be better if we addressed and corrected those shortcomings. But … how? This is where I find almost nothing but platitudes which can’t be acted upon. It is all well and good to tell me about patience, and my lack of it, and even how much better life would be for me and my acquaintances if I developed more of it. But … how?

I want to focus on the “how”—not telling you or anyone else how to proceed, but simply chronicling how I have proceeded in certain important areas, presenting myself not as a model but only an example, a concrete instance of specific efforts made by one person and their results for him. The point being only that it is possible to proceed—at the least, someone once gave it a try, and had some success. The specific efforts which worked for me may not work for you. But figuring out what to do—and then doing it—is an approach that I think is almost universally applicable, and I’m looking for ways to better encourage others to pursue it.

So, why do current treatments of this problem fall so short when it comes to the “But … how?” test? For me they tend to fall into one of two categories: either they focus on something which is achievable by normal people but will not lead to direct tangible results (e.g. read your Bible, pray every day), or on something which would provide direct tangible results if it were achievable, but achieving the goal is emphasized over taking steps toward the goal (be patient, be humble, be considerate, prefer others to yourself). Even the writer I’ve found most helpful on this score, Dallas Willard, falls short in this second way—although I don’t blame him, because he was only one man and the work he was able to accomplish in this area far outweighs this shortcoming. And I’ve managed to close the gap Willard leaves under my own power, working with stray bits and pieces he left in his wake and using his outlook to evaluate and fine-tune ideas I’ve found elsewhere.

Believe me, I understand how difficult it is to select an approach which hits the sweet spot, being both within the reach of folks with average skills and drive, and capable of producing the immediate tangible results that will reassure the practitioners that the approach is correct and motivate them to continue on. A few have worked for me over the years. Twenty years ago I was inspired to pray for humility, and was thereafter blessed (?) with ample opportunity to practice humbling myself, as well as to gain a deeper understanding of the role humility plays in a well-functioning community. Unfortunately, the rewards tend to be long in coming, and so exercising humility can feel more like being dutiful towards some abstract conception of Good Personhood than a practice that yields blessings in proportion to invested effort. Exercising patience, compassion, and curbing my tongue have also been good and have yielded more immediate benefits, but the scope is narrow and the benefits tend to be self-centered.

I’m wondering now if kindness might not be an effective concept for focusing character development. This occurred to me after watching the four-hour HBO miniseries Olive Kitteridge this week. I wanted to watch it because of good reviews and Frances McDormand, who I like a lot as an actress. It was harrowing to watch, mostly because McDormand is uncompromising in her portrayal of .. uh … a difficult person, let’s say. And I’m sure there are greater depths than I am capable of plumbing—I’m not very good with fiction. But what struck me about the character Olive was that she was completely lacking in kindness, a flaw that ruined her life but also one she seemed barely aware of. Worse, she was married to Henry, an extremely kind man—who often suffered her contempt and anger because of his kindness—although she didn’t understand (until far too late) that it was his kindness that infuriated her.

To me the story was an excellent illustration (through Olive) of the damage that unkindness can inflict, both short- and long-term, on acquaintances and community and family and one’s own self. And (through Henry) of the ability of even small kindnesses well placed to redeem ugly and difficult situations. Kindness can be practiced at many scales, and paying a kindness will quite often yield a reward right away, in personal satisfaction and in the joy of others and in improved circumstances. It’s the sort of practice that can be tested easily, that one can gradually immerse oneself in as trust and understanding grows.

Best, it’s one of the few virtues remaining that is viewed positively by the world at large. Random acts of kindness, and all. While poking around I came across this commencement speech by the writer George Saunders, which hints at how deeply we respond to the idea as humans (emphasis in original):

So here’s something I know to be true, although it’s a little corny, and I don’t quite know what to do with it:

What I regret most in my life are failures of kindness.

Those moments when another human being was there, in front of me, suffering, and I responded . . . sensibly. Reservedly. Mildly.

Or, to look at it from the other end of the telescope: Who, in your life, do you remember most fondly, with the most undeniable feelings of warmth?

Those who were kindest to you, I bet.

It’s a little facile, maybe, and certainly hard to implement, but I’d say, as a goal in life, you could do worse than: Try to be kinder.

Now, it would be possible to object here: What an inadequate goal! Kindness on its own will hardly get you into heaven. And that’s exactly the objection raised in this response to Saunder’s speech from Jen Pollock Michel in Christianity Today, under the title “The Misguided Theology of Kindness”:

A thousand times and more I have hung myself on the accusation of selfishness, living with the burden of be kind, advice that would subtly seek to obligate me to the whole of humanity and will to find me guilty whenever I cannot appease their demands.

How we progress from “try to be kinder” to “obligate me to the whole of humanity” isn’t clear—some sort of slippery slope, no doubt. But fortunately there was One who resisted this sort of pernicious guilt-tripping:

But never in the New Testament is Jesus hailed as the paragon of unselfishness. As we see throughout the gospels, Jesus did not heal every person. Nor did he grant every request. In fact, our Lord routinely escaped the clamoring crowds to pray, to sleep, and to spend intimate time with his disciples. When an oppressed people cried out for him to become their political deliverer, he resisted their pleas.

Well, this is certainly arguable, but the more important point here is where Michel takes this line of thought as she ends her essay:

This could have been perceived as selfish. Some may have even considered it cruel. But Jesus remained fixed on pleasing his Father. “I have come to do your will, O God,” (Heb. 10:7).

We are better off, not with George Saunders’s advice, but with the wisdom of King Solomon, who, at the end of his life of study, concluded this about living life well: “Fear God and keep his commandments.” Honor your Creator first—and kindness to his creatures will follow.

Michel’s response is an excellent example of a tendency that dismays me, disdaining a modest but perfectly workable practice—try to be kinder—because it is so far inferior to a higher goal—fear God and keep his commandments—even though the higher goal fails the “But … how?” test and turns quickly into pious wishful thinking.

(As for “Honor your Creator first—and kindness to his creatures will follow”, I’d say the most famous counterexample to this was the rich young ruler, who Jesus did not contradict when he claimed to have always feared God and kept His commandments, but instead called him out in a way that indicates he fell a bit short in the kindness department. Meanwhile, I know plenty of people, myself included, whose modest, imperfect, but steadfast practice of kindness has glorified God and led to greater intimacy with Him.)

So I’m looking at kindness as a possible rallying concept, a practice that requires minimal initial commitment but has the power to draw the practitioner deeper, one that might even serve as a gateway drug for those whose faith is minimal or even nonexistent. More as it develops.

Options, options everywhere

Occasionally I’ve run across an item that seems to express an idea perfectly—yet the idea is buried so deep that I can’t quite pin it down. More than thirty years ago, in the back pages of Texas Monthly magazine, I saw a small ad (for an outlet mall, of all places) which pictured a fashion model striding down a runway, with the caption “Be the Star of Your Own Movie”.

I hadn’t thought much at that point about culture or history or modernity, but even then I knew that it resonated because it embodied a sort of narcissism that was not only in full flower at that point (this being a few years past Tom Wolfe’s Me Decade), but was also something new on the scene. As prideful as I could be at times, I had never really envisioned my unfolding life as a movie about me. And I was certain that my folks would be baffled by such thinking.

Although I had a label for it, I still wondered: Why now? Why everywhere, all at once? Was it really a cultural shift, and if so what caused/enabled it? During the time since then the trend seems only to have broadened and deepened, manifesting itself everywhere, even having its way with folks my parents’ age. And although those questions haven’t exactly driven my own studies, I’ve always had them in the back of my mind as I’ve learned more about where we are and how we got here. At this point I have a much clearer understanding of the sources and trajectories involved, but have yet to put the puzzle together.

This week I ran across some key pieces. I’m re-reading a book by Thomas de Zengontita, called Mediated: How the Media Shapes Your World and How You Live in it. It’s pretty good—I first picked it up on Thursday afternoon and am already halfway through my second reading. Here is the very first paragraph.

Ask yourself this: did members of the Greatest Generation spend a lot of time talking about where they were and what they did and how they felt when they first heard the news from Pearl Harbor? People certainly remembered the moment, and a few anecdotes got passed around—but did a whole folk genre spontaneously emerge? Did everyone feel compelled to craft a little narrative, starring me, an oft-repeated and inevitably embellished story-for-the-ages reporting on my personal experience of the Event? Or did they just assume that Pearl Harbor and its consequences were what mattered, and talk about that.

This captures something I’ve seen over and over, and wondered at. Increasingly over the years I’ve been watching people, or listening to them talk, or reading something they wrote, and a frustrated thought forms and builds in me: why should anyone care what you think about that? Note that I only react this way when there is no obvious reason to value the other person’s thoughts on the topic, due to having studied it or having been directly involved with the event or even just because they are generally wise or experienced. When I found myself reacting like this sporadically, I chalked it up to having run across someone who was unusually lacking in self-awareness. Then it began happening more often, and I suspected it was a generational thing. Now it happens to me all the time, and I have to acknowledge that I’m the one who was left behind when the zeitgeist shifted, and whether on balance it’s good or bad I need to avoid the trap of “Why, back in my day …” and instead figure out how to respond to this new thinking with integrity and grace.

Looking back through my old blog posts, I see I’ve addressed one aspect of this, the idea of newsjacking, where one injects oneself into a trending story, in hopes of riding its coattails. Quoting myself here:

I think it points to a basic but unnamed impulse that has flourished along with the internet. For a long time I’ve noticed it primarily in comment threads on blogs, where people often don’t interact with the content of a post but simply use it as a springboard to talk about themselves. When I called the technique anything at all, I would call it (somewhat meanly) “That reminds me of … ME!” It is not the same as hijacking a thread, which involves redirecting the entire discussion somewhere the original post didn’t go. It is smaller and self-contained, a way of injecting oneself into a discussion without actually needing to address the topic at hand. Or, to be mean again, a natural result when the commenter finds their own experiences and opinions more worthy of comment than those of the blogger.

I go on to cite Paul Ford, who claims that this impulse, properly recognized, is key to understanding how people behave on the internet. Ford says:

“Why wasn’t I consulted,” which I abbreviate as WWIC, is the fundamental question of the web. It is the rule from which other rules are derived. Humans have a fundamental need to be consulted, engaged, to exercise their knowledge (and thus power), and no other medium that came before has been able to tap into that as effectively.

My only quibble with this statement is the “humans have a fundamental need” part. Have folks always felt that their opinion counted, no matter the topic, simply because it was their opinion? The fact that my frustration has increased, eventually to peg the needle, suggests otherwise. The Pearl Harbor anecdote above suggests otherwise. And lots of my reading over the years suggests otherwise, but I won’t try to make the case here, only ask that you entertain the possibility.

So, what changed? de Zengontita says it is the triumph of media, or more specifically, representational flattery—we no longer experience the world directly, but instead as representations served up for our delectation, as if we were of central importance. To give his reader a feel for the difference, he offers this thought experiment:

Say your car breaks down in the middle of nowhere—the middle of Saskatchewan, say. You have no radio, no cell phone, nothing to read, no gear to fiddle with. You just have to wait. Pretty soon you notice how everything around you just happens to be there. And it just happens to be there in this very precise but unfamiliar way. You are so not used to this. Every tuft of weed, the scattered pebbles, the lapsing fence, the cracks in the asphalt, the buzz of insects in the field, the flow of cloud against the sky, everything is very specifically exactly the way it is—and none of it is for you. Nothing here was designed to affect you. It isn’t arranged so you can experience it, you didn’t plan to experience it, there isn’t any screen, there isn’t any display, there isn’t any entrance, no brochure, nothing special to look at, no dramatic scenery or wildlife, no tour guide, no campsites, no benches, no paths, no viewing platforms with natural-historical information posted under slanted Plexiglas lectern things—whatever is there is just there, and so are you. And your options are limited. You begin to get a sense of your real place in the great scheme of things.

Very small.

Some people find this profoundly comforting. Wittgenstein, for example.

So that’s a baseline for comparison. What it teaches us is this: in a mediated world, the opposite of real isn’t phony or illusional or fictional—it’s optional. Idiomatically, we recognize this when we say “The reality is …,” meaning something that has to be dealt with, something that isn’t an option. We are most free of mediation, we are most real, when we are at the disposal of accident and necessity. That’s when we are not being addressed. That’s when we go without the flattery intrinsic to representation.

The key new notion in this meditation is options. In fact, a few pages later de Zengontita states that he is bringing just two new notions to the table, representational flattery and the role of proliferating options in buffering us from reality. The rest of the book devotes itself to illustrating how those two notions manifest their effects in different areas of life—helicopter parenting, prolonged adolescence, celebrity, politics, the busyness of modern life, our uneasy relationship with nature, and so on.

A third notion—which isn’t presented as succinctly as the first two, maybe because he doesn’t see it as original—is that we are all performers now. Just as the world is represented to us via the media, we must represent ourselves to the world in a fashion unknown to our grandparents. Rather than simply growing into the role a community provides for us/imposes on us, we now need to sort through the myriad options available in order to construct a self, a process that is often agonizing, not just due to the pain of learning but also the constant risk of “buyer’s remorse”—what if I choose badly?

It’s this third notion that resonates most strongly with me, and I wish vaguely that de Zengontita had a clearer statement to make about it—but then again I admire his determination to simply explore the landscape without offering prescriptions, to do his best to make the reader feel aspects of the water we all swim in and leave it at that. So I’ll do my best to absorb his thinking, then make my own applications. (You can get a taste of his thinking in this four-minute video clip, where he claims that the “ability” of his grandfather to be naturally authentic is now lost to us, lost in a way that can’t be recovered, and we need to find a different path to authenticity.)

This part interests me because of my recent focus on character formation. Regardless of its historical sources, de Zengontita claims that it qualifies as a recent shift because the breezes of the past have lately become a hurricane we must now deal with. It is no good to yearn for the earlier times when character was a gift the community bestowed on us, even more foolish to try to restore such an environment. And as he points out, many of the options have their good points, at least individually considered. So (this is my prescription, not his) perhaps we would be better off learning how to choose wisely among the options.

In becoming a musical performer I learned that technique is often confused with phoniness—people think that “true” musicians somehow just let the music come out. But, as becomes obvious after just a little thought, a lot of mechanical groundwork needs to be laid before a musician can reliably, effectively produce the notes which express the music behind them.

And even if you get that, it’s easy to confuse the groundwork with the music. Vladimir Horowitz once said that he never worried about being challenged by the endless stream of technically more proficient whiz kids who came after him, because they would practice, practice, practice—and when they got on stage they would practice some more. Similarly, when I was learning to sing, it took much time and effort for me to hear what I was doing well enough to develop the technical skills I needed to produce the sound I wanted—and then more time and effort to get my mind off the technical production of the sound, to trust in my skills enough to use them for some other purpose than simply using them. Was I somehow a phony singer for not having been born so able, but instead setting out at age 50 to learn how to be one?

And while we’re discussing phoniness and performance, allow me to hearken back to an old post on the subject, when Chris and I were exploring the concept of stage presence:

At one of our earliest performances, an open mic program on stage at Natural Tunnel State Park, a fellow stopped by who had some experience performing, and asked to do a few songs. He also asked me and another fellow, a guitar player, to accompany him. When the guitar began playing a break, the new guy watched him with a big smile, nodded his head in time, then turned to me with the same big smile as if to say, “Isn’t that great?” To me, sitting next to him, it felt weird and artificial, easy to mistake for insincerity. But it wasn’t insincere at all; the guy was in fact enjoying the break, and saying to me “Isn’t that great?” It was just that in order to communicate that from the stage to a bunch of people thirty feet away, he had to do some things I wasn’t used to doing or seeing up close.

The life lesson I learned from that exploration: it isn’t enough to, say, appreciate someone else’s effort—you must communicate your appreciation accurately. The same with admiration, or disappointment, or anger, or pleasure, or the rest. The writers I respect the least are the ones who blame their readers for misunderstanding them. Similarly with well-intended folks who fail to act on those intentions. Similarly with folks who think it is sufficient to feel a certain way towards others, leaving them the job of divining those feelings.

So, must we resign ourselves to a life of feeling phony, of deliberately performing for others in order to actually convey what we intend to convey? Well, yes and no. I think we (in the WEIRD world, anyway) no longer inhabit a context which naturally shapes us into an authentic character, and there’s no going back. Our character is something we need to assemble, and our best hope is to learn how to do that properly, to acquire the knowledge and skills and disciplines that will equip us to the task of transforming ourselves into proper human beings.

That’s the yes part. The no part is this: once we’ve thoroughly learned what we need to know, once we’ve practiced it all over and over—at that point we actually become the human we set out to construct, with characteristics which were carefully chosen and nurtured but are now second nature to us. At which point we can stop thinking about the performance, and simply perform. We can stop thinking about how to love, and simply love.

For awhile now my life verse has been 1 Thess 4:11: “Aspire to live quietly, and to mind your own affairs, and to work with your hands, as we instructed you,” But I’m toying with the idea of changing it to Matthew 5:16: “In the same way, let your light shine before others, so that they may see your good works and give glory to your Father who is in heaven.” I suspect that in these dark days the greatest gift we can give someone is to help them to understand that a godly life is a live option—not only with our lips, but in our lives. I suspect that more folks might give it a try if they see others successful in their efforts.

The insufficiency of self-awareness

This caught my eye:

But my favorite thing about “BoJack Horseman” is how badly BoJack wants to think of himself as—and even, if he’s desperate enough, wants to be—a good person. Just tell me I’m good is the constant undertow of his motivation. He doesn’t want to be cool or happy. He wants to be a good person, in spite of all the genuinely awful things he’s done. He’s ashamed of himself, sure. But he tries to disguise his failures as successes, as cocktail-party anecdotes and, if necessary, as lessons learned.

He has this exchange with Diane, which runs exactly parallel to the character vs. actions bit from “Mistress America” (BoJack knows the zeitgeist!):

BoJack: But do you think I’m a good person, deep down?

Diane: …I don’t know if I believe in ‘deep down.’

“BoJack” is a pretty scathing portrayal of the insufficiency of self-awareness. BoJack knows what his problems are and states them frequently and with often-hilarious bluntness, and it doesn’t help. As a different family entertainment once taught us, knowing is half the battle—but it turns out not to be the half where the battle is won.

I’ve watched both seasons of BoJack Horseman (and enjoyed them immensely), but you don’t need to in order to understand the point. Simply acknowledging your weaknesses doesn’t justify them, much less put you on the path to correcting them.

Has this always been a thing, or is it something recent? I’ve sat through countless sermons which fit into a standard call-and-response pattern—”You guys don’t pray/read your Bible/volunteer/give enough”, “True, true, we don’t pray/read our Bibles/volunteer/give enough”. And the transaction is then complete. Never have I heard one which starts, “So, after last week’s exhortations are you praying/reading your Bibles/volunteering/giving more?”, and I have to imagine it’s because the answer is obvious—and, in fact, to ask the question is to misunderstand the true purpose of the exchange.

As the writer says, “knowing is half the battle—but it turns out not to be the half where the battle is won.” To complete the thought: doing is the half where the battle is won (or lost). Doubling down on self-awareness is just a way of putting off engagement.

Paul Ford on losing weight … and gaining it back

I like Paul Ford’s writing because it is simple and straightforward and quite honest. I don’t follow him in my feeds but I probably should, since I always enjoy his pieces when I stumble across them. He wrote one about the fundamental question of the internet (“Why wasn’t I consulted?”) which I cite often. I haven’t read his 38,000 word opus on code because I already know what code is, but I may at some point.

Ford has just published a short piece on how he lost 100 pounds (down from 400 to 300), kept it off for awhile, then gained it all back. He was deliberate about losing it, carefully counting calories in and calories out, and says he even enjoyed the process. But somehow he lost interest, and the weight returned—and he can’t find a way back to that state of mind which enabled him to lose it.

I sympathize strongly with Ford’s predicament. I have lost large amounts of weight multiple times in my life, only to gain it back again. And each time the weight loss was in a way technically engineered, usually by a diet program (NutriSystem). It worked every time, and once I honestly decided to start I didn’t have a problem sticking with it. But the weight always came back.

Being in the final stages of one more long stretch of weight loss, I’m the last to start crowing that this time will be the charm. But I have noticed some differences this time around, differences which give me hope that I can achieve a healthy weight and maintain it indefinitely.

The most important one is that I seem to have lost interest in food as a source of intense gratification. I still enjoy the food I eat—quite a bit, in fact—but I don’t crave anything, in particular things I’ve excluded from my current menu. My memory isn’t the greatest, but I’m pretty sure that in the past I looked forward to being done with my dieting, so that I could go back to normal eating, or treat myself on occasion, or take a break from denying myself, and so on.

That’s not how I look at things right now. There is no particular treat I am looking forward to. I haven’t looked for excuses to vary my routine. And neither has my routine been rigid—when the summer tomatoes came in, I put bread back on the menu so that I could eat sandwiches heavily laden with them, maybe with cheese or tuna, other things I hadn’t been eating. But even then I figured out how much of that would be reasonable to eat—and that’s what I ate, with much pleasure. The tomatoes are done, so bread will probably fall further into the background, trotted out on those occasional days when I’m too indifferent to the usual fare.

When contemplating my favorite foods, my mantra has been, “It’s good—but it’s not that great.” And not as some form of hypnotic thinking. I summon up memories of the taste—for some reason my taste memory has become vivid—and realize that, as much as I would enjoy one of my Aunt’s tacos, or a steak, or a rich dessert, or a piece of fresh bread thickly spread with butter, I don’t crave it anymore—good, but not that great. Perhaps this is how normal people relate to food, I don’t know.

The other helpful difference is that I am rarely hungry, and when I am I don’t find it unpleasant. I seem to have sorted through the behaviors that I would “mistake” for hunger—a desire to be distracted, usually—and dealt with them for what they are. Which leaves actual hunger itself, and here I was helped by something Leo Babauta wrote, namely that mild hunger is not so bad and definitely won’t kill you. So when I started on my diet I decided to leave out breakfast, not just to save the calories (although that has been quite a help) but so that every day I would be mildly hungry until lunchtime. Which gave me a regular opportunity to contemplate hunger, and how I had dealt with it over the years. What I realized was that I would often eat at the slightest indication of hunger—or even sooner, in order to prevent even the mildest hunger pangs from occurring. Once I realized that, and that mild hunger was barely a distraction, it was easy to give up snacking and settle into a fairly rigid menu. Which, for the record, is almost always a large salad for lunch (lettuce, cucumber, mushrooms, tomatoes, with olive oil and vinegar), an apple and a banana in the afternoon, and something roughly on the order of a baked chicken thigh and rice for dinner, adding up roughly to 1500 calories.

And one thing I learned which was helpful in sticking to the diet was to identify the things I truly did crave and then arrange the menu to accommodate them. For example, the chicken thigh is always skin-on and bone-in. A strict calorie-counter might object to the extra calories, but to me the fat and extra flavor are satisfying in a way that far outweighs them. I seem to have a fat/umami tooth—the apple and banana are more than enough to cover my need for sweetness, but I nearly swoon over the olive oil in the salad dressing, the fat in the chicken skin, the meatiness of the mushrooms and soy sauce I use liberally. As a result I don’t feel deprived at all—those elements make my menu luxurious to me.

As I mentioned, my past experience is more than enough to keep me from proclaiming victory. But I do have new hope, because my current diet is not a temporary program but the way I eat now. As I get closer to the end the weight loss has slowed. Although I originally set a rough target in pounds, I recently switched over to thinking about getting rid of excess fat, so the number on the scale no longer quantifies my goal but just tells me I’m still headed in the right direction. So I have stopped (or tried to stop, anyway) worrying about the rate of change in pounds, instead simply exercising patience, knowing that taking in less calories than I burn will eventually get me to where I want to be.

And finally, though even a year on this routine is too early to tell, I think that if someone informed me that I would have to stay at my current level of intake for the rest of my life, I’d be OK with that. Whatever novelty, stimulation, entertainment, or gratification I used to get from food I seem to be getting elsewhere. The role of food in my life hasn’t been reduced to simple fuel—I really do enjoy what food I do eat, and make an effort to insure that what is on my plate is good and wholesome and enjoyable. But it is no longer the jumble of cravings it used to be.