The Babadook

This low-budget Australian horror film has gotten excellent reviews (98% positive reviews according to Rotten Tomatoes), and I was already looking forward to seeing it on Netflix in April. But after reading this review in The Baffler, I can’t wait.

The Babadook follows young widow Amelia and her troubled tot Sam as they discover a disturbing book during a nightly bedtime reading and unwittingly unleash its sinister central figure on their quiet lives. As it turns out, the key to the creature’s undoing is merely to recognize it and rebuke it—something that Amelia eventually discovers. But this revelation comes after a slow-burning, suspenseful battle of wills, prolonged by a centrally important fact about Amelia that critics have largely overlooked: she is working class. […]

All the dramatic action among the film’s adult humans proceeds to flow from this core disjuncture of class. The school administrators condescend to her in icy-precise professional language, telling her, in so many words, that she has failed to correctly parent her son. And in the next scene, Amelia sits next to her sister, a smooth-haired woman in black nylons and a blazer, and barely listens as her upwardly mobile sibling natters on about installation art pieces. Miffed by this indifference to her class ascent, Amelia’s sister abruptly calls off the joint birthday party they had planned for their children. This leads to a round of pointedly class-based recriminations, all upbraiding Amelia for not “properly” celebrating her son’s birthday.

I’m all for fulfilling the vision of a classless society—as long as it is entirely lower-class. Jesus promised relief to such folks, but not by exalting them to a higher rank. He also issued dire warnings to those in the higher ranks, and did not indicate they could redeem themselves by welcoming those under them into their fold.

Unfortunately, American Christianity has chosen a different path to the classless society, eliminating distinctions by ignoring and occasionally denying them, akin to the law in the Anatole France aphorism:

The law, in its majestic equality, forbids the rich as well as the poor to sleep under bridges, to beg in the streets, and to steal bread.

I don’t go along with the reviewer’s suggested remedies:

Beyond the claustral terrors of a classic horror-fantasy, The Babadook leaves us with a surprisingly far-reaching epilogue: the film leads us to imagine the kind of programs that could make life as a working-class parent more leisurely and secure, like child allowances, paid maternity leave, and all the sundry baby benefits that are commonplace in European social democracies. Ladies who lunch seem capable of providing only bitter censure, and good politics, with concrete material assistance, will have to be in place before the rest of us can gather the few pearls of wisdom their parenting fashions offer. This is the real horror story of The Babadook: our culture is at a loss to make the hopeful epilogue to Amelia’s story match up with the kind of social isolation that spurred on her brief descent into madness and terror.

But I do agree with the reviewer’s observation that “our culture is at a loss” to deal with Amelia’s difficulties. And I’d go further in saying that our culture—Christian culture in particular—is at a loss to speak of them at all. So I’m all in favor of films and film reviewers who do what they can to raise the question.

Shoot, don’t talk

I love The Incredibles. One of its admirable qualities is a self-aware treatment of storytelling conventions. I don’t know the film introduced the term monologuing, but it is a welcome addition to my vocabulary, perfectly describing a tendency which afflicts not just superheroes but most of modern society—talking about what you’re going to do, usually to the exclusion of (or a substitute for) actually doing it.

I also love The Good, The Bad, and The Ugly. One scene in particular provides a favorite family quote:

Tuco’s adversary catches him in the bathtub, but of course he starts monologuing. Tuco listens for a bit, then shoots him a few times. After the final shot, he tells the dead man, “When you have to shoot, shoot, don’t talk!”

Whenever I am tempted to announce my intentions—which happens less and less, but still happens—I always stop and ask myself if I’m about to start monologuing. And that is usually enough to re-strengthen my resolve to let my actions speak for themselves. But not always.

Exactly two months ago I announced that regular posting would resume, and today I announce that it will cease—not altogether, just the regular part. This doesn’t require an announcement, of course. I could have just done what I plan to do with no comment, with no harm to anyone along with the helpfully humbling reminder that my commitment to at-least-daily posting was important only to me. But I decided that commenting on the change would give me the opportunity to say a few things I might not otherwise say when I return to a lighter posting regime.

First, I think my commitment has done its work. It pushed me to write 64 posts, some of them taking my writing into unfamiliar territory. And it showed me that I can pretty reliably write a 1500-2000 word rough draft in 1-2 hours. But it also showed me that the rate was more or less an upper limit, that more practice wasn’t going to make the words flow more freely. This is because the writing is now secondary to the thinking—I can get the words down easily enough once I know what I want to say, but assembling my thoughts into something coherent takes time. And it’s time I don’t really have available right now, so the commitment to spend 1-2 hours per day writing was becoming a burden. Time for a change.

Second, I think the next challenge should be to tackle bigger, more complex topics, ideas that will take 5-10 times as many words to cover. This is not something I can do by just allowing 5-10 times as long to do it. I don’t know how to do it, really, or if I can do it at all. My blog posts are almost always first drafts, and it will take new skills to create 7,500-20,000 word pieces that are worth reading. So I think I need to invest my deep-thinking time in developing those skills, and I don’t expect the work will be worth exposing to public view (although that may change). Although this blog won’t go intentionally silent, the postings will be less frequent and limited to things I can write quickly.

Third, this two month experiment has been a good example of a principle I’ve embraced for a long time—I’m sure my kids are tired of hearing about it—but don’t really have a good name for. The idea is for any effort you are engaged in, strive to figure out as quickly and painlessly as possible that you shouldn’t continue. More than thirty years ago I told a co-worker whose project was dead in the water to just do something simple right now—which he did, and it got his project moving again, and a few weeks later when I stopped by his cubicle I noticed he had pinned a piece of paper to his bulletin board which said, in big letters, “SOMETHING SIMPLE RIGHT NOW”. Leo Babauta of Zen Habits wrote a post about it called Fail Faster at Habits. Two months ago I wanted to improve my writing, and so I decided I should just write something public every day until that no longer seemed to be helpful. It was a small enough step that I was actually able to do it, it taught me some things, and it quickly enough pointed me in a different direction.

Fourth, if I were to continue the recent stream of blog posts, one of the upcoming posts would have been entitled “Exemplify, don’t exhort”. Lately I’ve seen some criticisms of the aphorism commonly attributed to St. Francis: “Preach the Gospel at all times. Use words if necessary.” The criticism always comes from professional talkers. Here is one, from Glenn Stanton in a post on The Gospel Coalition website:

It is always attributed to St. Francis of Assisi—founder of the Franciscan Order—and is intended to say that proclaiming the Gospel by example is more virtuous than actually proclaiming with voice. It is a quote that has often rankled me because it seems to create a useless dichotomy between speech and action. Besides, the spirit behind it can be a little arrogant, intimating that those who “practice the Gospel” are more faithful to the faith than those who preach it.

The more I look at this, the more baffled I am. Isn’t in fact more virtuous to live out the Gospel than to simply give it lip service? Aren’t we more faithful when we practice the Gospel than when we preach it? Isn’t the dichotomy in fact very useful, reminding us that it is much, much easier to say than to do?

I bring this up here partly because “exemplify, not exhort” makes a nice parallel with “shoot, don’t talk.” But mostly because a 1500-2000 word post is not adequate to cover the topic thoroughly. And although I may not actually be capable of covering the topic at proper length, I can’t know that without giving it a try, and that requires a different approach.

Finally, taking this opportunity to talk about my own talking gives me a chance to say something about why I write, which may not be obvious from the writing itself. I don’t write in order to tell people what to think—far from it. What little is settled in my own thinking has come at the cost of significant time and effort and occasional embarrassment, so I’m long past the point where I could ever insist that others should join me in my thinking.

I’m even past thinking that others should approach the puzzles of life in the same way, whether it leads them to agree with me or not. I’ve done this with my children, but only as a stopgap. In many areas they are too young or inexperienced to be expected to approach an issue cold, and in those cases I’m glad to offer my own thinking, along with observations on how I reached those conclusions—but the offering is made strictly as a convenience to them, and an opportunity for them to learn. Not much makes me happier than watching them come to a carefully considered conclusion that is different than my own.

So I don’t write in order to tell people what to think, or how to think, or even how to go about developing their own way of thinking. I only write to chronicle my own efforts to think, whether successful or failed, in the hopes that it might encourage others to take up the task of thinking for themselves. For those who choose not to, I wish them well—thinking for yourself is good, but it is not vital. For those who don’t realize they aren’t thinking for themselves, I hope that by chronicling lines of thinking they are unfamiliar with I might help them recognize where their wisdom is received—but only by offering a contrast, not by direct confrontation. And for those who are working at thinking for themselves, I hope to offer a bit of help by writing down what has worked for me, and how—which techniques have been helpful, which have let me down, which untried techniques look promising.

Dieting and all that

This article gives a pretty good overview of the current insane attitudes towards weight and weight loss. I like how it manages to negotiate the space without being guided by a hidden agenda. I thought at first that it might end with a clarion call for fat acceptance, but it acknowledges problems with that as well.

Even though the writer makes clear how complex the issues are—and how much money certain people stand to make by insisting they are simple—she doesn’t even touch on the issue of how the modern diet has changed people’s weight for the worse, or how much of the modern economy depends on convincing us that we are obliged to pursue peak experiences with food. That’s not a criticism of the article, since she is focused only on whether dieting works and whether it is worth the current emphasis on it. I only bring it up to point out that, as complex as her narrow take on eating is, the reality is much more so.

She ends with an example of a chronic dieter who eventually went the acceptance route but with a twist, switching her emphasis from restrained eating to competent eating:

About 10 years ago, Ellyn Satter, a dietitian and therapist in Madison, Wisconsin, developed a concept she calls eating competence, which encourages internal self-regulation about what and how much to eat rather than relying on calorie counts or lists of “good” and “bad” foods. Competent eaters, says Satter, enjoy food; they’re not afraid of it. […]

Not that abiding by competent eating, which fits the Health at Every Size paradigm, is easy; Robin Flamm would tell you that. When her clothes started to feel a little tighter, she panicked. Her first impulse was to head back to Weight Watchers. Instead, she says, she asked herself if she was eating mindfully, if she was exercising in a way that gave her pleasure, if she, maybe, needed to buy new clothes. “It’s really hard to let go of results,” she says. “It’s like free falling. And even though there’s no safety net ever, really, this time it’s knowing there’s no safety net.”

As I’m losing weight this time around, I am fairly confident that I will be able to find an acceptable weight—and stay there—because I am working not to view this as a temporary stretch of pain and deprivation in prelude to my old uncontrolled eating, but more a matter of getting my eating under control—forever. I am losing weight due to the only factor that leads to weight loss, namely a calorie deficit. And I’m OK with that!

I suppose I could be fooling myself—it’s only been 6 months, after all—but right now I feel like I could continue this menu forever. I’m also working to be OK with a more daunting fact, namely that the averages say I’m in a 1000 calorie per day deficit, but the scales say it’s only 500 calories. If true, then it’s not like I can add a lot to the menu once I reach a weight where I’d like to stay.

But that’s fine too, I’m not dreaming of eating all the good things I’ve been denying myself once my diet is done, I’m instead reviewing that list and learning that, in truth, I don’t miss them all that much. Passing up a piece of cake on my birthday wasn’t hard. Eating fried chicken for once (the birthday meal of choice) was nice, but I had no desire to gorge myself, even with the excuse of a special occasion. Saturday I’ll be taking Debbie to a highly recommended Indian buffet in Lexington for our 30th anniversary. I dearly love Indian food, and may not run a calorie deficit on that day—or I might, and in any case I don’t expect to come away from the buffet stuffed.

One day she was craving a hamburger, a food she wouldn’t typically have eaten. But that day, she ate a hamburger and fries for lunch. “And I was done. End of story,” she says, with a hint of wonder in her voice. No cravings, no obsessing over calories, no weeklong binge-and-restrict, no “feeling fat” and staying away from exercise. She ate a hamburger and fries, and nothing terrible happened. “I just wish more people would get it,” she says.

This is pretty accurate. In fact, the birthday menu was originally grilled hamburgers and homemade fries, except that it snowed 20 inches the day before. I picked it partly because it’s a family favorite and I’m not the only one who eats on my birthday, and partly to remind myself that I shouldn’t be a slave to anything, even a diet menu. So far I haven’t had any cravings, for hamburgers or fried chicken or Mexican or BBQ or any of my other beloved favorites. But due to circumstances I’ve eaten a couple from that list—and survived. I ate them without concern, enjoyed myself as I did, and was perfectly happy to my restricted menu afterwards.

Of all the topics on which I share personal experience, this is the one where I’m most reluctant, because I don’t think my experience is easy to generalize from. From what little I do understand about how food and our relationship with it works, I know that change in this area is extremely difficult and even harder to maintain over the long term. Worse, I am also aware of how little I actually understand. Worst of all, I don’t really understand why critical aspects of my attitude seem to be different this time around, and I certainly don’t give my conscious efforts the credit for that.

So I would never give advice any stronger than “I tried this and it seemed to work for me, you might want to ponder that.”  So weak that it’s hardly worth writing up! But on the off chance it might prove useful to someone, I will go ahead and write it up.

Bad thinkers

This article, Bad Thinkers, argues that believing weird stuff is often due to bad information, but to flawed intellectual character:

I want to argue for something which is controversial, although I believe that it is also intuitive and commonsensical. My claim is this: Oliver believes what he does because that is the kind of thinker he is or, to put it more bluntly, because there is something wrong with how he thinks. The problem with conspiracy theorists is not, as the US legal scholar Cass Sunstein argues, that they have little relevant information. The key to what they end up believing is how they interpret and respond to the vast quantities of relevant information at their disposal. I want to suggest that this is fundamentally a question of the way they are. Oliver isn’t mad (or at least, he needn’t be). Nevertheless, his beliefs about 9/11 are the result of the peculiarities of his intellectual constitution – in a word, of his intellectual character.

Although the writer’s example centers around 9/11 conspiracy theories, his case does not, so if you are more sympathetic to that particular theory than the writer you can choose something else—the vast right-wing conspiracy, maybe, or (something the writer himself is guilty of) an unexamined faith in the redeeming power of education.

I like the idea of intellectual virtues and vices, and I think the items he puts on each list belong there.

Gullibility, carelessness and closed-mindedness are examples of what the US philosopher Linda Zagzebski, in her book Virtues of the Mind(1996), has called ‘intellectual vices’. Others include negligence, idleness, rigidity, obtuseness, prejudice, lack of thoroughness, and insensitivity to detail. Intellectual character traits are habits or styles of thinking.

To describe Oliver as gullible or careless is to say something about his intellectual style or mind-set – for example, about how he goes about trying to find out things about events such as 9/11. Intellectual character traits that aid effective and responsible enquiry are intellectual virtues, whereas intellectual vices are intellectual character traits that impede effective and responsible inquiry. Humility, caution and carefulness are among the intellectual virtues Oliver plainly lacks, and that is why his attempts to get to the bottom of 9/11 are so flawed.

[…] Our intellectual vices are balanced by our intellectual virtues, by intellectual character traits such as open-mindedness, curiosity and rigour.

The nicest example of intellectual character fostering questionable beliefs is a psychologist investigating the idea of basketball players having a “hot hand,” something that is easily disproven.

Gilovich used detailed statistical analysis to demonstrate that the hot hand doesn’t exist – performance on a given shot is independent of performance on previous shots. The question is, why do so many basketball coaches, players and fans believe in it anyway? Gilovich’s cognitive explanation is that belief in the hot hand is due to our faulty intuitions about chance sequences; as a species, we’re bad at recognising what genuinely random sequences look like.

And yet when Gilovich sent his results to a bunch of basketball coaches, what happened next is extremely revealing. One responded: ‘Who is this guy? So he makes a study. I couldn’t care less.’ This seems like a perfect illustration of intellectual vices in operation. The dismissive reaction manifested a range of vices, including closed-mindedness and prejudice.

This in a nutshell is the danger of seeking out like-minded community, where the goal is largely to avoid direct encounters with challenging viewpoints, preferring to hoot and holler at them from a distance, taking strength not from the soundness of your thinking but from the uncritical encouragement of your fellow enthusiasts.

Bring the right questions

I once heard Navigators missionary Jim Petersen say he read through the Bible continuously, using Matthew Henry’s reading plan—but only as a way of remembering where he left off. Sometimes he would read a lot, sometimes just a bit, depending on where he was (often waiting for a plane) and where his head was at. Sometimes he would go for awhile without reading, but not for long.

What struck me, though, was that he didn’t do it as a discipline for its own sake, but as a means to finding answers. He always had one or more questions in mind as he read, and as a result would always find new things along the well-worn paths.

I’ve never developed the habit of regular Bible reading. I’ve read the Bible plenty, but it has usually come in spurts, and for years now I’ve known the territory well enough that I often find myself looking up one or more relevant passages as I think through a problem. (I like Bible Gateway for this.) And as I review passages or read what comes before or after I stumble across new things, which I either pursue or file away for later thought.

Although it doesn’t rise to the level of a full-fledged complaint, my quibble with the usual exhortations to “spend time in the Word” is that Scripture reading is presented as an end in itself, like relaxing or eating well or going for a walk. (Which is kind of funny, because these days exhortations to relax or eat well or go for a walk are usually cast as means to some other end!) And perhaps spending time in the Word has its own inherent benefits—but if so, I only experience them peripherally. I turn to the Bible because it can answer my questions, and can improve the quality of my questions as well. And I think that understanding the Bible in this way, as a resource to be used, leads one to approach Scripture differently, in many cases more profitably.

All this is prelude to a fairly modest note I want to make, which doesn’t even have to do with the Bible. I only bring that up as an illustration of a more general thing, namely that I try to approach the river of information in which we all swim in exactly the same way—I have questions, and I am always on the lookout for bits and pieces floating by that might eventually form part of an answer, or at least a better question. (It’s probably also worth mentioning that over the years I’ve gotten fairly good at determining on first glance that a particular bit will never be helpful, thereby allowing me to contentedly ignore a vast amount of stuff clamoring for attention because it is Important—or so it claims.)

One of my long-standing questions: where does the power of Christian witness reside? Or, perhaps, what enables unbelievers to resist and reject the Good News? One of the places I look for answers is periods of history where the Good News was apparently irresistible. The first one, of course, was the three hundred year stretch following Pentecost where the brotherhood grew from 120 believers to 33 million. Here’s a table charting the growth,


As Stark notes, the rate of growth works out to a pretty consistent 43% each decade. Or, put another way, each disciple making one new disciple every twenty years. Now, I don’t know of any proposed program of evangelization that suggests we expect to work with someone for twenty years on average before they experience conversion. One of the more deliberate discipleship-based modern programs, Robert Coleman’s Master Plan of Evangelism, suggests that 6 months is enough to make a convert, plus six more to equip him sufficiently to be a convert-making disciple on his own. But this is totally out of sync with the most celebrated period of Christian expansion—at the Coleman rate, it would have been accomplished in less than 20 years. So something else seems to be going on.

Yesterday I stumbled across an article about John Wesley, and after a brief scan decided it was worth sending to my Kindle for later reading. As I scanned, I noticed this claim:

From 1776 to 1850 American Methodist grew like a weed. In 1776, Methodists accounted for 2.5 percent of religious adherents in the colonies, the second smallest of the major denominations of that time. By 1850, Methodists comprised 34.2 percent of religious adherents in the United States, which was 14 percent more than the next largest group.

I was already intrigued by Wesley’s method of discipling believers in small groups—as well as the fact that, two hundred years later, this approach is hardly remembered, even by Methodists. And now I have a pointer to another intriguing episode of explosive growth—and not just explosive but sustained, enough so to eventually capture one-third of a pretty religious population. Was it simply the Good News, presented Methodist style, which led to this? Or was it something about Wesley’s method?

Ellie Kemper

This article is unusual—in a good way!—and so I thought I’d highlight it. (When I see a link to Vulture on a topic that interest me I always follow it, and am never disappointed.)

Before watching Unbreakable Kimmy Schmidt on Netflix, I had no idea who Kemper was, and reading this article I see why that is—she hasn’t had an especially high profile before now. I didn’t expect to like the show, but the buzz and the half-hour runtime led me to give it a try. I think the show is terrific, with a edgy urban humor I like in small doses together with an incredible sunniness that makes the package delightful. And Kemper is a big part of that, with her Lucille-Ball-like over-the-top expressiveness.

What’s good about the article is that it shows how Kemper became who she is today, kicking off with a short homemade horror film with a very young Ellie, continuing with sample clips from 1994, 1998, 2002, 2006, 2009, and so on that illustrate her development as an actress while also tracking her career, adding just enough text to put the clips into context. So in the end, the best part of the article is Kemper herself, who supplies most of the content through online videos, nicely organized and lightly annotated.

Warning: several of the clips are highly adult in topic matter, so click cautiously—but I think titles make it clear which ones to avoid.

Is commitment teachable?

Seth Godin clarifies something for me:

[Teaching technique] a waste because the fact is, most people can learn to be good at something, if they only choose to be, if they choose to make the leap and put in the effort and deal with the failure and the frustration and the grind.

But most people don’t want to commit until after they’ve discovered that they can be good at something. So they say, "teach me, while I stand here on one foot, teach me while I gossip with my friends via text, teach me while I wander off to other things. And, sure, if the teaching sticks, then I’ll commit."

I’ve seen what Godin describes, and I still don’t understand it completely’. It’s so obvious that acquiring a skill will involve a long, sustained stretch of failed effort—failed because you don’t yet have the skill. And making the effort is no guarantee you will ever master the skill. But it’s the only path. And there are plenty of secondary benefits that will come from making the attempt, whether or not the effort eventually succeeds. What else but commitment can power you through such a disheartening stretch?

For awhile I thought the culprit here was a general refusal to fail. And I think that is part of it. One current bit of advice for accomplishing something is to publicize the goal, and then using the potential shame of public failure as motivation to stick with the program. I’ve tried this occasionally in the past, but it fails me at the moment I realize that no one really cares whether I succeed or fail. Plus I’m too comfortable with failure, private and public.

As part of my job I’ve watched good teachers engage adult students who need help, only to see those students do anything in their power to deflect that help. Different folks use different strategies, but it boils down to: don’t get too close. Tell me what you have to say, and then let me consider it, preferably after you’ve gone away. The best teachers will get closer than the student wants, then confront the student directly with their deficiency, then tell the student how to correct it—and then wait until the student makes some attempt, right there, to do what the teacher said. Often it’s this last step at which students balk. It’s humiliating enough to be told where you fall short, worse to be given the obvious (usually simple) solution, but worst of all to have to acknowledge it all by doing what the teacher says right then and there. Until that final step we are still able to “stand up on the inside”, but doing what the teacher says forces us to sit down both outside and inside.

(I’m referring here to an anecdote I first heard from James Dobson, telling of a toddler who got into a war of wills with his mother about standing up in his highchair. At the end he plops down, saying—with his glare, if not in actual words—“I may be sitting down on the outside, but I’m standing up on the inside.”)

So I thought the problem was a refusal to fail, which is essentially a refusal to become a student. Godin suggests that the problem is less severe—that folks are willing to sign up for studenthood and endure a stretch of failure if they can somehow be reassured that success would be the end result. This is heartening if true. I have no idea whether it is actually true.

There are few areas where excellence is in reach of everyone, but surely one is godly living. History testifies to that, and until community evaporated one’s everyday life testified to that through the many neighbors who were unremarkable except in their godliness. Perhaps being surrounded by that cloud of witnesses was enough to generate commitment in a young (or even older) believer, who had a long way to go but also serious reassurance that the effort would end in success.

But I have the sense that today we no longer have such reassurance—and in fact don’t even know what success might look like when embodied. It’s one of the few hypotheses I have for why so few people I know seem to be engaged in the tedious work of discipleship, occasionally discouraging but never mysterious. It’s the only task Jesus has set for us, and the rewards are significant both at the end and along the way. What could keep us from it except a refusal to endure failure, or at least a fear that we can never succeed?