Dieting and all that

This article gives a pretty good overview of the current insane attitudes towards weight and weight loss. I like how it manages to negotiate the space without being guided by a hidden agenda. I thought at first that it might end with a clarion call for fat acceptance, but it acknowledges problems with that as well.

Even though the writer makes clear how complex the issues are—and how much money certain people stand to make by insisting they are simple—she doesn’t even touch on the issue of how the modern diet has changed people’s weight for the worse, or how much of the modern economy depends on convincing us that we are obliged to pursue peak experiences with food. That’s not a criticism of the article, since she is focused only on whether dieting works and whether it is worth the current emphasis on it. I only bring it up to point out that, as complex as her narrow take on eating is, the reality is much more so.

She ends with an example of a chronic dieter who eventually went the acceptance route but with a twist, switching her emphasis from restrained eating to competent eating:

About 10 years ago, Ellyn Satter, a dietitian and therapist in Madison, Wisconsin, developed a concept she calls eating competence, which encourages internal self-regulation about what and how much to eat rather than relying on calorie counts or lists of “good” and “bad” foods. Competent eaters, says Satter, enjoy food; they’re not afraid of it. […]

Not that abiding by competent eating, which fits the Health at Every Size paradigm, is easy; Robin Flamm would tell you that. When her clothes started to feel a little tighter, she panicked. Her first impulse was to head back to Weight Watchers. Instead, she says, she asked herself if she was eating mindfully, if she was exercising in a way that gave her pleasure, if she, maybe, needed to buy new clothes. “It’s really hard to let go of results,” she says. “It’s like free falling. And even though there’s no safety net ever, really, this time it’s knowing there’s no safety net.”

As I’m losing weight this time around, I am fairly confident that I will be able to find an acceptable weight—and stay there—because I am working not to view this as a temporary stretch of pain and deprivation in prelude to my old uncontrolled eating, but more a matter of getting my eating under control—forever. I am losing weight due to the only factor that leads to weight loss, namely a calorie deficit. And I’m OK with that!

I suppose I could be fooling myself—it’s only been 6 months, after all—but right now I feel like I could continue this menu forever. I’m also working to be OK with a more daunting fact, namely that the averages say I’m in a 1000 calorie per day deficit, but the scales say it’s only 500 calories. If true, then it’s not like I can add a lot to the menu once I reach a weight where I’d like to stay.

But that’s fine too, I’m not dreaming of eating all the good things I’ve been denying myself once my diet is done, I’m instead reviewing that list and learning that, in truth, I don’t miss them all that much. Passing up a piece of cake on my birthday wasn’t hard. Eating fried chicken for once (the birthday meal of choice) was nice, but I had no desire to gorge myself, even with the excuse of a special occasion. Saturday I’ll be taking Debbie to a highly recommended Indian buffet in Lexington for our 30th anniversary. I dearly love Indian food, and may not run a calorie deficit on that day—or I might, and in any case I don’t expect to come away from the buffet stuffed.

One day she was craving a hamburger, a food she wouldn’t typically have eaten. But that day, she ate a hamburger and fries for lunch. “And I was done. End of story,” she says, with a hint of wonder in her voice. No cravings, no obsessing over calories, no weeklong binge-and-restrict, no “feeling fat” and staying away from exercise. She ate a hamburger and fries, and nothing terrible happened. “I just wish more people would get it,” she says.

This is pretty accurate. In fact, the birthday menu was originally grilled hamburgers and homemade fries, except that it snowed 20 inches the day before. I picked it partly because it’s a family favorite and I’m not the only one who eats on my birthday, and partly to remind myself that I shouldn’t be a slave to anything, even a diet menu. So far I haven’t had any cravings, for hamburgers or fried chicken or Mexican or BBQ or any of my other beloved favorites. But due to circumstances I’ve eaten a couple from that list—and survived. I ate them without concern, enjoyed myself as I did, and was perfectly happy to my restricted menu afterwards.

Of all the topics on which I share personal experience, this is the one where I’m most reluctant, because I don’t think my experience is easy to generalize from. From what little I do understand about how food and our relationship with it works, I know that change in this area is extremely difficult and even harder to maintain over the long term. Worse, I am also aware of how little I actually understand. Worst of all, I don’t really understand why critical aspects of my attitude seem to be different this time around, and I certainly don’t give my conscious efforts the credit for that.

So I would never give advice any stronger than “I tried this and it seemed to work for me, you might want to ponder that.”  So weak that it’s hardly worth writing up! But on the off chance it might prove useful to someone, I will go ahead and write it up.

Bad thinkers

This article, Bad Thinkers, argues that believing weird stuff is often due to bad information, but to flawed intellectual character:

I want to argue for something which is controversial, although I believe that it is also intuitive and commonsensical. My claim is this: Oliver believes what he does because that is the kind of thinker he is or, to put it more bluntly, because there is something wrong with how he thinks. The problem with conspiracy theorists is not, as the US legal scholar Cass Sunstein argues, that they have little relevant information. The key to what they end up believing is how they interpret and respond to the vast quantities of relevant information at their disposal. I want to suggest that this is fundamentally a question of the way they are. Oliver isn’t mad (or at least, he needn’t be). Nevertheless, his beliefs about 9/11 are the result of the peculiarities of his intellectual constitution – in a word, of his intellectual character.

Although the writer’s example centers around 9/11 conspiracy theories, his case does not, so if you are more sympathetic to that particular theory than the writer you can choose something else—the vast right-wing conspiracy, maybe, or (something the writer himself is guilty of) an unexamined faith in the redeeming power of education.

I like the idea of intellectual virtues and vices, and I think the items he puts on each list belong there.

Gullibility, carelessness and closed-mindedness are examples of what the US philosopher Linda Zagzebski, in her book Virtues of the Mind(1996), has called ‘intellectual vices’. Others include negligence, idleness, rigidity, obtuseness, prejudice, lack of thoroughness, and insensitivity to detail. Intellectual character traits are habits or styles of thinking.

To describe Oliver as gullible or careless is to say something about his intellectual style or mind-set – for example, about how he goes about trying to find out things about events such as 9/11. Intellectual character traits that aid effective and responsible enquiry are intellectual virtues, whereas intellectual vices are intellectual character traits that impede effective and responsible inquiry. Humility, caution and carefulness are among the intellectual virtues Oliver plainly lacks, and that is why his attempts to get to the bottom of 9/11 are so flawed.

[…] Our intellectual vices are balanced by our intellectual virtues, by intellectual character traits such as open-mindedness, curiosity and rigour.

The nicest example of intellectual character fostering questionable beliefs is a psychologist investigating the idea of basketball players having a “hot hand,” something that is easily disproven.

Gilovich used detailed statistical analysis to demonstrate that the hot hand doesn’t exist – performance on a given shot is independent of performance on previous shots. The question is, why do so many basketball coaches, players and fans believe in it anyway? Gilovich’s cognitive explanation is that belief in the hot hand is due to our faulty intuitions about chance sequences; as a species, we’re bad at recognising what genuinely random sequences look like.

And yet when Gilovich sent his results to a bunch of basketball coaches, what happened next is extremely revealing. One responded: ‘Who is this guy? So he makes a study. I couldn’t care less.’ This seems like a perfect illustration of intellectual vices in operation. The dismissive reaction manifested a range of vices, including closed-mindedness and prejudice.

This in a nutshell is the danger of seeking out like-minded community, where the goal is largely to avoid direct encounters with challenging viewpoints, preferring to hoot and holler at them from a distance, taking strength not from the soundness of your thinking but from the uncritical encouragement of your fellow enthusiasts.

Bring the right questions

I once heard Navigators missionary Jim Petersen say he read through the Bible continuously, using Matthew Henry’s reading plan—but only as a way of remembering where he left off. Sometimes he would read a lot, sometimes just a bit, depending on where he was (often waiting for a plane) and where his head was at. Sometimes he would go for awhile without reading, but not for long.

What struck me, though, was that he didn’t do it as a discipline for its own sake, but as a means to finding answers. He always had one or more questions in mind as he read, and as a result would always find new things along the well-worn paths.

I’ve never developed the habit of regular Bible reading. I’ve read the Bible plenty, but it has usually come in spurts, and for years now I’ve known the territory well enough that I often find myself looking up one or more relevant passages as I think through a problem. (I like Bible Gateway for this.) And as I review passages or read what comes before or after I stumble across new things, which I either pursue or file away for later thought.

Although it doesn’t rise to the level of a full-fledged complaint, my quibble with the usual exhortations to “spend time in the Word” is that Scripture reading is presented as an end in itself, like relaxing or eating well or going for a walk. (Which is kind of funny, because these days exhortations to relax or eat well or go for a walk are usually cast as means to some other end!) And perhaps spending time in the Word has its own inherent benefits—but if so, I only experience them peripherally. I turn to the Bible because it can answer my questions, and can improve the quality of my questions as well. And I think that understanding the Bible in this way, as a resource to be used, leads one to approach Scripture differently, in many cases more profitably.

All this is prelude to a fairly modest note I want to make, which doesn’t even have to do with the Bible. I only bring that up as an illustration of a more general thing, namely that I try to approach the river of information in which we all swim in exactly the same way—I have questions, and I am always on the lookout for bits and pieces floating by that might eventually form part of an answer, or at least a better question. (It’s probably also worth mentioning that over the years I’ve gotten fairly good at determining on first glance that a particular bit will never be helpful, thereby allowing me to contentedly ignore a vast amount of stuff clamoring for attention because it is Important—or so it claims.)

One of my long-standing questions: where does the power of Christian witness reside? Or, perhaps, what enables unbelievers to resist and reject the Good News? One of the places I look for answers is periods of history where the Good News was apparently irresistible. The first one, of course, was the three hundred year stretch following Pentecost where the brotherhood grew from 120 believers to 33 million. Here’s a table charting the growth,

unnamed_Fotor

As Stark notes, the rate of growth works out to a pretty consistent 43% each decade. Or, put another way, each disciple making one new disciple every twenty years. Now, I don’t know of any proposed program of evangelization that suggests we expect to work with someone for twenty years on average before they experience conversion. One of the more deliberate discipleship-based modern programs, Robert Coleman’s Master Plan of Evangelism, suggests that 6 months is enough to make a convert, plus six more to equip him sufficiently to be a convert-making disciple on his own. But this is totally out of sync with the most celebrated period of Christian expansion—at the Coleman rate, it would have been accomplished in less than 20 years. So something else seems to be going on.

Yesterday I stumbled across an article about John Wesley, and after a brief scan decided it was worth sending to my Kindle for later reading. As I scanned, I noticed this claim:

From 1776 to 1850 American Methodist grew like a weed. In 1776, Methodists accounted for 2.5 percent of religious adherents in the colonies, the second smallest of the major denominations of that time. By 1850, Methodists comprised 34.2 percent of religious adherents in the United States, which was 14 percent more than the next largest group.

I was already intrigued by Wesley’s method of discipling believers in small groups—as well as the fact that, two hundred years later, this approach is hardly remembered, even by Methodists. And now I have a pointer to another intriguing episode of explosive growth—and not just explosive but sustained, enough so to eventually capture one-third of a pretty religious population. Was it simply the Good News, presented Methodist style, which led to this? Or was it something about Wesley’s method?

Ellie Kemper

This article is unusual—in a good way!—and so I thought I’d highlight it. (When I see a link to Vulture on a topic that interest me I always follow it, and am never disappointed.)

Before watching Unbreakable Kimmy Schmidt on Netflix, I had no idea who Kemper was, and reading this article I see why that is—she hasn’t had an especially high profile before now. I didn’t expect to like the show, but the buzz and the half-hour runtime led me to give it a try. I think the show is terrific, with a edgy urban humor I like in small doses together with an incredible sunniness that makes the package delightful. And Kemper is a big part of that, with her Lucille-Ball-like over-the-top expressiveness.

What’s good about the article is that it shows how Kemper became who she is today, kicking off with a short homemade horror film with a very young Ellie, continuing with sample clips from 1994, 1998, 2002, 2006, 2009, and so on that illustrate her development as an actress while also tracking her career, adding just enough text to put the clips into context. So in the end, the best part of the article is Kemper herself, who supplies most of the content through online videos, nicely organized and lightly annotated.

Warning: several of the clips are highly adult in topic matter, so click cautiously—but I think titles make it clear which ones to avoid.

Is commitment teachable?

Seth Godin clarifies something for me:

[Teaching technique] a waste because the fact is, most people can learn to be good at something, if they only choose to be, if they choose to make the leap and put in the effort and deal with the failure and the frustration and the grind.

But most people don’t want to commit until after they’ve discovered that they can be good at something. So they say, "teach me, while I stand here on one foot, teach me while I gossip with my friends via text, teach me while I wander off to other things. And, sure, if the teaching sticks, then I’ll commit."

I’ve seen what Godin describes, and I still don’t understand it completely’. It’s so obvious that acquiring a skill will involve a long, sustained stretch of failed effort—failed because you don’t yet have the skill. And making the effort is no guarantee you will ever master the skill. But it’s the only path. And there are plenty of secondary benefits that will come from making the attempt, whether or not the effort eventually succeeds. What else but commitment can power you through such a disheartening stretch?

For awhile I thought the culprit here was a general refusal to fail. And I think that is part of it. One current bit of advice for accomplishing something is to publicize the goal, and then using the potential shame of public failure as motivation to stick with the program. I’ve tried this occasionally in the past, but it fails me at the moment I realize that no one really cares whether I succeed or fail. Plus I’m too comfortable with failure, private and public.

As part of my job I’ve watched good teachers engage adult students who need help, only to see those students do anything in their power to deflect that help. Different folks use different strategies, but it boils down to: don’t get too close. Tell me what you have to say, and then let me consider it, preferably after you’ve gone away. The best teachers will get closer than the student wants, then confront the student directly with their deficiency, then tell the student how to correct it—and then wait until the student makes some attempt, right there, to do what the teacher said. Often it’s this last step at which students balk. It’s humiliating enough to be told where you fall short, worse to be given the obvious (usually simple) solution, but worst of all to have to acknowledge it all by doing what the teacher says right then and there. Until that final step we are still able to “stand up on the inside”, but doing what the teacher says forces us to sit down both outside and inside.

(I’m referring here to an anecdote I first heard from James Dobson, telling of a toddler who got into a war of wills with his mother about standing up in his highchair. At the end he plops down, saying—with his glare, if not in actual words—“I may be sitting down on the outside, but I’m standing up on the inside.”)

So I thought the problem was a refusal to fail, which is essentially a refusal to become a student. Godin suggests that the problem is less severe—that folks are willing to sign up for studenthood and endure a stretch of failure if they can somehow be reassured that success would be the end result. This is heartening if true. I have no idea whether it is actually true.

There are few areas where excellence is in reach of everyone, but surely one is godly living. History testifies to that, and until community evaporated one’s everyday life testified to that through the many neighbors who were unremarkable except in their godliness. Perhaps being surrounded by that cloud of witnesses was enough to generate commitment in a young (or even older) believer, who had a long way to go but also serious reassurance that the effort would end in success.

But I have the sense that today we no longer have such reassurance—and in fact don’t even know what success might look like when embodied. It’s one of the few hypotheses I have for why so few people I know seem to be engaged in the tedious work of discipleship, occasionally discouraging but never mysterious. It’s the only task Jesus has set for us, and the rewards are significant both at the end and along the way. What could keep us from it except a refusal to endure failure, or at least a fear that we can never succeed?

Village farming

I’m far from an expert on either farming or villages, but I think that Gene Logsdon is correct when he says that the village represents the apex of human civilization:

As far as I can find in history and archeology, as the hunting and gathering age gradually evolved into settled communities, farming was very much a village affair, not an individual family undertaking.  People congregated into groups for mutual protection and for sharing the work load. Their garden farms were clustered around the outskirts of their villages. Among the many advantages, there were plenty of children and dogs running around, scaring wild animals away from the crops.

Traditionally in Europe and especially Asia where even today the average size of farms is under five acres in some areas, farmers lived in villages and went out to their acres during the day.  Immigrants who lived this integrated village farming life in Austria have told me how much more comfortable and enjoyable life was compared to what they found in America. In their homeland, farmers often worked in groups in the fields and then returned to town in the evenings, to community, and on porches, street corners,  and in taverns, they talked to each other, shared ideas and events, tended to see both farm field and urban shop as one community united in work and play. In America they felt lonely on American farms. […]

I’m sure it sounds ridiculous, but I like to think that the village represents the apex of human civilization. Village life is more secure and comfortable than the lonely ramparts of the outer countryside or the crowded nonentity of the big city. The world is littered with the ruins of great cities. The way to keep a nation vital and human is to keep it as a collection of villages spread out over the landscape. This new age of local garden farming is a way to do this. It is causing the return of the village as the center of human endeavor. People are coming together for that most basic need of all: good food. They are realizing that humans have a lot more in common than geographical, political, economic, and religious differences would imply. As they flock into farm markets, why, my goodness, they realize they can actually like people of different ideologies.

We’ve lived in the city, and we’ve lived In the country. Both lack the community that the village promises, at least as described by Logsdon. I’m agnostic as to whether the current interest in urban farming can take us back there, but I am persuaded that neither the city nor the country will be able to.

Just pick something

When Debbie and I were engaged, her family began to include me in their outings. Any trip to a restaurant was preceded by a long, inconclusive discussion of possible places to go. This was something unusual for me. I suppose there weren’t many choices open to me while growing up, but even when that changed my approach tended to be: just pick something. Considering the options was something I never enjoyed.

I mentioned the paradox of choice last week. Today I ran across an article which addresses the same issue. The best part of the article is the opening graphic:

It’s a simple point, but the simple ones bear repeating: the increment between the good choice you can make right away and the better one you might possibly make after considering it further is often outweighed by the cost of considering—cost in time, in anguish, and in delaying the benefit of having the thing.

As I drove home, I realized how refreshingly different this experience had been. My normal pattern when it comes to spending money is to look at every option. I read reviews, study alternatives and ultimately spend way too much time finding a way to save a bit of money.

I go through this process even when I have already decided I’ll make the purchase. I’m also very aware that the time I spend overanalyzing prices will cost me way more money, in the form of opportunity costs and cognitive drain, than I could ever hope to save.

On top of all that, it’s painful.

I did the opposite with the boot heaters. I decided that if my wife was going to enjoy skiing, she needed (yes, needed) boot heaters. I had the money, so I bought them.

It was such a relief to make a decision without weighing 50 options, reading hundreds of testimonials or calling friends for recommendations. I knew what I wanted, and previous experience with the local shop gave me the confidence to trust I had made a good decision.

Not the best decision, just a good one. Contentment has its rewards.