Occasionally I’ve run across an item that seems to express an idea perfectly—yet the idea is buried so deep that I can’t quite pin it down. More than thirty years ago, in the back pages of Texas Monthly magazine, I saw a small ad (for an outlet mall, of all places) which pictured a fashion model striding down a runway, with the caption “Be the Star of Your Own Movie”.
I hadn’t thought much at that point about culture or history or modernity, but even then I knew that it resonated because it embodied a sort of narcissism that was not only in full flower at that point (this being a few years past Tom Wolfe’s Me Decade), but was also something new on the scene. As prideful as I could be at times, I had never really envisioned my unfolding life as a movie about me. And I was certain that my folks would be baffled by such thinking.
Although I had a label for it, I still wondered: Why now? Why everywhere, all at once? Was it really a cultural shift, and if so what caused/enabled it? During the time since then the trend seems only to have broadened and deepened, manifesting itself everywhere, even having its way with folks my parents’ age. And although those questions haven’t exactly driven my own studies, I’ve always had them in the back of my mind as I’ve learned more about where we are and how we got here. At this point I have a much clearer understanding of the sources and trajectories involved, but have yet to put the puzzle together.
This week I ran across some key pieces. I’m re-reading a book by Thomas de Zengontita, called Mediated: How the Media Shapes Your World and How You Live in it. It’s pretty good—I first picked it up on Thursday afternoon and am already halfway through my second reading. Here is the very first paragraph.
Ask yourself this: did members of the Greatest Generation spend a lot of time talking about where they were and what they did and how they felt when they first heard the news from Pearl Harbor? People certainly remembered the moment, and a few anecdotes got passed around—but did a whole folk genre spontaneously emerge? Did everyone feel compelled to craft a little narrative, starring me, an oft-repeated and inevitably embellished story-for-the-ages reporting on my personal experience of the Event? Or did they just assume that Pearl Harbor and its consequences were what mattered, and talk about that.
This captures something I’ve seen over and over, and wondered at. Increasingly over the years I’ve been watching people, or listening to them talk, or reading something they wrote, and a frustrated thought forms and builds in me: why should anyone care what you think about that? Note that I only react this way when there is no obvious reason to value the other person’s thoughts on the topic, due to having studied it or having been directly involved with the event or even just because they are generally wise or experienced. When I found myself reacting like this sporadically, I chalked it up to having run across someone who was unusually lacking in self-awareness. Then it began happening more often, and I suspected it was a generational thing. Now it happens to me all the time, and I have to acknowledge that I’m the one who was left behind when the zeitgeist shifted, and whether on balance it’s good or bad I need to avoid the trap of “Why, back in my day …” and instead figure out how to respond to this new thinking with integrity and grace.
Looking back through my old blog posts, I see I’ve addressed one aspect of this, the idea of newsjacking, where one injects oneself into a trending story, in hopes of riding its coattails. Quoting myself here:
I think it points to a basic but unnamed impulse that has flourished along with the internet. For a long time I’ve noticed it primarily in comment threads on blogs, where people often don’t interact with the content of a post but simply use it as a springboard to talk about themselves. When I called the technique anything at all, I would call it (somewhat meanly) “That reminds me of … ME!” It is not the same as hijacking a thread, which involves redirecting the entire discussion somewhere the original post didn’t go. It is smaller and self-contained, a way of injecting oneself into a discussion without actually needing to address the topic at hand. Or, to be mean again, a natural result when the commenter finds their own experiences and opinions more worthy of comment than those of the blogger.
I go on to cite Paul Ford, who claims that this impulse, properly recognized, is key to understanding how people behave on the internet. Ford says:
“Why wasn’t I consulted,” which I abbreviate as WWIC, is the fundamental question of the web. It is the rule from which other rules are derived. Humans have a fundamental need to be consulted, engaged, to exercise their knowledge (and thus power), and no other medium that came before has been able to tap into that as effectively.
My only quibble with this statement is the “humans have a fundamental need” part. Have folks always felt that their opinion counted, no matter the topic, simply because it was their opinion? The fact that my frustration has increased, eventually to peg the needle, suggests otherwise. The Pearl Harbor anecdote above suggests otherwise. And lots of my reading over the years suggests otherwise, but I won’t try to make the case here, only ask that you entertain the possibility.
So, what changed? de Zengontita says it is the triumph of media, or more specifically, representational flattery—we no longer experience the world directly, but instead as representations served up for our delectation, as if we were of central importance. To give his reader a feel for the difference, he offers this thought experiment:
Say your car breaks down in the middle of nowhere—the middle of Saskatchewan, say. You have no radio, no cell phone, nothing to read, no gear to fiddle with. You just have to wait. Pretty soon you notice how everything around you just happens to be there. And it just happens to be there in this very precise but unfamiliar way. You are so not used to this. Every tuft of weed, the scattered pebbles, the lapsing fence, the cracks in the asphalt, the buzz of insects in the field, the flow of cloud against the sky, everything is very specifically exactly the way it is—and none of it is for you. Nothing here was designed to affect you. It isn’t arranged so you can experience it, you didn’t plan to experience it, there isn’t any screen, there isn’t any display, there isn’t any entrance, no brochure, nothing special to look at, no dramatic scenery or wildlife, no tour guide, no campsites, no benches, no paths, no viewing platforms with natural-historical information posted under slanted Plexiglas lectern things—whatever is there is just there, and so are you. And your options are limited. You begin to get a sense of your real place in the great scheme of things.
Some people find this profoundly comforting. Wittgenstein, for example.
So that’s a baseline for comparison. What it teaches us is this: in a mediated world, the opposite of real isn’t phony or illusional or fictional—it’s optional. Idiomatically, we recognize this when we say “The reality is …,” meaning something that has to be dealt with, something that isn’t an option. We are most free of mediation, we are most real, when we are at the disposal of accident and necessity. That’s when we are not being addressed. That’s when we go without the flattery intrinsic to representation.
The key new notion in this meditation is options. In fact, a few pages later de Zengontita states that he is bringing just two new notions to the table, representational flattery and the role of proliferating options in buffering us from reality. The rest of the book devotes itself to illustrating how those two notions manifest their effects in different areas of life—helicopter parenting, prolonged adolescence, celebrity, politics, the busyness of modern life, our uneasy relationship with nature, and so on.
A third notion—which isn’t presented as succinctly as the first two, maybe because he doesn’t see it as original—is that we are all performers now. Just as the world is represented to us via the media, we must represent ourselves to the world in a fashion unknown to our grandparents. Rather than simply growing into the role a community provides for us/imposes on us, we now need to sort through the myriad options available in order to construct a self, a process that is often agonizing, not just due to the pain of learning but also the constant risk of “buyer’s remorse”—what if I choose badly?
It’s this third notion that resonates most strongly with me, and I wish vaguely that de Zengontita had a clearer statement to make about it—but then again I admire his determination to simply explore the landscape without offering prescriptions, to do his best to make the reader feel aspects of the water we all swim in and leave it at that. So I’ll do my best to absorb his thinking, then make my own applications. (You can get a taste of his thinking in this four-minute video clip, where he claims that the “ability” of his grandfather to be naturally authentic is now lost to us, lost in a way that can’t be recovered, and we need to find a different path to authenticity.)
This part interests me because of my recent focus on character formation. Regardless of its historical sources, de Zengontita claims that it qualifies as a recent shift because the breezes of the past have lately become a hurricane we must now deal with. It is no good to yearn for the earlier times when character was a gift the community bestowed on us, even more foolish to try to restore such an environment. And as he points out, many of the options have their good points, at least individually considered. So (this is my prescription, not his) perhaps we would be better off learning how to choose wisely among the options.
In becoming a musical performer I learned that technique is often confused with phoniness—people think that “true” musicians somehow just let the music come out. But, as becomes obvious after just a little thought, a lot of mechanical groundwork needs to be laid before a musician can reliably, effectively produce the notes which express the music behind them.
And even if you get that, it’s easy to confuse the groundwork with the music. Vladimir Horowitz once said that he never worried about being challenged by the endless stream of technically more proficient whiz kids who came after him, because they would practice, practice, practice—and when they got on stage they would practice some more. Similarly, when I was learning to sing, it took much time and effort for me to hear what I was doing well enough to develop the technical skills I needed to produce the sound I wanted—and then more time and effort to get my mind off the technical production of the sound, to trust in my skills enough to use them for some other purpose than simply using them. Was I somehow a phony singer for not having been born so able, but instead setting out at age 50 to learn how to be one?
And while we’re discussing phoniness and performance, allow me to hearken back to an old post on the subject, when Chris and I were exploring the concept of stage presence:
At one of our earliest performances, an open mic program on stage at Natural Tunnel State Park, a fellow stopped by who had some experience performing, and asked to do a few songs. He also asked me and another fellow, a guitar player, to accompany him. When the guitar began playing a break, the new guy watched him with a big smile, nodded his head in time, then turned to me with the same big smile as if to say, “Isn’t that great?” To me, sitting next to him, it felt weird and artificial, easy to mistake for insincerity. But it wasn’t insincere at all; the guy was in fact enjoying the break, and saying to me “Isn’t that great?” It was just that in order to communicate that from the stage to a bunch of people thirty feet away, he had to do some things I wasn’t used to doing or seeing up close.
The life lesson I learned from that exploration: it isn’t enough to, say, appreciate someone else’s effort—you must communicate your appreciation accurately. The same with admiration, or disappointment, or anger, or pleasure, or the rest. The writers I respect the least are the ones who blame their readers for misunderstanding them. Similarly with well-intended folks who fail to act on those intentions. Similarly with folks who think it is sufficient to feel a certain way towards others, leaving them the job of divining those feelings.
So, must we resign ourselves to a life of feeling phony, of deliberately performing for others in order to actually convey what we intend to convey? Well, yes and no. I think we (in the WEIRD world, anyway) no longer inhabit a context which naturally shapes us into an authentic character, and there’s no going back. Our character is something we need to assemble, and our best hope is to learn how to do that properly, to acquire the knowledge and skills and disciplines that will equip us to the task of transforming ourselves into proper human beings.
That’s the yes part. The no part is this: once we’ve thoroughly learned what we need to know, once we’ve practiced it all over and over—at that point we actually become the human we set out to construct, with characteristics which were carefully chosen and nurtured but are now second nature to us. At which point we can stop thinking about the performance, and simply perform. We can stop thinking about how to love, and simply love.
For awhile now my life verse has been 1 Thess 4:11: “Aspire to live quietly, and to mind your own affairs, and to work with your hands, as we instructed you,” But I’m toying with the idea of changing it to Matthew 5:16: “In the same way, let your light shine before others, so that they may see your good works and give glory to your Father who is in heaven.” I suspect that in these dark days the greatest gift we can give someone is to help them to understand that a godly life is a live option—not only with our lips, but in our lives. I suspect that more folks might give it a try if they see others successful in their efforts.