Too Much Meta!

“What is meta,” you may ask, “and how is there too much of it?”  Those are excellent questions.  In order to answer them, I’ll need to give a little background on just what it is I’m talking about.  “Meta” comes from the Greek preposition μετά, which simply means “after” or “beyond”, among other things.  It can also be a prefix in which the basic meaning is attached to the root word.  For example, “metamorphosis” pairs meta– with with a derivative of μορφή (morphē), “form” or “shape”, giving the meaning, “beyond the [original] form”.  Thus, in a metamorphosis, something (such as a caterpillar) goes beyond the form it has into another form (such as a butterfly).

A subtle shift in this straightforward meaning began with the works of Aristotle, and rather inadvertently, at that.  Aristotle’s books on various topics derived from what we would now call lecture notes for the talks he gave at the school he founded, the Lyceum. These were either written by Aristotle himself, or taken down by his students.  After his death, these notes were collated and arranged by topic.  The book dealing with the working of the natural world was called the Physics, from the Greek φυσικά (physika), which simply means “having to do with nature”.  The name stuck, and we still call the study of mass, energy, motion, and such “physics”.  The book that was placed next in the sequence after the Physics dealt with abstract topics on the nature of being, what we can know and how we can know it, causality, and so on.  Whoever it was who arranged the texts very pragmatically called this text τὰ μετὰ τὰ φυσικά (ta meta ta physika), literally, “the things coming after the Physics”).  In other words, it was the next book after the one on physics, so its title was essentially After Physics!

This was shortened by the Romans, who translated Aristotle into Latin, to Metaphysica, which we Anglicize as Metaphysics.  From early on, the tendency was to interpret “meta”–“beyond”–as meaning not “beyond” in the sense of “the next book in the sequence”, which was its original connotation, but “beyond” in the sense of “transcending”.  Thus “metaphysics” was understood to mean “that which goes beyond ordinary physics” or “that which transcends nature”.  This has been the standard connotation of “metaphysics” ever sense; and this connotation has determined the use of “meta” in other contexts, as well.

From “beyond” in the sense of “transcending”, meta has come to have a slightly more specific meaning of “self-referential” or “recursive”.  For example:  When my daughter was very little, we used to watch various shows for little kids on the satellite channels we got.  Many of them would have a brief intro aimed at parents explaining the skills that kids were supposed to learn from the show–reading, counting, and so on.  One show–I don’t remember which it was–listed among these skills “metacognition”.  My wife asked me what that meant, and I said, “It means thinking about thinking.”  “Cognition” means “thinking”, and “meta-” makes the root refer back to itself–thus “thinking about thinking”.  By the same token, “metalogic” is logic applied to how logic itself works.  What I want to look at in this post is “metafiction”.

“Metafiction” is not quite “fiction about fiction” but more “self-aware fiction”.  That is, it is fiction that draws attention to itself as fiction.  As I’ve noted in previous posts, one of the traditional characteristics of fiction is that it assumes and encourages the willing suspension of disbelief.  That is, though the audience knows that it’s “just a story”–perhaps even an improbable or impossible story, in the case of science fiction or fantasy–they make believe that the events depicted really happened.  The expectation is that the author will thus try to make the story as believable as possible.  This is true even of a science fiction or fantasy story.  Elves don’t exist; but when the author lays down ground rules for what they’re like, he must stick to those rules.  He can’t have the elves with pointed ears sometimes and round ears at other times, or contradict himself as to whether they can work magic, for example.  This would break the spell, so to speak, and make it harder for the audience to suspend its disbelief.

Metafiction loudly and proudly flouts these conventions and explicitly proclaims itself to be artificial.  A good example at this very moment is the Deadpool franchise, in which the eponymous character talks directly to the audience and speaks explicitly about the movie he’s in as being a movie.  In television, this is commonly referred to as “breaking the fourth wall“.  In other words, in a typical TV show, the action takes place in a room.  We only ever see three walls to the room, though, because the “fourth wall” is where the camera shooting the show is located.  We willingly suspend our disbelief and imagine we’re in the room seeing the story play out.  When a character literally looks at the camera and speaks directly to the audience, though, he is “breaking” that fourth wall.  The 80’s comedy-action series Moonlighting was noted as one of the first television shows to do this on a regular basis.  Long before that, Looney Tunes characters spoke to the audience on a regular basis.  Metafiction can also be more subtle than this.  For example, in the first Christopher Reeve Superman movie, there is a scene in which, when something happens requiring Superman, Clark Kent runs frantically to find a place to change costumes.  He comes up short at a 1970’s style payphone:

He stares in perplexity for a moment, and dashes off into an alley.  This, of course, is a reference to the comics, in which Clark often ran into one of the old-fashioned phone booths that were full-body and opaque:

 

The site from which both of these images are taken points out, interestingly, that the Clark-Kent-running-into-a-phone-booth trope wasn’t actually that common in the comics.  Nevertheless, this is still an example of metafiction.  Though Clark doesn’t address the audience, the teeny phonestand which is totally inadequate for a costume change refers back to the older trope, with which the audiences would be familiar.  The audience immediately gets the joke and laughs.  Though Clark doesn’t break the fourth wall, the positioning of the phone booth calls attention to the trope and its subversion, and thus calls attention to the fact that this is indeed a movie just as clearly as if Clark had said something.

Over the years, the metafictional has become more common in film, TV, and literature.  Breaking the fourth wall is still the exception, but playing with tropes, as in the Superman example, is all over the place.  Metafiction in general is often associated with the postmodern in literature, film, and TV, although examples go back much further than the last century.  In any case, the device has become almost standard-issue common in contemporary media.  It is so common, in fact, that “meta” has become shorthand in pop culture discussions for “metafiction”.  If, for example, you say, “That show was so meta,” you mean it was very much self-referential–it constantly had elements of metafiction, such as we’ve discussed.  The reason for the increasing amount of meta in pop culture is a shift in audience expectations.  We’ve all become much savvier about pop culture, and we are all becoming experts of a sort.

Let me unpack that.  If you’re shopping for a recliner and are sitting in various such chairs in a furniture store, you don’t think about the process of chairmaking.  You don’t admire the joinery work in the wooden parts, or the smooth finish; you don’t think about what exact approach the builder took to upholstery; you don’t speculate as to the exact setup of gears and slides that allow the chair to recline.  You think, “This chair feels great!” or “This chair is really uncomfortable!”  You might critique the chair in slightly more specific terms–“The foot pad is too far out for my feet to hit it right,” or “It doesn’t lean back as far as I’d like it to,” or “This would take up too much space in my room.”  Still, one need not know a thing about chairmaking, or be able to make a chair oneself, to know if one likes the chair–no complicated analysis needed.

When I was a kid, this is what it was like with most things, including movies, TV, and literature–in short, entertainment.  It was also like that with food (I’ll explain how that fits in soon).  If you went to a movie and your friend asked you about it, you’d say, “It was great–I liked it!” or “It stank–I didn’t like it!”  If you were asked to elaborate, you might say, “It was really cool,” or “It was funny,” (if it was a comedy), or “I really liked Joe Schmoe in the lead role,” or alternately, “It was really stupid,” or “It didn’t make sense–I didn’t understand it,” or “It was really cheesy.”  Not much analysis, really.  You didn’t talk about camera angles or special effects, you didn’t talk about scripts or themes or tropes, you didn’t consider the acting techniques of the actors–you just liked the movie or you didn’t.  To make a deep analysis of it would have been seen as weird, pretentious, and making a big deal out of something that was “just a movie (TV show, book, comic book, etc.)”.  You’d behave pretty much the same if asked about a new restaurant.  You might say the food was not warm enough or too salty; or you might say it was very tasty; but that’s about the extent of it.  Anything beyond that, the response would be, “Who are you, Julia Child?”

Now a few of us bought books about films which we cherished in our little geeky hearts, read about them, read the serious movie reviewers (the only class of people at that time who said anything more about a movie than that it was good or bad), bought books on them, and so on.  Our generation–late Baby Boom, early Generation X–were the first to do so, since these things didn’t really exist previously.  Such resources didn’t begin to become common until the 70’s and 80’s, in fact.  Before the 60’s, most people in show biz were exactly that–very pragmatic in that it was indeed show biz–a business like anything else.  You got into the movies by going out to a movie lot and asking for a job, or getting discovered at Schwab’s, or buying equipment and starting your own studio.  Deep meanings and themes were for eggheads–you had movies to make!  Gradually, in the post-World War II era, things began to change from this paradigm.  Starting in France and spreading elsewhere, cultural commentators began to take movies and pop culture in general seriously.  This began in academic circles, but trickled down to movie critics.  They began to see their vocation as not just telling viewers whether they ought to spend money to go see The Rocky Horror Picture Show or whatever, but teaching the audience about movies and moviemaking.  In short, they wanted to show viewers why a movie was great or schlock.  Many reviewers began to become in essence public intellectuals, winning the Pulitzer Prize, in the case of Roger Ebert, and being esteemed as serious journalists, not mere chroniclers of pop culture.  Thus, our attitude towards movies shifted, and reviewers, in the newspapers and in the popular books they began to publish, were in the lead in shifting these attitudes.

Meanwhile, the generation of filmmakers coming up through the ranks in the 60’s were a new breed.  Film schools had come into being, and instead of the rough-and-tumble of learning the movie business by actually going and making movies, young people began enrolling in film schools.  There they studied the classic movies of old, analyzed camera angles, directorial techniques, acting and forth.  They learned all about the nuts and bolts of cinema in an academic setting.  This was spreading to actors, too–acting school became a thing, and teachers such as Lee Strassberg with his “Method” were teaching actors to analyze their acting techniques in the same way that directors were analyzing filmmaking techniques.  As these directors–Stephen Spielberg, George Lucas, Martin Scorsese, and others–and actors–Marlon Brando, Dustin Hoffman, Robert De Niro, and others–came into their own, they brought with them a whole new perspective of movies as not mere entertainment, and worthy of serious reflection and analysis. Movies were no longer just “show biz”, but a serious endeavor.

Similar forces were at work in pop culture more generally.  Though the 80’s and 90’s, comics came to be seen as a legitimate art form; television shifted gradually from being the “vast wasteland” to a venue for prestige projects; genre literature, previously seen as a ghetto fit only for the lowbrow came to be viewed with new respect; and so on.  Meanwhile, there were increasingly large numbers of books and other media on the meaning of film, TV, comics, etc.  As we devoured them by the stack, we were all turning into critics ourselves.  No longer did we just think a movie was good or bad.  We commented sagely on the camera angles, noted the references to 40’s classic films, noticed the tropes the writer and director used, dug for the deeper themes, praised or critiqued the artwork (in the case of comics), and so on.  We were all experts now.  With the adoption of the Internet hitting critical mass around the turn of the century, and the explosion of blogs, websites devoted to every conceivable aspects of pop culture, and later YouTubers of all stripes, Rotten Tomatoes and other aggregator sites (to say nothing of the review-it-yourself ethos of Amazon, eBay, and the like), analysis of film and pop culture was fully available to the masses.  We are a country full of film, literature, and pop-culture critics.  This is why metafiction has become mainstream.  Complicated inside jokes, self-referential asides, and calling attention to tropes were things that would go over the heads of most of the audience in previous times.  People didn’t go to the movies to analyze them, but to be entertained.  Now with everyone being a Roger Ebert or Pauline Kael, everyone would be looking for tropes and themes and such.  With VHS and later DVD’s, the audience could stop, rewind, fast-forward, and freeze-frame to analyze every last aspect of a film (hence the advent of Easter eggs).  No stone–or trope–is left unturned these days.  We are well on the way to peak meta.  The title of this post, though, is “Too Much Meta”; so that implies I think peak meta is a bad thing.  Why?

I’ll start by saying the current mediascape is not an unalloyed bad thing.  People understand more about pop culture than ever before, and are thus able to appreciate and enjoy it more deeply and on more different levels than in the past.  This increased understanding has certainly been grist for many interesting fan discussions, both in real life and via the Internet.  Social media have even allowed back-and-forth discussions between fans and the artists and creative people who produce pop culture in the first place.  This has often resulted in fruitful dialogue and feedback–no doubt future historians of 21st Century culture will point to the Internet, social media, and increased connectivity as a significant factor in the art and literature of our age.  All of these, as far as they go, are good things.  And yet….

This is where I will revert to my temperamental Buddhism.  The Fourth Noble Truth of Buddhism lays out the Noble Eightfold Path.  The seventh division of the Noble Eightfold Path is Right Mindfulness (Sanskrit:  samyak-smṛti; Pali:  sammā-sati).  The original terms translated as “mindfulness” literally mean “remembrance” or “recollection”.  The essential concept means to be aware of phenomena and sense perceptions as they actually are.  In other words, the idea is that one is to be fully aware of what one sees, hears, feels, smells, tastes, or touches, without judging or categorizing these impressions.  One merely observes one’s experiences, not saying “good” or “bad”, “pleasant” or “unpleasant”, without analysis or thought.  The mind, in a sense, is supposed to be a mirror of the world.  As I said back here,

The common [Buddhist] metaphor for [human nature] is that the mind is like “a mirror bright”.  A mirror is not affected by dirt or dust that collects on it–its mirror-nature is unchanged.  Wipe off the dust, spray on some Windex and clean it up, and the mirror reflects as well as ever.  Likewise, the Mahayana view is that we’re “already” enlightened, the pure Buddha-nature at the core of our being.  This Buddha-nature is perfect as it is; but we can’t realize it’s there or access it because of the layers of delusion, emotion, and neuroses that cover it.  Spirituality is gradually cleaning all this psychological and spiritual “dust” off of the “mirror bright” of our Buddha-nature.  Once that’s done, we experience enlightenment.  Rather than being self-centered, our mind reflects back the cosmos.

Thus, ideally, at least, we should try not to interpret the world as we want it to be, or would like it to be, but as far as possible, to see it as it actually is.  To put it another way, the idea is that when we experience something through any of our various senses, we should accept it–be it good or bad, pleasant or unpleasant–for what it is, without categorizing it, judging it, or emotionally reacting to it (i.e. holding onto it if it’s pleasant, or rejecting it if it’s unpleasant).

Now that may seem like an exalted goal, and something beyond our ability most of the time.  Certainly, I’m not saying that every meal we eat, every movie we watch, everything we experience, must be done slowly, carefully, and with full attention, as if it were a spiritual exercise.  That would hardly be possible.  I’m also not saying there’s no place for analysis.  It’s all right to be able to explain why a movie was good, bad, or ugly.  My point isn’t that we have to be mindful all the time; rather, it’s more that we’re hardly mindful at all in contemporary culture.

We’re already ingrained with a rather anti-contemplative ethos.  We pop microwaveable food into the microwave oven for quick breakfasts before we run out of the house to go to work; we gobble rapid fast-food lunches; we have music on at home, at work, and on the commute as a background that we don’t even pay attention to; at home and in the waiting rooms at doctors’ offices and service garages there are televisions blaring nonstop.  We are constantly immersed in stimuli until we become numb and oblivious to them all.  When we do pay attention, though, we don’t really experience our own experiences.  That is, we are not watching the movie to enjoy it, but analyzing the tropes, the camera angles, the special effects, and so on.  Similarly with other things:  We are watching movie and TV, listening to music, and eating food as critics, not as ordinary people.

As I’ve said, there’s a place for this.  For more and more of us, however, this is becoming the default mode.  We no longer watch or read for enjoyment, or eat for pleasure.  It’s all about analysis and technique.  We are not experiencing directly without judging or analyzing.  We’re judging and analyzing with almost no experience at all.  This, I think reduces our enjoyment and appreciation.  Instead of reveling in and enjoying the experience, we are analyzing it to death.

I notice this, for example, with my fifteen-year-old daughter.  She watches YouTube movie and pop culture sites like Nostalgia Critic and the Game Theory franchise, and is very knowledgeable about all kinds of movies and TV shows she’s never actually seen, and books she’s never actually read.  When she does see a movie or show, and afterwards my wife and I ask her how she liked it, she will often talk about the tropes she noticed, or critique the plot or story, and so on.  Basically, she comes of like a fifteen-year-old Roger Ebert.  I often wonder if she gets the same amount of simple enjoyment from watching a movie or TV show that I and my generation did at her age, free of all the analytical tendencies.

Of course, the media themselves increasingly reward this tendency, being every more meta, realizing that the audience is looking for just that kind of thing.  It is almost obligatory to make clever metafictional nods to the audience and to provide them with tons of fodder for ever more analysis (for example, Google “order to watch MCU movies” for a mind-numbingly detailed explanation of the exact sequence in which to watch the Marvel Cinematic Universe movies, explanations of why the suggested order is correct, and even a discussion of the relative dates of the internal chronology).  I can hardly imagine someone doing that with the wildly inconsistent James Bond series back in the day, or expecting anyone to take the fruits of his labor seriously!

I’m adamantly old-fashioned on this.  Not long ago a friend and I were talking about current movies, and he noted that he didn’t go out to see movies much anymore.  I agreed that I usually went out only for Marvel or Star Wars movies, and a few others.  He said that he preferred to watch movies on DVD, anyway, since he could pause and rewind.  At that point I had to adamantly disagree.  I didn’t really elaborate much in that conversation; but my reasoning is another aspect of what I’ve been discussing here.  I want to watch a movie as is, no pauses or rewinds or fast-forwards, from beginning to end.  I want to experience it with as little analysis as possible, and to enjoy it for what it is.  Later I might think about it or analyze it or even blog about it.  Later I might seek meaning or theme in it.  When I’m watching it, though, I want only to watch it.  It’s like the Zen aphorism, “When you eat, just eat.”  In this case, when you watch a movie, just watch a movie.  Likewise with watching a TV show or reading a book.

I mentioned food a little while back and have alluded to it here.  I think the foodie culture, inspired by the rise of celebrity chefs, the Food Network, and other such outlets has paralleled the changes in media that I’ve discussed.  In the culinary field, as in pop culture, everybody’s an expert who analyzes everything to death.  God forbid one should just make and enjoy a good meal!

Really, I think all of this is a sign of cultural decadence.  We have way too much bandwidth in the 24/7 culture of modern times, and since the amount of worthwhile content strictly obeys Sturgeon’s Law, the time gets filled up with remakes, reboots, franchises, and analysis, analysis, analysis.  Rather than having a lot less content and putting more effort into making what we do produce good, we make content, analyze the content, analyze the analysis of the content, make spoofs of the content and the analysis, and make content that winks knowingly at us as it analyzes itself.  All of this, of course, is much easier than, you know, making quality content.  We’re so caught up in the meta, though, that we hardly seem to notice.  This also takes us ever farther from the pure experience, as we no longer feel what we feel, but what someone said that someone else, in his opinion thought we ought to feel, based on what yet another said.

This has admittedly been a bit of a rambling and impressionistic post.  I hope, however, that I have got my basic point across.  I suspect that all of this will eventually collapse under its own weight, so to speak.  Maybe not, though.  I am sometimes at a loss in thinking of how writers of the future (if anybody is still writing) will look back on our cultural moment.  Then again, given technological issues, it’s unclear how much of it will even survive.  In any case, maybe it’s me being a crabby middle-aged man; but I still avoid almost everything on YouTube (particularly pop-culture analysis) and restrict my media diet of pop-culture websites.  When I watch a movie or read a book, I try to do exactly that.  As Yoda might put it, “Do or do not–there is no analysis!”

Posted on 23/07/2018, in Entertainment, literature, movies, television and tagged , , , , , , , , , , , , , , , , . Bookmark the permalink. 2 Comments.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: