Creative decay, courtesy our corporate overlords by @DavidOAtkins

Creative decay, courtesy our corporate overlords

by David Atkins

Kurt Andersen of Vanity Fair penned a great article this month on the comparative lack of cultural innovation over the last 20 years. Andersen notes that in architecture, art, fashion, music and other aspects of popular and consumer culture, there is very little difference between the culture of 20 years ago and that of today. By contrast, think of the enormous differences between 1992 and 1972, or between 1972 and 1952, or 1952 and 1932, or 1932 and 1912. Technology has changed significantly, of course, but styles haven't:

Think about it. Picture it. Rewind any other 20-year chunk of 20th-century time. There’s no chance you would mistake a photograph or movie of Americans or an American city from 1972—giant sideburns, collars, and bell-bottoms, leisure suits and cigarettes, AMC Javelins and Matadors and Gremlins alongside Dodge Demons, Swingers, Plymouth Dusters, and Scamps—with images from 1992. Time-travel back another 20 years, before rock ’n’ roll and the Pill and Vietnam, when both sexes wore hats and cars were big and bulbous with late-moderne fenders and fins—again, unmistakably different, 1952 from 1972. You can keep doing it and see that the characteristic surfaces and sounds of each historical moment are absolutely distinct from those of 20 years earlier or later: the clothes, the hair, the cars, the advertising—all of it. It’s even true of the 19th century: practically no respectable American man wore a beard before the 1850s, for instance, but beards were almost obligatory in the 1870s, and then disappeared again by 1900. The modern sensibility has been defined by brief stylistic shelf lives, our minds trained to register the recent past as old-fashioned.

Go deeper and you see that just 20 years also made all the difference in serious cultural output. New York’s amazing new buildings of the 1930s (the Chrysler, the Empire State) look nothing like the amazing new buildings of the 1910s (Grand Central, Woolworth) or of the 1950s (the Seagram, U.N. headquarters). Anyone can instantly identify a 50s movie (On the Waterfront, The Bridge on the River Kwai) versus one from 20 years before (Grand Hotel, It Happened One Night) or 20 years after (Klute, A Clockwork Orange), or tell the difference between hit songs from 1992 (Sir Mix-a-Lot) and 1972 (Neil Young) and 1952 (Patti Page) and 1932 (Duke Ellington). When high-end literature was being redefined by James Joyce and Virginia Woolf, F. Scott Fitzgerald and Ernest Hemingway, great novels from just 20 years earlier—Henry James’s The Ambassadors, Edith Wharton’s The House of Mirth—seemed like relics of another age. And 20 years after Hemingway published his war novel For Whom the Bell Tolls a new war novel, Catch-22, made it seem preposterously antique.

Now try to spot the big, obvious, defining differences between 2012 and 1992. Movies and literature and music have never changed less over a 20-year period. Lady Gaga has replaced Madonna, Adele has replaced Mariah Carey—both distinctions without a real difference—and Jay-Z and Wilco are still Jay-Z and Wilco. Except for certain details (no Google searches, no e-mail, no cell phones), ambitious fiction from 20 years ago (Doug Coupland’s Generation X, Neal Stephenson’s Snow Crash, Martin Amis’s Time’s Arrow) is in no way dated, and the sensibility and style of Joan Didion’s books from even 20 years before that seem plausibly circa-2012.

Andersen has a lot more evidence where this comes from in his lengthy 3-page piece. Suffice it to say that it's a fairly compelling case. The key question is why?

Andersen speculates on a number of reasons, not all of which I find convincing. But one reason occurred to me immediately while reading the piece, which Andersen does eventually address: the fact that culture and media are increasingly dominated by shareholder-interested, risk-averse conglomerates that have too much to lose by taking significant creative initiative:

Part of the explanation, as I’ve said, is that, in this thrilling but disconcerting time of technological and other disruptions, people are comforted by a world that at least still looks the way it did in the past. But the other part of the explanation is economic: like any lucrative capitalist sector, our massively scaled-up new style industry naturally seeks stability and predictability. Rapid and radical shifts in taste make it more expensive to do business and can even threaten the existence of an enterprise. One reason automobile styling has changed so little these last two decades is because the industry has been struggling to survive, which made the perpetual big annual styling changes of the Golden Age a reducible business expense. Today, Starbucks doesn’t want to have to renovate its thousands of stores every few years. If blue jeans became unfashionable tomorrow, Old Navy would be in trouble. And so on. Capitalism may depend on perpetual creative destruction, but the last thing anybody wants is their business to be the one creatively destroyed. Now that multi-billion-dollar enterprises have become style businesses and style businesses have become multi-billion-dollar enterprises, a massive damper has been placed on the general impetus for innovation and change.

This isn't exactly news to anyone who goes to the movie theater. Producers come up with sequels, prequels, remakes and reboots galore because there's built-in audience and branding for them. Doing new things and telling new stories are dangerous and potentially expensive endeavors.

It's even worse in videogames, exceptions like Portal notwithstanding. Cracked had a great article on this subject last year:

But this isn't about any lack of creativity among game developers, artists, writers or anyone else. It's about money, and the fact that the market has trapped games in a fucking creative coffin (and developers will tell you the same). Everybody complains about sequels and reboots in Hollywood, but holy shit, it's nothing compared to what we have in gaming right now.

For instance, each of the Big Three game console makers took the stage at E3 to show off their biggest games of the upcoming year. Microsoft led off with the aforementioned Modern Warfare 3, which is really Call of Duty 8 (game makers like to switch up the sequel titles so the digits don't get ridiculous). Next was Tomb Raider 10 (rebooted as Tomb Raider). Then we had Mass Effect 3, and Ghost Recon 11 (titled Ghost Recon: Future Soldier). This was followed by Gears of War 3, Forza 4 and Fable 4 (called Fable: The Journey).

Next were two new games, both based on existing brands and both for toddlers (Disneyland Adventure -- a Kinect enabled game that will let your toddler tour Disneyland without you having to spring for a ticket -- and a Sesame Street game starring Elmo).

Then, finally, we reached the big announcement at the end (they always save cliffhanger "megaton" announcements for last, Steve Jobs-style) and they came out to announce that they were introducing "the beginning of a new trilogy." Yes! Something fucking new!

Then this came up on the screen: Halo 4. Confused? So was the audience. By "new trilogy" they actually meant that there would be three more Halo games. Did I mention that Halo 4 is actually Halo 7? Which means they intend to put out at least nine Halo games before they're done? Oh, wait, they also announced they were doing a gritty reboot of the decade-old Halo to make it an even 10.

Sony came up next and announced a sequel, another sequel and then a reboot. After that it went sequel, sequel, special edition of a sequel, new FPS, sequel, new FPS, sequel, special edition of a sequel, new game based on an existing property (Star Trek), sequel, sequel and sequel. Then they introduced a new system (the PS Vita) and showed it off with four sequels.

Nintendo's list went: sequel, sequel, sequel, sequel, sequel, sequel, sequel, sequel, sequel and (hold on, let me double check here) a sequel. And you already know what those were, even if you haven't played a video game in 15 years: Mario Kart, Mario World, Luigi, Zelda, Kirby, etc. Then they showed off their new system (the Wii U) with a demo reel promising that some day it would allow us to play sequels like Arkham Asylum 2, Darksiders II and Ninja Gaiden 3.

Think about the situation with Hollywood -- movies are expensive as hell, so studios are scared to death of taking creative risks and thus we get a new Transformers movie every two years. But now take that and multiply it times five, and you have the situation with video games. Literally. A video game costs five times as much as a movie ticket, and therefore customers are five times as cautious about experimenting with unfamiliar games that might wind up being shit. Game publishers respond accordingly.

And yes, we gamers are ultimately to blame. We don't even perceive how incredibly narrow our range of choices has gotten. For instance, every single gaming forum on the Internet right now is hosting at least one passionate discussion about which is better, Modern Warfare 3 or Battlefield 3. [Emphasis added]

The videogame industry is particularly problematic in this regard. But the problem is a cultural universal with similar symptoms across the board. The malaise of large-scale corporate domination of our economy isn't just political and economic. It's cultural, too. It's the slow death of conformity and creative strangulation disguised as cool and individual expression through ironic nostalgia and the commodification of discontent.


.