Friday, September 30, 2011

Robert's Rules of Emoticon Order



It is an interesting thing to watch the tired old Palestinian fighters, who have spent their whole lives competing with the Israelis for control of the same soil, present their petition for statehood to the United Nations. In part because the idea that such a process even exists is so science fictionally cool, because it represents the possibility of creating new states—maps do change, as we have all seen in our lives, and the possibilities for how much they could change are theoretically boundless. The criteria are pretty simple: you need to have some real estate over which you exercise internal and external sovereignty, a permanent population, a government, and the capacity to enter into relations with other sovereign states. Simple enough, but for the unspoken part about the other people who might think it's *their* sovereignty to exercise.



Imagine, if you will, a near-future Texas—say fifty years out—whose demographics have radically changed, such that the only Perry that could ever be elected to statewide office would be a Perez. It is not hard to imagine such a sub-state of the United States deciding it wanted to return to becoming a sovereign state of its own. And that to accomplish such an act isn't really about the legalities under the flaccid regime of international law, the law of an imaginary sovereign with no real ability to enforce its edicts, but about the military ability to keep out occupying armies and the political ability to secure diplomatic recognition. Just ask the Confederates—the political theory underpinning our legal systems has never really articulated a coherent legal code defining when and how new states can be created within existing ones. Which doesn't stop a whole lot of free thinking iconoclasts from trying.


[Pic: President Kevin Baugh of the Republic of Molossia, aka a piece of land outside Reno, Nevada.]

Of course Abbas wants to get after the issue now, in the fall of the Arab Spring, as the incipient leaders of post-revolution territories debate their visions for a 21st century Arab state. But to do so also seems very anachronistic, when we are in a historical moment that reveals the the culture of the Jewish diaspora a much more relevant model for the organization of peoples than a piece of land with a fence around it and some ruling fathers running the rancho. Isn't the real power of the modern Israeli state based on the pre-Westphalian power of the transnational, inter-state network of supporters, who want the state because it articulates the existence of the network in the only terms that were understood by the post-colonial, post-WWII rulers of the world atlas?

In this century, the network is a more compelling model for the polity than the nation state.



The signs are all there in the outstanding roundup by Nicholas Kulish at The New York Times of the post-democracy movements emerging around the world—As Scorn for Vote Grows, Protests Surge Around Globe.

Surprise: the generations raised in cyberculture don't take the truths of constitutional democracy as self-evident.

Increasingly, citizens of all ages, but particularly the young, are rejecting conventional structures like parties and trade unions in favor of a less hierarchical, more participatory system modeled in many ways on the culture of the Web.

In that sense, the protest movements in democracies are not altogether unlike those that have rocked authoritarian governments this year, toppling longtime leaders in Tunisia, Egypt and Libya. Protesters have created their own political space online that is chilly, sometimes openly hostile, toward traditional institutions of the elite.

The critical mass of wiki and mapping tools, video and social networking sites, the communal news wire of Twitter and the ease of donations afforded by sites like PayPal makes coalitions of like-minded individuals instantly viable.

“You’re looking at a generation of 20- and 30-year-olds who are used to self-organizing,” said Yochai Benkler, a director of the Berkman Center for Internet and Society at Harvard University. “They believe life can be more participatory, more decentralized, less dependent on the traditional models of organization, either in the state or the big company. Those were the dominant ways of doing things in the industrial economy, and they aren’t anymore.”

Yonatan Levi, 26, called the tent cities that sprang up in Israel “a beautiful anarchy.” There were leaderless discussion circles like Internet chat rooms, governed, he said, by “emoticon” hand gestures like crossed forearms to signal disagreement with the latest speaker, hands held up and wiggling in the air for agreement — the same hand signs used in public assemblies in Spain. There were free lessons and food, based on the Internet conviction that everything should be available without charge.

Someone had to step in, Mr. Levi said, because “the political system has abandoned its citizens.”

The rising disillusionment comes 20 years after what was celebrated as democratic capitalism’s final victory over communism and dictatorship.

In the wake of the Soviet Union’s collapse in 1991, a consensus emerged that liberal economics combined with democratic institutions represented the only path forward. That consensus, championed by scholars like Francis Fukuyama in his book “The End of History and the Last Man,” has been shaken if not broken by a seemingly endless succession of crises — the Asian financial collapse of 1997, the Internet bubble that burst in 2000, the subprime crisis of 2007-8 and the continuing European and American debt crisis — and the seeming inability of policy makers to deal with them or cushion their people from the shocks.

Frustrated voters are not agitating for a dictator to take over. But they say they do not know where to turn at a time when political choices of the cold war era seem hollow. “Even when capitalism fell into its worst crisis since the 1920s there was no viable alternative vision,” said the British left-wing author Owen Jones.




Will the law students of the future learn Robert's Rules of Emoticons? It seems very likely to me. As suggested in last month's post, In the Panopticon, no one can hear your reboot, it seems indisputable that contemporary networking technologies present more compelling tools for the construction of direct democracy than have ever existed. These under-40s all over the world who are the natives of the realm of those technologies are naturally forming their own political networks using those tools. And these imminent polities may violate all the geopolitical conventions of land, language, and ethnicity.

Geopolitics isn't going away, but it is going to have its work cut out for it dealing with the emerging 21st century cyberpolitics.

What will the United Nations Security Council do about sovereign polities that assert themselves in the ethereal space of the network, even controlling resources and behaviors through the systems of the network, without needing to wall in any segments of the physical world?

What will happen when a virtual world secedes from the jurisdiction of the governments of the physical world?

What happens when a virtual polity decides to assert dominion over the physical world?



This mode seems the first really viable alternative approach to political choice, and the idea of democratic representation, to emerge in a long time. The NYT piece tries to place it within the existing dualistic right/left paradigm, but that kind of watered down Hegelian dialectic doesn't really have any place in the network. A network parliament would be a polyphony. A network parliament, in all likelihood, wouldn't be a parliament—it would be the People.

In a time when modern Greece is crumbling as a sovereign republic, is it too utopian to imagine the planet as a virtual Athens, governed by a network-enabled direct democracy? It is certainly a scary idea for the power elites of the world, the rulers of all the contemporary republics quietly scornful of popular opinion while relentlessly pandering to it and manipulating it in their own political and financial interests. And the American Founders would have no trouble scaring us with the idea of how horrific it could be to live in a society ruled by an Internet mob.

Lots to worry about in how to construct effective operating systems for that sort of polity, but the truth that seems self-evident to me is that we need to start tackling those tasks in earnest, because it's already starting to happen.

Tuesday, September 27, 2011

Frankenstein's Moon

One of the cool things about my day job is that sometimes it intersects with my genre leanings in a big way. Take Frankenstein's Moon as a case in point. This is how I spent much of last week, distilling a full-blown Sky & Telescope article down to a media-release-sized writeup, balancing readability with accuracy. Not always an easy task, especially when there's a lot of research and technical nuance involved. Practice helps, though. In the past, we've worked on similar projects connecting Edvard Munch's painting The Scream with Krakatoa, settled a date conflict regarding Caesar's invasion of Britain, offered a convincing new date for the ancient battle of Marathon and solved Walt Whitman's meteor mystery, among many others. Fun stuff, that!

The current Frankenstein piece seems to be capturing popular attention as well. Already it has resulted in a nice articles in The Guardian, which as been reprinted in quite a few British newspapers. Another article written by Jim Forsythe at WOAI in San Antonio has been picked up by Reuters and shown up all over the world, including MSNBC. So yeah, we've got lots of Frankenstein to enjoy here at the end of September.

One cool bit of conflation didn't make it into the media release, but is touched upon in the full article. Allow me to slip into Jess Nevins mode for a moment (although Jess would likely scoff that this is common knowledge) to explain. During the original "ghost story" challenge mentioned below, Mary Shelley is the only participant to actually finish a written piece begun at that time. Lord Byron began one, but soon lost interest and abandoned it. John Polidori, however, took up that fragment some time later and was inspired to write The Vampyre, published in 1819. The story was an immediate success, in part, no doubt, because the publisher credited it as written by Lord Byron (Polidori and Byron fought for some years to get the attribution corrected in subsequent printings). The Vampyre was the first fiction to cast the legendary bloodsuckers as an aristocratic menace in the narrative, and spawned a popular trend of 19th century vampire fiction which culminated with Bram Stoker's enduring Dracula in 1897. Which means the two most famous horror icons of 20th century pop culture--Dracula and Frankenstein's monster--can both trace their lineage back to that 1816 gathering at Villa Diodoti overlooking Lake Geneva.
Frankenstein’s moon: Astronomers vindicate account of masterwork

Victor Frankenstein’s infamous monster led a brief, tragic existence, blazing a trail of death and destruction that prompted mobs of angry villagers to take up torches and pitchforks against him on the silver screen. Never once during his rampage, however, did the monster question the honesty of his ultimate creator, author Mary Wollstonecraft Shelley.

That bit of horror was left to the scholars.

Now, a team of astronomers from Texas State University-San Marcos has applied its unique brand of celestial sleuthing to a long-simmering controversy surrounding the events that inspired Shelley to write her legendary novel Frankenstein. Their results shed new light on the question of whether or not Shelley’s account of the episode is merely a romantic fiction.

Percy Bysshe Shelley (played by Douglas Walton) and Lord Byron (played by Gavin Gordon) listen as Mary Wollstonecraft Shelley (played by Elsa Lanchester) tells her tale of horror. [Bride of Frankenstein]

Texas State physics faculty members Donald Olson and Russell Doescher, English professor Marilynn S. Olson and Honors Program students Ava G. Pope and Kelly D. Schnarr publish their findings in the November 2011 edition of Sky & Telescope magazine, on newsstands now.

“Shelley gave a very detailed account of that summer in the introduction to an early edition of Frankenstein, but was she telling the truth?” Olson said. “Was she honest when she told her story of that summer and how she came up with the idea, and the sequence of events?”

A Dark and Stormy Night

The story begins, literally, in June 1816 at Villa Diodati overlooking Switzerland’s Lake Geneva. Here, on a dark and stormy night, Shelley—merely 18 at the time—attended a gathering with her future husband, Percy Bysshe Shelley, her stepsister Claire Clairmont, Lord Byron and John Polidori. To pass the time, the group read a volume of ghost stories aloud, at which point Byron posed a challenge in which each member of the group would attempt to write such a tale.

Villa Diodati sits on a steep slope overlooking Lake Geneva. Relatively clear views prevail to the west, but the view of the eastern sky is partially blocked by the hill. A rainbow greeted the Texas State researchers upon their arrival at Lake Geneva. [Photo by Russell Doescher]

“The chronology that’s in most books says Byron suggested they come up with ghost stories on June 16, and by June 17 she’s writing a scary story,” Olson said. “But Shelley has a very definite memory of several days passing where she couldn’t come up with an idea. If this chronology is correct, then she embellished and maybe fabricated her account of how it all happened.

“There’s another, different version of the chronology in which Byron makes his suggestion on June 16, and Shelley didn’t come up with her idea until June 22, which gives a gap of five or six days for conceiving a story,” he said. “But our calculations show that can’t be right, because there wouldn’t be any moonlight on the night that she says the moon was shining.”

Moonlight is the key. In Shelley’s account, she was unable to come up with a suitable idea until another late-night conversation--a philosophical discussion of the nature of life--that continued past the witching hour (midnight). When she finally went to bed, she experienced a terrifying waking dream in which a man attempted to bring life to a cadaverous figure via the engines of science. Shelley awoke from the horrific vision to find moonlight streaming in through her window, and by the next day was hard at work on her story.

Doubting Shelley

Although the original gathering and ghost story challenge issued by Byron is well-documented, academic scholars and researchers have questioned the accuracy of Mary Shelley’s version of events to the extent of labeling them outright fabrications. The traditionally accepted date for the ghost story challenge is June 16, based on an entry from Polidori’s diary, which indicates the entire party had gathered at Villa Diodati that night. In Polidori’s entry for June 17, however, he reports “The ghost-stories are begun by all but me.”

Russell Doescher and Ava Pope take measurements in the garden of Villa Diodati. [Photo by Marilynn Olson]

Critics have used those diary entries to argue Shelley didn’t agonize over her story for days before beginning it, but rather started within a span of hours. Others have suggested Shelley fabricated a romanticized version for the preface of the 1831 edition of Frankenstein solely to sell more books. Key, however, is the fact that none of Polidori’s entries make reference to Byron’s ghost story proposal.

“There is no explicit mention of a date for the ghost story suggestion in any of the primary sources–the letters, the documents, the diaries, things like that,” Olson said. “Nobody knows that date, despite the assumption that it happened on the 16th.”

Frankenstein’s moon

Surviving letters and journals establish that Byron and Polidori arrived at Villa Diodati on June 10, narrowing the possible dates for the evening of Byron’s ghost story proposition to a June 10-16 window. To further refine the dates, Shelley’s reference of moonlight on the night of her inspirational dream provided an astronomical clue for the Texas State researchers. To determine which nights in June 1816 bright moonlight could’ve shone through Shelley’s window after midnight, the team of Texas State researchers traveled in Aug. 2010 to Switzerland, where Villa Diodati still stands above Lake Geneva.

Ava Pope, Kelly Schnarr and Donald Olson on the steep slope just below Villa Diodati. [Photo by Roger Sinnott]

The research team made extensive topographic measurements of the terrain and Villa Diodati, then combed through weather records from June of 1816. The Texas State researchers then calculated that a bright, gibbous moon would have cleared the hillside to shine into Shelley’s bedroom window just before 2 a.m. on June 16. This calculated time is in agreement with Shelley’s witching hour reference. Furthermore, a Polidori diary entry backs up Shelley’s claim of a late-night philosophical “conversation about principles” of life taking place June 15.

Had there been no moonlight visible that night, the astronomical analysis would indicate fabrication on her part. Instead, evidence supports Byron’s ghost story suggestion taking place June 10-13 and Shelley’s waking dream occurring between 2 a.m. and 3 a.m. on June 16, 1816.

“Mary Shelley wrote about moonlight shining through her window, and for 15 years I wondered if we could recreate that night,” Olson said. “We did recreate it. We see no reason to doubt her account, based on what we see in the primary sources and using the astronomical clue.”

For additional information, visit the Sky & Telescope web gallery at www.skyandtelescope.com/Frankenstein.

Friday, September 23, 2011

Hijacking Flight 117 to the Nostalgia Factory



Fridays are all about escape. All you need to do is flip through the weekend arts section of the newspaper, a menu of evasions of the real. This week it's another upbeat baseball movie from the ubiquitous Brad Pitt, three more variations on corporate life as apolitical action thriller (the commuter version in Drive starring Ryan Gosling, the retro remix version in the Jason Statham/Robert DeNiro/Clive Owen reinvention of The Killer Elite (how I wish someone really could channel Peckinpah for our post-GWOT culture), and the teen wolf wet dream version in Abduction of Taylor Lautner (they're not your real parents!)), and best of all, the ever-grunting über-Spartan Gerard Butler in Machine Gun Preacher (aka, What Would Jesus Shoot?)



On television, the escape is beyond an alternate present, into an alternate past. The success of Mad Men has shown Hollywood that, in a world where the present is apocalyptic and the future no longer exists, the past is the place we will go to happily watch commercials—indeed, the shows are all commercials for an imagined version of the past, when they're not anachronized versions of our favorite old commercials. Pan Am, The Playboy Club, Boardwalk Empire, even Game of Thrones...we like our product placement to occur in beautifully curated cathode ray nostalgia bubbles. As Alessandra Stanley says in her review of Pan Am, "When the present isn’t very promising, and the future seems tapered and uncertain, the past acquires an enviable luster."



Pan Am even imagines a time when there were cute revolutionaries in our midst imagining a better world: "Christina Ricci plays Maggie, a closet beatnik who wears the Pan Am uniform to see the world but at home listens to jazz and studies Marx and Hegel."



What a perfect semiotic response to the state of things in the world after 9/11, itself an evolved derivative of the Lockerbie Bombing, by imagining oneself eternally flying the airline that represented the dream of a shiny corporate everyday interplanetary 2001? Especially if you revisit the decade that just passed, in Mark Danner's amazing piece in this week's New York Review of Books—"After September 11: Our State of Exception." Danner conveys the catalyzing power of the historical change when wars between states were as relevant as a vintage game or Risk, and the duty of the sentinel was to protect the monolithic state from elusive and conceptually intangible networks:

[M]ake no mistake, the critical decisions laying the basis for the state of exception were made in a state of anxiety and fear. How could they not have been? After September 11, as Richard Clarke put it simply, “we panicked.” Terrorism, downgraded as a threat by the incoming Bush administration, now became the single all-consuming obsession of a government suddenly on a “war footing.”

Every day the President and other senior officials received the “threat matrix,” a document that could be dozens of pages long listing “every threat directed at the United States”10 that had been sucked up during the last twenty-four hours by the vast electronic and human vacuum cleaner of information that was US intelligence: warnings of catastrophic weapons, conventional attacks, planned attacks on allies, plots of every description and level of seriousness. “You simply could not sit where I did,” George Tenet later wrote of the threat matrix, “and be anything other than scared to death about what it portended.”11

One official compared reading the matrix every day—in an example of the ironic “mirroring” one finds everywhere in this story—to “being stuck in a room listening to loud Led Zeppelin music,” which leads to “sensory overload” and makes one “paranoid.” He compared the task of defending the country to playing goalie in a game in which the goalie must stop every shot and in which all the opposing players, and the boundary lines, and the field, are invisible.12

All this bespeaks not only an all-encompassing anxiety about information—about the lack of map rooms displaying the movements of armies, the maddening absence of visible, identifiable threats, the unremitting angst of making what could be life-and-death judgments based on the reading and interpreting of inscrutable signs—but also, I think, guilt over what had been allowed to happen, together with the deep-seated need to banish that guilt, to start again, cleansed and immaculate. Thus “the War on Terror”—a new policy for a new era, during which the guardians of the nation’s security could boast a perfect record: no attacks on American soil. The attacks of September 11 would be banished to a “before time” when the “legalistic” Clinton rules applied, before “the gloves came off.” The successful attack could thus be blamed on the mistaken beliefs of another time, another administration. The apocalyptic portal of September 11 made everything new, wiping out all guilt and blame.





No wonder my son, raised in the Cheney/Obama decade, gravitates toward the vinyl records of the 1970s. 9/11 succeeded in flipping the switch that turns our media culture into a giant fear-based psyop on ourselves. If I unleash my Jack Bauer action figure inside the screen of Pan Am, will he fall in love and settle down in the peaceful interregnum that never existed during the long war of the twentieth century? Maybe his lover will be Christina Ricci's New Left stoner, and she and Jack will partner up to foment a revolutionary atemporal mashup in the mediascape of the early 21st century. Better futures are there for those unafraid to leap into the Nietzschian uncertainty of tomorrow.



Isn't it better to punch through the exit door than assume crash positions? Don't believe what the flight attendants tell you: it is always your right to go into the cockpit.

Friday, September 16, 2011

Capitalist tools



I was struck the other day, upon reading this opinion piece within the peachy electronic pages of the Financial Times from UBS senior staffer George Magnus, by what an unusual clarion it was to find within the four corners of a business newspaper:

Financial bust bequeathes a crisis of capitalism
By George Magnus

Financial markets have had a torrid summer of breaking news about slowing global growth, fears over a new western economic contraction and the unresolved bond market and banking crisis in the eurozone.

But these sources of angst have triggered turbulence before, and will continue to do so. Our economic predicament is not a temporary or traditional condition.

Put simply, the economic model that drove the long boom from the 1980s to 2008, has broken down.

Considering the scale of the bust, and the system malfunctions in advanced economies that have been exposed, I would argue that the 2008/09 financial crisis has bequeathed a once-in-a-generation crisis of capitalism, the footprints of which can be found in widespread challenges to the political order, and not just in developed economies.

Markets may actually have twigged this, with equity indices volatile but unable to attain pre-crisis peaks, and bond markets turning very Japanese. But it is not fashionable to say so, not least in policy circles.


The basic theme is not that unusual—plenty of economic doom scenarios can be found throughout the mainstream media these days. What is so unusual is the way the diagnosis is expressed in Marxist terms—"crisis of capitalism" being part of the core lexicon of Capital, a term whose use reveals training in those texts as part of one's toolkit for understanding the contemporary world. To openly state that we are experiencing a "once-in-a-generation crisis of capitalism" is to summon all the Hegelian world historical eschatology of Marx—the idea that there are underlying forces in tension, that will ultimately lead to an endpoint of the current period, and some (hopefully better) period on the other side of the long apocalyptic night.



It's like the Kali Yuga of political economy.

I find the use of this lexicon very refreshing. Not just because of the gravity it invokes, but also because it reveals a healthy intellectual diversity that is largely unknown in the U.S. While anyone who studied economics or politics in the UK (at least before 1990) would have learned a healthy dose of Marx, you would be unlikely to easily find it in any American curriculum other than perhaps European intellectual history, or a dismissive sidebar in your introduction to classical microeconomics.

And you would never see a reference to a "crisis of capitalism" in an American newspaper (especially not a business newspaper), because you would so rarely even see the word "capitalism" used in an American newspaper.

We just talk about "business." Frequent use of the term "capitalism" to describe our political economic system would suggest, heretically, that there might be an alternative system.


[Pic courtesy of the amazing site of The Wanderer.]

Our culture is founded on religious freedom, as we see evidenced every day in the reality-confounding faith-based factual pronouncements of political leaders from both the Coke and Pepsi factions of our two-party system. But there really is no ideological freedom outside of the religious community. We have all the Utopias you can eat, and tolerate them with smiles—so long as they keep their fresh-baked modes of thinking and living confined to the congregation (which may be a secular variation, like the hippie communes of the 60s, or theologically science fictional but culturally successful religious movements like Mormonism). But we do not have an alternative political economy to capitalist Constitutionalism. The Federalist Papers are the real American Talmud, and there is no alternative to the seven Articles of the Founders.

American culture was extremely effective in the twentieth century in suppressing any authentic alternatives, violently excising any European-style revolutionary fervor beginning around the time of World War I. Who knew the first car bomb was a horse-drawn wagon detonated on Wall Street by an Italian-American anarchist who managed to leave a crater at the corner of Wall and Broad in protest of the imprisonment and deportation of other alienated immigrant anarchists? (See Buda's Wagon: A Brief History of the Car Bomb, by Mike Davis.) Socialist thought was appropriated by FDR to navigate the culture through the Depression, but mainly to empower the state to create the military industrial war machine born in WWII that is the core engine of 21st century American capitalism. By the end of the Clinton administration, ideological difference was largely illusory, and the end of the End of History with 9/11 really did nothing to change that.



So the only place you are likely to find the world "capitalist" in the American mainstream is in a winking necktie from Forbes magazine.

This weekend, while Austin amps up the somatic consumerist revel of ACL Fest, in which our teens are taught to express the illusion of ideological diversity through their choice of bands, the folks from Adbusters are trying to occupy Wall Street. Of course, they do not have an actual ideological agenda—they want an American Tahrir movement, but they seem to barely know what they are protesting (beautifully presented but politically impotent grievances with the soul-crushing ennui of our advertising-based mental environment), let alone what the end goal is, as evidenced by their email to the multitude last night revealing that they can't even decide on their one demand. At least they're trying to break out of the haze.



What they may not perceive is the extent to which the network technologies incubated by the military-industrial complex are now bulldozing the monolithic institutions of the post-Westphalian world in a way no revolutionary cadre could ever imagine. The future is looking to be one of network-based polities and mass-customization, in which the advertising is derived from what's already in your head. These are some very real Hegelian world historical forces rolling over the political economic landscape, and we don't even yet have an "ism" to locate them in our cultural taxonomy.

Monday, September 12, 2011

9/12/11

Yesterday the Soaring Club of Houston couldn’t fly because of Temporary Flight Restrictions having to do with a wildfire to the east of the Field. The last time I remember it being clear, bright weekend weather when we couldn’t fly our sailplanes was the days after 9/11/01, when US aviation was grounded. It’s deja vu with the perspective of ten intervening years.

Sunday’s Houston Chronicle Editorial pages include a column by Kathleen Parker in which she says, ” We stumble at last upon a purpose for columnists – to say that which no one else dares.” This in a column in which she posits that 9/11 caused America to go temporarily insane; that today’s political dysfunction took root in the soil of Ground Zero. Well, in observing the American mindset today, I’ve had to conclude that you can’t understand it without invoking psychopathology, or religion, and in particular, religion and psychopathology intersecting like a Venn diagram of doom.

Earlier this week Thomas Friedman dared too. He said, “. . . rather than use 9/11 to summon us to nation-building at home, Bush used it as an excuse to party — to double down on a radical tax-cutting agenda for the rich that not only did not spur rising living standards for most Americans but has now left us with a huge ball and chain around our ankle. And later, rather than asking each of us to contribute something to the war, he outsourced it to one-half of one-percent of the American people. . . . We used the cold war to reach the moon and spawn new industries. We used 9/11 to create better body scanners and more T.S.A. agents. It will be remembered as one of the greatest lost opportunities of any presidency — ever.” The entire Friedman column is worthwhile reading.

Friday, September 9, 2011

The Day the Narrative Died



On the front page of this morning's business section, cranky Bernanke (one wonders what he's measuring between thumb and forefinger in that photo) has a message for American consumers: lighten the f up!

Fed Chief Describes Consumers as Too Bleak
By Binyamin Appelbaum
Published: September 8, 2011

WASHINGTON — Ben S. Bernanke, the Federal Reserve chairman, offered a new twist on a familiar subject Thursday, revisiting the question of why growth continues to fall short of hopes and expectations.

Mr. Bernanke, speaking at a luncheon in Minneapolis, offered the standard explanations, including the absence of home construction and the deep and lingering pain inflicted by financial crises. He warned again that reductions in government spending amount to reductions in short-term growth.

Then he said something new: Consumers are depressed beyond reason or expectation.

Oh, sure, there are reasons to be depressed, and the Fed chairman rattled them off: “The persistently high level of unemployment, slow gains in wages for those who remain employed, falling house prices, and debt burdens that remain high.”

However, Mr. Bernanke continued, “Even taking into account the many financial pressures that they face, households seem exceptionally cautious.”

Consumers, in other words, are behaving as if the economy is even worse than it actually is.




Who can blame them? The psychology of the American Zeitgeist has been shattered by the failure of all of our national narratives. No wonder the hot new movie out this weekend is Steven Soderbergh's appropriation of the paranoid 70s don't trust the Man thriller, updated for the Tea Party age. Bernanke's mystifying multitude can see what he perhaps cannot: that the inability of the plotline of the nation to reel out in accordance with the oft-repeated master plan signals changes far more profound than we can really yet fathom.

Consider what those consumers have witnessed in the past decade:

- The crushing destruction of the blissful utopian prosperity (at least for the class of Americans Bernanke is talking about—the ones who spent extra money on stuff they don't need) of the technology-induced Long Boom, and the end of the End of History.

- A presidential election that no one won, until the power factory made up a reason to break the statistical tie. Between two candidates, it should be noted, who were members of the (un)American aristocracy: politicians who are electable because they inherited the title.

- A Charlton Heston and George Kennedy disaster movie scenario played out with a reality Orson Welles could never imagine on the Tuesday morning news, complete with exploding skyscrapers and heroic men in uniform, in which the basic everyday technologies of our commercial lives were employed by medieval necromancers hiding in caves in exotic locations to begin the destruction of our meta-narrative. The new era was ushered in by investment bankers learning to fly.



- A framing of the response to this event with the all-American Western movie paradigm of hunting down the bad guy. This narrative played out perfectly for the first three months, with grainy telephoto images of ass-kicking Grizzly Adams Special Forces dudes outfitted with the Full Metal Apache version of the Outside Magazine gear of year catalog, riding horseback with the good Indians across the Afghan plain. We had the big denouement on the snowy approach to the Blofeld hideaway of the bad guy (see below). And then? Ten years without an ending.


[Matt Bors gets it dead-on pretty much every time.]

- Instead of the tidy ending that would end the movie with the cowboy riding off into the sunset, we got Apocalypse Forever. A decade of grim governance through geopolitical fear, personified by the cyborg Veep, and all the well-mannered lawyers clinically constructing extra-Constitutional space where no one can hear you scream. You had to work hard to find the real images of the pain that our war machine unleashed on the globe, but you could see it seeping out all over the pop cultural glaze of our ubiquitous screens, as dissected in our 2007 post on "The Warporn Eucharist."


[For an unhealthy survey of the GWOT Zeitgeist, check out these morale patches at MilSpecMonkey: "Pork Easting Crusader"? "What Would Jesus Shoot?" "What Would Jack Bauer Do?" (via Bruce "Gothic High Tech" Sterling)]

- And let's not forget about the unsolved mystery of the anthrax in the mail.

- Because the Taliban Toyota cowboy movie lost reel wasn't enough, we repeated pretty much the same paradigm in Iraq, as neocon War College scenarios turned out to be a lot bloodier, and resistant to definitive ends, when they were in the real world.

- In The Big Uneasy, the disaster movie paradigm really failed, as the guys with the ladders and the helicopters and the megaphones managed only to watch a complete Hobbesian meltdown that exposed all of the unhealed cultural lesions of American race and class. (Yes, they did just convict those cops for wrongfully shooting the people on the bridge.)

- The beginning of the end of privacy, in which all secrets—corporate, government, and personal—are live on the network, ushering in a panopticon society in which we are all conducting surveillance on each other all the time.

- A diminishing of the relevance of the sovereign nation state, revealed in the insecure efforts to reinforce its very existence with the fantasy of border walls, the inability of governments large and small to service their own debt, and the dubious viability of governance by a monolithic "sovereign" in a world that is really run by networks.

- The diminishing viability of the long-term employment contract, a destabilizing force that is probably one of the main cultural factors behind the Tea Party movement—networked Capital's transformation of the American middle class into a mass of disenfranchised grey collar lumpenproles.

- A global economic meltdown far worse than the bubble burst that began the decade, one where you can see the raw fear in the confounded eyes of the Masters of the Universe, as scenarios play out that violate the Talmudic laws of neoclassical economics, crisis scenarios only Karl Marx could have imagined. The American narrative really doesn't work when you take away the theme at the heart of it all: growth.



What Bernanke's really complaining about? Americans got the message of the past decade, and they don't believe the movie anymore. They no longer buy into the basic narrative that growth should be generated by borrowing against future income. It's the Great Deleveraging. Is that really such a bad narrative to abandon? Wasn't Rocky Balboa a lot more likable when he stopped working for the payday lenders?

It is no accident that this decade in which our narratives failed us began with the confluence of the *real* end of the long war of the twentieth century, and the dissemination of networked computing into every fabric of society. The network is the common thread in the mass destabilization of the past decade, the tool that is disassembling the reality we inherited.

What did we really lose in the past decade? The Future. Our narratives failed because network culture does not tolerate the orderly linearity of "narratives."

The end of the Future doesn't mean The End. It means an anarchist's dream of liberated territory, waiting to be explored, spelunked, and recreated in the image of the emerging world. It won't be easy, but it will definitely be interesting, and it should be fun. And you can bet that it won't have a lot to do with Ben Bernanke.

Friday, September 2, 2011

Michele Bachmann is a Mexican?



Today's Washington Post reports from the campaign trail that the people of New Hampshire, Minnesota, and other wintry climes are going totally loco about immigration, hounding the candidates at every stop for not being tough enough on the issue. And no, they are not talking about the border those states actually share, with Canada.

The piece notes how the issue is posing challenges for our Texas Gov. Rick Perry, who, like his predecessor, has a record on the issue that confounds the Midwestern xenophobes that might be drawn to his otherwise opportunistically jingoistic politics—having called the idea of a border fence "idiocy," and promoted paths for the children of "illegal immigrants" to get great public educations. It is a peculiar politico-cultural moment when Ann Coulter can complain, however coded, that Rick Perry isn't white enough.

Michele Bachmann has no such problem:

Bachmann promised to build a fence along “every mile, every yard, every foot, every inch” of the nation’s southern border, to “have the back” of enforcement agents, and to put an end to the provision of federal benefits to illegal immigrants.

But she really got her audience going with a series of lamentations about the border that places her to the right of her opponents.

“On the southern border, we are dealing with a narco-terrorist state today in Mexico,” Bachmann said. “Because 70 percent of narcotics are coming to the United States are coming from Mexico. Mexico is in a very different place right now. We are seeing criminals, felons, drugs, we’re seeing contagious diseases coming into our country. What is wrong with our government that it isn’t stopping this from coming into the nation?”


Contagious diseases? Michele Bachmann really is scarier than Sarah Palin, because she encodes a much more powerful American archetype than Palin's frontier mama who fields strips her own elk—Bachmann echoes the Mad Men-era white housewife, practitioner of hygiene as the domestic theology that distinguishes the people of the sanitized linoleum from people who get dirty.

June Cleaver was a white supremacist.



And Charlton Heston is a Mexican.

Watching "Inspector Vargas" and his gringa wife Janet Leigh walk across the border in Orson Welles' long opening tracking shot from Touch of Evil, you see the essence of border angst shown, not told—anxiety about miscegenation and menace crossing freely through our trusting and essentially insecure borders, ready to explode.



Like the contagious diseases Nurse Bachmann would protect us from, the real threats represented by the idea of the southern border are invisible. As noted earlier on this blog, you can see the architecture of our anxieties clearly reflected in the research procurement program of HSARPA, the Homeland Security version of DARPA, which is trying to develop the technological components of a "virtual" border fence that sounds more like a sci-fi forcefield than a castle wall—a "barrier" comprised of speculative fields of surveillance and electronic interdiction.

An imaginary barrier, constructed with semiotics designed to sustain the belief that it exists, is the perfect way to maintain the fictions that sustain our political economy—like the idea that there really is a viable distinct sovereignty called the United States of America, rather than a networked world of blurred borders hurtling inevitably and chaotically towards new global systems, or the idea that there really is such a thing as "white people," rather than a single species of naked apes playing out in infinite permutations.

How do you sustain the idea that there is such a thing as *an American*, and *a Mexican*, and that you are supposed to be able to, you know, tell the difference? By the social construction of the idea that such a distinction exists. And the more illusory the idea of the difference is revealed to be, the more its proponents struggle to represent it with real world fortifications.



The Jeffersonian tree of liberty has a mythic potency. How sad to watch the culture ramp up for a political season in which that idea of the right of revolt is appropriated to express the narcissistic frustrations of the children from Gilligan's Island—the American lumpenproles who find themselves disenfranchised by the productivity-sucking forces of global capital, subjugated into the gray collar class of The Matrix. One can only hope that the lunatic anachronisms of Bachmann's race-based demagoguery and demonization of the phantom Mexican Other will prove itself ultimately unsuccessful in a twenty-first century milieu.



The antidote? Maybe go catch a Labor Day Weekend screening of "Saving Private Perez" (Salvando al Soldado Perez), the recent Mexican hit film satire of Saving Private Ryan, in which a narco and his band of hermanos travel to Iraq to rescue his American soldier brother from his jihadi kidnappers. That sounds a lot more like the 21st century than the Tea Party's hate-based Pleasantville. And the real contagion Michele Bachmann needs to be worried about is that her June Cleaver version of reality is under full assault when indigenous Mexican humor and perceptions of American identity start getting relayed via Hollywood into the minds of the mall rats of Wayzata.