JCRT Live

Web Name: JCRT Live

WebSite: http://jcrt.typepad.com

ID:97586

Keywords:

JCRT,Live,

Description:

By Carl RaschkeThe Egyptian revolution , which was not really a revolution in the Western historical sense, is now a fait accompli. But what was it, really? What does it mean in the long haul? The events of the last two weeks strike me as a post-modern, perhaps even post-Muslim version of the events of May 1968, which French intellectuals today still recall as l eventement, the event , or events, that supposedly changed everything. At the time they did, but socially and politically they had very little short-term or long-term effect. The events of May 1968, which included both the Prague Spring as a very early herald of the collapse of Communism two decades later and the peak fury of anti-Vietnam protests that led to the resignation of then-President Lydon Johnson.There was one common, undeniable factor in both the Western uprisings of the late Sixties and the sudden in the Muslim world in 2011 - an enormous, maturing, disaffected youth population. The common, almost stale refrain in the Western media these days has been that the events in Egypt were almost exclusively about freedom , democracy , and the lack of economic opportunity. In a large sense that observation is true, but trivially true. The so-called youthquake in America and Western Europe more than 40 years ago shared many of the same slogans, and the same kind of explanations from the punditerati. To a lesser extent similar forces and factors propelled the final push to bring down the post-war Marxist states of the post-War Europe in the fateful years 1989-91. But in all three cases what struck observers was the sudden propogation of a wave phenomenon started by a seemingly random incident (as when an earthquake on the ocean floor triggers a tsunami) that quickly caught global attention and inspired mass mobilizations of those who were fed up with whatever it was that was onerous and long-standing. Hence, there is not much difference on that score between Rosa Parks who set off the civil rights protests and Mohamed Bouazizi, the so-called street vendor who sparked the revolution. But these now historically inscribed phenomena only compare with each other superficially. History, as we remember it, always happens suddenly and unexpected - and often is over quickly. Organized youthful disaffection can conjure up thrilling global theater, but what makes real revolutions, as historians constantly remind us, are well-thought-out transformative and quasi-utopian ideologies that have enough appeal within a select circle to inspire long-term, absolutely dedicated and often ruthless, commitment that more often than not produce the opposite of what they promise. Whether we re talking about the Jacobins in 1789, the Bolsheviks in 1917, or Ayatollah Khomeini s Shiite Islamist circle in 1978 - the prototypical revolutions described and analyzed in the history books - the historical impact ultimately has come from the unflagging efforts of determined vanguard. Just as an exceptionally warm day in January does not an early spring make, a dramatic tumbling of a long-standing social, political, or economic system does not a revolution make.For example, if it had not been for the visionary and dogged strategic leadership of Martin Luther King and the Southern Christian Leadership Conference starting in the early 1950s, there would not have been a civil rights revolution . Conversely, there are essentially democratic but leaderless revolutions short on ideology that flare up, then largely fade into memory and quickly forgotten. Prominent examples are the temporary overthrow of the war-weakend French state by the Paris Commune in 1871, or the so-called Orange Revolution in Ukraine from 2004-05 that within five years dissolve back into the kind of oligarchical control which has been the dominant theme in Eastern Europe for centuries and with which quite Egyptians are familiar. The Egyptian revolution is seriously in danger of succumbing to this latter pattern.But what does seem significant in the long run about the events in Egypt (and North Africa for that matter) is the marginalization of the Islamist voice, which so many academic experts have almost herd-like extolled (often in fulsome terms) as a slow-rolling, inexorable wave rolling over the Middle East in the last ten years. Despite their long-proven conspiratorial prowess when it comes to leveraging chaos to impose semi-totalitarian Islamic principles on the populace since the late Seventies, such interests and factions were almost inconsequential. There is fear that if the situation is not stablized soon, they will fill the vacuum once again. But something else unthought (as Martin Heidegger would call it) seems to be happening below the klieg lights. What is happening seems almost to be a sudden reversal of trends that began and gradually gathered momentum in both the West and the Middle East from the late 1970s onward. What has reversed is the slow takeover of the political process by religious ideologies and utopian schemes of using stripped down, highly modernized, fundamentalist world views to overhaul society. The sudden and mass defection of youth in America from the religous right toward the close of the Bush era was one local indicator. The uselessness of imams and mullahs in the new Middle Eastern youthquake is an even more telling illustration.That does not mean that the Middle East is finally going secular. Nor are we witnessing anything like an implantation of classical ideas of European liberal democracy. Ironically, Slavoy Žižek s pronouncement on Al Jazeera recently that what the Egyptian jacquerie really symbolizes is a new sense of revolutionary universalism in the Muslim word driven by a pushback against tyranny may be the most trenchant observation so far. Even though the current reactionary Islamist regime in Iran cited the events along the Nile as a vindication of their own revolution , the renewed mobilization in recent days of that nation s green movement based on the example of what happened in Iran has far more weight to it. This new universalism , to cite the phrase first invented by Kant at the time of the French Revolution, is probably, at least in the Muslim context, a religion without religion. But all religions without religion are driving forces that share ethical and political affinities with their historical precedessors and cannot be divorced from them. Egypt thus probably will cut significantly into the commercial relevance of today s generations of Islamic and religous studies scholars (many of whom have secret Islamist sympathies) who have become accustomed to telling us how Muslims are really different from us, and that in good multi-culturalist genuflexion we need to acknowledge our sins of insensitivity to that fact.But it will probably also undercut the authority of the secular theologians, who have held on since the 1960s and constantly remind us that it all comes down to our own autonomous and liberal political (or perhaps genteel neo-neo-Marxist) responsibility to change the world somehow (or in some form), even though Gott ist tot.What we may be witnessing intellectually, even if Egypt goes the way of Ukraine, is a new, universalist, syn-religious but not necessarily religionless , faithfulness to the force of the future - torchborn by youth, as always happens - that focuses on some kind of concrete vision of a globalized, cosmopolitan passion for justice. Kant may have been more prophetic than we realize. And it is no coincidence that Derrida picked up this theme twenty years ago in his own meditations on the return of religion. The return of religion was never intended by Derrida to mean the return of religiosity , or the return of the forms of historical religion. If we follow Derrida back ten years earlier, it was meant as a return of the universalized force of faithfulness to what he himself called the impossible. But the impossible has happened. Carl Raschke is senior editor of the JCRT and Professor of Religious Studies at the University of Denver. By Colbey Emrerson ReidZombieland, Ruben Fleischer’s 2009 contribution to the cinema of zombie apocalypse, resembles many recent contributions to the genre in its stylization of the undead into a political allegory. The film’s opening voiceover describes the collapse of America into Zombieland, a construct of post-nationhood in which American citizens and governments have perished in the chaos caused by a virulent strain of mad cow disease that turns people into sponge-brained monsters. Fleischer’s zombies are a cautionary tale against the American lifestyle, whence the over-consumption of fast-food hamburgers causes epidemic hyperglycemia as the country is overrun by people who are grouchy, indiscriminately ravenous, and too fat to run away from danger. The reinvention of America falls not to lawmakers, since cops on their lunch break are some of the first to fall prey to the disease, but four oddballs whose circumstantial rendering as post-American even before the apocalypse equips them to survive in their country’s wasteland. Besides being physically fit, they’re all loners divested of a conventional sense of family and belonging. Two of the characters are sisters who seem to be runaways, one has lost his son, and the other has parents whom he suspects don’t want him. Their survival is juxtaposed against vignettes of Americans dying in the midst of families and communities: weddings, backyard barbeques, suburban car pools, public restrooms. Mothers who sentimentally hesitate to blow the brains out of their frothy-mouthed children are at particular risk, illustrating the serious necessity of eschewing traditional family values during a zombie apocalypse. The protagonists, recognizing that their mutual predisposition to alienation has been keeping them alive, sustain that condition once they all join forces by using their destinations as names so they don’t get too attached to one another. Thus “Tallahassee,” “Columbus,” “Wichita,” and “Little Rock” are all Americans first articulated through conventional standards of national identity, describing themselves as places to which they are tied by the habits and associations of the past. But ultimately each place name becomes detached from a fixed location as the person affiliated with it realizes that the place no longer exists as anything but a private fantasy of returning to the way things used to be. Thus the cities, taking the form of people, wander and move as they construct a fundamentally mobile post-nation. The ensuing narrative explains what “America” might be when it no longer exists as we know it, with Fleischer arguing that the country isn’t its geography but the relationships fostered between its people. The characters, who realize that they love each other more than the places and people whom they were expected by the logic of authenticity to affiliate themselves with in the past, reconfigure into a postmodern family that works as a metaphor for global citizenship: four nomads forging a unit of emotional rather than tribal ties as they quest not to rediscover their respective spaces of origins but rather try to dream up new destinations to inhabit together in a totally undetermined future. Where they all go matters very little in the borderless America they inhabit because every place is identical; all distinctiveness must now belong to the individual citizens of Zombieland, who are revealed to possess quite colorful personalities as the film progresses. For example Tallahassee, literally eye-catching in his animal print jackets, is a human Texas in cowboy boots, a trunk full of machine guns, and a hankering for Twinkies. Rather than destroying him the Zombie apocalypse brings out the unique talent his mother always told him he would find: “Who’d have thought it’d be killing zombies?” he muses, before dismantling a pack of undead in impressively creative ways.The very denotation of Zombieland thus changes in the film, from a globalization nightmare depicting a horrifically dismantled nation into a space of amusement. The literal amusement park, called Pacific Playland, on which our heroes ultimately converge in the mistaken belief that the park will provide a zombie-free place to return to the innocence of pre-apocalyptic America, turns out to be riddled with zombies, a sign that there is no going back. The four characters thus move forward, turning Pacific Playland into Zombieland by troping zombie-annihilation as various amusement park games: Tallahassee holes up in a shooting booth and slaughters the undead with effortless exuberance, and Columbus hits a zombie clown head with a hammer in a carnivalesque test of strength featuring the prize of the endangered Wichita’s safety. As Zombieland transitions from a space of desolation into a space of enjoyment, the film shifts genres from horror to romantic comedy. The moment of grave danger, Playland surrounded by masses of zombies, resolves into happy carnage as Columbus wins Wichita’s heart and Wichita finally tells Columbus her real name. Tallahassee gets his Twinkie, and Little Rock finds her childhood, screaming with faux-fear on a ride that hoists her up high and then drops her—into a pack of hungry cannibals that never can quite get to her, as Tallahassee has taught her just to relax and shoot their brains. Currently the highest grossing film at the box office, it’s easy to see why: Zombieland reconfigures a globalization dystopia into the kind of affectively-forged utopia that would make Martha Nussbaum proud, explaining to Americans what they are now that they aren’t the United States of America in any of the same old ways, and how to take pleasure as opposed to terror from that status.Fleischer’s use of the zombie narrative as a political allegory is not a new conceit, and in fact I would argue that the zombie subgenre is the primary contemporary locus of democracy mythology, particularly producing narratives of crises in democracy. Tallahassee’s mantra, also the tag line in Zombieland posters, is “nut up or shut up,” and this could be the tag line for all zombie movies insofar as they are about the necessity of taking revolutionary action in a crisis of suppression that is, if not literally orchestrated by the government, a sign of the state’s ineptitude in guarding against it. In the zombie genre action is taken through armament, but figuratively the reference to silence, or “shutting up,” as the alternative to “nutting up” posits guns as a metonymy for the “voice” that is granted democratic citizens through enfranchisement. Etymologically the zombie derives from West African voodoo folklore and describes a person magically stripped of an independent will and therefore controlled by a sorcerer; in the Afro-Caribbean context zombies were slaves, stripped of will insofar as they are divested of citizenship. In more contemporary forms the association with colonization disappears, but the notion of the mass suppression or willing abdication of democratic agency remains. Consider Edgar Wright’s Shaun of the Dead (Great Britain 2004), for instance, where zombie infection depicts western democracy as plagued by a wasting illness, with the ordinary mindlessness of contemporary life blossoming from the banal into the horrific. Facets of the pre-apocalypse produce the apocalypse: working the same dead-end job day after day, success at which demands a high tolerance for stupor and herculean bouts of inactivity; riding the bus staring vacantly into space, returning home to stare vacantly at the television; subsisting in passionless relationships produced by a society whose proper function demands nothing more than acquiescence to boredom. The foundational gag of the film is the absurdly long time that it takes for Shaun to notice that he’s surrounded by zombies because they so closely resemble non-zombies, including Shaun himself. A parody of the title character at the beginning of the movie include shots of Shaun waking up, stumbling out of bed, yawning—and looking exactly like the slow, stumbling, moaning undead already conquering the British Isles. Shaun’s task, once he realizes what’s happened, is not only to evade the zombies but to keep from becoming one (which is to say, remaining one) himself. He therefore is driven to become a “man of action,” as the film tropes democratic agency as action heroism (“nutting up”) and records Shaun’s awakening as a great community organizer able to inspire other survivors towards revolt. Shaun fails, however, to awaken anyone but himself, his girlfriend, and one of their friends, all of whom he finally leads to safety in an armored van driven by government special forces. These three live happily ever after, while the tamed zombie population simply continues the inert lives of ticket and toll collectors that they always led. “Shaun” is a tale of democratic revolution in which the status quo is maintained by those who refuse mobilization, Britain divided into new social classes distinguishing those who know, are awake, and join in a participatory democracy from those relegated by their own obliviousness to sleepy servitude.In Bruce McDonald’s similarly politicized Pontypool (Canada 2008), a Quebecois village is plagued by wild packs of flesh-eating locals whose first symptom of illness is the compulsive repetition of whatever words and phrases they spoke at the moment of infection. The film is narrated by radio talk-show hosts who discover that language is dangerous, and not just figuratively. Certain English words host a virus that infects and kills anyone who speaks them, and since they don’t know which ones to avoid the protagonists speak (bad) French. The detachment of signifiers from signification yields particularly eerie zombies, rendering something ordinary—“dead language”—unfailingly creepy. Pontypool, the first post-structuralist zombie movie, critiques the mindless use of language and contrasts it with the strenuously invigorated rhetoric employed by a pair of daring radio hosts who eventually figure out the source of the virus and do their best to reinvigorate language in order to survive. They broadcast a tour de force of “living language” to the desperate village population until the government tells them to shut up. When they won’t, they’re branded terrorists and bombed, thereby darkly concluding McDonald’s allegory of zombification through the consumption of mystified ideological jargon. Though the radio hosts’ refusal to “shut up” doesn’t end happily, the allegory of democracy as ailing through its citizens’ failure to exercise voice, to exert a popular will discrete from that imposed by the state, could not be more baldly drawn. Notably, Pontypool takes place in Canada, where “nutting up” isn’t an option. It’s hard to imagine an American Zombie movie where the military are the only ones with guns, and indeed in Zombieland Tallahassee and Columbus discover a hummer filled with machine guns and a pair of disembodied hands gripping the steering wheel. Tallahassee’s cry—“Thank god for rednecks!”—makes it clear that protecting the Fifth Amendment would pay off in a zombie apocalypse. Norwegian zombies resemble American, British, and Canadian zombies in their association with mindlessness that can only be vanquished by a democratic man of action who is posited as the heroic antithesis to political torpor. Tommy Wirkola’s 2009 Dǿd Snǿ, for instance, takes the natural next step in the association of zombies with fascist tendencies—their proclivity to enact mass violence once the capacity for independent thought and therefore resistance to political “sorcery” is removed—by literally featuring Nazi zombies chasing German medical students while they are on spring break in the Alps. The Nazi Zombie sub-subgenre was popular in the 1970s, to wit Amazon’s extensive Undead Reich! DVD collection, so Wirkola’s invention isn’t the Nazi zombie so much as the retro Nazi zombie, a category of zombie that caters to contemporary political scenarios that international configurations of the political Left like to compare to fascism. Hannah Arendt, in her discussion of the Eichmann trials in Jerusalem, has linked fascism to the “banality of evil,” that chronic intellectual apathy which can become cataclysmic in mass cultural forms. It is a small leap from Arendt’s implication that fascists are like zombies to the sub-subgenre’s reconfiguration of Nazis as zombies. In Dead Snow, contemporary German medical students are attacked by (and therefore themselves become) Nazi zombies because they disturb hidden Nazi treasure and try to keep it for themselves, unwitting until it is too late that the reason the dead fascists have become reanimated is because they see the contemporary students as their likeness. Hence Wirkola’s thinly veiled critique of the contemporary German professional class, which enjoys a moral distance from the fascist past while harboring its “blood money,” a metonymy for the paintings and other German cultural treasures featured in the recent media as stolen from persecuted Jews during the Second World War. Wirkola further implies a relationship between economic stability and bygone atrocities in the figure of a frolicking German bourgeoisie who brings snow mobiles and other signs of prosperity to the haunted Alpine graveyard. Here Dead Snow is analogous to Zack Snyder’s remake of Dawn of the Dead (U.S.A. 2004), where survivors barricade themselves against a zombie epidemic in a Milwaukee shopping mall. They are ultimately unsuccessful at saving themselves there, thus demonstrating Americans’ misguided efforts to tourniquet the plague of apathy with the artificial stimulations of consumption. In both Snow and Dawn, the man of action attempts to forge a resistance against the zombies and appears at first to succeed. But these films, two of the many twenty-first century zombie films in which everyone dies by the end, illustrate the ideological pervasiveness of capitalism in that neither the German medical students nor the assorted group of refugees in the mall can find a space of true resistance to the zombies. All spaces of resistance are depicted as infected by capitalism: the German students become “men of action” without realizing—with deadly obliviousness—that they have been fighting over the same thing animating the Nazi zombies; the mall refugees also become action heroes to fight their way to what they hope is a deserted and uninfected island only to discover that there is no oasis so remote that the ideology they’re metaphorically fighting can’t get to it. Indeed, the notion of oasis—on a series of uncharted islands as in Lucio Fulci’s Zombi II (Italy 1979), or at Pacific Playland—is a pivotal part of zombie films’ registry of globalization as a crisis for democracy. In a zombie apocalypse, there is no oasis because the world has become too small to sustain spaces ungoverned by somebody. The various Zombielands of the films I’ve discussed all signify citizenships dismantled by apoliticism, and seem to argue for the necessity of reinvigorating western democracy through heroically-posited post-Enlightenment ideals of agency and rationality disentangled from the post-Industrial trappings of consumption which offer an illusion of reinvigoration that actually only leads to another form of “reanimation,” a cipher for the living dead. Zombieland’s Columbus, for example, is a nerdy college student who used to spend his time studying but becomes a rifle-toting adventurer with useful analytical abilities. Throughout the film he jots down rules for zombie survival garnered through his practice of the scientific method, rules like doing lots of cardio, killing zombies twice to make sure they’re really dead, always checking the back seat, and avoiding bathrooms. Columbus’s alienation from popular culture and therefore from its modes of consumption is represented first by his romantic isolation and social awkwardness—his ongoing fantasy of getting close enough to a girl to brush her hair off her face, and then by his nerdy cluelessness about how to be hip—as when Tallahassee pours him a shot of whiskey and Columbus tosses the booze out the window before “drinking” the shot. Several critics have complained that Jesse Eisenberg, who plays Columbus, is too wooden in delivering his lines, but I would argue that his stilted passionlessness is essential to the allegory; Columbus is an anachronism, Benjamin Franklin wandering the highways, gas stations, and strip malls of the twenty-first century.Lest zombie democracy, in its valorization of the man of action, appear to be not only as rationalistic but as sexist as those embodied in the societies of our eighteenth-century past, we should note that contemporary zombie films dramatize the growing prevalence of female politicians as the protectorate of a democratic tradition. In Zombieland, Wichita and Little Rock are a teenage girl and her little sister, and they repeatedly rob and take hostage the two male characters. Wichita’s costume of black boots, tight jeans, dark eye makeup, and wild hair recalls the figure of Alice in Resident Evil, a 2002 American zombie movie featuring Milla Jovovich as the resident ass-kicker and whose many sequels posit women’s bodies as the next evolutionary step and the source of the salvation through adaptation of humankind. I suppose one could argue that the real heroes of Resident are the teenage boys who play Biohazard, self-identifying with Alice in the video game on which the film is based. But even so, their ability to become virtual men of action depends upon identification with a female character, and specifically the capable female body of the game and movie’s heroine. In Andrew Currie’s Fido (Canada 2006), women conquer zombies not by killing them but by emasculating them in a 1950s-esque parable of domesticity. The emasculation is not misogynistic; in Fido, it is male aggression that has to be tamed by housewife Helen Robinson and her little boy, Timmy; ultimately, they replace their violently angry husband and father with a gentle zombie named Fido and live happily ever after. In many contemporary zombie films a woman or effeminate male is the last left alive (I would count Columbus as one of the latter), rendering the democratic action hero less a restrictively gendered fantasy than an explicitly western one. For a genre devoted to the creative production of exquisite gore, the Japanese have been relatively silent on the subject (too eastern), as have the Chinese (not interested in democracy), though at least one Top Ten Zombie Movie list online cites two Japanese zombie films made in 2000 that I haven’t seen. Specifically, it is the western woman, biologically a symbol not of viral replication but complex organic reproduction and correlating complex possibilities for action, which is at the center of films ranging from Resident Evil to Danny Boyle’s 28 Days Later (UK 2002). The title of the latter refers to the lifecycle of the virus infecting all but a few immune survivors, but also to the female menstrual cycle, and it is no accident that the film features the attempted rape of two female survivors who are in this case grotesquely posited in their capacity to become mothers of an immune race as the future of humanity. The women (or more accurately, girl and woman) in 28 Days escape from their would-be rapists, and in all of the original endings planned by the director (if not the ending presented at the film’s debut, in which one of the women’s lover also lives) they are the only survivors. Since 28 Days does not portray a worldwide infection but only a quarantined England, the survival of the women without any men does not indicate an end but a new beginning.The woman of action is the ultimate antidote to the crumbling democracies across the spectrum of Zombielands not merely because of her body, but her mind. Female zombie warriors are female intellectuals, who enact on and with their bodies the forms of unconventional thinking that tend to elude male action heroes in the zombie genre. Wirkola, for instance, posits Columbia as departing from his brainy past during the zombie apocalypse, becoming physically capable (both sexually and athletically) under the tutelage of his mentor, Tallahassee. Wichita and Little Rock, however, trump physical strength by outwitting their more “powerful” male cohorts. When they first meet, neither of the women have weapons. Wichita begs helplessly for their assistance, taking them to a weeping Little Rock who pretends to have been bitten and infected by a zombie. She asks Tallahassee to shoot her, and he agrees—but at the last moment Wichita asks if she can be the one to kill her sister. Tallahassee hands her his gun, and the sisters rob the men of guns and vehicle. To demonstrate the absence of brains in all of Tallahassee and Columbus’s brawn, the two men fall prey to the hoaxers again when the women’s stolen car runs out of gas several miles down the road. Thus the woman of action figure delineated in narratives of resistance to the totalitarian authority of the zombie-state emphasizes the importance not only of revolution (troped as a masculine form of resistance) but of innovative thinking (troped as female). Female characters represent the ultimate antithesis to zombies’ physical and mental anaesthesia in their mobilization of the mind, the creative function of which is frequently (as in “Resident Evil,” where Alice is not only some sort of über-black belt but a genius) projected onto the body in the form of extraordinary physical prowess. Even the female antidote to zombie apocalypse is problematic, however, insofar as the genre questions the extent to which having an “independent mind” (the hallmark of the western woman) is ever possible. Patricia Chu’s deceptively bland-sounding Race, Nationalism and the State in British and American Modernism (Cambridge UP: 2006) is a well-disguised example of zombie scholarship that also positions the genre within narratives of citizenship and nation-building. Chu’s first chapter contends that the first zombie movie, Bela Lugosi’s White Zombie (USA 1932), was a cautionary parable about the insidiousness of the forms of agency promised by democratic enfranchisement in the era of expanding state administrations; she uses the film as a theoretical framework with which to identify other aesthetic and philosophical versions of the same dilemma, which is to say that she considers zombies the fundamental building block on which twentieth-century civilization is built. In White Zombies, which takes place in Haiti, enslavement can happen in all of the obvious ways, as when colonizing nations oppress individuals by divesting them of the status and rights of citizenship. But it can also take place in insidiously just ways, as when governments configure “choice,” “freedom,” and “having a voice” as garnered through citizens’ willing obeisance to the state. Chu offers a context in which we might question the plausibility of the “woman of action” figure in her reading of the marriage scene in White Zombies, in which a beautiful young woman is hypnotized by a rich plantation owner who has been generating wealth for himself by using voodoo to turn Haitians into zombie-slaves. As Chu points out, the woman’s role in the wedding while in a state of zombie-like hypnosis is indistinguishable from a normal wedding. Her “choice” and “voice” are over-determined as acquiescence and repetition of the decisions and words orchestrated by government, church, and social institutions. White Zombies, Chu argues, was the first in a series of modernist aesthetic testimonies to anxiety about agency as it is generated through enfranchisement. Her reading demonstrates the lack of differentiation between zombie and democratic agent, the notion that the man of action as posited by Enlightenment political theorists never has existed, not only for recognizably disenfranchised groups but for white men. The zombie is a western white male nightmare of democracy’s failure to deliver on its promises, its inability to proffer agency and independence as anything other than submission. The alternative to being a zombie, insofar as it is attained through democracy’s heroic ideal, is a mirage. Zombieland, which concludes, like a classic comedy, with a synecdoche for a wedding (Columbus kisses Wichita), gestures towards the wedding scene of White Zombies, illustrating that the hope of extricating oneself from zombiedom is artificial since one form of submission is merely substituted with another. In Zombieland the state no longer exists, but Columbus and Wichita mime its existence in the absence of another model for agency. The analogy of weddings to zombie-citizenship is the premise for Pride and Prejudice and Zombies (Grahame-Smith 2009), where the infamous premise that “it is a truth universally acknowledged, that a single man possessed of a good fortune must be in want of a wife” becomes “it is a truth universally acknowledged that a zombie in possession of brains must be in want of more brains.” In each case, weddings become a theater in which to enact the ineptitude not only of masculine revolutionary action but of female rational agency to posit an antidote to the zombie-state. Henry James, referring in 1909 to the valorization of action embraced in the pragmatist construction of democracy delineated by William James and John Dewey, critiqued in the preface to his plotless sprawl of a novel, The Golden Bowl, the conversion of America to a “religion of doing” (James’ emphasis). The emergence and proliferation of zombie narratives in the twentieth and twenty-first centuries suggests a crisis of faith in the ontological possibilities proffered by democracy. In the zombie myth, at the level of plot (the narrative locus of action) there is little possibility for physical or mental opposition to the encroaching zombie apocalypse. However, as is suggested in Rick Popko and Dan West’s journey into the grotesque, retarDEAD (USA 2008), zombie mythology may suggest another avenue through which to conceptualize democracy within the figure of the zombie itself. In Popko and West, a mad scientist transforms students into zombies at the Butte County Institute of Special Education by injecting them with a hyper-intelligence serum. Thus the zombie is presented as a site of radical experimentation, the features of its own brain and body yielding the hidden antidote to the anxiety of mass anesthesia posed by the institutionalization of democracy insofar as the retarDEAD signify a perversely living mind through the figure of the corpse. In Part 2, an “Apology for the Undead,” I will therefore explicate the potentiality for an alternative construction of democracy that is theorized within zombie aesthetics, the narrative unconscious of the genre. Colbey Emmerson Reid is Assistant Professor of Modern Literature Department of English and Humanities York College of Pennsylvania. She can be reached at creid@ycp.edu. Eric Gans likes to speak of his “originary hypothesis” regarding the origin of language as a “new way of thinking.” In my view, a new way of thinking means thinking in a new idiom, with a new vocabulary and grammar—an idiom of inquiry. Gans’ originary hypothesis completes the “linguistic turn” of 20th century thought—the intuition guiding the dismantling of metaphysics by 20th century thinkers was that language doesn’t represent some external and independent reality; on the contrary, language, or more generally, signs, is constitutive of anything we can call a human reality. What Gans’ hypothesis does is explain why language is constitutive: because it was through the sign that our immediate ancestors transcended the mimetic rivalry that perpetually threatened their existence by discovering/inventing a way of deferring violence. Without Gans’ hypothesis, the linguistic turn remains hostage to victimary thought, which, following and slightly inflecting Gans’ talk at the annual Generative Anthropology conference on June 20, I would define as the insistence that claims to metaphysical hierarchies are really disguises legitimating social and political hierarchies. If logocentrism is really phallocentrism, Eurocentrism, etc., then the critique of metaphysics is essentially volley in a partisan political battle, rather than an attempt to disclose a more originary presentation of human being than metaphysics allows.My own entrance into the new scene of thinking opened up by Gans’ hypothesis, as outlined in my previous posts, is through the belief that once we have clarified the constitutive power of the sign, we should be thinking in and not merely of the originary sign. More simply, our idiom should a thoroughly semiotic one: originary grammar. Gans speaks of his hypothesis as a “minimal” one, to which Occam’s Razor has been applied (the reader can see any of the introductions to Gans’ books, or my own work The Originary Hypothesis: A Minimal Proposal for Humanistic Inquiry, or, indeed, my first post on this blog, for an argument for the minimality of the hypothesis). A minimal mode of thinking, then, would be one which uses only vocabulary derived from and applied to originary accounts of language in order to speak about a reality we now know to be thoroughly mediated by semiosis. A new way of thinking generates idioms of inquiry and the originary hypothesis makes it possible to generate such idioms out the exploration of linguistic relationships. To take just one simple example, all the discussions in postmodern thought regarding “power relations” can be reduced to the simple question of when, where and how imperatives “work,” and when, where, and in what ways they don’t. Thinking in language does require the assumption of a minimally conceived extra-semiotic element, however: otherwise, all we can do is string out a series of vaguely connected descriptions of language. Only an account of language as emergent and constitutive, as an event, which tells us why there is language rather than none and what language is for can enable an ordered inquiry in language. The originary hypothesis provides us with such an account, by positing an originary event which defines language as the deferral of violence through representation. Every word, every sentence, every sign, then, defers, or contributes to the deferral of some mode of potentially catastrophic violence, as that possible violence appears to some sign user within some configuration of relations which in turn overlaps with other configurations. Through our intuition of the sign aimed at deferring violence we apprehend the scene upon which the threat of violence is gathering, and we can work our way through the itinerary of the signs constitutive of that scene. And, of course, signs don’t always work, and even when they do only partially so—we can defer the most imminent and seemingly devastating eruptions of violence, but not necessarily always them and certainly never abolish violence as such.The constitutive difference of language is that between imitation and iteration. Imitation is following the rules guiding someone else’s action—you don’t need to “understand” the rules, i.e., be capable of formulating them explicitly, in order to do so. Indeed, as we all know, you can’t really be able to formulate the rules, because there would then be another set of rules for formulating rules that would need to be formulated and that couldn’t be present in your formulation, and so on. Some understanding always remains tacit. But understanding this paradox intrinsic to rule following and sign using aids us in defining imitation: imitation is the attempt to abolish that gap between tacit and explicit, invisible and visible. Imitation is the attempt to map the model in one’s own activity. Iteration, then, is what happens when the gap becomes visible in one’s mimetic efforts. Such visibility is more commonly known as error; but error presupposes a norm that is produced simultaneously with that error. The rule one follows reaches its limits in the emergence of norm/error and the rule of the model must be revised if it is to be followed. Iteration applies the rules put forth by the model to the model itself. On the scene posited by the originary hypothesis, the first sign is the aborted gesture of appropriation: the hand reaching hesitates in seeing all the other hands reaching—imitation has failed through its success, since the simultaneous effort to procure the object ensures that not only will no one do so but that the world of objects will disappear altogether. It is some “sense” of all this that is involved in the aborted gesture. That gesture, then, applies, and is subsequently seen to apply by those who imitate it in turn, the rules of the grasping activity to that activity itself—and in doing so discloses its limits and converts that act into another. We are always imitating and iterating, the two modes of activity are separated by an infinitesimal boundary, and significance lies on that boundary. What guides us in semiosis is some intuition regarding the sustenance of the scene we are on; that is, the desire, compulsion really, for presence. We are always complementing, completing, resisting, instituting some sign another has put forth, always in accord with something that seems to be missing, some piece of the scene without which the scene will not coalesce or some perceived deformation of or excess to the scene that must be remedied or curtailed. To put it in very simple terms, we want to keep things going before they close in, collapsing the scene. Each new kind of speech act emerges in this way: the imperative out of the “inappropriate” ostensive; the interrogative out of a imperative weakened by possible failure, in a scene where compliance is uncertain; the declarative out of a negative ostensive repeating and negating the interrogative; the verb out of an imperative attached to the negative ostensive redirecting attention to another location when the negative ostensive fails.Any sign, then, is some articulation of these speech acts in some supplementary and constitutive relation to a scene. In proceeding to construct a semiotic and linguistically specific idiom of inquiry, I would propose that we think about our activities as so many modes of obedience to imperatives. Only imperatives can set action in motion: ostensives are self-enclosed acts, creating a center of attention, while we can only act on declaratives insofar as we read them as imperatives—if someone says “the door is open” I can leave or close it, depending upon what I take that sentence to be telling me to do; a sentence like “the time has come” is activating some imperative, however indeterminate. So, to get started, we can speak of thinking as obeying the imperative to suspend imperatives; in this way a presence is sustained in which the stream of imperatives reality generates continually become signs pointing to their possible origins and outcomes, leading us ultimately to the most elementary commands, the most originary of which turns out to be the imperative to suspend imperatives in the face of the ostensive sign. The creation of moral maxims entails obedience to the command to map imperatives onto indicatives; the shaping of ethical habits is obedience to the command to suit imperatives to ostensives. When we moralize, we want the imperatives we follow or issue to be backed by the currency of indicatives—as the imperatives in the Decalogue need the backing of “I am the Lord thy God…” This is because morality is a system of imperatives, and a system of imperatives requires something other than an imperative to provide articulation—indicatives embed the imperatives in a shared reality, perhaps, especially, a commanded reality. Insofar as we distinguish ethics from morality, meanwhile, it is in shaping morality to fit concrete situations—we speak of professional ethics, not professional morality. Ethics must have a moral backing, such as “treat all individuals fairly and equally,” but the ethics itself involves determining what counts as “unfair” or “unequal” in a given situation—what do we recognize—point to—as a violation of the injunction against unfairness?Let’s proceed further. Desiring is obedience to the command from the object to model yourself on its possession; resenting obeys the command from the object to keep watch over it (which also means watch over everyone else’s watch) once one’s access has been barred. We can see how desire and resentment become generative by considering all the ways imperatives can be mistaken. I can’t know how the object would have me possess it (even the object doesn’t know) and so my relation to the object is generated by my applying the rules of the object’s command to the object as I construct it. Regarding resentment, I have no idea what kind of access and distribution the object considers appropriate so my resentment must also be donated to the Object-as-center as it repels all of our contentions regarding the right way to slice it. And here, indeed, we can work our way towards an originary morality along with our originary description of morality. Gans proposes an originary resentment directed towards the center, the Other which bars me from possession of the object—I am reading this resentment as obedience to the object’s command to superintend it, and this imitation of the newly repellent object must first secure the protection of the center from others and my insistence collides with the similar insistence of all my fellows, threatening the repetition of the originary crisis; at this point the only way of sustaining the scene is to donate my resentment to the center along with others, thereby (re)installing the bar. This new convergence could only be arrested by obtaining a new command to give the center the power to prevent us from annihilating us in its name. The consequent moral imperative, then, is as follows: when your resentment, if imitated by others, would cause imperatives issued by the center to cancel each other out, you must donate your resentment to the center (which resentment involves all of us watching each other through the eyes of the center, which is to say through the eyes of the others watching me, so that we all become signs for each other). Invent and embody a rule that can account for your resentment along with everyone else’s—or to get started, at least a few other people’s.And, finally, let’s return to the dyad I proposed as constitutive, as simultaneously inside and outside the linguistic space: imitation and iteration. We can now bring even these terms, and the constitutive paradox their relation represents, within that linguistic space as well: imitation obeys the imperative to seek imperatives in the presence of the other, to treat the other as an inexhaustible fount of imperatives. Rules, indeed, are nothing more than the articulation of imperatives drawn from the presence of the other. Iterating, then, is issuing a command in turn to the other made present, a command to issue a new command that would help sustain the space opened by my obedience to the previous one; such a command must authorize me to issue my own commands to those (including myself) who have drawn imperatives from my act and disrupted the space it established. To the sign others take as a command to appropriate, to rush to the center my presence as source of commands seems to indicate, I must now add a new command, making a series, and therefore a rule, a relation between two sources of commands, such that any new command must take the norm of the series. Eugen Rosenstock-Huessy, a precursor in what I am calling originary grammar, already contended back in the 1930s that the crisis of our time is a vacuum of imperatives, and I believe it is still true. It may be a constitutive crisis of modernity, that space of misreading, deliberate and unknowing, generous and malicious, and universal application of the Christian revelation. The Christian revelation forbids scapegoating and commands us to stand with the victim, and all public and private life in societies which have left transcendent freedom behind for freedom in the world can be seen as the search for victims to rush to in solidarity and victimizers to charge. The scope for legitimate commands must shrink as the search becomes ever more successful. Those who hope utterly to free modernity from its reliance upon transcendence have so far tried to embed operative imperatives in indicatives—how else could one explain such idealized representations of modernity as those of, say, Juergen Habermas, who hopes that the “better argument” can command; or those human rights activists who believe imperatives might flow from a more extensive body of international law and more detailed and widely disseminated descriptions of its violations? Supplementing the absence of operative imperatives on today’s global scene would require events sacralizing the inexhaustibility of the sign, deliberate iterations. I hope to elaborate in a future post.Adam Katz teaches writing at Quinnipiac University in Connecticut. He is the editor of The Originary Hypothesis: A Minimal Proposal for Humanistic Inquiry and an editor of Anthropoetics, the on-line journal of Generative Anthropology. I remember the morning Bernie Madoff was to be sentenced that it seemed I was waiting along with everyone else to find out if “disgraced financier” got twelve years or one hundred and fifty years in federal prison, a.k.a. “Camp Cupcake.” By now, the details are well known. Madoff, according to the authorities, took in money from one group to payout nominal “dividends” to another group without ever having made an investment; that is, without investing except in his own lavish lifestyle. The classic Ponzi investment scheme is a “shell game” that works well until everyone wants to cash in, which is what happened, more or less, when the economy tanked late last year.The lesson of Madoff, I would like to contend, is not that we should be on guard against rip-off artists. Most people knew that long before Madoff’s well publicized confidence game. The lesson is this: the entire economy is more or less a networked Ponzi scheme. Bundling high and low risk mortgages and selling them as excellent, insured investments meant that when the housing market collapsed everything would collapse with it since nothing had the value it was presumed to have had. The “make believe” quality of Madoff’s rip-off and the mortgage sector over-valuation (not to mention the easy credit across the board) became illustrations an economic “perfect storm” that we are still sailing through today. Madoff is just a small example of how a larger system can be easily “gamed” without anyone in charge knowing or, more to the point, without anyone in “charge” caring. With the fallout from the worst economic disaster in decades, my question is, “What about higher education? In my last post on “Business Model University” I argued that business didn’t fail to appreciate the more nuanced and finer aspects of college and university life; we’ve always known this; remember Oscar Wilde’s famous statement that money people know the “cost of everything and the value of nothing.” No, the real problem as I described it was that business failed at business. Business had all the right answers, correct? It could run the world! It could run higher education! In fact, as the new CEO of GM has said in so many words “I don’t need to know cars. I know business.” Well, this is the same stupid confidence in make believe “science” that has led to the crisis that we are currently in and bound to repeat. I think that this point is very important for the simple reason that higher education has been “making believe” with budgets and finance for years. Colleges and universities, mostly, are non-profit institutions. This status allows them to function as corporations without having to deal with the inconvenience of “taxation.” So, major universities can run “professional” sports programs that take in millions of dollars and everyone makes believe that it isn’t happening. In many cases, the sports “money” is shielded from the college or university budget at-large. This is nothing new. What is new, however, is the way in which other sectors of the corporate university have separated from the core function of the institution. Last year, The Chronicle of Higher Education ran a story about how colleges and universities worked with credit card companies on campus to lure students into “teaser rate” cards, with the institution getting a kick-back. At one level, I suppose it is okay to treat students as “fans.” Sell them hats and sweatshirts to raise a little cash; however, when institutions make a radical shift and begin thinking of students as “customers” or “potential customers,” this, as Bill Readings noted in The University in Ruin, is the beginning of the end. Instead of the university as a modest enterprise, charging tuition and fees to cover operating cost, the “university-corporation” drives toward greater and greater “profit.” With the concept of profit comes the concept of liability, a.k.a. the faculty. So, what would a university or college look like if it maximized it profits and drastically reduced its liabilities? Well, it would look like Bernie Madoff University.Bernie Madoff University would have to look “good.” Just as Madoff had an upscale office and account statements and employees, Bernie Madoff University would need buildings, sports arenas, ample parking, comfortable dorms with 24-hour cafés, and coffee kiosks every few hundred yards. It wouldn’t hurt to have a bell or chimes that marked the hour across a sprawling campus. Comfort, convenience, and pleasure would be the motto (preferably in English so everyone would understand it). And now, the most important part. Bernie Madoff University would need one famous writer, one famous scientist, and one famous business person. The remainder of the so-called faculty would be on and off site “adjuncts” or “teaching associates” who would work on a per student rate—time and materials arrangement. This would be a Ponzi scheme at its highest level—an ultimate something for nothing. The irony is that if Madoff had run a university he would have been a hero. His appearance over substance strategy could have been the model for all college and university presidents; or, I should say college and university presidents could have unashamedly embraced his model. . . outright. The problem with Bernie Madoff is that he ripped off real people for real money, giving them only a good feeling in return. Had this been in the context of higher education, all would have been well. Take the money and tell them how good they feel; he never would have had to pay out one cent in dividends. It may not be too late, however. If Madoff were to get parole after two years or so, despite his actual 150-year sentence, he could still do wonders for American higher education. A book, a few appearances on Morning Joe where he curses Obama’s socialism , and a charity event in the Hampton could put him back on top. All it would take would be one college or university to hire him! It could happen. If it doesn’t, we’ll all have to wait a few years until everyone else catches on to his brilliance. By then, we’ll certainly have made greater progress toward comfort, convenience, and pleasure. Victor E. Taylor is Associate Professor of Comparative Literature, Philosophy, and Religious Studies in the English and Humanities Department at York University of Pennsylvania. By Adam KatzThe indicative sentence could be said by anyone; imperative and interrogative sentences are defined by the speaker and listener. To use Peircean terms, imperatives and interrogatives have a first and a second, while the indicative is uttered by a third. The speaker of the indicative sentence is therefore outside of the direct interaction involved in the interrogative, imperative and, most fundamentally, the ostensive. If you say “he needs help,” you are clearly outside of the help-seeking; for that matter, if you say “I need help”—as opposed to “help me”—you setting yourself, as speaker, at least somewhat apart from yourself as needing help. The response called for by your indicative sentence is something along the lines of “what’s wrong?” or “what can I do?” that is, an interrogative, rather than a direct proffer of aid. You are engaged in a discussion over your situation. Your indicative statement calls for a response—that is, it commands a response, or embeds an imperative: inquire into my condition. How meaningful the sentence is depends upon how meaningful that inquiry would be—that is, what new ostensive signs it would yield. The same is true if we increase the distance between the need for help and the sentence—say, “K needs help.” Here, a different chain of interrogatives and imperatives constitutes the space of inquiry opened by the sentence, including questions like “how do you know” or “who says” that K needs help? (Questions that would hardly arise with a sentence like “I need help”). The inquiry here might take new paths—perhaps the credibility of whoever claims K needs help becomes the central question, leading one of another speaker to put forward a sentence embedding the demand for some ostensive sign that would credit whoever claims to speak for K. The inquiry, or the space of questioning and answering articulated in the indicative sentence, is itself a deferral of some demand, an impossible demand, or a convergence of conflicting demands, at some distance from the speakers: the question is the softened edge of the demand. Some potential dispute regarding the subject or name organizing the sentence must lie at the origin of its utterance: perhaps K is right in front of us needing help, in which case “K needs help” is only slightly removed from urgent questions like “what do you want from me?” and “why are you just standing there?” and these questions are themselves separated by the thinnest of boundaries from imperatives like “do something!” and “help him!” and these latter, in turn, from interjections or exclamations (look at that!; Oh my God!)—in this case, the indicative sentence could represent either a momentary lull in the mounting emergency or a panicky non-response (the subsequent sentences should weigh down on one or the other possibility). Or, K is far away, beyond our capacity to aid him, and any dispute about K and his condition may be equally distant, in which case the mention of that condition stands in for some other set of disputes regarding competing demands and our conversation only makes sense, is not monotonous droning, if (this is my hypothesis) the conversation could be put at stake in one of those proximate disputes and if its continuance is therefore framing and deferring them. A good sentence, then—esthetically and ethically good—is one that holds the imperatives at bay but keeps them within sight, and, even more, keeps the space of questioning sufficiently expansive to shape the sentences along a range of actual and potential questions from requests for simple information to inquiries regarding the shape of the sentence itself. With an originary understanding of language, we need not venture beyond the grammar of signification itself for an esthetics, an ethics and a politics. Signs make sense along two axes: first, their iterability; second, as norm. To the extent that a sign is composed in accord with rules that a competent sign-user could discern and iterate, it has reached the threshold of signification. To the extent that a sign is convertible with other signs, can measure and be measured by them, so as to open up a field of semblances, it has likewise met that threshold. For sentences, as I suggested in my previous post, that threshold has been reached when a name becomes a source of imperatives (K’s needing help has transformed K into a source of imperatives—respond to K, inquire into K’s condition, spread the word about K, etc.); this happens when an impossible imperative finds a name to order. We erect the name in between us to defer our dispute by ordering the name to remain in that place: this is iteration. In thus “situating” the name, we attribute, as I also suggested last time, the imperatives the name is obeying, to the following: a particular agent; “reality” (the very reality created by the sentence, which generates a range of potential imperative-ostensive articulations, and which orders us around so often); and the name itself (as conferring a name confers at least a minimal freedom and capacity for self-constitution). This is norming. A verb is some articulation of compliance with commands coming from these three sources—our obligations to others, our sense of a limiting reality, and the space of freedom we constitute by issuing commands to ourselves before knowing what obeying them would look like. How one articulates those commands in an incoming sentence dictates what one takes to be an appropriate response to the sentence, one that will sustain the continuous present the sentence supports. A good sentence is both iterable and norming. We can fold ethical, epistemological, esthetic and political claims into accounts of how iterable and norming sentences are, which is another way of inquiring into how sentence-y they are. Goodness, knowledge, beauty, and freedom are all products of disciplinary spaces—that is, they result from commanding these names to show through semblances and to provide us with commands, in turn, that will enable us to confer these names and their descendants upon objects, events and actions yet to come. It is sentences, enacting the disciplinary space constituting these arenas, which would be the mode of measuring and registering these event/signs. Imperatives issuing from another, from reality, and from the name itself, respectively, would be incommensurable, but out of what other material than incommensurables can commensurabilities be constituted? When we make sense of a sentence, including the one we are speaking, we affirm some such commensurability ostensively, the way we recognize an imperative has been fulfilled, by integrating the sentence into the course of living, treating it as a model for appropriating reality, as a fount of new imperatives.Indicative culture would be a culture interested in citing and creating such planes of commensurability: attracting, ordering and transcending the strongest imperatives flowing from our diverse resentments. To match an indicative culture I would propose a marginalist politics, which seeks out that composition apart from which everything stays the same. We homogenize and commensurate the world through our habits, and through our habits we render ourselves idiosyncratic before the world. Let’s say I go out and get the newspaper every morning and then I come back for my morning coffee. What would define this as “habit” will naturally vary: in some cases, “morning” is good enough, in some cases only “at 6:45” will cover it; is it always exactly the same amount of coffee, or is the habit defined in terms of “however much coffee I feel like” that morning? However the habit is composed, the world is commanded to come together in a particular way through it, and signs cross the threshold over into meaning in terms of what sustains, what can be gathered into, and what interrupts the habit. This mode of analysis can suit any level of individual and social complexity: I read, teach and compose sentences in habitual ways, I respond to praise and criticism and confront compelling claims that disrupt my thinking likewise; habits are contagious and, like all contagious vehicles, mutate constantly, from neighbor to neighbor, teacher to student, across a place of work, among viewers of a popular TV show, etc. What interrupts my habit is what constitutes that habit before another and before myself as other, and raises the question of how each habit will read the other in terms of itself, itself in terms of the other, and with what remainder. When something interrupts my habit, I must re-compose it: in one sense, this is an adjustment at the margins; in another sense it is a creation ex nihilo. The store at the corner from which I buy my paper goes out of business, and so I have to walk another block; the paper itself goes out of business and I need to get my news from a favored internet site—maybe I need to buy a computer. Either way, the rupture in my habits can be healed, and the apparent “size” of the rupture won’t tell us much about what it will take to restore the habit: maybe the store that went out of business was owned by my best friend who just died and speaking with him for a couple of minutes every morning was an integral part of my habit, and maybe I take swimmingly to the Internet. Either way, these are marginal adjustments: new signs must fill the rupture and I can assign values to each of the candidates, based upon the system of value already in place. But this revaluing also seeps through the entire system, and I am ultimately doing everything differently. And we have lots of habits and habits for articulating the various habits that normalize the world for us and make us idiosyncratic to ourselves. Politics is where we get into the habit of having our habits interrupt each others in a regular manner: “regular,” in the sense of common and sustained, but also in the sense of rule based. When acting politically, we put forth our habits at their most interruptable at that point where the other’s also seems so. Where we both seem to be following the rule and yet applying it in incommensurable ways, there is where either of us might try to have our rule encircle the other. I find some way of following my application that disenables yours; and, in turn, I re-regulate your habit in terms of my application. This, of course, involves a way of talking about what we are doing: naming our practices so as to command us to follow my application; disobeying, in my discourse, the command your naming of your own and my practices would put forth. What we are looking for is where marginal shifts involve new compositions of the system—not necessarily revolutionary change, although sometimes that, but just as likely systemic relabelings of the “same” practices and institutions. Not necessarily all at once, but implicitly, perhaps putting in place a new command that will take years, even generations, to fulfill. The best spaces for such moves tend to be on the boundaries between imperative and indicative, executive and judicial, where a new set of imperatives and the habits supporting them are incommensurable with the existing indicative regime, or those who habitually work with indicatives in detachment from imperatives seek to influence the imperative regime. For example: a politician who represented me would demand that President Obama and any other publicly responsible figure who believes that the interrogation techniques used upon captured combatants between 2002-2006 constituted “torture” and were therefore illegal do the following: not only must you seek to prosecute everyone whom you believe broke the law, but you must apologize to and pay reparations to all victims of that “torture,” including Khalid Sheikh Mohammed himself. Are we not a nation of laws? Mr. Mohammed has not been tried, much less convicted—in the eyes of the law he is as innocent as anyone else. Apologizing and paying reparations to Mohammed seems to me only a mere and completely logical step forward from prosecuting the practitioners and lawyerly “enablers” of “torture”—and yet in crossing this boundary the habits of adherents to what I call the “human rights world picture” converge, suddenly, with the habits of those promoting massacres without limits. “International human rights law” is, one might say, a set of imperatives seeking out the name who will ensure compliance with them—but they will never find it because the habits of international lawyers and human rights activists find no points of contact with the habits of those who enforce the law and might effectively oppose the will of tyrants. I would like to make it a political habit to expose this misfit, because believing in the efficacy of “international human rights” leads to the habit of composing sentences with lots of quasi-imperatives scattered aimlessly around (everyone should do, think and say all kinds of things which they would never actually think do or say, and even if they did, it wouldn’t have the consequences it “should” have anyway). And exposing this misfit would, in turn, re-name the Nuremberg precedent as a “victor’s justice” nevertheless applied with enough impartiality to attain universality, so as to generate imperatives whose bearers might forge needed points of contact—the Nuremberg precedent might sufficiently justify and usefully circumscribe at least some wars, such that warriors might be happy to share the results with the lawyers and activists. To the extent that asymmetrical war waged by the stronger aims at protecting the victims of those who claim to be our victims, the ruthlessness of the warrior culture can be preserved and modified by innovative legal forms. The law can cover the spaces opened and exploited by those who fight outside of the inherited norms of warmaking only by norming, at the margins, the diverse and improvisational methods of counter-insurgency. The lawyers and activists have to study the habits of the soldier, rather than the reverse. This is not the direction in which we are currently headed.Adam Katz teaches writing at Quinnipiac University in Connecticut. He is the editor of The Originary Hypothesis: A Minimal Proposal for Humanistic Inquiry and an editor of Anthropoetics, the on-line journal of Generative Anthropology. By Carl RaschkeI ve been sifting through the text of President Barack Obama s speech in Cairo calling for a new beginning in relations between Muslims and the West. I ve been looking for those portions of the text that would truly seem inaugural, if I may be permitted a little piece of pomo-speak . I have to say that I am impressed, and that there is almost nothing in the address that comes across as hackneyed, platitudinous, or downright fulsome, as one would expect of any political speech, even though this one is political to the core. Is it inaugural? Yes, truly, and the reason has to do with far more than what the White House itself is saying.There were concerns voiced by Obama s critics in the run-up to his historic talk before a sometimes approving, sometimes demuring (especially when he criticized violence against Israeli citizens in Palestine) audience of largely Egyptian students. The critics predicted he would spend a lot of time apologizing for America and its historical sins. He didn t. They also warned that he might end up pandering to his Islamic audience. He didn t do that either. Obama defied in this instance the effort of many of his critics to brand him as one who kowtows to enemies simply by making nice. Every nuance of his carefully crafted rhetoric was strategically designed to affirm many classic American foreign policy objectives while scotching any hint of we-stand-for-right-and-truth bravado or you-have-a-right-to-be-mad-at-us grovelling. In essence, it was an effort to sound in his own way Kennedyesque by carefully articulating many familiar and venerable strains of democratic internationalism and idealism while repudiating the post-911 clash of civilizations idea. If Muslims are so much like us, while implicitly sharing our peaceful goals, there s no reason we can t all get along, Obama seemed to be saying.The Los Angeles Times summed it up: Obama s style has been to cast himself as ready to lead the nation pastthe entrenched battles of the Clinton and Bush years and to askAmericans to look beyond old fault lines and accept a new politics ofpragmatism and compromise. He was attempting to do the same with Muslims. But if one takes a look at the wording of the actual speech, a more grandiose vision seems to emerge. The key statements occur in the fifth paragraph. I ve come here to Cairo to seek a new beginning between the UnitedStates and Muslims around the world, one based on mutual interest andmutual respect, and one based upon the truth that America and Islam arenot exclusive and need not be in competition. Instead, they overlap,and share common principles -- principles of justice and progress;tolerance and the dignity of all human beings. Obama s new beginning , therefore, was to enunciate the lofty principles of the European Enlightenment, out of which America was birthed, and to invite present day Muslims of all stripes in all nations to re-define themselves within that shining episteme. It was perhaps a rare meld of Kennedyism minus the pay any price polemics as well as Wilsonianism minus any kind of make the world safe for democracy rant. It even smacked of Kantian cosmopolitanism, summoning the rational-minded from all cultures and all faiths to step forward as one toward the global city on the hill. Obama of course didn t mention Immanuel Kant, nor the Kantian categorical imperative on which all cosmopolitanism through the principle of a universal rational faith must be forged. But he came as close as any politician could come in describing the true religion without religion of the eighteenth century Aufklärung. There s one rule that lies at the heart of every religion -- that we dounto others as we would have them do unto us. This truthtranscends nations and peoples. Obama, in effect, challenged the West as much as the Muslim world to forsake the postmodernism of the last forty years with its emphasis on the philosophy as well as the politics of cultural, religious, and personal identity. In a sweeping gesture that could only be made by someone with his wealth of symbolic capital for today s international leadership - African roots, racial hybridity , American president, Harvard-educated apostle of global inclusivism - Obama called on the most identitarian of peoples to embrace a world vision that is far more European than Muslim. Although Obama, unlike previous American presidents, acknowledged the painful legacy of a colonialism that denied rights and opportunities to many Muslims , he challenged those resentful of that legacy to let go of it and embrace a set of values and ideals that post-colonialists frequently blame for the legacy in the first place. The historical causal nexus linking the Enlightenment, the industrial revolution, and colonialism has been fairly well explored in the scholarship of the past century. But there is another factor that is almost routinely overlooked - the events of 1683 that not only sewed the seeds for the rise of post-Reformation and industrial Europe, but also for what has been called the great Muslim humiliation before modern Western military might that has shaped the last almost three and half hundred years. We are talking about the Battle of Vienna between the allied armies of Central Europe and the Ottoman empire, seat of the last caliphate and champion of the dar al-Islam ( house of Islam ), that commenced on September 11, 1683 and concluded a day later. Like the legendary Battle of Tours in 732 A.D., which halted the first century of Islamic expansion and laid the groundwork for the rise of Charlemagne and the very idea of Europe, the Battle of Vienna set in motion the events that have configured the present conflicts. A respected Muslim colleague of mine some years back confirmed the historical significance of the date September 11 for many in his community, which was not in any way randomly selected by Osama bin Laden for his strategic attack on America by hitting New York s Twin Towers. It was a commemoration by certain Muslims with long memories of what Westerners have either forgotten, or are for the most part ignorant of.A Muslim chronicler wrote of the defeat at Vienna: this was a calamitous defeat, so great that there has never been its like since the first appearance of the Ottoman state. That calamitous turning of tables, according to later historians, both weakened fatefully Ottoman power at its apogee and emboldened the nation states of Europe, including Russia, to put deliberate and relentless military pressure on the once mighty Ottomans until they became a ghost of themselves and collapsed in 1918 after their last gasp attempt to save themselves by first seeking to adopt Western ways and finally allying with the Central Powers of Germany and Austria-Hungary. In the introduction to his blow-by-blow account of the Battle of Vienna entitled Vienna Anno 1683, Austrian historian Johannes Sachslehner writes that the year 1683 has become a profound mirror of the history of early modern Europe. It also tipped the balance in the minds of Europeans between the ultimate importance of faith and religion versus materialistic, technical, and military thinking in the definition of a people s historical goals. In other words, it sparked the Enlightenment.The clash of civilizations (regardless of whether it is understood in the way Samuel Huntington originally presented the thesis), which Obama acknowledged in passing as an historical reality, turns out to be something much deeper than any renewed Enlightenment-driven, pomo-Kantian, globalist internationalism can overcome. Kant himself was all about advancing the gospel of reason, including a nod even to the necessity of military conflict, in his own quasi-messianic anticipation of the coming of a commonwealth of the world s rational-minded. Today Europe, tomorrow the world is the implicit message of his famous essay Idea for a Universal History from a Cosmopolitan Point of View. This universal cosmopolitan condition, according to Kant, is the last to be achieved. It can only be achieved, as the Enlightenment itself was achieved, after the exhaustion of endless religious particularist conflicts. The greatest problem for the human race, to the solution of which Nature drives man, is the achievement of a universal civic society which administers law among men , Kant wrote in 1784. That was virtually Obama s message in Cairo. However, such an inauguration today poses many deep, theoretical dilemmas that take us well beyond such familiar (Enlightenment) concerns as intolerance, historical victimization, and the persistence of old, combative habits. The dilemmas - perhaps we should use the proper philosophical term and say aporias - were not only trenchantly recognized, but deconstructed in context by Jacques Derrida in his famously opaque, but decidably epochal 1995 essay entitled Faith and Knowledge: The Two Sources of Religion at the Limits of Reason Alone . Derrida takes up where Kant left off. Derrida recognizes the singularity of the religious as that which both responds to ( religion is the response ) and internalizes according to what he calls its logic of autoimmunity the Kantian exceptionality of radical evil. Kant argued late in life throughout his historical essays that this singularity, or exceptionality, seems to proscribe the dream of Enlightenment. Yet, Kant insisted in an argument that anticipates Hegel s own cunning of reason (List der Vernuft) that the refractory exceptionality of radical evil, manifest in seemingly endless human conflict, propels the calculus of rational expediency and the quest for peace among sovereign states, and in the end (yes, the Enlightenment did have its own curious eschatology ) the universal commonwealth of all rational beings. Again, something similar seems to be the gist of Obama s foreign policy. Derrida, however, has his own prophetic insight into the impossibility of a dialectical resolution of history, of any List. The Enlightenment, like the 18th century concept of religion on which the notion of the universal right of religious freedom is based, is a Graeco-Roman (and by derivation European) artifact. It is the cornerstone of what he much earlier named our white mythology . More difficult to think than the Kantian idea for a universal history from a cosmopolitan point of view is the return of religion, which is what 911 was all about. The extensions of the Enlightenment are the ideas of economic and scientific progress, the ideology by which colonialism was always justified, as well as what Derrida terms globo-latinization , hence globalization in the neo-liberal sense. Religion has an entirely different genealogy. Its genealogy is not the globo-Latin, but the world in Jean-Luc Nancy s sense of that which is shared intimately as a Mitsein within the space of particularity, that is beyond any kind of conceptual or universal representation. This Mitseinis constituted by what Levinas called the otherwise than being ofrevelation itself. It is not subject to any Kantian axiomatic of reason alone. As the Qur an itself says in the sura known as TheSpirits : knowing the unseen: God does not reveal the unseen mysterydivine to anyone at all, except a Messengers with whom God is pleased;and God sends forth observers before and behind him. The singularity of the religious, the sacred testimony, resists fiercely all globo-Latinization, according to Derrida. It is not radical evil - though it can become evil - so much as radical exceptionality.Radical exceptionality is exactly what the world is up against, and any summons for a return of Enlightenment to counter the return of religion is prone to disaster. As Derrida says in Faith and Knowledge the coming of this exceptionality, constituted not only as a return but as an event , ought to puncture every horizon of expectation. 911 was such a puncture. There are more to come. The Enlightenment brooks no aporias, which was Kant s project. But history - and the God of history if one wants to theologize about it - is a stern teacher, even for a rudderless West that desperately desires a Renaissance of its once glorious secular imperium, that wants a new Enlightenment. Carl Raschke is Professor of Religious Studies at the University of Denver. By Carl RaschkeI have awoken just about every morning for the last six weeks with the sense that this year is going to turn out to be an historically decisive one in terms of world events - like 1968, 1989, or even 2001. All I can say to justify this sense is that I have it. Decisive years are different from specific events that are normally identified, or correlated, with such a time frame. Their eventfulness cannot usually be ascribed to one great thing, or event (in the singular), that happened and is remembered - the uprising in Paris, the fall of the Berlin Wall, what we now call 9/11 , etc. Philosophically, I draw my understanding of the event from Alain Badiou, who develops it from Deleuze. The event is not actually internal to the analytic of the multiple, Badiou writes on p. 178 of Being and Event (trans. Oliver Feltham, Continuum Publishing, 2005), that is, in our case to the sequential generation of the moments of recent history. Even though it can always be localized within presentation, it is not, as such, presented, nor is it presentable. Badiou distinguishes between a fact (i.e., of history ) and an event. Unlike the trace, which pertains to the text, or khora, which belongs to the empty spacings of logos in its coursings, spectrality is what stalks history. It is akin to Badiou s event site. It is not a generatrix (as would khora), but a hidden and persistent parameter of what takes place. Yet it remains clandestine. The specter of Marx about which Derrida wrote in the early 1990sis neither joined to (gefügt an) to the present nor available to (verfügbar) present cognition. Its time dimension is obscure. That is why it remains a specter. A fact has a simple, or singular, location within history, but no historicity. But the historicity of the event, for Badiou is different from Heidegger s historicity. The difference does not merely lie in Badiou s efforts to contextualize the theory of the event in mathematics, in set theory. Events, according to Badiou, are localizable, yet not presentable. What does that really mean? Events are singularities that belong to the form-multiple of historicity. Events in this way occur always within what Badiou terms an evental site. An evental site is an entirely abnormal multiple, that is, a multiple such that none of its elements are presented in the situation. Only the site is presented. Badiou gives as an illustration a case of a concrete family, all of whose members were clandestine or non-declared, and which presents itself (manifests itself publicly) uniquely in the group form of family outings. (p. 175) Badiou may have in mind, of course, the conspiratorial, Maoist cadres to which he once belonged, but that is really beside the point. The tension is not, as in Heidegger, between the revealing and concealing of that which is, but between our tendency to demand empirical substantiation of what we consider to be the case and our recognition - ever since at least the advent of quantum physics - that what we observe may not be what is there, or was there, because it has now been transformed through observation. By now one should properly deduce that I am not warming up for one more, tiresome pontification on how significant Obama s presidence supposedly is. Obama s election is only a fact of history - and a significant one, in terms of the order of multiples, to say the least. But there is no eventfulness to it. In fact, very little other than the obvious has really happened. I am saying, boldly, that we are coursing through an event site of which the significations have to remain hidden (according to Badiou s theory itself). I ve never been able to prove that Badiou all along has been reading Bultmann s theology of several generations ago about the Christ event that is historical, though unintelligible to history itself. But these associations are not merely aleatory. It is not accidental that Badiou s well-received book on St. Paul really complements Bultmann, or that Badiou himself is a source of growing fascination among a newer generation of postmodern academic theologians (though they all struggle to follow him half the time, as they once did with Derrida). Badiou is probably more instructive for latter day Bultmannians , since he has unshackled himself from Heidegger, which Bultmann couldn t. But I digress. What has compelled me to reflect of late along these lines is not a closer reading of Badiou, but a re-reading of the later Derrida, in particular his own eventful book Specters of Marx. Composed ostensibly in its historical situatedness as a rejoinder to the soaring popularity of Francis Fukayama s dawn chant (in his bestseller The End of History and the Last Man) to the rising sun of neo-liberalism upon the occasion of Communism s worldwide collapse, Specters of Marx of course inaugurates an alternate messianism to Fukuyama s. The messianism of the democracy to come. Now that Fukuyama s world vision has itself gone up in flames, Derrida s naturally seems more prescient, even though few have figured out what the latter was really going after. Derrida s messianism without a messiah is not a readily usable political tool of analysis, even among the trendiest genus of current, postmodern democrats , and the slowly fading glow of messianic Obamaism will not make much difference in the long run either. After spending the past two weeks in conversations with my students in the advanced Derrida seminar I am currently teaching at the University of Denver, I can only say that Derrida s so-called political writings are not about messianism so much as it about specters , as the title of the book makes clear. I would assert boldly that the three ages , or stages, of Derrida s philosophical development - early, middle, and late, as conventional nomenclature would have it - can be marked (deconstructively, to be sure) respectively by each of the following terms - trace , khora , and specter. Now, on a blog, I will not even attempt some arcane and elaborate justification, with proper proof-texting, of this contention. But I will say that the evolution of these terms can be fleshed out with respect to the larger themes, or philosophical issues, that Derrida took up - again respectively - in the unfolding of his prolific writing career. The trace calls our attention to the way in which the singular presence-ness , the material haeccity (as Deleuze would phrase it), of the written inscription resets the entire question of Being, which Heideggerian fundamental ontology sought to overcome. The khora indicates the fertile and pregnant space of the mystical moment into which the trace vanishes - hence postmodern negative theology and the whole trail of neo-Derridean, religionless prayers and tears. The specter, however, is what haunts us at particular time, and will not go away. The specter does not vanish, like the trace, but returns. It is a ghost, a revenant. Yet it is not something, simply like a ghost, that once lived and now cannot be stuffed away in mere historical memory. The Marxian specter, at least so far as Derrida meant it, never really lived to begin with. In that sense it is more closely akin to Badiou s event site. It is not a generatrix (like khora), but a persistent parameter that remains hidden within the flux of multiples. It is a secret of what takes place. In 1993 Derrida perhaps had an uncanny intuition that Fukuyama s realized apocalypse of absolute spirit in the form of global capitalism was a fantasy rather than an affirmation. On the other hand, his sense of the messianic-democratic may also be read in hindsight prove itself to be something of a fantasy as well - a French one, possibly, of which every recherche experience on Bastille day offers some vague inkling. If subsequent history provides guidance, the title of the book could just as well have been Specters of Mohammed. The messianic spectrality of Derrida s political writings is more haunting than politics itself. In 1993 Derrida had not totally recognized, or begun to come to terms with, the specter of his own Judaism anyway. Such spectrality draws us closer, I believe, to what Badiou was thinking about events.The phenomenon of Derrida itself can be seen perhaps as an event-site which remains invisible against the tangible presentation of the intellectual and cultural history of the last thirty years. We have been attentive to something clumsily marked as the postmodern. but we have been oblivious to what might be spectrally stalking us. Particularly in America, we have reframed rather tired old debates in terms of the alleged influence of this specter , but we have failed to appreciate its own power as that which is still to come (avenir).In the last decade postmodern thought has largely degenerated, like a CNN or Fox News segment, into a parade before the camera of established experts , or luminaries, coming from opposite sides on the same-old-same-old and taking strategic shots at each other. The current scholarly celebration of the conversation between Žižek and Milbank in The Monstrosity of Christ is a case in point. I quote from the book blurb by the publisher. In this corner, philosopher Slavoj Žižek, amilitant atheist who represents the critical-materialist stance againstreligion’s illusions; in the other corner, “Radical Orthodox”theologian John Milbank, an influential and provocative thinker whoargues that theology is the only foundation upon which knowledge,politics, and ethics can stand...Žižek andMilbank go head to head for three rounds, employing an impressivearsenal of moves to advance their positions and press their respectiveadvantages. By the closing bell, they have not only proven themselvesworthy adversaries, they have shown that faith and reason are notsimply and intractably opposed. For something like this conversation to receive so much attention fifteen years ago would have been unthinkable. I haven t read the book yet, and I m sure it s illuminating in many ways, but come on! I m reminded of the BBC debates between Father Copleston and Bertrand Russell in the 1960s. Erudite atheist versus erudite theologue. Only the names and the styles of argument have changed. Plus ça change, plus c est la même chose. Since when was postmodernism about deciding on what foundation to rest classical arguments? I thought foundationalism had been left in the dust. When what was once avante-garde becomes merely a cool kind of retro, you know you re in a rut. But, as Hegel said - actually rather cryptically - about the painting of grays in grays, something may be happening here, but you don t know what it is. As Derrida points out, specters appear through conveyance as much from the future as from the past. This bidirectionality, or bi-vectoring of their appearance, which is key to what Deleuze characterizes as an event in keeping with the logic of sense , renders any representation of what is actually happening impossible. Specters of something that remains so hidden by what has gone on, even in the now foregone age of Derrida, that people - as Marx said of the regimes of Old Europe - try either to ignore it, hunt it down, or stamp it out before it manifests. Since blogs are supposed to be more chatty and down-to-earth than any philosophical disquisition, if not perhaps running commentary on what is current and newsworthy , I hope you can please pardon me for refusing to name whatever I sense. I could say I seem to sense this or that, but specters are always ambiguous and ambivalent. This blog was inaugurated last fall, just about the time that the world economy seemed to be careening toward collapse and a new political order seemed to be sweeping in on a heady tide from the future. If what was occuring during that brief time interval could have been sensed as a messianic moment (after all, the messianic irruption is always preceded by great tribulation), something else is wafting in on the breeze. I call your attention to the latest editorial in London s Financial Times. The time may be out of joint , as Shakespeare says and as Derrida cites as the condition for the spectral manifestation of the messianic. But the way things appear , according to the edtiorial, have become increasingly opaque and unintelligible. The Times writer quotes another famous line of Shakespeare - a lot of sound and fury signifying nothing. One is reminded instead of Nietzsche s specter at the opening of The Will to Power. Nihilism stands at the door. Whence comes this strangest of guests , the most unsettling of any revenant?Not a democracy to come , but something simultaneously even more glorious and severe. What is to come must first overcome. Nietzsche was transfixed with a sense of overcoming of humanity, the overman. But after a false messianic dawn has passed a new eventfulness is brewing.What is more glorious than the specter Derrida names the impossible. Specters of the eschata.Carl Raschke is Professor of Religious Studies at the University of Denver. By Adam KatzEverything stays the same but composition, says Gertrude Stein. So, what is “everything,” and what is “composition”? (For that matter, what is the “same”?) Everything is all that falls below the threshold of our attentiveness, what remains as background, noise, the field of semblances, subsumed within habit. Composition is the raising and lowering of that threshold. In her “Reflection on the Atomic Bomb” Stein asserted that What is the use, if they are really as destructive as all that there is nothing left and if there is nothing there nobody to be interested and nothing to be interested about. If they are not as destructive as all that then they are just a little more or less destructive than other things and that means that in spite of all destruction there are always lots left on this earth to be interested or to be willing and the thing that destroys is just one of the things that concerns the people inventing it or the people starting it off, but really nobody else can do anything about it so you have to just live along like always, so you see the atomic [bomb] is not at all interesting…The atomic bomb is the same as everything else that destroys—if unleashed, it will kill 10 million, maybe 100 million, maybe more, but that’s just some amount more than is killed by a murderer or garden variety terrorist. If we go about living our lives when we hear about the latter, we will do the same when we hear about the former. Our lives may become different, but what do you mean by that—at what point does more violence and destruction mark a qualitative transformation compared to the previous “level” of violence and destruction, and at what point does the bundle of activities comprising our daily habits, which is anyway always undergoing silent revolutions, become a different way of living? There are still lots of interesting things and lots of ways of being interested in them and the rest we can’t do anything about.What ties us to the world are ostensive and imperative signs: our desires and rivalries produce rifts, within each of us and between all of us; in the midst of these desires and rivalries for centrality we stumble upon the materials displaced by the nihilistic tenor of our urgencies and find ways to piece them together—we attend from some material (say, some previously overlooked habit) which is thereby converted into a sign and to the object we were pursuing, interrupting our pursuit, creating shared distance. Even more, we compose those materials, attending from and to each of its “parts” in turn, conferring upon it a formal and transcendent reality. The object we were pursuing is now framed, and accessible only via established rules and rituals.These signs are named, and we command the names to remain in place until the names command us to model our activities on the process of their own composition. Our discourse, our sentences do little more than transcribe these commands, but what is interesting is that we get them wrong. Error is co-constitutive with norm: we imitate the small details when it is the broader intention we must be limning; we become “big-picture guys” when the devil is in the details. And so habit becomes the idiosyncratic composition of the center, as we establish commerce between all of our names, establish conformity across the field of imperatives we obey, and are then driven into new desires and rivalries by strange names and commands that are nothing more than the malapropisms of our habits. So, what, exactly, do we think will happen with the global economy? Unemployment will go up—how much—2 points? 4 points? 10 points? Will our credit cards no longer work? If we move our money out of the volatile stock market, will our insured bank deposits then vaporize? Will agriculture cease; will there no longer be anyone to transport goods to market? I wouldn’t discount any of these possibilities—although I will note that I almost never hear any descent into such specifics in all the panic talk (which, I also note, at times suddenly morphs into speculation regarding whether we will start coming out of the recession this Fall or next Spring—in short, nobody knows anything). Habits won’t cease, though, and just as we can raise the threshold above which we notice difference, we can lower the threshold—more likely than government finding a way to restore corporate health or put people to work is people establishing new economic and social networks on the margins of their intersecting habits. And we will thereby rediscover the laws of complementarity: if the market crashes, maybe that is just a return to the true value of the commodities circulating through it; if official money ceases to measure anything reliably people will find other measures; if the rules seeking to prevent in advance insecurities, violence and error start to paralyze creative activity, people will seek out new trade-offs between these various goods; if regulators are, as they likely will be, as unable to predict values 2, 3, 10 years down the road as the participants in the market themselves, then transparency will be the only check on inordinate risk and will itself become among the highest of values. Perhaps strategies—of the kind one would expect all good postmodernists to applaud—of fleeing established centers, which become chokeholds with increasing rapidity, and establishing novel ones, will proliferate and not so much “resist” domination as seek to render it incoherent.I confess I am less sanguine about the Global Intifada. The Global Intifada might best be seen as the embodiment of Aime Cesaire’s remark, prescient, insightful and vicious all at once, that the West only cared about the Holocaust because the victims were white. It is true—the mass industrialized slaughter of Europe’s Jews became the foundational event of the postmodern, victimary era, because it—in the light of the new possibilities revealed by the atomic bomb—disclosed the possibility for universal destruction at the heart of the rivalries among the Western powers that culminated in the immolation. It is true that in the event itself, the colonized world, those behind the “color line,” were a side show at best. Cesaire’s remark also revealed, though, that the future of the event lay in the rise to centrality of this side show—that the ethical effect of the virtually universally shared horror at the scene of Auschwitz would be to place under the severest scrutiny, even if not all at once, every invidious distinction, even the most implicit, between one category of humans and another; and every claim to expert neutrality, scientific objectivity and procedural probity, which had all after all just been put to work in discovering, justifying, and implementing the most invidious of imaginable distinctions. And it is vicious in the way it sets the terms for this overturning of margin and center: in the end, the Holocaust must be taken away from the Jews, and what better way to do so than to represent the Jews as the new Nazis?Barack Obama, in his inaugural address, announced that the world is changing and we must change with it. A sentence both leaden and brutal. He must know how, exactly, the world is changing (he has commanded the world to come together as a model and has mapped out the imperatives we must follow in adhering to that model) and is either objectively describing the laws of reality (of which he is mere executor) which will force us to change accordingly or letting us know that he is determined, as voluntaristic subject, to force us to change. The world is always changing, and each of us is always changing at some angle to those of the world and the world is no more than all these angles. The Progressivist imperative, reiterated here in warmed over fashion by Obama, has always been a nihilistic metaphysics—now it has become the defense attorney of a burgeoning death cult. The dictum composed by the “imperialist” bogeyman Churchill is far more valuable than Obama’s hideous bromides: democracy is the worst system, except for all the others. We will always be dissatisfied with democracy, that dissatisfaction will periodically reach such a pitch that we are tempted to throw it all away for some other system—formerly, more authentic and unmediated; now, less wasteful, smarter, more inclusive and shock-resistant, until we acquire the static hysteria of the blackmail victim who is not quite sure that he’ll never run out of ransom money. And we will always come to realize that precisely this set of dissatisfactions and the way it sets and sustains each of us amidst and among all the rest is democracy. Or at least we can always be coming to believe that such is the case. Gertrude Stein also said that she liked having habits but didn’t like others talking about her habits—this by way of explaining why she wasn’t a utopian. Having habits, loving one’s habits, riding one’s habits, slavishly following one’s habits, finding extensions of one’s habits in the world and getting into the habit of finding in the world providential interference with and cause for revising one’s habits—this is not a bad definition of freedom. Having other people get into the habit of explaining one’s habits, cataloguing them, diagnosing and reordering them in accord with some template—that is not a bad definition of tyranny. Deeper than liberalism and democracy is the imperative order—the realm in which commands are spontaneously issued and obeyed, where the proximity of emergency is more real than the ever lowering threshold of victimage. Habit is deeply rooted in the imperative order—it is an idiosyncratic method of preparing for emergency by keeping sharp the distinction between what must be kept close and hand and what can be let go. Like habit, the imperative order is most effective when unnoticed: that is, when security is ensured unobtrusively. And therefore easily forgotten or demonized until it is essential. At some point the dominant men in the community must have willingly devoted themselves to defending the weak against other men like themselves, and this initiated the process whereby they came to subordinate their own imperative order to the declarative order of principles (“all men are created equal”). Only then is freedom possible—only when those willing to risk their own lives to ensure that participants in exchange can complete their exchange unmolested outnumber those for whom exchange is fraud or easy pickings do we have freedom. The perpetual composition of habits is sensitive to such conversions; indeed, it may be that the transformation of rituals into habits relies upon this kind of conversion—Stein insisted that verbs were interesting because there are so many ways they can be mistaken, in my last post I argued that sentences transform names into recipients and sources of imperatives, and now I can say that the connection between verbs and imperatives lies in the fact that it is first of all imperatives that can so easily and interestingly be mistaken—and so sentences are in essence the collaborative process of converting those errors which arrest our collisions into the material of norms. The continuance and constant adjustment on exposure to reality of our own habits depends upon the covenants among those who seek mutual insurance for the errors consequent upon their imperatives. And so the vast field of centrifugal, eccentric habits depends upon and flows back into those social sites based upon explicit, publicly shared habits. Having the courage of our habits will enable us to affirm reality in the errors of our self-issued imperatives, the ones we forget and call habits, and that provide us with a source of revelation in anything that suddenly lies outside of our habit as part of our composition. The convergence between novel compositions and ferocity harnessed and directed toward those who would hold civilization hostage by targeting its most vulnerable members and interstices—this is the answer to the Global Intifada. It’s not the answer we seem ready to provide right now, but it is encoded in the habits and composition of freedom—freedom, nothing more than no one, including you, knowing what you are going to do next, what you are about to tell yourself to do and what will then count, for you, as having done whatever you have come to be told by yourself.Politics is for protecting the dominion of habit and helping it become self-reflexive and open to novel compositions. The inertia of the other’s habits needs to be converted into the materials of one’s own composition. We might think about such a politics as a series of assignments we give to each other, assignments that would take the general form of, “by all means keep doing whatever you are doing but just take into account this one thing I’m doing—and I will begin by doing what I’m doing and taking into account one thing of yours.” These should be the rules of political exchange—anyone who’s not ready to go first in some proposed reciprocity should be boycotted. Convert the courage of your habits into the habit of encouraging others to compose with your habits. Adam Katz teaches writing at Quinnipiac University in Connecticut. He is the editor of The Originary Hypothesis: A Minimal Proposal for Humanistic Inquiry and an editor of Anthropoetics, the on-line journal of Generative Anthropology. By Colbey Emmerson ReidNeil Marshall’s 2006 horror film The Descent is about sixfemale high-risk adventurers who get lost in an uncharted Appalachiancave system inhabited by cannibalistic monsters. These turn out to bean undiscovered species of southerners produced through inbreeding,perfectly adapted to live alone in solitary pockets of the mountainsuntil it’s time to go hunting. You know, just like the real ones. Naturally,the entryway to the cave collapses, the spelunkers are separated, andthe women’s only hope of survival is to find another exit while evadingthe creatures. The women who try to outwit the monsters with their morehighly evolved brains, like the English teacher and zoology doctoralcandidate, who determines her opponents hunt by sound and can be eludedby silence, are the first to go. The only way to survive against theso-called “crawlers” is by becoming as feral as they, which is to say,by hacking them to pieces with climbing tools before they rip yourthroat out. The “descent” of the title thus alludes to Darwin’s The Descent of Man,in which the biologist explains that animals differ in gradations fromhuman beings rather than qualitatively. The women in the cave discoverthat their distinction from the extant cave men and women diminishesthe longer they survive, thus illustrating a downward drag within theprinciple of natural selection. For instance, one of the women, Sarah,cowers in a hole in the wall for a little while, watching the crawlersfeed on one of her friends. When they’re distracted by the sounds ofthe other lost women who are shouting for each other in the dark, Sarahemerges to try to help a second friend, Beth, who’s wounded, immobile,and clearly the next course. Sarah has a good cry and considerscrawling back into her nook, but in the end instinct takes over. Shebashes Beth’s brains out with a rock so she’ll be dead when thecrawlers get to her, and then bludgeons her way through a crowd ofmonsters until she falls into a sinkhole. When Sarah emerges, gaspingfor air, she pulls herself out of the hole, kills another couple ofcrawlers, and lifts her weapon high in her best conquering hero pose.The blood and mud on her face make her look like a highland warrior, anarchetypal figure for victory won through violence—ironically, though,the Scotch herdsmen who cultivated a reputation for hair-triggertempers and merciless reprisal in order to protect their sheep fromthieves are also the ancestors of Appalachian mountain people. Sarah’scompetition for survival draws out her likeness to what she is tryingto escape. In one scene, she butchers a family of crawlers offered bythe film as effigies of Sarah, her husband, and her child. Theevolutionary nightmare is only half the story. The other half is apsychological thriller focused on the aftereffects of the death by theimpaling of Sarah’s husband and daughter in a head-on collision with apole-laden car on the way home from a white water rafting adventure.After navigating the rapids while her family watches, Sarah noticesthat her husband seems distracted and Beth notices that he’s veryattentive to one of the other women, Juno. The car crash occurs shortlyafterwards; Sarah’s husband looks away from the road when she asks himwhy he’s been distant, and the next thing Sarah knows she’s waking upin a hospital, looking for her child, and then sprinting down acorridor as the lights behind her blacken. At the end of the hall, shecollides with Beth and the news that her family is dead. Thedescent into the cave one year later parallels the corridor in thehospital, the former echoing the latter and framing the “adventure” asan effort to escape a psychological descent into darkness. Sarahstruggles with hallucinations of her daughter that paralyze her; wewatch her leave the present, her face going blank and her body limp asher mind becomes absorbed in time. Crawling through a tight pipe lateron Sarah has a panic attack, and Beth’s words as she’s trying to calmher gasping friend say it all: “the worst thing that could happen hasalready happened. There’s nothing else to be afraid of.” Of course, shedoesn’t know about the inbred southerners yet—but it’s clear thatMarshall means to carve two stories out of one, to trump the trumped-upcreature feature with a real-live horror, the kind of horror anyonemight experience and be ruined by. In a genre that always offerssurvival as the ultimate prize—in this sense, all horror films are theinvention of evolutionary biology—Marshall wants us to wonder whetherstaying alive is enough. By introducing an ordinary trauma intothe Darwinian parable, Marshall thus produces the kind of “monstrousmisreading of Darwin” that Elizabeth Grosz (The Nick of Time: Politics, Evolution, and the Untimely, Duke University Press, 2004) hasattributed to Nietzsche, who contended that Darwin’s construction ofevolution produced an environmental fatalism designed to make nothingmore than “the weak…the herd…the servile…the low…the mediocre”—inshort, “the boring” (100-101). Nietzsche represents himself as the“anti-Darwin” (101), a champion of the exceptional, the unrepeatable.Against the vicissitudes of natural selection he posits his ownprinciple, the will to power, defined as a longing—indeed, a demand—notsimply for adaptation but “exaptation” (101): the excess of survival. In The Descent,while the other adventurers are stock characters whose fitness istested by circumstances, Sarah conquers the crawlers because her pastprovisions her with a will to seek something beyond the merely human asit is defined by biology and produced by nature. The will to power,which is a form of playfulness with the vast waste, or inutility, thatNietzsche discovers everywhere within the system to which Darwinascribed scarcity and efficiency, wants more than simple subsistenceand reproduction. It seeks profusion, luxury, abundance, an elevationof the organism above itself. What Sarah must discover in the cave, anallegory for her mind, is whether she will be rendered mentally andemotionally dead by her past: will she just get by, or will she findher way to joy again. It is not Sarah, though, but Juno whofirst exerts the will to power, and who invites Sarah to do so too. Shereminds the group of their motto: “if there’s no risk, what’s thepoint?” and ensures that their adventure will be dangerous by filingthe wrong cave with the forestry service and lying to the others aboutthe difficulty of the cave they’ll be exploring. Juno’s extension ofthe invitation to risk is designed to help Sarah as well as herself.She apologizes to Sarah for leaving so quickly after her family’s deaththe year before, telling Beth, who berates her for it, that “we alllost something” in the accident. Though it sounds like Juno isrehearsing the cliché of her lost sense of immortality, in fact we knowbecause we’re shown the tender glances she exchanges with Sarah’shusband right before he dies that Juno is speaking literally: she losthim. Juno wants the latest adventure to be an occasion forrejuvenation. She tells Sarah, once they’re trapped in the cave, thatshe wants them to discover something new and name the cave: “Maybe yourname,” she offers—and Sarah challenges, “Or yours.” This is the classicscenario of the will to power, which either shapes or is shaped bymatter. Sarah misunderstands. She accuses Juno of endangeringthe group by taking them on an ego trip, but she’s got it wrong.Neither the adventure nor the will to power is about ego. They’reabout wagering the self against an expansion: they’re abouttranscendence. For Juno, the appearance of the crawlers is a fortunatedisaster. They raise the stakes, as the will to power desires thegreater tension that will drive it toward greater accomplishment.Juno’s goal is to make it out alive with Sarah so that both women canbe reborn, in part because the rebirth of each is dependent on heralliance with the other. When Juno discovers markings on the cave wallssuggesting an alternate exit, Sarah is lost and Juno tells the otherwomen that she’s not leaving without Sarah. Her refusal to leavewithout her rival, whom she treats only as an ally, testifies to herdesire to extract herself from the conventional rivalry narrative. Junowears a charm around her neck engraved with the words “love each day,”a testimony to the renewed jubilation she seeks to uncover for herselfand friend by refusing the conventional narrative of envy andcompetition between women. Indeed, Marshall suggests thealternative paradigm through an implied lesbian relationship betweenJuno and a younger woman euphemistically referred to as her “prodigée.”In the context of Juno’s Roman name, a relationship that mightotherwise seem maternal is eroticized. Juno is determined to offerSarah this alternative as well. But Sarah recognizes “love each day” assomething her husband used to say. When she and Juno meet again afterwandering alone through the cave, everyone else is dead. They followthe markings Juno found on the wall, butchering crawlers together witheasy grace. It appears as though they will triumph together, and emergereinvigorated by their survival—which Marshall has begun to trope astrivial; it’s just not that difficult for them. At the lastmoment, however, Sarah confronts Juno with the engraved charm necklaceBeth snatched from Juno’s neck before falling into the pit with Sarah.Sarah knows that Juno left Beth for dead, and that she’d hoped theknowledge of Sarah’s husband’s unfaithfulness would die with her. Junois trying to rewrite history in order to produce a more bearablefuture, a future that will not end in descent, the living death ofthose buried alive by an emotional catastrophe. She doesn’t want to bemerely alive. But Sarah is caught in the throes of ressentiment,an inability to “digest the past and be rid of it” (116). For her thepast keeps coming on strong, it “returns to haunt the present”literally in Sarah’s recurring fantasy of her daughter’s birthdayparty, and thereby overshadows all futures. Marshall suggeststhat it is not environmental fatalism that ruins Sarah but her owninability to become “untimely” by removing herself from the constraintsof a present too fully circumscribed within the past. Her alternativeis to extract something from the past which will reshape the present,and it’s easy for the audience to see how Sarah’s history could explainto her the plausibility of difference, a “tension with the presentwhich [could] move [her] to a future in which the present can no longerrecognize itself” (117). We—like Beth—have noticed that Sarah’smarriage was falling apart anyway, that in losing it she wasn’t losinganything. Sarah’s anger with Juno is all about revenge for a loss thatshould now be beside the point. Juno seems to have understood this; sheturns to her friend to help recover from the loss of something she toonever had. Nevertheless, Sarah turns her weapon on Juno in a fury, theonly new story she can put together is one in which her daughter diesbecause her husband is distracted by Juno. She hacks into Juno’s legwith a scythe and leaves her crippled and weaponless to a freshonslaught of crawlers. Sarah makes it out of the cave, andMarshall plays up what will become the film’s great irony byaccompanying Sarah’s emergence from the ground with a high, spiralingcamera and majestic orchestral music. She claws her way up a tunnellined with bones, pushes her hand through a thin covering of dirt andmoss, and hoists her body from the earth in a scene the audiencerecognizes from vampire and zombie movies. This should be our firstclue that all is not as well as it seems, since in horror movies thatwhich crawls from the ground is only ever “undead.” When Sarah’s headpops out of the ground she draws a deep breath, a baby’s first breathon emerging from the womb: she thinks she’s born again having slain herdemons (avenged her child’s death), which she believes—mistakenly—to beJuno. Back at the car, Sarah careens out of the woods, pullsover to the roadside, and is nearly hit by a truck. This is the firsthint that the past isn’t vanquished, that she’s in the throes of thedemonic repetition of that other post-adventure head-on collision.Sarah recognizes this, but she still thinks she’s won. She thrusts herhead out the window and vomits water like a drowned woman resuscitated.Slowly, with relief, she draws her head back into the car. She’s readyto drive away from the cave now, it’s over. She looks to her right andfinds—the bloodied ghost of Juno in the passenger seat. The screen goesblack: “the darkness” (the film’s original title) that Sarah’s beenrunning from since the hospital corridor is upon her. Theending, revised from the British version in which Sarah dies in thecave for American audiences who wanted a happy ending, questionswhether mere survival can constitute happiness (the competingresolutions illustrate the difference between American and continentalphilosophical constructions of happiness). Sarah gets out of the cavealive, but she’s trapped in the madness of a future that only repeatsthe past with a different face attached to it. Having tradedhallucinations of her dead daughter for hallucinations of her deadfriend, unable to think her way out of the misery of eternalrepetition, she is caught in the hell of a Darwinian descent in whichbeings become only that which has been selected by their history. Sarahhas been reborn exactly the same, and the real horror of “The Descent”is not about crawlers or even being trapped alive and forgotten in acave beneath the earth. It’s about the inability to create a new selfthrough rebirth, the bitter disappointment of a self that becomesmerely itself again. Sarah’s hope for something else has been left inthe cave with her leg chopped off. But while Sarah’s survival is hardlya happy ending, her friend, who submits to the strike rather thanfighting back—as she is clearly equipped to do—has realized theNietzchean promise of the Overman by not merely accepting but willingthe accident that happens to her, thereby achieving the nobility of “akind of happy self-annihilation” (102). The Overman is amore-than-human-being which ascends not by evading, forgetting, or evenremembering the past but by willing its eternal return. ForNietzsche, the eternal return is the extraordinary acceptance of fatedevents as willed events, an invitation to them to happen again, nomatter how mundane, humiliating, emotionally withering, or physicallydestructive. While Juno could have elected to defend herself againstSarah the way she did against the crawlers, this would have been tofall back into the clichéd narrative of erotic rivalry between womenand deny herself the new being she’s been pursuing in the cave. Since“‘becoming master involves a fresh interpretation, an adaptationthrough which any previous ‘meaning’ or ‘purpose’ are necessarilyobscured or even obliterated…it is not too much to say thateven…death…is among the conditions of an actual progressus’”; indeed,“‘the magnitude of an “advance” can even be measured by the mass ofthings that has to be sacrificed to it’” (Nietzsche, quoted in Grosz,108-109). Marshall seems to think that the original ending, inwhich Sarah kills her friend and then purposely succumbs to thecrawlers, would have been “happier,” as it would have reflected theOverman’s will for the eternal return of her loss. As it is, all thatcan return is more of the same—The Descent 2 is due to bereleased in 2009, with Sarah haunted by hallucinations of an event sheneither remembers nor understands, forced back into the cave thatnearly killed her, doomed to the horrible repetition not only of hertrauma, but also—should she survive again—the disappointing rebirth ofan identical subject. Marshall offers a cautionary fairy tale againstbeing born again into a demonic repetition, and proposes conventionally“sinful” behaviors (lying, murder, lesbianism, suicide) as lining apath towards that genuine paradigm shift which is constitutes a genuineresurrection story. Colbey Emmerson Reid is Assistant Professor of Modern LiteratureDepartment of English and Humanities York College of Pennsylvania. Shecan be reached at creid@ycp.edu. R.C. Collingwood exposed a basic principle of Western metaphysics when he noted that “[t]he logician’s proposition seemed to me a kind of ghostly double of the grammarian’s sentence… Grammar recognizes a form of discourse called the sentence, and among sentences…one kind which express[es] statements In grammatical phraseology, these are indicative sentences; and logicians have almost always tried to conceive the ‘unit of thought’, or that which is either true or false, as a kind of logical ‘soul’ whose linguistic ‘body’ is the indicative sentence.” Eric Gans’ originary hypothesis enables us to pursue further the implications of this observation regarding the reduction of claims about reality to true or false propositions modeled upon the indicative sentence. Gans defines metaphysics as that form of thought that presupposes the primacy of the declarative sentence. Metaphysics thereby obscures the primacy of the ostensive sign, and the secondariness of the imperative. The purpose of this obfuscation in ancient and modern Enlightenments is to neutralize the power of the ostensive: as the originary sign, the ostensive defers violence and constitutes the community around a central, sacred object; however, once contending sacralities struggle to occupy the same space the power of the ostensive becomes a source of violence. Metaphysics attempts to do the work done by ritual in the more compact community—it defers violence in a world where market interaction breaks the bonds of ritual by placing the representation of a reality that transcends all specific desires and demands at the ethical center of society.It is in his first book on the originary hypothesis The Origin of Language (1980) that Gans traces the emergence of the declarative sentence from the originary scene. As I pointed out in my previous post, the originary sign “saturates” the scene which it constitutes—it would best be “translated” as the Name-of-God but a more “literal” translation would be something more along the lines of God-whom-you-must-not-encroach-upon-on-pain-of-immediate-cataclysm-stop-right-there-don’t-dare-take-that-next-step-you-are-considering-right-now… As long as the ostensive sign is the only sign, it would generate such scenes, resolving imminent conflicts, enhancing the cohesion and “inter-operability” of the nascent community. Its iterability and usefulness, though, would keep lowering the threshold of mimetic rivalry at which it could be introduced—more simply, it would become increasingly mundane as the singularity of its designation of the center is modified by its more variegated deployments. I am, then, suggesting the emergence of something like a “vocabulary,” of a system of signs alongside the transcendence of the originary sign. Somewhere along the way a human being issues what Gans calls the “inappropriate ostensive,” designating an object that isn’t there. When the inappropriate ostensive is met by a retrieval of the named object, it becomes the imperative sign. Since the threshold for putting forth a sign has been lowered, so that signs become meaningful in non-critical situations, imperatives can alternate with ostensives in minimally conflictual ways—indeed, imperatives can only be meaningful with an ostensive at the “end” of it. If I demand something of you, not only is my demand not fulfilled, but I have no evidence that you have understood my demand, or even that I have made it correctly, until you produce whatever it is I am demanding. In this world of ostensives and what we can call ostensive-imperative articulations the uses of signs are still limited by the proximity of some object. Objects demanded, acts commanded, must be produced and carried out within some very determinate temporal frame—otherwise, the linguistic acts will “expire.” The creation of a “reality” that transcends the immediate proximity of some object that might be shared, contended over, requested, requires a different linguistic form. For Gans (still in The Origin of Language) the origin of that new linguistic form, the declarative sentence, is to be found in what he calls the “negative ostensive”: an imperative goes unfulfilled (it can’t be fulfilled; it is refused—could there be any means of making this distinction at this point?), but remains unappeased. Conflict arises, and at least a mini-crisis looms. Neither “interlocutor” wishes to push further, but neither can simply retreat. The imperative is “softened” into an interrogative (the imperative repeated in a less peremptory way); a question creates space for “dialogue” that a command or demand excludes. The word remains the same, only the “tone” and “posture” of the “interrogator” changes; and the word remains the same when the “answer” is put forth: the same word, “inflected” differently, instead of the object. Once this linguistic exchange is completed successfully, we have the first sentence: topic+comment, (the object) (not here). A space for the declarative sentence has now been opened up—if the “claim” that the object is not here “makes sense” (appeases whoever makes the demand) then it would make sense that it could be elsewhere, anywhere else, and other “predications” become possible. At some point (and here my own hypothesizing takes over from what I hope has been an accurate account of Gans’ argument in The Origin of Language) the ostensive is conjoined with a fresh imperative, issued by the recipient of the original one, as there will be questions that require a path, however mediated, back to the object if they are to be “answered” (and not revert back into more menacing or importunate imperatives). Here we would have a clear separation and articulation of parts of speech (Name-of-Object—Place-Name/Direction), and once imperatives can be placed alongside ostensive naming the name is subjected to imperatives in general and thereby embedded in a sustainable reality, beyond anyone’s graspHere, we would have a noun and a verb. It helps, I believe, if we view this ostensive-imperative linguistic articulation as a sequence of “inappropriate” usages on a succession of scenes. Since our very distinction between different speech-act forms is artificial at this point, the first imperative could easily have been put forth (“intended”) more as a kind of “adjective”—it would become a verb, and become “transitive,” once acted upon and hence extended beyond its site of articulation. In turn, such an imperative issued by the recipient of a demand back to his interlocutor (following an insufficient negative ostensive) gets transferred to the object; once transferred to the object, the imperative could be coming from anywhere (imperatives have already been exchanged with God in the ritual scene). The sentence is itself, in a sense, an imperative directed toward the object—all these vectors of command enmesh the name-of-the-object in reality, out of anyone’s reach, turning it into a source of imperatives itself. Now we would have genuinely autonomous objects moving through what I would propose calling a “field of semblances”. If a “semblance” is something that is simultaneously sign and object (object when we—to use Michael Polanyi’s terms—attend to it, possessively; sign when we attend from it to something else), reality itself is a field of semblances created by us through the use of sentences which give objects their own life and thereby require that we devise formal mediations (compromises and covenants with objects that add new layers of complexity to our reciprocal compromises and covenants with each other) in order to arrange for their reliable availability, as signs and objects. Sentences, moreover, exist, sustain themselves, in a swirling pool of imperatives and interrogatives and are best made sense of as deferring and incorporating imperatives (through the mediation of questions). Even more, they themselves generate imperatives, upon whose acknowledgement their intelligibility is contingent, directed toward the field of semblances: “understanding” a sentence would involve obeying or resisting, iterating and complementing in words and deeds the ways it orders (strongly suggests/politely requests—anyway, one’s understanding is mediated by the softening of the imperative into a question) you to disperse the field of semblances around the name. So, to put it a little idiosyncratically, a sentence is a name turned by order from marker of dangerous convergence to event. A noun and a verb. Imperatives come from all over, but let’s reduce it to three possibilities: from another name; from reality; from the name itself. I’ll put forward the following hypothesis: in any sentence (the exceptions will almost always be verbs that are tied to questioning), we can translate the verb into an imperative coming from one of those three origins. Or, more precisely, some range of probability for imputing it to some distribution of those origins. The composer of the sentence thereby preserves the name and insists you compose another sentence anticipating and forefending some possible convergence upon it. It is then in the subsequent sentence that the distribution of possibilities in the previous one is established, as that subsequent sentence begins with the embedded request to authenticate ostensively the imperative embedded in the verb of the previous one.Originary grammar is the mode of thinking within sentences, tracing the paths from ostensives (what was settled being put out of order) through imperatives (put it back in order!), the stalling of imperatives and their softening into interrogatives (what’s the best way of ordering?), and into the declarative sentence (here’s a range of possible orderings) and back into imperatives (tradition/reality/conscience dictates that…) that, fulfilled and ostensively “authenticated,” settle things just enough to maintain a tolerable threshold for the emergence of new ostensives. Thinking is itself obedience to the imperative (self-issued? on compulsion from reality, rightly perceived? divinely imposed? Perhaps the quality of thought is at stake) to suspend obedience to all imperatives as the various possible circulations of those imperatives can be iterated in sentences, sentences that keep deferring the imperatives embedded in previous sentences. The most perfect issue of thinking is the maxim, simultaneously general in its implications, pragmatic in its applications, and paradoxical in its operations: the maxim is an imperative issued by the name to itself to single out an imperative from reality to obey.. For example: thinking is obedience to the imperative to suspend all imperatives.Metaphysics—like Hebraic and then Christian monotheism—emerges as an attempt to transcend scapegoating. The Hebrew community, Socrates and Jesus all deliberately attract the desires and resentments of all—the revelation all these intellectual/spiritual movements share is that scapegoating, rather than saving the community, destroys it—from a more historical perspective we could say that this is the case past a certain level of social development, when the imperatives of scapegoating conflict with the necessary openness between communities connected through markets or empires. But metaphysics never effectively extracts itself from the procedures of scapegoating, and while the revelations of the monotheistic faiths are inexhaustible, that of metaphysics has been drying up for centuries. If metaphysics is the mode of thought based upon the primacy of the declarative sentence, I would further refine that definition to say that metaphysics is the mode of thought that sees the declarative sentence solely as a conduit for the imperatives issued by reality. Metaphysics confirms human nature from the standpoint of human mind; it replaces scapegoating with realism, which lets reality select the victim and establishes intermediary institutions to ensure for natural selection. This was once progress—it delayed and somewhat neutralized socially legitimated violence, and helped prevent the further degeneration of that violence into out-and-out human sacrifice. Even now, with human sacrifice on the rise in, in particular, the practice of suicide bombing, we should be cautious in assaulting metaphysical modes of thought everywhere we find them. But we cannot obey the realist imperative to set aside all concern with other ways of writing sentences, with the generation of idiosyncratic idioms through which self-issued imperatives of names and things make audible orders and orderings operating below standard thresholds, in the meantime. Next time, I will examine the syntax of contemporary victimary metaphysics and propose an originary thinking of error as a way of working the margins of the current tsunami of the Global Intifada and the global financial crisis.Adam Katz teaches writing at Quinnipiac University in Connecticut. He is the editor of The Originary Hypothesis: A Minimal Proposal for Humanistic Inquiry and an editor of Anthropoetics, the on-line journal of Generative Anthropology. Editor JCRT Live is a blog for editors, contributors, and other invited participants in conversations sponsored by the Journal for Cultural and Religious Theory.

TAGS:JCRT Live 

<<< Thank you for your visit >>>

JCRT conversations on what is current and compelling.

Websites to related :
Shaivam.org - Devoted to God Shi

  Server is getting migrated. There could be service disrutions during 18th and 30th of October. Please bear with us. Namahshivaya திருவாதவூர

MR. Baulds English Website

  Hear the voice of the Bard!Who present, past, and future, sees;Whose ears have heardThe Holy WordThat walk d among the ancient trees,Calling the laps

Kansas City ObGyn – Gynecology

  Visits will be scheduled once the provider deems the visit to be a TeleHealth appropriate visit.To schedule, call our office at 913-948-9636.You must

Crest Fresh Market - Home of R

  Meat Sales Demos Check our Meat Sale and Demo events page for an upcoming sale/demo at a Crest near you! Crest Gift Card Vouchers Due to technical i

Welcome to Warm Fuzzies Pet Sitt

  Welcome to Warm Fuzzies Pet Sitters We are in the business of giving your pets tender loving care when you can t be there. Whether you re on vacatio

Purrs And Wags Sitters Home Page

  Purrs And Wags Sitters. Theyare competent, caring and dependable animal lovers. We feel confidentrecommending Linda Tom Jones to our clients. Carolyn

まちビねっと

  This page is written in Japanese. Please install Japanese font set.まちBBS (Machi-BBS) is built by 地方1 (Chihou-1). このサイトでは まちBBS の管理人さ

Pet Sitters NZ

  Trusted Pet Care Professionals since 2004based in Nelson NZAbout Pet SittersThank you to all our loyal customers for your continued support during Cov

Calgary Real Estate Listings & H

  X Property Type All Properties Residential Land Commercial Agri-business Mobile Multi-family Rental Search Calgary Real EstateWelcome to the most comp

Calgary Real Estate | MLS Calgar

  What's Your Home Worth? Find your home or condo's value in today's market, for free.Find OutMLS Calgary Real Estate |Calgary Area AB Real EstateWelcom

ads

Hot Websites