This is the very slightly emended text of the talk given at the third annual Generative Anthropology Summer Conference at Ottawa on June 19-21, 2009. While apologizing for its length and discursiveness, I wanted to commemorate this extraordinary conference, put together by the devoted labors of Ian Dennis and Amir Khan. The conference theme was transcendence.
* * *
To speak about transcendence, one has to determine what it is transcendence of. From the perspective of GA, what is transcended in transcendence, like what is deferred in Derrida’s deferral, is violence, the breakdown of communal order. Just as deferring something does not eliminate it as a future possibility, so it is with transcendence. The elaborate systems of human culture by means of which we seek to transcend this breakdown can never achieve final success.
The different topographies of deferral and transcendence reflect their respective relationships to the genesis of the sign. The aborted gesture of appropriation that becomes the first sign defers violence by turning away from its appetitive aim, and thereby from potential conflict, to become a consciously self-contained act, an intentional rite that designates its focus, the deferred appetitive object, as not merely outside itself but outside its universe of possible action. This is the difference that, along with deferral, makes up Derrida’s différance. But it is a metaphysical reflex to situate this difference within a paradigm of signs; any sign, whether its paradigm contain one or a million members, marks its difference from the world to which it refers. Deferral is nonetheless only provisional. In the event described in the originary hypothesis, the representation of the central object is followed by its division in the sparagmos, where each obtains an equal portion of what cannot be appropriated as a whole. The nascent human community defers appetitive satisfaction in order to organize it, to assure its symmetry; the aborted gesture does not abandon appropriation; it only postpones it.
Transcendence, by contrast, implies a permanent relationship to what is transcended. This does not mean that transcendence abolishes violence; the transcendental world of representation does not abolish the world of appetite, but exists alongside it, “above” it, as it were, and there is no guarantee that it can hold in check at every moment the appetites of this world, which contact with the world of representation exacerbates as much as inhibits.
What makes the notion of transcendence useful is that it suggests, as deferral or even différance does not, the existence of a separate, transcendent universe, and thereby more appropriately takes into account the significance of the human as, to our knowledge, the only worldly creature to be acquainted with this universe. As the term also suggests, we do not inhabit this universe, as do the divine beings that we sometimes postulate as its necessary inhabitants. To sum up, the central object of deferral will shortly perish; the central object of transcendence is eternally divine.
(Transcendence cannot be guaranteed by memory alone; the disengagement of the transcendent from the concrete that we find in the Hebrew prohibition on “graven images” and in the Hebrew God’s taking as his name a declarative sentence depends on a process of historical revelation. One area that GA should explore more thoroughly is the phenomenon of inscription implicit in the category of “institutional” representation that I opposed to formal representation in The Origin of Language. Derrida’s intuition that writing precedes speech can be understood as the affirmation that the ostensive sign cannot point only at a mortal object, but implies a subsistent inscription of the place—the scene—in which the event took place.)
The permanence of transcendence can never provide more than the deferral of human conflict. The danger inherent in the notion of transcendence is that it suggests the existence of a stable duality between the worldly and the transcendent, whereas our access to the transcendent universe of representation guarantees only the eternal instability of mimetic desire.
Today I would like to deal with the most salient factor of this instability, which I do not think has received the attention it deserves from those of us who are attempting to rethink the human in terms of the “new way of thinking” that is GA.
The heart of the hypothetical originary event is the exchange of the sign among the participants; each repeats the sign to his fellows, both emphasizing his own renunciation of the appropriative act and at the same time, the newly “sacred” status of the central object that is so desirable as to be beyond desire. Our accessibility to this sentiment of sacrality, which religion finds it necessary to cultivate (and which the secular can, although it may prefer not to, imagine vicariously), is what allows us to intuit the originary scene as our origin. A worldly object, say, a dead animal, is endowed with “supernatural” properties as a simple consequence of the intensity of human desire—or is it this intensity that permits us to experience the transcendent, to become in Heidegger’s Wagnerian term the “shepherds of being”? The advantage of GA is that, however hypothetical our scene, it is a concrete worldly experience that does not transport us to a metaphysical universe of ungrounded abstractions.
The exchange of the sign takes place in the topography of a circle with the sacred object in the center. The symmetry of the human participants contrasts with the central position of the object, and it is this contrast that will henceforth embody transcendence. Whether or not one conceives it as a wholly human invention/discovery, the transcendent world of representation is above and in any case not of the human. The symmetrical circle of humans exchanging a sign that represents what they cannot otherwise possess provides what I call the moral model of human interaction. We might say that this symmetry is already implicit at the very beginning of the scene in the interdiction that falls upon the formerly alpha animal and prevents him from appropriating the central object, but only the exchange of the sign provides a praxis of moral interaction.
The moral model provides a basis for various familiar affirmations of moral reciprocity, from the Golden Rule through Kant’s Categorical Imperative to the symmetry of John Rawls’s “original position” guaranteed by the “veil of ignorance.” This model also informs the symmetry between our active and passive use of language; to understand the sign is to be able to (re)produce it oneself. Language may embody hierarchy, but the essence of linguistic interaction is symmetrical.
From the earliest times down to 10,000-odd years ago, all human communities maintained this reciprocal, egalitarian organization. Then the Neolithic Revolution brought agriculture and the possibility of accumulated wealth in the form of a preservable surplus of production over consumption. Some years ago, in The End of Culture, inspired by Marshall Sahlins’ description of Pacific island societies concentrated around a big-man, I attempted a model of how this might have led to the demise of the egalitarian social order. I was struck by Sahlins’ observation that the big-man worked harder and had less to eat than anyone else in the village. The point was that in a culture of circulating ritual distribution such as the “totem” societies referred to by Durkheim, the big-man, by working harder than anyone else (and no doubt possessing advantages such as a more productive family workforce and more fertile land), had usurped the ritual center for himself. Why work hard to supply material for the feast when there is someone who will work harder for the privilege of taking over this responsibility? Although Sahlins says nearly nothing about ritual, it seemed obvious that the big-man, although not yet a king, was already a kind of high priest, furnishing and presiding over the sacrifices to the gods that accompany the sharing of the ritual feast, responsibility for which had previously rotated among the various clans.
I have since found no reason to reject this model of what Rousseau called the “origin of inequality.” But in order for GA to deal with the phenomenon of inequality in its own terms it must situate it, via what we call “originary analysis,” in the originary event itself. Even if the event created a symmetrical community, there must be an asymmetrical moment in the scene from which future asymmetries would eventually spring. It was Adam Katz’s introduction a few years ago of the Peircean notion of firstness into the description of the originary event that revealed its potential for inequality. If originary symmetry were truly the ultimate human condition, it could never have been disturbed. One thing of which the originary hypothesis reminds us is that humanity’s cultural or “spiritual” achievement begins and remains in the service of our material existence.
Adam observed that the emission of the sign should not be imagined as instantaneously unanimous; someone would be first to understand the aborted gesture of appropriation not merely as an instinctive act of recoil but as a sign that could be communicated to his fellows. In the originary event, the firstness of the first users of the sign was successfully propagated and symmetry was established—we may imagine earlier moments when language failed to emerge because the sign was not properly interpreted, with perhaps unfortunate consequences for its unsuccessful users. But the originary manifestation of firstness reminds us that human culture does not create a “collective consciousness,” that each individual adheres to it on his own terms, and that although our potential for mimetic rivalry is deferred in the originary scene, it cannot be permanently squelched.
I would propose that in our desire to understand the cultural products of hierarchical society in originary terms, we may have not paid the transition from symmetry to hierarchy sufficient theoretical respect. In particular, we should not let our duty to defend “the normal” from the excesses of postmodern victimary thinking prevent us from giving the anthropological basis for such thinking its proper weight. Attention to the historical manifestations of firstness should allow us to do this fully in the spirit of GA.
We may say, then, that since the advent of hierarchical society, the human has been defined by two tensions productive of pragmatic paradox, configurations that permit of no stable position. The primary tension, the one that defines us as human and that GA has explored at length, is that between the worldly and the transcendent, the mortal being and the immortal sign. When we say that we know we are going to die, this seems at first (from the perspective of “common sense”) to be a straightforward statement about the world. But the very possibility of making this or any other statement requires that we use words that are not themselves subject to mortality. We can only think our mortality in “immortal” terms. The paradox of a mortal possessing “immortal” thoughts gives rise to the idea of a soul that survives our bodies. This same paradox applies to the perishable being that occupies the center of the originary scene. The significance it bears is not itself mortal. Thus God may be defined (to quote a recent Chronicle) as the meaning of the originary sign, concerning which no distinction between referent and signified is possible. This is the originary, anthropological sense of the “ontological proof,” customarily expressed in terms of God’s “perfection.” To include existence as a predicate within God’s perfection—an inclusion rejected by Kant, the greatest of metaphysicians—is merely a roundabout way of acknowledging that the originary sign, while designating—pointing at—a real object, can only have meant, in order for it to subsist as a sign, the subsistent sacred being of which that object was merely a temporary embodiment. (Here again, I think we need to reflect more on the inscription that preserves the trace of this subsistence.)
In contrast to the alpha animal whose dominance is overthrown in the originary event, the first to use the sign gains no material advantage from his behavior. Since he inaugurates a mode of communication in which all can participate, his asymmetric act facilitates the establishment of symmetry. This provides a paradigm applicable to all examples of firstness: that of deferred reciprocity. Marcel Mauss’ model of gift exchange is an extension of this kind of delayed symmetry. Mauss discovered that the chief mode of exchange in tribal societies is not through transactions in the modern sense, but through “gifts” that generate temporary asymmetries that their recipients are required to pass on to other members of the group. The complexity of the structures that gift exchange makes possible is perhaps best illustrated by the “cross-cousin” marriage patterns described by Lévi-Strauss in his Structures élémentaires de la parenté. It is a source of wonderment to reflect that Mauss’ model remains applicable to our social life to this day; we still exchange birthday and Christmas gifts, or dinner invitations, in a recognizably Maussian fashion. But we no longer conduct our economy in this manner.
The Maussian system and its extensions constantly generate asymmetries, but these asymmetries are constantly resolved. The advent of hierarchical society applies the category of firstness in a qualitatively more radical manner. This gives rise to a second, subordinate but very real tension, which can be described most simply as the tension between ethics and morality. The divergence of a given ethical system from the moral model of reciprocity is paradoxical: the system’s sacred laws derive their authority from their transcendent status with respect to the individual members of the society (in the Durkheimean sense that the sacred expresses the interests of the society as a whole in contrast with those of its individual members), but the very language in which these laws are formulated embodies the originary symmetry without which no human society could exist. Thus both the master and the slave understand and implicitly mutually communicate their knowledge of the mastery of the one and the slavery of the other. Which is to say that slavery is by nature the deferral of freedom, as has been borne out by history in all but the most backward parts of the world.
Postwar, postmodern thought is characterized by an exclusive focus on this secondary paradox. The Holocaust paradigm of Nazi and Jew is a reductio ad absurdum of hierarchical society that delegitimizes any unequal relationship based on ascriptive or essentially unchangeable categories. I emphasize these categories because the entire weight of victimary pressure and the “White Guilt” it arouses in non-victims is based on group membership. The only kind of victimage that is relevant politically and ideologically requires that the victim define him or herself as a member of a group. Victimage is always, so to speak, a hate crime.
The first stage of postwar victimary thinking dealt with the ethical paradox in a manner that seemed wholly beneficial, as an Enlightenment exercise in demystification. Granting equal rights to Blacks in the American south or abolishing colonial regimes reaffirmed with broader scope that “all men are created equal.” The moral model of human reciprocity is blatantly violated when one group but not another can be served by a school or a luncheonette. But it soon became apparent that de facto as well as de jure group inequalities could be considered morally unjustified, regardless of the professed or real intentions of the actors. We are still debating the proper means to deal with such inequalities.
It would be foolish indeed to consider victimary thinking simply as a theory that GA can supersede or correct. But if we recognize its anthropological basis in what I have been calling the ethical paradox, we can not only better understand it but point out where it risks losing touch with the anthropological foundation on which its very moral sentiment is grounded.
Because victimary thinking understands human conflict as the result of the division of society into privileged and disadvantaged subgroups, it tends to see the cessation of this form of conflict as the end of human conflict in general. The anthropology implicit in this vision can be traced back to Rousseau, who associated social organization with the originary inequality of private property (“ceci est à moi”), which loosely reflects the Neolithic invention of agriculture. We all know that Rousseau considered la société commencée that he found in accounts of American Indian societies to be happier and more “noble” than European society as he knew it. What is less remarked upon is that Rousseau considered tribal, equalitarian society as devoid of genuine social organization. There was no need of ritual to promote “solidarity” or sacred language to express the values of the group over those of individuals. Indeed, there was no need for an originary event to generate language in the first place. Mimetic conflict over a given object would be prevented by the assurance of acquiring a similar object through withdrawal, so there would be no aborted gesture of appropriation, no sign, and no transcendence. The strange glissement between the state of nature and “begun” society in the Second Discourse reflects the paradoxical status of a sinless originary state fated by its own success (and the resulting increase in population density) to evolve into a sinful one.
Thus, even before the reconceptualization of the social order in terms of class conflict that emerges in the French Revolution and attains its highest expression in Marxism, Rousseau limits the necessary existence of human conflict to hierarchical society, as opposed to hunter-gatherer societies presented as only beginning to emerge from the state of nature.
But the roots of what we now call victimary thinking stretch back well beyond Rousseau, to the very beginnings of Western civilization. The underlying human ontology that Rousseau was first to put into recognizably anthropological terms is already present in the biblical association of the fall of man with Adam’s obligation to till the soil. Indeed, the interaction that determines the Fall is interpreted by God in much the same way as I have described the emergence of the big-man, as usurping the sacred center. God’s central grievance against Adam is that having eaten of the tree of knowledge, he threatens to become a god himself, usurping the central role whose permanence belongs to God (or the gods) alone. “And God said, ‘Behold, the man is become as one of us, to know good and evil: and now, lest he put forth his hand and take also of the tree of life, and eat, and live for ever: Therefore God sent him forth from the garden of Eden, to till the ground from whence he was taken’” (Genesis 3,22-23).
The association of the Fall of Man with the advent of agriculture extends victimary thinking in its earliest form back to the religious origin of Western Civilization. Biblical religion, culminating in the insistence in both the Jewish and Christian traditions on the primacy of morality over ethics, cannot be understood independently of the ethical paradox of hierarchical society.
I remarked some years ago that perhaps the most fundamental parallel between the Greek and Hebrew components of Western culture is that between Platonic metaphysics, which reduces language to the proposition or declarative sentence, giving concepts an independent existence in a world of Ideas independent of their worldly manifestations, and the founding monotheistic revelation to Moses on Mount Horeb in Exodus 3, where God gives his name as a declarative sentence, “I am what/that I am.” Let me point out another parallel situated on a more primitive level in both traditions. Reading the Fall as an expression of the secondary ethical paradox is the Hebrew side of this parallel. The Greek side can be found in various writings of the pre-Socratics, some of which I have examined in Chronicles 372, 373, and 374. Heraclitus’ line “war/conflict” (to polemos) is the father and king of all things; some he shows as gods, others as men; some he makes slaves, others free (53)” belongs to it. But the most intriguing example is no doubt the famous single sentence of Anaximander quoted by Simplicius, the object of a well-known essay by Heidegger:
Just as the Mosaic revelation of God’s nature is both ontologically prior and chronologically posterior to the story of the Fall, so Plato’s metaphysical ethics offers a perspective from which the ethical paradox raised by the pre-Socratics can be deferred. The biblical world is ruled by a God with neither a name nor an image that can be called on to bend his laws; the Platonic world is governed by an Idea of the Good that transcends the selfish interest of any individual. Understanding both metaphysics and monotheism as two complementary ways of dealing with the ethical paradox at the origins of the Western tradition allows us better to see postwar victimary thought as a radical product of this tradition, traumatized by the Holocaust into forgetting the fundamental hierarchy that precedes all others, that between the human and the transcendental.
Until the postmodern era, Western culture had always maintained the subordination of the ethical paradox to the fundamental paradox of the human. Tragedy, and the literary modes that derive from it, are structured by this subordination. We should reflect on Aristotle’s requirement that the tragic protagonist be “better than in real life,” which must be understood in terms of status as well as character. Girard is justified in interpreting tragedy as a form of “scapegoating” in which the protagonist is obliged to take on himself the blame for the mimetic conflicts that plague his city—as Oedipus himself, ironically enough, does explicitly both at the beginning and the end of Sophocles’ play. But I think we can better situate the Girardian phenomenon of “scapegoating” or “emissary murder” in our anthropology if we take it not as the model of the origin of the human, but, like Adam’s sin, as a model of the emergence of hierarchy, of the big-man who usurps the divine center and incurs the resentment of his fellows.
The “end of tragedy,” which so troubled the early romantics, corresponds to the beginning of the denial of the transcendent status of the primary human paradox over the second. When the human comes to be defined exclusively in terms of the oppression of one group by another, the problematizing of esthetic resolution is a symptom of this critical, albeit historically productive, anthropological misunderstanding. But it is the very excess of victimary thinking in the postmodern era that has provided the impetus for the return to the primacy of the transcendent, understood this time from a minimally anthropological perspective. In a word, victimary thinking was a necessary precursor of Girard’s mimetic theory, and subsequently of GA.
This general observation about postmodernity, needless to say, is not a substitute for either ethical judgment or anthropological analysis. I make it here in the hope that it may help clarify the former and encourage the latter.