One might think it unnecessary to return to a subject I have covered repeatedly, but I believe that the constant rethinking of what I conceive as the key moment of hominization is a good thing. Limiting our focus to that moment obliges us to seek in it the root of all that proceeded from it, not as though it “generated” what followed, but in that it inaugurated the mechanism that made it all possible.
Ideally, this is something I wish I could have done from the first. If I could do it all over, rather than beginning (in The Origin of Language [TOOL], 1981) by trying to derive the utterance forms of mature language from the originary ostensive gesture/utterance, then attempting to relate to it the forms of literary culture (in The End of Culture, 1985) I would have found it more useful to examine in depth the phenomenal reality of the aborted gesture of appropriation becoming a sign by focusing more closely -on the interactions among the participants in the originary scene. For it is the introduction of this mediating element among the participants that inaugurates the unique human phenomenon of conversation.
But of course this was impossible; it is in fact my conception of the originary event itself, rather than its linguistic consequences, that has most greatly evolved since TOOL. Such dialectical processes cannot be planned in advance—that is what makes them dialectical.
As to why academic students of language have not dealt with such matters, it is simply that they have not concerned themselves with the necessity of language. Animals have inhibitions that can be learned through practice, as Pavlov famously demonstrated. But they don’t have signs that intentionally communicate these inhibitions, that is, that make them into interdictions shared by the community. This process, as I have tried to show, depends on a sense of the sacred that precedes any recognizable notion of God; this sense attaches to an interdiction that, communicated through the sign, is understood to be shared by all the members of the group. Indeed, in its minimal sense, this interdiction is what the sacred is. Whether it depends on a preexisting sacred being is by definition beyond our ability to determine, since as Jesus put it, “my kingdom is not of this world,” whereas we are. In purely human, Ockhamist terms, God is not a necessary element of the sacred.
All this is very simple; the insight consists in understanding the origin of language and of the rest of human culture in the need for an explicit capacity for deferral of action in the face of increasingly powerful appetitive mimesis, an emergent need that required a new level of consciousness: an ability to represent the appetitive object, not simply within the individual brain, but in terms sharable by the community, that is, by a sign that communicates what cognitive scientists call “joint shared attention.”
Very simply, the core of GA is that this is the punctual difference between us and our fellow creatures. Even if we can teach them some elementary usages of language, it is clear that their brains, or perhaps better put, their souls, are not “wired” for even elementary language, and this is so because, as I have often repeated, it is not necessary for them.
The complexities of our intellectual lives, like the mysteries of “Being,” are rooted in these concrete elements of our experience, which the mystifications practiced by “profound” thinkers should not allow to be hidden. What I admire in Derrida is that once we put aside his préciosité (and his leftist politics), his understanding of the human can indeed be translated into simple anthropological terms.
Thus, as Andrew McKenna realized in Violence and Difference: Girard, Derrida, and Deconstruction (Illinois, 1991), Derrida the mystifier is appropriately paired with Girard, who never mystifies, but whose fundamental anthropology misidentifies where Derrida’s did not the specificity of the human, which is not mimesis itself, but the additional layer of meaning that the excess of proto-human mimesis forced nascent humanity to add: language, culture—the world of the sign.
I certainly didn’t invent the idea that “necessity is the mother of invention,” but one wonders why it has been so difficult to persuade others that this common-sense truth applies in exemplary fashion to language, which is as far as we know the most radical invention in the universe after life itself—in a sense, even more radical, in that it leads to the invention of ideal entities that have no worldly reality: words, phonemes… whose referential realms, fetishized as realities, became the subjects of the first systematic—Platonic—philosophy. The constant refrain of those who discuss the subject at all is that language emerged simply as a “natural” outgrowth of increased intelligence; as one well-known theorist put it, we began having ideas and we evolved language in order to express them!
Put in the simplest terms, the origin of language—as Raymond Tallis, author of The Hand (Edinburgh 2003), would no doubt agree—is the discovery of pointing.
For animals, even chimps, don’t point. They may gesture at something, but they are not by this intentionally communicating to their fellows their interest in the pointed-to object, even if they do so mimetically (or “indexically”). The joint shared attention of pointing is available only to humans, and is indeed already linguistic. And it is easy to see, or rather, difficult not to see, the originary pointing gesture as what I called long ago an aborted gesture of appropriation.
This choice of the referent of the original ostensive reflects what we can call its significance, its worldly importance, which evidently derives from an appetitive drive that is in this case frustrated—or in Derridean terms, différé, deferred, and thereby also differentiated within a world of potential signifiés. This makes possible the constitution of language as a system of an indefinite number of such signifiés, and the syntactic structure of language, whose origins I outlined in TOOL, arises as a natural extension of the use of language to model the relations and qualities of the objects referred to.
The time during which language is used—originally, we presume, wholly in the context of a public, sacred, ritual scene—is of a different quality from the everyday time of interactions with others and with objects: it requires, precisely, the deferral of worldly action and the establishment of a state of contemplation, which may eventually exist concurrently with other activities, but which requires a minimum of independent attention if the communication is to be successful. This state of linguistic communication, the sharing of signs, is the kernel of all cultural activity, notably of ritual/religious communion. We conceive of the originary event as a proto-rite, one “discovered” and subsequently repeated according to a predetermined pattern. The exchange of the sign in the originary event must be understood as deferring the appropriation of the central object of desire that is the raison d’être of the event, the serial distribution system employed by prehuman primates having become henceforth impossible. (See, e.g., Chronicle 740 et seq.).
The equivalence of the sacred and the significant, or what is worthy of being signified—attended to through the deferral of appetitive relations—simplifies our notion of the sacred without trivializing it. Time is precious, and to devote time to signifying something beyond simply attempting to make use of it is to defer one’s worldly relationship to it and to everything else.
As language users we of course take this for granted. The purpose of generative anthropology is to explain the necessity of this deferral in terms of the human need to avoid mimetic conflict and its concomitant violence by inaugurating the scenic contemplation of an object of communal appetite, which is by that very deferral transformed into desire, appetite mediated by the mimetic concurrence of the community. This notion of the scenic is homologous to Sartre’s “geometric” description of the human self-conscious mind or pour-soi as an internal scene containing a néant that separates it from its object, or to Heidegger’s less explicit notion of the human as Dasein, being-there, that is, facing, as in a theater, the object that one (tacitly) contemplates rather than (yet) acting upon.
Having heard these ideas for the nth time, one may be tempted to say, like the protagonist of an 18th-century novel after his or her first sexual experience, “Is that all there is?” But, precisely because of their apparent banality, they must be repeated until the reader realizes that the mark of the appropriate point of departure is not its subtlety but rather its simplicity; it is the simplest starting point that puts the least burden on the realities that must be understood in its terms.
It is in this spirit that Hegel begins his Logik with Sein and Nichts. Except that there is no experiential reality behind Sein and Nichts; they are already intellectual categories that imply a precedent effort to think what is simplest, whereas if we begin instead with the significant/sacred, that is, what in our experience matters, what we are without prior reflection willing to spend our time and energy on, then all the rest falls into place. All living creatures have analogous concerns; but what all but humans do not have, because they do not need it, is a system of representation that defers action with respect to what matters, that allows it simply to matter, to be shared as something we must contemplate, and indicate to our fellows to contemplate, before we begin to make use of it.
Sein or Being is best understood as the worldly manifestation of words, or signifieds. But there is really no way to use the term without confusion. If the green-ness or square-ness of something is reasonably unambiguous, the is-ness of something is entirely contingent on what it is being opposed to, such as that lizards have is-ness but unicorns don’t. But how do we know if God has is-ness? Or the number 5? Much as I love Hegel’s dialectical virtuosity, to get from Being to Nothingness to… the End of History is a tour de force, not an objectively rigorous process.
The point is not to debunk philosophy but to clear away the underbrush that has obscured its essential components. Girard may have overestimated the centrality of scapegoating or “emissary murder,” notably in reference to the earliest human communities that presumably had no human ruler occupying a central position, but his metapsychology was nonetheless rooted in reality: desire is indeed mimetic as distinguished from appetite, and Aristotle had already recognized that humans were “more mimetic” than any other creatures.
From there, it is but one small step to recognize human culture, beginning with sacrality/significance, as an apotropaic reaction to this excess of mimeticity—the conversion of the aborted gesture of appropriation into a sign. From there, we are indeed ready to trace the evolution of the human scene and its components throughout history, if not to the latter’s “end.” I have taken a few stabs at this, but the application of the originary hypothesis to the totality of humanity’s prehistoric and historic activities will require painstaking research and long reflection on the part of many.
It is a telling coincidence that I finished the first draft of this Chronicle on the day of the Israeli elections, where Benjamin Netanyahu has emerged the apparent winner, succeeding to a weak centrist government defined largely by its principals’ personal hostility to him. If I were an Israeli politician, perhaps I wouldn’t like Bibi either. But as I understand world history in the present moment, Israel is a key element, and the Abraham Accords a great achievement. I see Netanyahu’s return to power at a moment when we can expect the US to begin moving away from its current Woke childishness as a demonstration that the liberal-democratic world still remains capable of asserting itself—hopefully not too late—in opposition to the anti-world of resentful despotism.
The Jews are, whether they like it or not, the people whose history is originarily relevant to the West, and thereby to modern humanity in general. Whence the substantive basis of the Evangelical belief that the Jews are destined to return to their homeland in Israel/Palestine, an eschatological ideal which, as Walter Russell Mead demonstrates in his recent The Arc of a Covenant: The United States, Israel, and the Fate of the Jewish People (Knopf, 2022), has from the days of the Pilgrims maintained a strong presence in American Protestantism. As Mead realizes, this is not a mere fantasy but a religiously-garbed genuine insight into the future of Western and world civilization.
Just as antisemitism, which was left for dead after the Holocaust, has strongly renewed itself by taking the Palestinians for the victimized “indigenous” population of Israel, so the uncontested establishment of a Jewish state is a prerequisite for a truly Hegelian synthesis to resolve the division among the three Abrahamic religions that dominate the West. In this context, the end of the diasporic existence of the Jews corresponds to a genuine possibility of reconciliation that—and that is the genius of the Abraham Accords—includes in its extension both Christianity and Islam.
Admittedly this is still a visionary work in progress, but its realization is perhaps less distant than might appear, and the return of Netanyahu to power in Israel is a major step in the right direction.
The Jews are not “the chosen people” in some invidious sense; they are simply the first people to have conceived a genuine monotheism and, consequently, a fulfillable vision of world unity: One God, one humanity. Just as, as Adam Katz suggested, in the originary event there must have been a single proto-human who first realized that the aborted gesture of appropriation could and should be understood as a sign. But just as the very success of his insight, dependent on its spread to the totality of the participants, made it inevitable that this “first man’s” inaugural understanding would forever remain unrecognized, the Jews would no doubt happily give up their chosenness could they put off the curse of their firstness.
Yet since, in the historical world, they cannot, they will have to wait to happily live with it until their neighbors will have become willing to accept, as did (Saint) John Paul II, their status as elder brothers. And I imagine that by that time, whether or not my name has remained associated with it, the originary hypothesis, or something quite similar, will have become a simple matter of common sense.