Le Figaro, France’s conservative daily, published on May 1 an article by literary critic Marin de Viry deploring the interruption of the Cérémonie des Molières, the French equivalent of the Tony Awards, by two actresses protesting the recent controversial government decision to bypass the National Assembly and enact a retirement reform that would increase the basic retirement age from (please don’t be shocked) 62 to 64.
This interruption was surely far from the thuggery of the recent fracas at Stanford Law School, where Judge Kyle Duncan, invited to speak on a legal subject, was prevented from delivering his talk and roundly insulted by student hecklers, then weakly defended by the Dean of DEI, whose intervention failed to condemn the hecklers. But what struck me in Viry’s piece was its description of the interruption as an attack on what he called, using a term familiar to GA, the scene, an act qualified, borrowing a quasi-pun from the late Jean Baudrillard, as ob-scene. He thus reminded us that the root meaning of that term designates what violates a scene, understood as a locus of sacred activities that command universal attention.
Here is a translation of a key passage from the article:
The obscenity was confounding the scene of political negotiation with that of the award presentation. Of course, it has become almost basic to today’s mentality, especially in the entertainment world, to want to break codes, blow up conventions, not tolerate not interrupting, to insist on disrupting the smooth operation of institutions, leading to the triumph of the imperious desire to impose one’s presence and have everything else grind to a halt in order to make oneself noticed. It is as if it has become normal to show up in a ball gown at a swimming pool, or in a bathing suit at a formal dinner. No institution, no event is exempt any longer from such brutal hijackings, like those of airplanes in the past.
Viry repeats the term obscénité, expressing a sense of sacrilege in referring to an act of confounding the scene of the awards with that of political protest—the French term confondre designating not mere confusion but improper equivalence, treating in the same fashion the profane and the sacred.
What struck me in Viry’s reaction to what we would consider a minor disturbance, one that in American ceremonies of this kind is by now more expected than shocking, is that he puts his finger on the specific “obscenity” of the antinomian behavior that increasingly characterizes the contemporary implementation of what I have called “the epistemology of resentment.”
Every point of his denunciation is presented as an attack on the sanctity of scenes that society had heretofore treated with reverence. To “not tolerate not interrupting” (ne pas supporter de ne pas interrompre) can virtually be translated as “to leave nothing sacred.” No doubt the epistemology of resentment is at the origin of these sentiments, but it is important to see them as attacking not merely the sources of power but the very fiber of the social order. Our societies are presumably defined by the “rule of law,” but as Burke observed already in the early years of the French Revolution, and as the recent legal troubles of both Trump and Netanyahu demonstrate—in contrast with the impunity of the Biden and Clinton families—even when the formal rules are followed, these legal scenes are clearly manifestations of political power rather than quests for justice. Once the unwritten rules of communal activities are no longer obeyed, the courts become either impotent or mere instruments of the mob and its masters.
In a broader context, the desacralization of our public scenes reveals itself in not attending church services as well as in skipping marriage, increasingly seen even by stable couples as an irrelevant formality. Millions will have watched Charles’ coronation last weekend to witness what increasingly strikes us as a relic rather than a living example of continuity.
How does one restore sacrality where it has decayed? The most obvious answer is the one recommended by religious sources: an act of faith. And more than a few have found a new life purpose in recent times by either returning to a faith abandoned or adopting a new one. GA can explain such acts, but it cannot substitute for them. Yet the question does arise as to whether GA’s foregrounding of the scenic, and of the différance that makes it possible, can serve, in the absence of a transcendent faith, as the foundation of a worldly ethic.
That GA and religious faith have much in common is reflected in the simple fact that they are both tabooed in what presents itself today as rational discourse. One might think that scientists and serious scholars would be willing to differentiate between the discursive mode of Genesis and that of the originary hypothesis. Yet both tend to be treated as myths lacking in “falsifiability” in Popper’s sense.
Even when it is made clear that the hypothesis’ reference to “eventfulness” does not imply that a single moment divided the past from the present and transformed proto-humans into humans, that the hypothesis is not contradicted by the obvious fact that the emergence of humanness, eventfulness, and scenicity, along with that of deferral and différance, must have taken place in stages, and no doubt at different times among different groups of hominins—there still remains the matter of the Rubicon that must be crossed, even if one step at a time and not simply leapt across, to get from animal communication to human language and the rest of the culture that accompanies it. Like the change of state from water to ice, the origin of language involves a series of many small changes, but the end result is qualitative rather than quantitative, and in addition, the quality of humanness, unlike any others found in nature, involves the creation of a whole new system of intraspecific interaction.
GA’s originary hypothesis is a minimal model of what was necessary to permit the first scene, the first sign, the first event and their memorability and sacrality. However much “real life” must have been sloppier and less schematic, we could not be what we are without the invention of the scene, the sign, the sacred, the event. The problem is that such qualitative changes are excluded from the realm of scientific discourse as normally understood. The universe has evolved considerably since the big bang, but in scientific terms, it has done so without any of these features, because they are specifically associated with the human, and the specifically human has remained off limits to the scientific theory of evolution.
We witness many indications of this. One of these, as I have noted several times in these Chronicles, is the insistence on presupposing the existence of life elsewhere in the universe. Needless to say, I have no authority to deny that this is possible. But for years I have remarked that virtually every exploration of celestial bodies other than stars and black holes speaks of them as potential habitats for life. In some articles about exoplanets, or even bodies in our own solar system such as the moons of Saturn or our neighbor planet Mars, the word life is repeated constantly, as if to make up for the fact that no trace of life has yet been found in any of these places. What is notable about this is the tacit a priori rejection of the idea that life is quite possibly not found elsewhere in the universe. Let alone “intelligent life,” whose absence becomes increasingly difficult to explain away as our means of exploration become more powerful, given that it certainly stands to reason that if we can evolve in a mere few billion years to this point, other creatures on one of the trillions of planets elsewhere in the universe might well have done the same a bit earlier and provided for us traces of the ETI activity that we are so anxious to observe.
I think there is more here than the neutral scientific exploration of an open possibility. There is a clear if unspoken dogma in the scientific mindset in this case that goes beyond simple uniformitarianism. No doubt if we could become sign-users, so could “other life forms,” but we have not the faintest idea of the probability of any such life-forms so evolving, particularly given the lack of evidence that any life-forms exist elsewhere. What I find irksome in these constant references to extraterrestrial life is their implicit conviction that such life must exist, if not on Mars then on Ganymede, or on some exoplanet recently discovered that appears to have liquid water. Even then, the SETI question would not be answered, but at least it could reasonably be asked. Whereas at present at the very least we are obliged to consider ourselves an extremely rare phenomenon, either because any similar species have evolved to the point of blowing themselves up before taking care that their existence would remain visible elsewhere in the universe, or in the simple zero-hypothesis, because there haven’t been any.
GA has neither the need nor the ability to make predictions about extraterrestrial phenomena. But because it does suggest that, as far as we can tell, humans are “special,” and although this is something we all must recognize, once more, the suggestion of a qualitative difference between our minds and those of other primates remains properly unthinkable, given that it denies the uniformitarian principle that insists that all differences between forms of life are “relative.” Our proposal of a behavioral explanation for our difference from our fellow hominids is scandalous because, instead of attributing this development to the fortunes of our DNA (the “language gene”), it affirms that human self-consciousness emerged because it was necessary to our survival, the problems of which were certainly due to our DNA-improved brains, but which reached the point where a collective solution was necessary that transcended DNA modification and began to drive it.
The idea that humans have invented/discovered a new way to deal with survival that depends on inter-communal communication not inaugurated within the genome doesn’t seem scandalous to me, but the assumption that it does to the scientific community explains why in their eyes our hypothesis is not a hypothesis at all but an unfalsifiable myth of origin. Yet the hypothesis neither postulates the existence of supernatural beings nor implies a preexisting level of social organization incompatible with prehuman society.
What it does imply, however, is, to cite once more Roy Rappaport’s 1999 Ritual and Religion in the Making of Humanity, that language emerged coevally with the sacred—understood as a purely human phenomenon not implying the existence of supernatural beings. Once more, the emergence of the key characteristics of human culture—the scene, the sign, the event—is inconceivable in the absence of what I have called the sense of the sacred, which corresponds to the essence of what had earlier been called the soul, the conscience, or the superego.
From this perspective, the sacred, rather than being defined by the postulation of supernatural powers and/or beings, is wholly explicable as a product of human interaction. Although the sense of the sacred has historically aroused beliefs in the existence of such powers and/or beings, it is independent of them and can and should be so examined.
Whence the importance of emphasizing the underlying identity between the sacred and the significant, between the attribution of a signifier and the attribution of supernatural status, where the common component of signifying is independent of one’s belief in the existence of the supernatural. Thus the Jewish refusal to name the divinity should be understood as a historical discovery concerning the ontology of any conceivable extra-worldly being. As I have noted, a word is itself a “supernatural” entity in the sense that it has no worldly reality as a thing. Unlike any element of our experienced reality, a word is a form, a possibility of speech or other modes of signing, one that presupposes the existence of a human community that shares its language.
Let me conclude with the observation that I view the taboo that has until now precluded widespread interest in GA as a sign, not of its irrelevancy, but rather of its apparent infringement on the sacred ground hitherto trodden exclusively by the faithful and therefore off limits to scientific thinking. But the only faith required by GA is the confidence that the exploration of the consequences of the originary hypothesis neither infringes on religious thinking nor reflects the inauguration of one more crackpot “belief system.” The hypothesis is not meant to be caressed as an idol; it is the point of departure of a new mode of anthropological thought.
After over forty years, I continue to believe that all prior reflections on human culture would be enriched and transformed by incorporating this new perspective. And the researches such a perspective has the potential to stimulate, and which my own work only very partially anticipates, hold the promise of revolutionizing our understanding of every dimension of human activity. It is this faith alone that generative anthropology asks of us.