Although generative anthropology is based on a scenic hypothesis rather than empirical observation and confirmation, its phenomenological nature brings it closer to natural science than to philosophy. As Plato’s Socratic dialogues make clear, the basis of philosophy is faith in language as the ultimate repository of human (self-) knowledge. The substance of these early dialogues is the attempt at defining words, whose status as Ideas is grounded in a vaguely conceived transcendental universe where they exist unencumbered by the timebound associations that obscure their “ultimate” sense. Heidegger’s critique of Plato’s doctrine of Ideas is a surenchère: Plato has not gone far enough into etymology and the fundamental intuition of Being to find not merely words but sememes, the “originary” roots of language—but this so-called originarity remains linked to a Saussurean understanding of language as a system of differences.

The root of humanity, however, is not to be discovered in the lexicon of language, but in the act of “languaging.” What originary language teaches us is not the Ur-meaning of the words we use but the scenic structure of linguistic communication and its roots in originary human behavior. GA offers a plausible scenario of human origin that explains our culture as the solution to the problems posed by our mimetic intelligence.

Accordingly, our explanation of the distinction between morality and ethics does not depend on the etymology, real or speculative, of morality and ethics, but on the distinction between the scenes that supply their models of human behavior.

The “moral model,” the scenic basis for the term “morality” as I use it—this basis being the real object of our discussion—is the reciprocal exchange of the originary sign on a scene centered on an object of common desire whose very desirability makes it sacred, interdicted. The reciprocal symmetry of the participants around a sacred center may be observed in religious and similar rituals to this day.

The positive or negative reference to this model (moral, immoral) may then be applied to any analogous situation in which the participants, whether two or millions, stand in equal status with respect to an entity to be divided among them. This object may be negative; if a fine is assessed on a community, the moral model would imply that all should be assessed equally.

Ethics, which we can define as the system of values conducive to the welfare of the human community, however delimited, is distinguished from morality by the absence of any similar model. What is ethical is what is judged best for a given community, and every community is different. Thus we may speak figuratively of a human moral “instinct,” but beyond the most elementary forms of interaction in which reciprocity is normally required (e.g., “thou shalt not kill”), there is no similar model to which we can appeal for ethical laws.


Resentment has always been with us, yet so far, at least, love has been able to win out. Nonetheless, the triumph of love over resentment in WWII followed by Western victory in the ensuing Cold War has brought the West three decades further on to what may well be its most serious spiritual crisis, a crisis perhaps most cruelly felt in the USA as the civilization’s Last Great Hope. The common understanding of the West’s Judeo-Christian foundations has become increasingly caricatural. Wholly opaque to irony, “Hate the Haters,” the Ur-slogan of Wokeness, is put forth as the ultimate avatar of Christian love.

In the absence of an appeal to the Almighty, there is only the Hegelian certitude that “all that is real is rational,” in the sense that we must be able to discover the causal process that led to the Zeitgeist of any given moment—and the greater its apparent absurdity, all the greater must be the need that it fulfills.

I have been suggesting that the driver of the current malaise has nothing particular to do with racism, but that racism, that is, ascriptive discrimination, has become a sacrificial substitute for meritocracy, discrimination by measured ability. No system of evaluation arouses more resentment (“sense of injustice”) than one in which transparent and relevant criteria are measured objectively. If one is going to lose, it is much less painful to be able to say that the game was fixed—as had been the case in the caste systems of the past, and even largely in the class system of the industrial market economy.

All the virtue-signaling displays of checking one’s “white privilege” are means of avoiding having to justify the growing dichotomy, regardless of race, between professional Belmont and working-class Fishtown. How much easier it is for the successful to express contempt of the residents of the “red states” of “flyover country” (some of whom may even occasionally use the N-word), while affirming their solidarity with poor George Floyd. Such persons are to be reproached, not with their success, but with the arrogant self-righteousness that Jesus condemned in the Pharisees.

Yet condemnation of the woke solution to the moral problems of the digital era is hollow in the absence of a better one. Nor can we avoid the growing possibility that our insistent refusal to defend our system is in fact a step toward cultural suicide, ultimately in the hope of striking a better deal with our successors.


Over the past year or two, I have increasingly referred to the sacred as an anthropological reality, and more recently, to the human soul. I would insist that my use of these terms does not reflect a turn to fideism or mysticism, but rather an attempt to make precise what it is that distinguishes human culture not merely from analogous activities among animals but from the kind of “consciousness” generated in machines by the practice of Artificial Intelligence (AI).

My use of “soul” began with Chronicle 616, which appeared in April 2019 in response to a talk by former Humanities Dean Herbert Morris, to whom the text is dedicated. It contains the following paragraph:

GA began as a theory about the origin of language, but I have always made clear that formal and institutional representation are linked from the start; that language itself is not in the first place about transmitting information, but about demonstrating, from the sharing of the first linguistic sign, our human solidarity. Hence we can say that the ultimate purpose of the originary hypothesis is to provide a minimal anthropological model for the attitudes and behaviors that the term soul encompasses. That a group of university scholars can discuss this term for over an hour with so little focus on its agapic function strikes me as a sign that, in the absence of a publicly-shared religious discourse—for the individual religious commitments of any of the participants must remain mute in a context where there is no presumption that they are shared—GA is our last best chance to provide conditions analogous to the religious context in which the concept of the soul originated.

More recently, I have focused on another “spiritual” term, the sacred. In an exchange of correspondence with (Anglican) Bishop Pierre Whalon, in order to express the kinship of GA with expressly religious views of the human, I used the term “sense of the sacred” as a neutral expression of sacrality that brackets its extra-worldly status. What is important about the sacred is less whether only God could bring it to our world than what reactions it stimulates, and in particular, how these reactions preserve us from the mimetic self-destruction that the human has from the beginning defined itself against.

Creating an anthropological description of soul and sacred without reference to a transcendental being may indeed be the central achievement of the originary hypothesis. Thanks to our originary phenomenology, human uniqueness is defined in necessarily spiritual (rather than neurological) terms, yet without reference to extra-worldly realities. The key characteristic of the sacred “will” that interdicts the individual appropriation of the scene’s central object is that it has no objectal source. Its presence in each mind as the source of the deferral of appropriation is minimally an effect of the fear of arousing the ire of those who share our common desire, and the sign allows this deferral to be reciprocally communicated.

No fundamental element of human uniqueness is absent from this formulation, including the fact that the results of its future evolution are in no way contained within it. The scenic space opened up by deferral is the laboratory of human thought, in which we progressively discover the world and ourselves, and as is the case with all laboratories, these discoveries are unpredictable consequences of its free use.


As The Origin of Language already implied, although without emphasizing the moral/ethical separation implicit in it, the evolution from the elementary utterance forms of ostensive and imperative to the “objective” form of the declarative not only permitted the transmission of information beyond the presence or absence of objects and actions, but set the human on the path of metaphysics: the use of language to record and transmit truths, to become aware of its own internal logic, including that of mathematics, and eventually to permit philosophers, and later, scientists, to create logical edifices, first of words, and subsequently of things and theoretical entities. Ostensives, like our moral sense, are quasi-instinctive; declaratives express thoughts.

If we all share a “moral instinct” that makes us resent a lack of reciprocity, and that can easily be projected onto others such that their treatment by third parties arouses our indignation, we have seen that there is no analogous “ethical instinct.” Elementary hunter-gatherer societies lack a notion of social status that transcends moral reciprocity; this is illustrated by their insistence on the equal distribution of food and other necessities of life—as well as by the frequency in these societies of interpersonal violence, including the casting of hostile spells, by those whose sense of moral equality has, justifiably or not, been violated.

Social hierarchy—to follow Marshall Sahlins’ example of the “big-man” in Stone-Age Economics, whom he describes as eating less although producing more than his fellows—emerges from the differentiation of social utility rather than from the oppression of the weak by the strong, which is precisely what language and culture came into being to prevent. It has been the key social purpose of religion in differentiated societies, however consensual or despotic, to provide their members with ethical notions of justice that override the simple moral sense of reciprocity.

But, as the French revolutionaries discovered, justifying a given situation in ethical and even religious terms cannot prevent its being denounced, rightly or wrongly, as immoral. The religion of wokeness seeks to universalize moral reciprocity by extending it over time, so that the present and future must serve to compensate past inequalities, equality giving way to “equity.”

Tout compte fait, wokeness is perhaps best understood as carrying to its extreme the deconstruction of metaphysics that had preoccupied the French Theory generation and its post-Hegelian predecessors. From Dostoevsky’s Underground Man’s suggestion that 2+2=5 is beautiful in its way to dismissing mathematics as “white” and refusing to tell schoolchildren when their calculations are incorrect, we have descended the intellectual scale without changing the ultimate objective. Propositions, particularly mathematical ones, can be objectively judged for correctness, unlike the originary ostensive, which is “correct” whatever it points to. Teachers might do well to use the example of wokeness as a way of acquainting their pupils with the originary hypothesis.

When the hitherto accepted religio-ethical institutions fail to inspire a society’s members with an effective sense of solidarity, cooperation can be maintained only by tyranny. In the days of Aristotle’s Politics or of Montesquieu’s Esprit des lois, such governments were seen as unstable, one tyrant being easily replaceable by another, ultimately motivating a return to more stable, consensual forms.

This has not, however, been the general case in recent times. Many contemporary tyrannies have endured over several generations, and if overthrown, as often happens in “failed states,” the result virtually never leads to anything like liberal democracy. On the contrary, democracies far more frequently evolve into démocratures. I cannot claim to have made a thorough study of the question, but it seems clear that the default today, in contrast with the Western triumphalism of Francis Fukuyama’s “end of history,” is rather the replacement of liberal democracy with dictatorship.

The more successful liberal democracies continue to exist, but having squandered a good deal of their firepower and persuasiveness in what now appear as ill-conceived attempts to persuade former colonial territories to emulate their political forms, they now find themselves facing a Chinese totalitarianism that has been able to profit from the market system while enhancing its dictatorial nature, as well as a meddlesome ex-Soviet Union and a militant Iran associated with persistent Islamist terrorism—which supplies via the “Palestinian struggle” the pretext for the current rebirth of Western antisemitism.

These developments are made all the more ominous by the fact that, so far, the overall effect of these challenges on the West and on the US in particular has been just the opposite of a patriotic revival. The recent evidence that the American military, long exemplary in its strict racial equality, is moving to emphasize racial and gender “equity” and CRT over military preparedness, reflects its descent into the same swamp as the rest of our globalist elite. Rather than mobilizing our forces, we have reduced our defense budget and greatly weakened our navy. Insisting that the Chinese military is not well trained or combat-experienced, or that China faces severe demographic problems down the road, is irrelevant to the vital question of whether our visible fecklessness will tempt them in the near term to belligerent acts, notably an attack on Taiwan, which would risk putting an end to the postwar world as we know it.


Given the digital age’s requirement of maximizing each individual’s capacity for symbol-manipulation and the clearly deleterious effect this development this has had on democratic polities, it is unfortunately not inconceivable that strict central control may be the only way to maintain a reasonably efficient system in our era.

Americans would be very wrong to understand wokeness as dependent on the US’s “peculiar institution” of black slavery. France, whose strict meritocracy—and relative freedom from racism—had been a major source of pride since the days of Napoleon, has reached the point where the Baccalauréat, limited a couple of generations ago to the top 10 or 20% of lycée graduates, is now nearly as much of a rubber stamp as an American high school diploma. The formerly stringent requirements in various fields from classics to French literature to history are no longer any better met than their American equivalents. And the land of Descartes and Pascal now ranks near the bottom of international competitions in mathematics.

Some of this is due to inter-ethnic tensions involving the large numbers of unassimilated African and Middle-Eastern Muslims among the younger generation, but the recoil from the objective measurement of competence and the growth of discrimination positive offer too close a parallel to the American situation to allow any doubt of the fundamental cause. The postwar flattening of class differences—the years 1945-75 were spoken of in France as les 30 glorieuses—has now turned against itself, and the rift described in Charles Murray’s Coming Apart is as characteristic of (white) France as of (white) America.

The thesis that only an authoritarian regime that makes no excuses for those who refuse to follow the rules can maintain a functional social order in our age is confirmed a bit more each day by the growing disorder in both our cities and rural areas. Given the size of its population and its history as the multi-millennial “Middle Empire,” humiliated for a century at the hands of the West, China’s quest for world hegemony has yet to arouse an appropriately serious response.


As Richard van Oort once wisely pointed out, generative anthropology is not a religion. But neither is it a denial of religion. We continue to develop our way of thinking in the faith that it can help restore the West’s depleted self-confidence in being guided by the most insightful and potentially universal mode of human self-understanding.