By forty, you learn to live with yourself; at sixty, you have to bet on yourself, on your own unique set of limitations. You may be able to push them back a little, but you’re surely not going to “transcend” them. Thus I have no choice but to wager that someone like myself–someone too impatient with data to concentrate his energies on any particular subject-matter, notably the details of others’ lives (my last “biographical” book dates from 1974)–can contribute significantly to human thought.

Over twenty years ago I proposed that the origin of language must be conceived as an event because it is the origin of the event-ness that defines human culture. Nowhere in the ever-increasingly volume of language origins research has this idea been refuted: it is simply not taken into account. I am betting that the very unanimity of this neglect shows that it is I, rather than everybody else, who am right.

If the human is indeed defined by a system of exchange that serves to defer violence, there is no difficulty explaining why, in the current context, this definition is not generally adopted. Academic life is elaborately organized to eliminate direct competition: rivals in one’s field are elsewhere, local colleagues are in other fields. Above all, there is no hierarchy: all are specialists, and no one’s work encompasses anyone else’s. Such a system cannot, almost by definition, tolerate an anthropology that insists on the transcendence of nature by the scene of culture focused on the sacred center. The center is a vulnerable place, all the more so when its occupant is no longer crucified but scorned as uncool.

The originary hypothesis defines humanity by what most critically concerns it. Whether in the Kansas City Star or the New York Times, the biggest headlines are atop stories of violence. Persons with whom we have only humanity in common interest us to the point of obsession when they become characters in a tale of human violence. Recently the Star commemorated the twentieth anniversary of the death of a notorious petty criminal who had for years single-handedly terrorized several North Missouri counties, stealing hogs, burning down houses, threatening witnesses, and, finally, shooting a man in the neck at close range–for which he was convicted of second-degree assault and allowed free on bond during appeal. The community had had enough; in a rural area where firearms abound, a few individuals dealt with the problem. The identity of the murderers, if the term is appropriate, has never been revealed. This story illustrates the immense danger posed to the social order by even a single person who refuses to participate in the common deferral of violence. Denial that this deferral is fundamental to being human is only another form of deferral.

Scientific discourse takes human mortality into account only as subject-matter; medical science is no more humanistic than geology. Conversely, cultural discourse defers through representation our return to physical nature. Human beings whose awareness of death is revealed to them by the culture they share cannot afford to wait forever to understand their origin. They need an impatient anthropology that provides a minimal core of self-knowledge invulnerable to the storms of empirical data.

Let me give a brief counter-example. Recently I read Andrew Carstairs-McCarthy’s The Origins of Complex Language (Oxford, 1999), which develops with great skill and tenacity an intricate argument requiring mastery of linguistic philosophy and linguistics as well as a good working knowledge of neuroscience and primatology. The author presents the specificity of human language as a consequence of the descent of the larynx, itself initiated not by the need to enunciate phonemes but as an anatomical consequence of erect posture. Our adoption of a form of language that features sentences possessing a truth value rather than simple noun phrases (NPs) is described as a consequence of the dual rhythm imposed on language by vowel-consonant articulation, which, in turn, results from the articulatory mode favored by our descended larynx. But even if the suggested correlation between the rhythms of phonetic and syntactic articulation is valid, it cannot explain the origin of language because it cannot explain the crucial correlation between the emergence of language and that of the human social order, inevitably established on a sacred foundation. It is sad to see so much intelligence expended in the service of the naïve and hackneyed antihumanist polemic with which the author concludes:

The idea that something so immense in its effects as human language could have its beginnings in physiological adjustments consequent on bipedalism strikes deeply at the assumptions that human uniqueness must be based on something rather grand and profound, such as access to unique kinds of knowledge and self-awareness. But, once we get used to the idea, I think we will come to realize that it is only our pride that is hurt. Certain historical accidents have indeed supplied us with a uniquely sophisticated mechanism for communication and for the mental representation of the world; but, apart from that, we are just one species among many. (231)

What does it really mean to describe humanity as “just one species among many”? If this is a critique of religious anthropology, the author should explain how our “uniquely sophisticated mechanism for communication” is related to religion. If not, he should tell us what would count as falsifying his description. We are treated to a moralizing sermon in guise of a dismissal of “our” moral prejudices, which are really those of the great unwashed, since the intelligentsia has been obsessed with chastising our “speciesist” pride throughout the postwar era.

This is a case where empirical research might have benefited from the framework supplied by this impatient old amateur. Ostensives are not “NPs,” but complete utterances in their own right. The passage from ostensive to declarative is not simply cognitive but ethical; it is the passage from interdicting the sacred object through the sign to formulating this interdiction explicitly in signs. In other words, what makes humans a different kind of species is not language but ethics, which language uniquely articulates. However well or poorly our own ethical value compares with that of other species, ours is the only species that assigns such values; it is this fact, rather than the values assigned, that is relevant to Carstairs-McCarthy’s inquiry.

Dear reader, if you’ve heard all this from me before, don’t be impatient: who knows how much longer I’ll be around? But if you find value in this kind of anthropological humanism, you owe it not to me but to yourself to defend it. Social science insists with quasi-religious fervor that language is either a simple extension of animal communication or an “instinct” analogous to those of animals. The idea of human uniqueness is dismissed as a contemporary equivalent of the belief that the sun revolves around the Earth. But denial cannot save us from the ethical question this uniqueness poses. The unique attributes of humanity are what make us dangerous to each other and, accessorily, to the other inhabitants of the planet. If we would protect the Other, we must first ensure that we are protected from Each Other; this Hobbesian need is what defines us as human in the first place.

I am betting that, however long people deny these truths, sooner or later, they will come to acknowledge and live with them, as the basis of what Kant called “perpetual peace.” I may not see that day, nor, no doubt, will you, dear reader; but you would do well to bestir yourself if you would like your children or your grandchildren to do so.