I had nearly finished a Chronicle deploring the West’s lack of sympathy for Israel’s need to respond to the Hamas pogrom of 10/7, when on the day following the end of President Trump’s 60-day deadline for Iran to engage in serious negotiations to end its nuclear program, Israel decided, with the accord of the US, that it could not wait any longer to attack the personnel implementing this program as well as its key installations, with so far highly encouraging results. But as it would be premature to draw any conclusions while the fighting continues, we will instead pursue our meta-reflections about the origin of language.
We must begin from the fundamental notion that language is not merely a signaling system, but a uniquely human mode of communication, one that, pace Chomsky, cannot be understood as principally a means of thinking. Indeed, we should note that the current revolution in AI has been brought about through the abandonment of logical/mathematical reasoning in favor of the imitation of empirical human linguistic practices as the only really useful guide to the construction of “authentic” sentences and conversations. Whence the apparent difficulty posed, ironically enough, by logical elements such as “not” in sentences (see https://www.newscientist.com/article/2480579-ai-doesnt-know-no-and-thats-a-huge-problem-for-medical-bots/), where the referents of such logical modifiers cannot be deduced merely from their proximity to words in the sentences of the corpus at the source of the AI engine.
And I am sure all users of AI systems based on the Large Language Model have noted that they make logical mistakes of the most elementary nature. For example, tell the AI that you are seeking a three-letter word to fit in a crossword puzzle, and half the time it will give you a two- or four-letter word—a mistake beneath the intelligence of a literate 6-year-old.
I mention this simply to point out that whereas in mathematical calculations we have no need to reflect about “thinking” as distinct from mechanical calculation, what we do with language cannot so easily be mechanized—indeed, in principle, perhaps not at all, although the machine imitation may well be sufficient for most purposes.
I half regret having entitled my first GA book The Origin of Language (UC Press, 1981; henceforth TOOL), since this suggests a historical solution to the question, as though I claimed to have found archaeological evidence of the very moment in which human language came into being. And it is no doubt because of the absence of concrete evidence of this Ur-moment that the book’s originary hypothesis has provoked so little reaction in academic circles. As a professor of linguistics said to me dismissively some years ago, “you can’t prove it.” As if the absence of empirical evidence of a kind that would be virtually impossible to produce invalidates the search, not for the chronologically first use of human language, but for a minimal hypothesis of its emergence. And by extension, as if, in the absence of such evidence, no useful purpose could be served by seeking a minimal formulation of the difference-différance brought into the world by the invention of language, whatever the details of its spread from its original place(s) of discovery to the rest of our species.
For the difference between human language, along with the rest of the culture that emerged in company with it, and the purely physiological developments that along with their behavioral extensions can explain the evolution of the various capacities of living creatures, is that language carries with it the potential, unpredictable from its earliest manifestations, to create parallel worlds of representations, with analogies in what we call the “arts” as well as in a limitless number of collective behaviors, from religious rituals to political structures, all mediated by the scenic interaction of its users. The (human) scene, even more than its signs, is the fundamental creation of language. This is clear from studies of animal behavior which show that the configuration of a scene is impossible for them, as witness the entirely one-on-one transmission by the “explorer” honeybee of the instructions for locating its newly discovered source of pollen to the rest of the hive.
That the notion of human scenicity has remained unexplored in relation to language and culture generally is explicable as a consequence of the general view of language as a technique, one largely limited to humans, but which scientists are most willing to attribute to other animals on the evidence of what appear to be complex and sophisticated signaling systems, as though the lack of anything even minimally comparable to human culture among even the highest primates did not make clear that an essential component that language uniquely possesses—precisely what I call the scenic—is missing from non-human systems of communication. That in TOOL I situated the declarative sentence which marks the “maturity” of human language at the end of an evolving series of syntactic forms leading from ostensive to imperative and finally to the declarative as the response to a “failed” imperative, that is, as a linguistic response to a worldly demand, that creates for the first time a genuine linguistic dialogue, as opposed to what had previously remained in practical terms a signaling system, allows us to understand the role of the declarative on the human scene in a way that treating it as the incarnate expression of a “truth” does not. It is not to downplay the importance of logic to observe that it must emerge from the practical functions of language, and that the possibility of this emergence depends on the scenicity that creates among humans a new form of mutual presence.
And in turn, the recently observed power of the screenic to interfere with the interactive element of the scene by allowing the individual to create a solipsistic imaginary universe that he can people with his own choice of characters is clearly a major factor in today’s loosening of the social bonds among members of human communities, a factor surely connected to, among other disquieting developments, the sub-replacement birthrates of industrial societies. All the more reason to focus our attention on the scenicity that has since the origin of our species held human communities together.
To conclude this brief reflection, I cannot help returning to what was for the postwar generation a key document of human self-understanding: Sartre’s L’être et le néant, or Being and Nothingness, first published in 1943 under the German occupation. The book’s key anthropological insight is the opposition it draws between the pre-human en-soi (in-itself) and the human pour-soi (for-itself). These terms are translations of Hegel’s an Sich and für Sich, but Sartre gives them an anthropological twist, one that, as I discovered in preparing the 2024 GASC in Tokyo, was the result of his contact in Paris in 1928 with Japanese philosopher Shūzō Kuki, whose understanding of “nothingness” was strongly influenced by Buddhism. The contrast between the nonhuman world of the en-soi that in Sartre’s imagination has no “free space,” so that everything is “jammed together,” and the pour-soi where there is a space of freedom between the mind and its objects is a fascinating subject of reflection. This physical imagery, very much akin to that used in Buddhist texts to describe the openness of what I would simply call the scene of the language-enabled mind, is contrasted as the uniquely human freedom that makes humans capable of conscious reflection as opposed to the limitation of other creatures to reflexive reaction.
What is most fascinating of all is that Sartre, while attributing to the open space of the pour-soi the freedom that makes us human, never made the connection between this openness and the scene of language. His vision of human subjectivity is so to speak naturalistic, without any debt to human culture. It is as though freedom cannot be made dependent on anything other than itself, with the consequence that L’être et le néant unfortunately remains a work of philosophy in the narrow sense, all the while its specific analyses, often fascinating in their detail, emphasize the freedom that humans alone possess in virtue of the scenicity that, as we have seen, is the direct product of the deferral by which humans have learned to avoid conflict.
And this deferral in turn depends—to evoke a quality utterly foreign to Sartrian thought—on our sense of the originarily sacred nature of our separation from the object of contemplation in the mode that Sartre describes as the pour-soi, which poses a desired object on a scene accessible only via a sign that communicates to others its sacrality, its necessarily deferred accessibility. It is indeed difficult to imagine the hostility with which Sartre would no doubt have reacted to this anthropological analysis of his most important philosophical work.