The recent Thinking the Human conference at Stanford, my paper for which was published as Chronicle 402, gave me a new idea for positioning the originary hypothesis with respect to the sciences that I would like to discuss here.
I delivered the paper as part of a rather arbitrarily constituted panel of two, whose other member was a Stanford anthropologist, whom I shall call T, who has done extensive field work among evangelicals and other “low church” Christians. The burden of T’s presentation was to report how prayer, the gift for which is closely correlated with the psychological trait known as absorption, the ability to concentrate wholly on one thing blocking out all else, is associated among these believers with experiences of divine presence. These experiences are generally brief, typically involving a few words heard in the mind with no obvious source and usually recommending some course of action.
T displayed not the least condescension in discussing beliefs that Dawkins and other professional atheists treat with hostility and contempt. Her approach to the claims of her subjects that they heard God speaking to them was to take their statements at face value within their own belief systems rather than to judge them according to “objective” criteria. As she put it, she took no ontological position with respect to these claims, let alone to the root causes of their possibility.
Since I spoke after her, I made the point that although our talks had few specifics in common (most notably, the mention of C. S. Lewis’ Mere Christianity), the originary hypothesis, for which the originary sign is best interpreted as the “name-of-God,” understood language in much the same spirit as her “ontology-neutral” presentation of her subjects’ experiences of the divine.
Nonetheless, after my talk, T, clearly scandalized by my approach, asked me why I was indifferent to empirical data, since of course there could be no originary event of language, no “first word.” To which I replied that GA was a humanistic rather than a scientific theory, a “new way of thinking” that was meant to provide a model of the human rather than a set of hypotheses to be tested by paleontology. While true as far as it goes, I think this defense can be strengthened and made more persuasive.
I expended much energy at the conference in a not altogether successful effort to remind the scientists present that (1) humans are characterized most simply and fundamentally by language rather than any of the various “cognitive” functions that accompany it, (2) the origin of language and that of the sacred cannot be separated, and (3) sharing food is the most fundamental form of human sociality. Of these fundamental anthropological truths, not even the second is controversial in itself, but in the milieu of empirical science, the idea of combining them as a joint justification for an originary hypothesis is simply unthinkable. My experience convinced me that our “new way of thinking” will not be given a hearing by scientists unless it can be presented in such a way as to make unambiguously clear the difference between its claims and those that are empirically falsifiable. T’s own research suggested a model for this new mode of presentation.
Although social scientists do not consider the originary hypothesis a true hypothesis in their sense since it is not testable by empirical means, they readily accept that such things as Ignatius Loyola’s imaginative techniques can improve the effectiveness of prayer in evoking concrete images of God, and in the case of T’s subjects, of hearing non-hallucinatory “commands” that they take to be “from God.” (One of T’s more interesting results was that a group of subjects who listened to Loyola-like discourses designed to stimulate the religious imagination gained greatly in sensitivity to divine presence, in contrast with a control group that was given traditional religious texts.) Bracketing ontological claims concerning the reality of the referent of these experiences allowed T to express her results in empirical terms. Yet the possibility of such experience was simply taken for granted. How is it that people have the sensation of speaking with God? Is the mere fact that people claim such experiences a sufficient explanation of their possibility? It would seem all the more crucial to explain this possibility, given that we have no way of determining the status in reality of the personal divinity to whom these experiences are attributed.
This suggested to me that we might do well to explain the originary hypothesis, as I have at various times described it, as a heuristic for facilitating our understanding of the human, in somewhat the same way as Loyola’s spiritual exercises facilitate our ability to become aware of God’s presence. For when the nonbeliever claims to evaluate the effectiveness of Loyola’s procedures without making any ontological claim as to the existence of God, or even as to the anthropological source of the idea of God, we can argue in defense of our heuristic that as students of the human we cannot not concern ourselves with the origin of language and religion, and that however they originated, all evidence suggests that they were originally linked—a position defended by the late Roy Rappaport, whom I have often mentioned in these Chronicles.
Putting its explanatory scenario aside, GA’s own neutral attitude toward the ontology of the divine is identical to that of anthropology as T conceives it. For precisely, T insists that the attitude of prayer is not hallucinatory or fictional. Interpreting a word as “from God” is not like claiming to have experienced alien abduction. But if bracketing the “existence” of God does not prevent T from discussing and in every practical sense understanding the believer’s experience, this suggests that the very notion of God is such that its reference and its meaning or signified are indistinguishable. T’s reduction of religious experience to the simple claim of the presence of the sacred independently of any other supernatural element shares with GA the quality of minimalism. GA’s hypothetical scenario may consequently be offered as a heuristic permitting us to understand the possibility of this minimal sacred experience. This does not require us to abandon the ontological claim that the event defined by the emission of the first sign is unique, at least within a given human group. Whatever the exact scenario involved, at some point people must have begun to have the experiences of divine presence that evolved into those T’s subjects report.
Although GA’s heuristic model is minimal in its own terms, social scientists feel compelled to slash at it with Ockham’s razor because it postulates a cut in time determined by a singular event, which thereupon takes on something of the import of the big bang. Yet there is no more parsimonious way to explain the phenomena of human culture in their scenic unity. Human cultural/representational activity is not understandable without the notion of an event taking place on a scene. An event, any event, is a cut in time, a historical present that divides a past from a future. This analysis of time, associated with Heidegger’s and Sartre’s existentialism, can be traced back to the romantics and to Augustine, and beyond them to the universal human intuition of eventfulness that one finds in every cultural text, religious or secular.
I think the ontological weight of the firstness of the event can be greatly diminished if in describing the participants’ experience we emphasize less that it is the first event than that it is the first event. To put it a bit differently, the heuristic model of the originary event is not first defined as unique and then understood as the source of the fundamental configuration of human culture. It is the event-heuristic that is primary, its uniqueness, secondary. Once we understand that uniqueness is a characteristic of every event, it becomes clear that it is a fallacy to deny the uniqueness of human origin by alleging the improbability of a singular originary event. The difference between the gradualism of biological evolution, however “punctuated,” and the eventfulness of the human is inherent in the human event as such, which contains its own internal self-consciousness of irreversibility in the fact that it is understood as always already represented. The “heuristic” model of the hypothetical event is simply eventfulness itself, and regardless of the accuracy of the scenario we use to flesh it out, the cut it makes in human history is that of every human event.
The idea that the human can be defined by a list of traits each one of which might theoretically have emerged at a different time and in a different context is a matter not of ontology but of methodology. It is the anthropological equivalent of performing a laboratory experiment where everything is kept constant except for the single parameter whose ultimate effect on the organism or ecology one is seeking. The fact that in the case at hand no such experiment is possible provides no reason to abandon this mindset, since even in the laboratory of the mind, holding constant all parameters except one makes thinking clearer in principle. Save that—and this is what social scientists refuse to accept—the principle does not apply when the parameter in question is conceivable only as an effect or a corollary of the human phenomenon of shared representation through signs. Language, religion, and such related phenomena as shared intentionality and “theory of mind” are arguably interdependent and associated in their emergence with a communal, scenic context; to separate these traits in their origin and thereby to treat them as qualities of isolated individual minds makes it impossible to explain this interdependence. To then take this impossibility as a refutation of the hypothesis of interdependence would be a case of circular reasoning incompatible with the scientific method.
Beyond language and other forms of representation, a feature of the originary event that deserves greater emphasis than it normally receives is the sharing of food. Readers of Girard might think that the typical ceremony in human societies is some form of scapegoating, whereas our most frequent “scenic” activity, universal in all human communities from hunter-gatherers to the most advanced industrial economies, is eating in common. Rare indeed is the family that doesn’t get together on holidays such as Thanksgiving or Christmas for a collective feast. The minimal human configuration is the scene, with a collective human periphery surrounding and designating/signifying a sacred center containing an edible object whose “equal” division will supply“equally” shared food—the product of the new mode of distribution initiated by the collective sacred as a solution to the mimetic crisis occasioned by the breakdown of the alpha-beta pecking-order hierarchy. The third criterion should be given equal billing with the other two, because only a new mode of alimentary distribution can guarantee the superior adaptive utility of the scene/sign/sacred configuration over the old alpha-beta hierarchy. To say the new configuration “prevents mimetic violence” would be meaningless in the absence of a new mode of distribution, since peaceful alimentary distribution is the main burden of such hierarchies.
If this discussion demonstrates anything it is that it matters how we define the human. For it is this definition rather than any “objective” scientific criterion that determines the salience of the event-nature of the human. If we define the human in purely biological terms, whether as homo sapiens alone or as including other species such as the Neanderthals, not only can no singular event determine the passage between the nonhuman and the human but the very category of event by which GA defines the human can have no part whatsoever in this determination. Conversely, if we define the human in cultural terms by the presence or absence of (shared) representation, then we cannot exclude from our definition the notion of the event and its singular status.
It is by choosing the intuitive evidence of eventfulness over empirical methodology that our new way of thinking parts company with the sciences. It distinguishes itself as well from philosophy-metaphysics, whose various implementations of this intuition always take for granted the existence of a language capable of describing, or to put it more fashionably, inscribing eventfulness in a set of propositions whose logical connections can then be analyzed.
GA has no pretension of doing without propositional language, even when (as in some recent researches of Adam Katz) it seeks the ostensive basis for declarative utterances. But it is a fallacy to assume that this necessity condemns us to either the social-science or the philosophical version of the “prison house of language.” Our internment there is voluntary and will last only so long as we refuse to accept the ontological novelty of the human and of human language. That the uniquely human trait of shared representation is indeed a new form of being is in conformity with our common-sense intuition that, given that we are wholly dependent on this trait in constructing models of the universe, we cannot refuse to recognize its singular status.
Perhaps the methodology of empirical science demands this denial; that is a question that can be answered only in practice. But it is certainly not a demand of logic or of common sense. On the contrary, the unexpected persistence of religion even among those who have experienced the disenchantment of the world might serve as a hint to those who study the human that these believers may know something they don’t. Thus I would suggest that the results of Professor T’s study, so respectful of its subjects’ beliefs, hold a lesson that, properly understood, has the potential to burst the methodological bonds within which traditional anthropology would contain it.