I have spent the past few weeks proofreading, and rereading, past Chronicles, looking for formatting and other errors, and have now gone through three quarters of them. Corrections are thankfully few; I imagine there will be more problems when I start going over the Anthropoetics articles, with footnotes and hyperlinks. There’s really no substitute for doing this myself.
Next Tuesday I will be 77, so please indulge me in some subjective remarks.
I must admit I can’t help being impressed with the variety and overall quality of these little essays. They deal intelligently with a large number of issues and refer to quite a few other thinkers; there are also a good number of discussions of films and other esthetic issues.
But rereading them only makes more evident that, “popular,” that is, non-professional, culture being driven by the need to discharge resentment, one’s ideas do not attain any degree of “virality” on the Internet or anywhere else unless they enlist the resentment of a well-defined clientele. The fact that in the past few months, stimulated largely by Adam Katz’s GABlogs, there have been signs of an interest in GA from the more intellectual domains of the alt-right essentially reflects the fact that this group, unlike either the majority of “progressives” among our intelligentsia or even the conventionally conservative remainder, finds food for its resentments in GA.
GA has had on university and mainstream publicity departments the effect that DDT once had on mosquitoes. It is a fact that I cite here with no intent of vainglory that there are precious few, if any, living humanists whose work has been the basis for an electronic journal that has appeared regularly for over twenty years, as well as a dozen consecutive yearly meetings of an association (the Generative Anthropology Society and Conference). Yet neither UCLA nor any other institution has ever devoted a single line of publicity to these activities, let alone to me personally.
Not that I take this personally. It is simply that there is nothing in the university world, in the media world, and certainly not in the more diffuse world of victimary resentments, that could find its nourishment in these ideas, however clearly I believe my polemics with other thinkers demonstrate the qualitative superiority of GA to other ways of thinking (see, for example, Chronicles 567, 525, 519, 490, and 444. You can search the names of other thinkers at http://anthropoetics.ucla.edu/apsearch/).
The old academic world that I knew as a student, and that endured through most of the 1970s, was incomparably less politicized than ours, which is to say, far less directly resentment-driven. In those relatively halcyon times, when there was not yet an oversupply of PhDs, the “academy” saw itself as an unproblematically elite operation, an “old boy” system—which also included a few women, like my medieval professor and later Indiana University colleague Anna Hatcher—but which had above all a sense of its own cohesion without a need for the unseemly struggle for visibility that has since become the norm. Just to give a counterexample, in 2010 or so, the last “8-year review” of my department at UCLA before I retired, prepared by a local committee that included one or two outside reviewers, did not even mention my name, although I had published about as many scholarly works as all my colleagues together. But since I no longer attended MLA meetings and wrote about other things than French literature, I was no longer considered part of the “profession.”
I would love to be able to say that my nonetheless relatively successful academic career should serve as an example to those concerned more with developing their ideas, in particular those of GA, than with the kind of glory available to members of the MLA. But the economics of the profession are far from what they were when I came to UCLA in 1969. Today someone with my qualifications, and my negative “diversity” value, would have little hope of landing a job in a major university. Nor can I honestly present GA as a key to academic success.
Whereas in the sciences, beneath all the politics and grantsmanship, there is a solid objective core that permits scholars of demonstrated ability to explore byways that may just turn out to be productive, the social sciences, and particularly the humanities, have no such grounding in external reality. In the period that was ending just when I began my graduate studies, literary scholarship claimed to address a well-defined, limited set of tasks: compiling critical editions, writing “literary history,” providing straightforward commentaries on literary works grounded on scholarly knowledge of their historical context. Those were the days when senior professors, as I believe they still do in France, would tell their degree candidates that il y a une thèse à faire on a certain topic: an obscure manuscript, a posthumous publication, or to be particularly daring, a new interpretation of a well-known work, based not on personal insight or “critical theory,” but on some newly discovered historical data.
I’m sure I would not have been happy under this old system, but it had the advantage of a serene objectivity; reputations were made by fulfilling a relatively transparent set of criteria, not by making oneself visible at conferences. In a sense, I had the best of both worlds; I began my career near the start of the era of la nouvelle critique, but it was not yet the time of “French theory”; there was still room to be original without being fashionable.
In the past few decades, and increasingly in the last few years, the humanities and social sciences have been overwhelmed by victimary politics, both in their choice of subject matter and in the increasingly narrow focus of their analytic perspectives. The one new area that strikes me as promising is what they call at UCLA and elsewhere the “digital humanities,” the use of computing power to do such things as reconstruct models of ruined buildings on the basis of fragmentary evidence, detect word-patterns of different kinds in masses of text, or mine Twitter to study reactions to films or books. No doubt such “big data”-driven analysis is indeed a way of the future, one also that blessedly allows the researcher to avoid the need to foreground political judgments about the works he studies. But GA’s “originary thinking” has other ambitions.
I tend to agree with those who say that these Chronicles—which I hate to hear called “blogs”—are really my most important work. They add up to well over a million words, and cover most of the terrain of GA, often more subtly and in greater detail than any of my books. Indeed, my last few books, with a few exceptions—my biography of Carole Landis, my online sketch of “the Girardian roots of GA”—have been compiled from revised versions of the Chronicles, and the new edition of The Origin of Language, which should be published shortly, was composed in Chronicle-sized chapters and published in this series. Like Derrida, if not Girard, I am more comfortable as an essayist than as a book writer. These essays, read by a few dozen people, yet available to countless millions, embody the paradox of a set of ideas too “minimal” to be accepted or even noticed by the contemporary intelligentsia.
How many of my readers, when confronted by the originary hypothesis, have reacted by conjuring up a rival hypothesis of their own; after all, the scenario is wholly speculative. And once one has done this, instead of realizing that the details of the event are relatively unimportant in contrast with the principle that there must be an event, the typical reaction is: if I can think up an alternative scenario in ten minutes, then there’s nothing special about this one, and that just demonstrates that the whole business is not really serious. Paleoanthropologists work for years digging up bone fragments, neuroscientists spend years exploring brain synapses, and this guy just comes along and invents a theory out of whole cloth and expects everyone to go along with it… And yet my theory plausibly explains the origin of human language and culture and religion, and the others do not.
Readers of my Girardian Origins of GA will recall that in 1978, when Girard invited me to Hopkins as Visiting Professor in hopes that I would join the faculty there, for my graduate seminar, instead of giving a normal course on literature and/or literary theory—such as the one on Derrida and Girard that I had given the previous year at UCLA—I decided to spend the semester expounding my theory of esthetic paradox as developed in my Essais d’esthétique paradoxale, which had just appeared, without otherwise supplying a reading list. In retrospect, this was a crazy thing to do. Yet I had been Chairman of my department at UCLA for three years; I was not fundamentally irresponsible.
In retrospect, my behavior seems less a simple conduite d’échec than the manifestation of a kind of unconscious wisdom: my career was surely far more productive at UCLA, where I was not in competition with anyone, than it would have been in the hothouse atmosphere of the East Coast, even before it heated up still more a few years later with the spread of deconstruction and “French theory.”
And however irrational my gesture may have been, it was also an act of faith: if I was to fulfill my destiny, I must follow my star and not play it safe. As it turned out, my theory of esthetic paradox was, although fundamentally sound, of limited scope. But as a result of that visit and the reflections that it inspired, I produced The Origin of Language and “discovered” GA.
That my writings have not been widely read or discussed is not a source of joy, but from the perspective that resentment is our default attitude toward others “in our field,” it has been a blessing in disguise. I imagine that if the authors of the books I criticized had ever thought to read my criticisms, they would have responded in kind, and I would have been forced to waste time on polemics that, as they say, would have generated more heat than light.
As it is, I have felt free to present my ideas without fear of offending anyone, yet knowing that they are available to all. Given that I was fortunate enough to enter the profession at a time when good positions were available and, whatever my follies, now have a perfectly adequate income for my old age, this relatively serene existence has been far better suited to me than the mantle of “celebrity,” even in the limited sense in which it falls on the shoulders of intellectuals.
Blaise Pascal, best known in the United States for his “triangle” of binary coefficients, was, among other things, a pioneer of the theory of probability. The central religio-philosophical concept of his Pensées was that of le pari, the wager, which as a good Jansenist, he attempts to persuade the libertin or skeptic to make on the existence of a benevolent God, even if, as such a person might think, the chances of God’s existence are vanishingly small. For, as Pascal puts it, if we lose, we lose only our finite life, which will end in any case, but if we win, the payoff is infinite. (I assume that at the time he wrote this, no one had yet thought up the St. Petersburg paradox, which gives the gambler a spuriously “infinite” expectation.)
The paradoxical esthetic on which I wagered my appointment at Johns Hopkins was not a good bet; but it led by a mysterious path to the formulation of the originary hypothesis on which I have wagered the remainder of my career, materially secure in my tenured position at UCLA.
I am happy to say that rereading these Chronicles written over 23 years gives me the feeling that I’m winning my bet. If the intellectual world is not able to accept these ideas today, even if it never accepts them, I have no doubt that if humanistic thought is to have a future, it will be forced to retrace the path I have taken.
Above all, I am grateful to the handful of brave spirits who have accompanied me on this journey. These scholars are making their own contributions to generative anthropology, which I have every hope that they will carry forward into the next generation and beyond.