The Long View 2006-10-03: Religious Studies; Humans versus Homo Sapiens

St. Thomas Aquinas – An altarpiece in  Ascoli Piceno , Italy, by  Carlo Crivelli  (15th century)  By Carlo Crivelli - Via The Yorck Project (2002) 10.000 Meisterwerke der Malerei (DVD-ROM), distributed by DIRECTMEDIA Publishing GmbH. ISBN: 3936122202., Public Domain, https://commons.wikimedia.org/w/index.php?curid=149804

St. Thomas Aquinas – An altarpiece in Ascoli Piceno, Italy, by Carlo Crivelli (15th century)

By Carlo Crivelli - Via The Yorck Project (2002) 10.000 Meisterwerke der Malerei (DVD-ROM), distributed by DIRECTMEDIA Publishing GmbH. ISBN: 3936122202., Public Domain, https://commons.wikimedia.org/w/index.php?curid=149804

This is perhaps the most profound thing John J. Reilly ever said:

Human, homo sapiens, and person are not just different things, but different kinds of things. A human is an essence (if you don't believe in essences, then you don't believe in humans; maybe that's Peter Singer's problem); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. (As for the converse, C.S. Lewis occasionally toyed with the possibility that not every homo sapiens need be human; so have I, though I'd rather not pursue the matter.) As for "person," I think this kind of argument conflates the primary meaning of "person," which is an entity, conscious or otherwise, that you can regard as a "thou," with the notion of "person" as an entity able to respond in law, either directly or through an agent.

I ponder this all the time, and the critical distinction he makes here just gets better with age. Clear terms enable clear thinking. 


Religious Studies; Humans versus Homo Sapiens

 

Your degree in Religious Studies is not useless, if we consider the Jobs for the Boys implicit in this assessment by that Spengler at Asia Times:

Theological illiteracy is epidemic in the neo-conservative camp. The American Enterprise Institute's Iran expert, former US Central Intelligence Agency officer Reuel Marc Gerecht, thinks that "Islam is akin to biblical Judaism in accentuating the unnuanced, transcendent awe of God". Gerecht is ge-wrong. Worst of all is Norman Podhoretz of Commentary magazine, who insists that Islam takes even a stricter approach to idolatry than Judaism....These are the blunders of secular intellectuals who approach religion from the outside. Because the neo-conservatives propose to democratize the Middle East, they also must insist that Islam can be twisted into the pretzel that they prefer.

In fact, the foreign policy establishment was trying to get up to speed on religious issues even before 911. This effort requires academic expertise, however, and "Religious Studies" is too often an exercise in stultifying multiculturalism; or worse, a scarcely disguised form of Tradition. Many of the people who are already in the religion consultancy business are part of the problem: don't even ask what Spengler thinks of Juan Cole.

One of Mark Steyn's suggestions for combatting the ideological dimension of the jihad (as I note in this excessively long review of America Alone) is the creation of a "civil corps" to refute Islamist ideas and propose alternatives. Might I suggest that the only reality such a measure could have would be something like Christians praying for Muslims during Ramadan:

DALLAS, September 29 (UPI) — A global coalition of evangelical Christians is urging prayer for Muslims during their holy month of Ramadan.

But the idea has met some resistance from Muslims — and even some Christians, Associated Baptist Press reported Friday.

I am not altogether happy with the idea of evangelization as a national security strategy, but it could come to that.

* * *

Speaking of religion and politics, and particularly with regard to the Democratic Party's religion deficit, Robert P. George raises these objections at First Things:

Over at the Mirror of Justice website, law professor Eduardo Peñalver keeps reasserting his arguments for why Catholics and other pro-lifers can and should support Democrats—even those who uphold abortion. But Professor Peñalver’s arguments do not improve with age or repetition....But let us get to the heart of the matter in dispute. Either Eduardo Peñalver believes that human embryos are human beings or he does not....the answer to [that] it is clear. The evidence, attested to unanimously by the major embryological texts used in contemporary anatomy and medicine, is overwhelming. From the zygote stage forward there is a complete, distinct, individual member of the species Homo sapiens ...

[I]s dignity something we possess only by virtue of our acquisition or realization of certain qualities (immediately exercisable capacities) that human beings in certain stages and conditions possess (or exhibit) and others do not, and that some possess in greater measure than others, e.g., self-awareness, consciousness, rationality? If the latter, then not all human beings are “persons” with rights....

I and many others have advanced philosophical arguments against the idea that some human beings are “nonpersons.” I will not repeat the arguments here. I will say only that among the weakest arguments for denying that embryonic human beings are persons is the one that seems to have impressed Professor Peñalver: namely, the argument that purports to infer from the high rate of natural embryo loss (including failure to implant) that human embryos lack the dignity and rights of human beings at later developmental stages. No one knows what the rate actually is, in part because what is lost in some cases is, due to failures of fertilization, not actually an embryo. But the rate doesn’t matter. For nothing follows from natural death rates about the moral status of the human individuals who die.

I'm prolife, too, but I think George's arguments are glitchy. Human, homo sapiens, and person are not just different things, but different kinds of things. A human is an essence (if you don't believe in essences, then you don't believe in humans; maybe that's Peter Singer's problem); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. (As for the converse, C.S. Lewis occasionally toyed with the possibility that not every homo sapiens need be human; so have I, though I'd rather not pursue the matter.) As for "person," I think this kind of argument conflates the primary meaning of "person," which is an entity, conscious or otherwise, that you can regard as a "thou," with the notion of "person" as an entity able to respond in law, either directly or through an agent.

And actually, the conjecture that most concepti might be duds does bear on the matter. If, say, 75% of humans are organisms of a dozen cells that live for just a few days, that fact might not affect their dignity, but it does affect the dignity of the small minority of humans who survive to become adults. Do we really want to define humanness is such a way that intelligence, percipience, or compassion become irrelevant? Then there is also this: if those certainly human concepti could be rescued, won't we be morally obligated to try?

Let me suggest that a human is a little like a quantum particle that strikes a target behind more than one aperture: you can tell where it has been only after it has arrived. Certainly every adult alive today was once a conceptus, and then an embryo, and so on: every stage of this development shared the same essence, and so had the same dignity. That's quite different from saying that every conceptus is a human being; the most we can say is that every conceptus might be. That is quite enough reason not to interfere with it.

A final point on this matter: the human-life question will turn out to be epiphenomenal to the end of the abortion era. Contraception, abortion, and homosexuality were all features of a human-rights package that was designed, at least in part, to lower the birthrate. The intellectual and cultural climate on this issue is changing very rapidly. The interesting thing is that, whereas the courts that created these rights tended to avoid the suggestion that they were really implementing a population-control program, the courts now seem open to explicitly pro-natalist arguments.

You need an argument for your appellate brief that does not smack of theology or natural law? Here you have it.

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Interior Castle

This luminous review of St. Teresa's The Interior Castle finally brings me full circle, back to the mysteries of human experience, and the unity of mystical experience across religions. 


The Interior Castle
By St. Teresa of Avila

Translated by: 
The Monks of Stanbrook, 1911
Spanish Original:
Published Circa 1583
Barnes & Noble, 2005
227 Pages, US$9.95
ISBN: 978-0-7607-7024-5

 

If someone asks you, "What do you want from life?" all sorts of answers may occur to you. Ancient tradition suggests, however, that you should ask for something like this:

[T]he spiritual marriage with our Lord, where the soul always remains in its center with its God. Union may be symbolized by two wax candles, the tips of which touch each other so closely that there is but one light; or again, the wick, the wax, and the light become one, but that one candle can again be separated from the other and the two candles remain distinct; or the wick may be withdrawn from the wax. But spiritual marriage is like rain falling from heaven into a river or stream, becoming one and the same liquid, so that river and rainwater cannot be divided; or it resembles a streamlet flowing into the ocean, which cannot afterwards be disunited from it. This marriage may also be likened to a room into which a bright light enters through two windows--though divided when it enters, the light becomes one and the same.

The spiritual marriage is an event that occurs in the Seventh Mansions of the seven-region structure of the soul described in this book by Teresa Sánchez de Cepeda y Ahumada (1515 -- 1582), the reforming Carmelite nun. She was later named a saint and a Doctor of the Church; she is best known as St. Teresa of Avila. The state she described is the best thing that can happen to a living human being.

The contemplative tradition of prayer in which Teresa is such an eminent figure prescinds from most late-modern discussions about the reality and nature of the divine. God is not a proposition to be proven; or even an object of faith, at least after the first stage of prayer as self-initiated meditation. Rather, God is known through direct experience, an experience that is prior to any philosophical or scientific glosses that students of contemplation might apply to it. In that sense, contemplative prayer is an existentialist enterprise, rather like Heidegger's study of conscience as the voice of Being. The difference is that modern existentialism appeals to immediate experience on the assumption that experience will always behave itself. In the world of the contemplative, experience does not behave itself at all.

Be that as it may, any class of phenomena that are predominantly mental is going to raise at least some suspicion of insanity, fraud or mistake. Teresa reminds us more than once that she suffers from headaches, and that she sometimes hears a sound like rushing waters. There were points in her spiritual life, she makes clear, when she was simply ill. Critical of her own experience, she offers readers frank cautions about the psychological pathologies to which the nuns of her Order are subject. ("Melancholia" is not a modern diagnosis, but it seems at least as useful as later terms have proven to be.) She has a quite lively sense of the power of wishful thinking. She evidently knows mere silliness when she sees it. She also warns that even the most dramatic psychological event can be a diabolical deception, or may simply have no deep significance at all.

Readers of her book will soon appreciate how disciplined her treatment of contemplation is. They will also appreciate that quite a lot of this discipline is external.

Throughout her career, Teresa's activities were impeded because she was a woman in a society where women had limited legal personality, and, in any case, were not expected to have serious intellectual interests. Teresa was the daughter of a converso family, which also made her an object of suspicion in 16th-century Spain. More important, she and her colleague in the male wing of the Carmelite Order, Saint John of the Cross, were continuing to cultivate a tradition of late medieval spirituality that the Spanish hierarchy of her day strongly suspected, not without reason, to have contributed to starting the Reformation. Teresa was periodically suspected of being one of the alumbrados, a mystical movement whose beliefs shaded into antinomianism.

For a variety of reasons, then, Teresa had protracted problems with the Inquisition and her own superiors. In fact, in 1577, when this book was written, her access to religious texts and even her own earlier works were restricted; when she makes a Biblical quotation, she warns that she may have misremembered it because she cannot look it up. Nonetheless, it says something for her general mental health that she proved to be a formidable bureaucratic infighter. She managed to keep her major works in circulation, and she co-founded the Discalced Carmelites, a branch of the 12th-century Carmelite Order, that remains an important institution in the 21st century.

Teresa's uncongenial historical circumstances created fewer restraints than the system of confession and spiritual direction that can be found in some form in any religious order, but that are especially important to contemplatives. They are not unwanted intrusions, but an integral part of the discipline she describes. She repeatedly urges her readers, whom she assumed would at first be her fellow Carmelites, to keep their confessors informed about their spiritual experiences, and their prioress about their social and psychological ones (sometimes, the best next step in one's prayer life is a vacation, or at least a change of assignments). Of course, Teresa was aware that she knew more about the theory and practice of advanced spirituality than some of her spiritual directors did. The book is sprinkled with passages like this:

The time which has been spent in reading or writing on this subject will not have been lost if it has taught us these two truths; for though learned, clever men know them perfectly, women's wits are dull and need help in every way. Perhaps this is why our Lord has suggested these comparisons to me; may He give us grace to profit by them!

Leaving aside the question of which two truths were at issue, there are several ways to view this passage. Maybe it is a simple expression of humility. Maybe it is a way of deflecting possible criticism from suspicious prelates. There is also some reason to suppose that Teresa was the snarkiest Doctor of the Church since Augustine.

* * *

We should note that nowhere does Teresa suggest that the contemplative path is necessary for salvation, or even peculiarly helpful for it. Neither does she make special claims for her model of the soul as a castle like a translucent crystal. Nonetheless, for those who found the analogy helpful, she suggested that those who wished to advance in the knowledge and experience of God could think of themselves as moving through a concentric system of six rings of rooms or mansions ("moradas") toward a seventh, central set, where God was most perfectly present. Each of these rings of mansions presented its own challenges in terms of personal reformation and the type of prayer that is possible there; also, in each successive ring God affects the seeker in a more dramatic and overwhelming way. After the inner sections, particularly after the Fourth Mansions, God is clearly controlling the advance, but grace of some kind is needed for every step, including the original decision to enter the Castle.

Outside the castle is a dark landscape, where poor sinners are preyed upon by "reptiles," which may be demons, or the temptations, or the sinners' own ill will. Entering on the spiritual life, the penitent comes to the First Mansions. There, with some suffering, he gains self-knowledge. This painful process is necessary, though these mansions are a relatively crepuscular region, where the assaults of the reptiles are still common. The Second Mansions are similarly dark and dangerous, but there the aspiring soul will first learn how to pray. In the Third Mansions there is less danger from the cruder assaults of evil. It is the region of ordinary virtue; continuance in a state of grace becomes easier. Though we are not told this explicitly, one might gather from the text that these are the Mansions where the faithful in secular life might ordinarily expect to spend their lives.

In any case, even in these first three sets of Mansions, one meets here some of the subtle dangers of the spiritual life. Teresa counsels her readers on dealing with aridity and distraction in prayer, and about indiscreet zeal, the temptation to judge and criticize persons who seem less pious than oneself. The denizens of the Third Mansions in particular are tempted to think their lives are saintly because they are irreproachable; such people can actually benefit from the humility that comes with misfortune.

In the first three Mansions, the aspirant soul may sometimes be aware of special manifestations of divine grace, and of peace in prayer. As a rule, though, the divine is experienced only through the ordinary means of preaching and the sacraments, and through the natural satisfaction in a job well done (if you are a contemplative nun, the distinction between liturgy and labor tends to disappear). The Fourth Mansions, however, are the point where "consolations" normally begin to play a large part in the spiritual life. There are moments of the "expansion of the heart" that are outside the normal range of emotions; and indeed, in some manifestations, outside the range of nature.

There is a science of mystical experience. The Interior Castle is one of the key sources of its data; so are Teresa's earlier works, including the Life and The Way of Perfection. Rather than try to summarize the increasingly complex treatment of the inner mansions, let us here simply paraphrase the editor's Note 113 to The Interior Castle, even though it uses some of Teresa's terminology that does not occur in this particular book:

The first three Mansions of the Interior Castle correspond with the first water, or the prayer of Meditation. The Fourth Mansion, or the prayer of Quiet, corresponds with the second water. The Fifth Mansion, or the prayer of Union, corresponds with the third water. The sixth mansion, where the prayer of ecstasy is described, corresponds with the fourth water.

As for the Seventh Mansions, this review begins with a description of the spiritual marriage that occurs there.

The present text assumes that the reader is familiar with these modes of prayer and how they are performed. Meditation, for instance, seems to mean principally the sustained contemplation of the incidents in the life of Christ or of the Passion; the Rosary is a prayer of this type. In the other forms of prayer, some voluntary recollection or other act may be necessary, but the higher forms are events in which the will of the aspirant plays a smaller and smaller role. In any case, this book is less concerned with how to pray than with how to handle prayer's effects.

* * *

The theological subtext of The Interior Castle is Thomistic. Teresa was not herself trained in systematic theology, however, and even by her own account she garbled some points. This text has editorial notes and an interpolated chapter to clarify these points. Thus, they amplify with a venerable Scholastic gloss her distinction between the prayer of Union, which occurs in the Fifth Mansions, from the Marriage that occurs in the Seventh Mansions. The prayer of Union, the monks suggest, involves the accidents of the soul (its senses and cognitive functions), while the Marriage involves a change of its substance. This change is a transformation that identifies the soul with the divine to the degree that Teresa has a vision in which Jesus says to her "that henceforth she was to care for His affairs as though they were her own and He would care for hers." In the spiritual marriage, a human life becomes Christ's life. The editors do not make quite so bold as to call this transubstantiation.

Note that this was an "interior vision." Teresa describes "imaginary visions," which occur when people see images as if they were physical objects. She does not say such things are impossible, but that they do not belong to her experience. She also describes raptures, in which the spirit feels itself to leave the body (the she is professedly agnostic about whether this is actually the case). She also describes "jubilees," which can involve more than one person, and which sound a bit like charismatic behavior. Until she gets to the dramatic (and apparently somewhat dangerous) ecstasies of the Sixth Mansions, she herself is far more comfortable with "intellectual vision," in which knowledge is infused directly into the intellect, without the intervention of the senses. This can involve a direct awareness of an object or person, including the physical appearance. Indeed, one of the greatest consolations in the more advanced Mansions is the repeated and even habitual awareness of the divine presence.

Even a cursory familiarity with the literature of mysticism will find resonances in this work. This reviewer was surprised to discover how much of this book's advice about prayer and the dangers of the advanced spiritual life is echoed in C.S. Lewis's most popular work, The Screwtape Letters. Lewis was familiar with the literature of mysticism, of course, but that is unlikely to be the whole explanation. Serious spirituality is an empirical enterprise; people who have experienced its effects will recognize them in the accounts of others who have experienced them.

This does not mean that all the writers say the same things about the same experiences, or even that it is certain that the experiences are the same. For instance, in The Interior Castle, Teresa speaks of a point where a word, an idea, any small thing will cause an eruption of the divine presence. The divine sends out a flurry of sparks, any one of which could cause the soul to ignite. This sounds a bit like the climax of the anonymous English work, The Cloud of Unknowing, from two centuries earlier. In that book, the prepared soul sends out, at unpredictable intervals, shafts of aspiration that pierce the Godhead. Similar to The Interior Castle, yes: but are these moments identical?

There are certainly points where Teresa takes care to distinguish her views from those of other writers. There are some texts that suggest there comes a stage in the seeker's journey when the whole object of attention is God without qualification; the earlier meditations on Christ and His Passion were necessary, but are no longer relevant to the final stages. That is the view of The Cloud of Unknowing, which demands a preparation of perfect faith and purity of life, but moves to a point where everything, including even the benefits conveyed by God, is neglected in favor of the love of God. Teresa says that this is not her experience; she never ceases to focus on Jesus and the Cross. She never forgets the Saints, who at this level become felt companions rather than merely recipients of prayers for intercession. The Interior Castle presents a world that is less arid and alien than other expressions of advanced spirituality, particularly those of the 20th century. Finally, we may note that the seven-part structure of the Castle makes the journey through it into a history of seven ages, which inevitably calls to mind some of the models of time based on the structure of the week. The spiritual marriage of the Seventh Mansions calls to mind the Millennium, an idea that might have a literal personal application even if it does not have a historical one. More speculatively, one of Teresa's best-known metaphors, that of the caterpillar that spins a cocoon and later dies to be reborn as a butterfly, might have an application not just to the aspiring soul, but also to the Incarnation. The cocoon begins to be spun in the Fifth Mansions, after a long history of preparation. This is not unlike the idea that the Incarnation is the center of history, structurally if not necessarily in terms of the duration of the time periods to either side.

Even if Teresa had any thoughts along these lines herself, she does not mention them in The Interior Castle. They are the sort of notion that made the Inquisition cranky, for one thing. For another, speculation was not Teresa's vocation. She wrote about only what she knew.

Copyright © 2011 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Neocons, Theocons and the Cycles of American History

Neocons go back to James Q. Wilson or possibly the Coleman Report. They represented the wing of American liberalism appalled by the Sixties and the radial Left of that era.


Neocons, Theocons and the Cycles of American History

 

“...[T]he full story of [the generation that came of age around 1900] cannot possibly be told ....by recalling the steel-willed leaders of the 1940s...[T]he full story must include very different images--of youthful indulgence, coming-of-age fury, rising-adult introspection, and midlife pomposity and intolerance. What finally emerged late in life, the austere and resolute persona, was largely self-created by a generation determined (in Edith Wharton’s phrase) ‘to build up, little by little, bit by bit, the precious things we’d smashed to atoms without knowing it.’”

--from “Generations,” by William Strauss and Neil Howe, page 237

 

In November of 1996, the Manhattan-based magazine First Things published the first installment of a symposium entitled “The End of Democracy?: The Judicial Usurpation of Politics.” It was occasioned by the possibility that the Supreme Court might uphold two lower court decisions that had found a constitutional right to physician-assisted suicide. Among the original contributors, Robert Bork and Charles Colson were the best known, though Russell Hittinger’s essay, “A Crisis of Legitimacy,” perhaps best addressed the specific issues suggested by the symposium’s title. Considering that First Things is a respectable monthly read mostly by clergy and conservative academics, the Introduction by the editors was breathtaking. They posed the matter thus:

“The question here explored, in full awareness of its far-reaching consequences, is whether we have reached or are reaching the point where conscientious citizens can no longer give moral assent to the existing regime.”

Yikes.

Before going any further, let me first make some personal admissions. I write occasionally for First Things. I was not part of the symposium, but I wrote an essay that appeared in the July/August 1996 issue of Culture Wars dealing with much the same topic. The piece was entitled “How to Prevent a Civil War.” My argument was not so different from that of Robert Bork’s contribution to the symposium, in which he suggested various mechanisms for limiting the scope of constitutional judicial review. I too used the term “regime” to describe the current jurisprudential system, though I picked up the usage not from the right, but from Michael Lind’s “The Next America.” I too think that contemporary constitutional theory is damned and doomed. If I differ from the symposium’s participants, it is only in believing that the current jurisprudential regime is not just wicked but rotten, and that it will collapse under very little pressure in a fashion not at all dissimilar to Soviet Communism. I am thus not a wholly impartial observer.

Objectivity notwithstanding, the reaction in the weeks that followed the symposium was manifestly explosive. Several prominent members of First Things’ own editorial advisory board resigned. Just about all the conservative magazines chimed in. The Weekly Standard, for instance, ran a piece called “The Anti-American Temptation,” which accused the editors of First Things of running off the rails of ordinary politics, in a way analogous to the “pick up the gun” radicals of the 1960s. The New Republic (not a particularly conservative magazine) ran a piece by Jacob Heilbrunn entitled “Neocon v. Theocon: The New Fault Line on the Right.,” that is worth considering in some detail.

Heilbrunn’s thesis is that the neoconservatives (the neocons) are mostly New York-based Jewish intellectuals who broke with leftist politics in the 1970s. They remade conservatism by articulating serious intellectual critiques of liberalism and the welfare state. When the conservative revival began about 25 years ago, the concerns of cultural conservatives were not much represented among this group. Therefore, they were not much represented in government or the academy, despite the fact it was cultural conservatives, mostly evangelicals and ethnic Catholics, who provided the growing electoral muscle of the Republican Party. Latterly, however, the neocons have been joined by a new breed of conservative intellectual, for whom Heilbrunn has coined the nifty term “theocon.” The theocons, by his account, are predominantly Catholic, and unlike their Jewish colleagues have a tendency to frame political questions with a theological twist. The theocons, in fact, are seeking to restructure American society in accordance with Thomistic natural law. Their efforts are intellectually sophisticated, far more so than anything conservative populists from George Wallace to Pat Buchanan have been able to formulate. However, according to Heilbrunn, “Thomism is an ideology to which only the faithful can subscribe. It is not so much anti-American as un-American.”

Well, so much for John Courtney Murray and the decades-long attempt to establish the compatibility of Thomism with the American enterprise. For that matter, so much for the more recent debate about the natural-law assumptions of the Founding Fathers. The only kind of natural law Heilbrunn seems to feel to be appropriate for American political discourse is the post-Kantian theories of Leo Strauss, who did indeed influence many neoconservatives.

I for one find Heilbrunn’s assessment more odd than offensive. Whatever else you may think about Thomism, it is difficult to think of it as a subversive political ideology. Images rise up of a Senate Subcommittee on Neo-Scholastic Activities. Could its jurisdiction be challenged on the ground that subcommittees offend against Occam’s Razor? C-Span is not ready for this.

For that matter, it is misleading to characterize First Things as a hotbed of Thomism. The editor in chief, Fr. Richard John Neuhaus, is indeed a Catholic priest, but before that he was a Lutheran pastor. Much of his social thinking is informed by the Lutheran model of the “orders of creation,” which is analogous to natural law but by design non-theological. The magazine’s editor, James Nuechterlein, remains a Lutheran and delivers himself of a no-popery declaration every few months to make sure that no one forgets. The Managing editor, Matthew Berke, is Jewish. The contributors to the magazine are all over the lot in terms of denominational affiliation. First Things is perhaps most noted for its “Evangelicals and Catholics Together” initiative, announced in its May 1994 issue, which went far toward providing a common roof for all cultural conservatives. St. Thomas is indeed much quoted and praised in the pages of First Things, but then it defines itself as a “Monthly Journal of Religion and Public Life.” One thing it is not is a Catholic magazine, much less an organ of creeping international Thomism.

Of course, there is no lack of prominent proponents of natural law on the national scene, many of whom are Thomists. The most prominent, no doubt, is Justice Anthony Scalia, who often makes himself unpopular with his Supreme Court colleagues by critiquing their more incoherent decisions from the bench. There is former presidential candidate Alan Keyes, a brilliant speaker who would have transformed the 1996 election campaign if he had been featured at the Republican convention. (Keyes, by the way, is a former student of Allan Bloom, who was in turn a student of the influential Leo Strauss. In Keyes’ mind, at least, Aquinas proved more persuasive.)

On the other hand, the ranks of Thomists do not include people such as Robert Bork, whose objections to judicial activism arise from a historically-based interpretation of the powers of the courts. “Theocon” might not be a bad term for describing many cultural conservatives. It might not even be a bad term for describing me. However, it is misleading to suggest that all or even most theocons are Thomists, or that opposition to the current state of constitutional law is a crank enthusiasm of religious sectarians, Catholic or otherwise. (For that matter, with all due respect to the Prodhoretz and Kristol clans, neoconservatism is not a Jewish monopoly, even if you confine the term to subscribers of little magazines.)

Granted that Heilbrunn’s criticisms are misdirected, nevertheless it seems to me that all sides to this debate, neocons, theocons and the liberals who mock them, are overlooking some important things about it. What we are seeing now is a drama that has been played out more than once before in American history, when the chaos created by a radical episode was repaired a generation later by much the same people who caused the commotion in the first place. We have all heard that the 1990s are the 1960s turned upside down. In the neocon-theocon flap, perhaps we see an instance of 1960s style turned against the institutionalized vestiges of 1960s substance.

The short explanation for the radical tone of the First Things symposium is that the Supreme Court does bad work in important areas of the law and will not admit its mistakes. It does not help that in such ill-reasoned decisions as Planned Parenthood v. Casey, for instance, we find such language as, “If the Court’s legitimacy should be undermined, then so would the country be in its very ability to see itself through its constitutional ideals.” What nonsense. The country can see its constitutional ideals in the constitution. The court’s “legitimacy” (perhaps Justice O’Connor meant “credibility”?) stands or falls by the court’s competence, the lack of which has been the problem.

This explains the exasperation, but why does the exasperation take the form of a bunch of parsons and college professors making noises like students circa 1968 threatening to storm the math building? Partly it’s because the parsons and college professors came through the 1960s themselves, though they were for the most part too old to be students at the time. The style of some generations, as Strauss and Howe argue in the book cited at the beginning of this article, dominates cultural and political life for decades. The substance may change, but the manner is tenacious. Fr. Neuhaus, for instance, once famously marched into Henry Kissinger’s office with other prominent opponents of the Vietnam War and read him the Riot Act. The First Things symposium is not quite as dramatic, but the spirit is the same.

These remarks apply even more to neocons than they do to theocons. The neoconservatives became neoconservatives, after all, because they were appalled by the extremism of the language and behavior of the far left of 20 or 30 years ago. The theocons of today, or at least the ones at First Things, have few violent tendencies, but once again the neocons are put off by language that seems to suggest that questions of civil order are at issue.

The difference this time around is that the “radicals” have a better chance of winning. The radicals of the 1960s had no prospect of success. On the other side of the victory of, say, the Weathermen there was a world of re-education camps and political dictatorship that few Americans could imagine. Of course, the Kids of the 1960s have “won” in the sense of outliving their elders. One of them is actually in the White House as I write this. However, he got there by abandoning some of his youthful beliefs and dissimulating about the rest.

The task of today’s conservatives is the relatively modest proposition of repairing the damage many of them did themselves 20 or 30 years ago. On the other side of the victory of today’s cultural conservatives, there is a world sort of like the Eisenhower Administration but without racial discrimination. Many people might not like this outcome, but it is not hard to visualize and few people find it actually repulsive. Thus, we may be in for a larger than average historical irony. The very attitudes and rhetorical style that did so much to institutionalize the ‘60s in our law and popular culture may also be among the chief instruments by which that era is finally dismantled.

End

 

Copyright © 1997 by John J. Reilly

This article originally appeared in the February 1997 issue of Culture Wars magazine. An edited version was included in the book:

 

The End of Democracy?
"The Judicial Usurpation of Politics"

The Celebrated First Things Debate with Arguments Pro and Con and "The Anatomy of a Controversy"' by Richard John Neuhaus"

 

The publisher is The Spence Publishing Company (Dallas, Texas). Their telephone number is 1-888-SPENPUB.

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2005-05-05: British Elections; Multiple Horrors; Theodicy

I have a pending book review on Tim Powers' Earthquake Weather where I will claim that Powers wrote a theodicy.


British Elections; Multiple Horrors; Theodicy

 

By the time you read this, we will probably already know whether a Labor government has been returned in the UK, and more specifically, whether Tony Blair is still prime minister. This is the last of the Five Elections, which tell us whether at least the opening phase of the Terror War can be successfully concluded. The elections in question were those in Australia, Afghanistan, the US, Iraq, and the UK. If any of them seriously miscarry, then the jihadi calculation that they could win an asymmetrical war of attrition directed at the morale of their opponents would have been substantially vindicated. A Labor defeat later today, for instance, would almost certainly encourage proposals to treat with Al Qaeda, and to create secure areas for jihadis in Iraq. That now seems unlikely, but at this writing it could still happen.

* * *

Something even less likely to happen is that the British electorate will return a substantial number of members to Parliament from The Raving Monster Looney Party. Their platform has many pledges that demonstrate a commendable willingness to think outside the box:

We will issue a 99p coin to save on change.

We pledge to reduce class sizes by making the pupils sit closer to one another...

Anyone caught breaking the law will be made to mend it.

Immigration: everyone wanting to come and live in the UK will be made welcome, so long as they are over the age of 85 and accompanied by both parents.

All foxes will be issued with sheep’s clothing.

All food shall be clearly labelled 'Recommended for Oral Use'.

I know this party is an old joke, but it still works for me.

* * *

To some people, blogging might seem to be no more than at excuse to write at length to no point or purpose, and with a fine disregard for accuracy. However, recent research by a Dr. Perelman at M.I.T. suggests that this activity might be perfect practice for getting into a good college, since SAT Essay Test Rewards Length and Ignores Errors:

An essay on the Civil War, given a perfect six, describes the nation being changed forever by the "firing of two shots at Fort Sumter in late 1862." (Actually, it was in early 1861, and, according to "Battle Cry of Freedom" by James M. McPherson, it was "33 hours of bombardment by 4,000 shot and shells.")...SAT graders are told to read an essay just once and spend two to three minutes per essay, and Dr. Perelman is now adept at rapid-fire SAT grading. This reporter held up a sample essay far enough away so it could not be read, and he was still able to guess the correct grade by its bulk and shape. "That's a 4," he said. "It looks like a 4."

Has anyone considered that the least corrupting admission system might be based on honest graft?

* * *

Speaking of the rewards of writing filler prose, the current Onion demonstrates the key to writing thrillers:

Unspeakable Horror Happens in Area Town
'Oh God, No!' Say Onlookers

MURPHY, ID. Indescribable tragedy struck the quiet foothill town of Murphy Monday, leaving authorities and citizens dumbstruck by the nameless horror that descended on their community.

"Oh God," said Wilma Freas, standing at the edge of Main Street overlooking the lumberyard. "Those poor people!"

Added Freas: "And the children..."

Just go on like that for 80-thousand words and you'll be fine. Spend as little time as possible describing the monster; it's going to look like a guy in a rubber suit anyway.

* * *

Real-life horrors, in contrast, must be described. When they involve injury to large numbers of innocent people, they raise questions of theodicy, of how a good God could permit suffering in the world. The Times of Malta recently discussed an updated version of one of the Scholastic arguments:

The tsunami was caused by an underwater earthquake. Science suggests, however, that if humans are to exist, earthquakes are a necessary evil. If the earth were a geologically dead planet with no tectonic activity such as earthquakes and volcanic eruptions, aeons of erosion would long ago have made it a dully flat and inhospitable place, in all likelihood completely covered with water and we certainly would not be around.

Earthquakes, therefore, appear to be a good example of what Aquinas meant when he said, in a passage cited by Mgr Gauci, that "God can never want moral evil, but, accidentally, he can want physical evil. This last because of some good tied up to that evil" - in this case, the existence of human beings.

What we are talking about here, of course, are the Anthropic Coincidences. Any universe that has sunlight in it is also going to have the possibility of hydrogen bombs. The interesting thing is that only a very narrow range of physical constants will produce any highly structured universe at all. In other words, any world that contains biology is also going to contain danger; they are both generated by the same numbers.

So why couldn't God make a universe in which this linkage did not apply? Because a perfect physical creation would be a contradiction: physical things are by definition vulnerable in some respects. Why could not God create a contradiction? Because contradictory things lack the power to exist.

John Leslie's argument is perhaps a little neater, and it does not leave you struggling with fine points of modal and extentional logic. He pointed out that any human generation that reproduces itself makes the same decision that God made when He created the human race: people know that some of their descendants are going to be miserable some of the time.

Tough, but fair.

* * *

Here's a coincidence for you: I was reading Ian McEwan's novel, Saturday, and came to a place where the protagonist wonders, amidst his many other meditations, whether it would be possible to judge Prime Minister Blair's veracity about WMDs in Iraq by studying the images of the Prime Minister's face in the multiple television screens in a shop window. The name Paul Ekman comes up, in connection with Ekman's research into facial expression and cognition. The coincidence is that I had never heard of this research, or of the name Ekman, until a few days ago, when I saw both mentioned in Malcolm Gladwell's book, Blink, a review of which I have here.

Both these books were given to me by the same people, but I doubt that face-research is something they were trying to draw my attention to. In this conjunction, I do not see a synchronous event. When the same fishy ideas turn up in randomly selected novels and non-fiction, however, that is always a bad sign: we could be in for a deluge of hype about face-reading.

Again, we can only ask: could a good God permit such things?

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Is Mathematics Constitutional?

A recent popular [well, as popular as a massive book full of equations can be] exposition of mathematical Platonism is Roger Penrose's The Road to Reality. It even has practice problems in it with devoted communities of amateurs trading tips on how to solve them. Mathematical Platonism, or something much like it, really is something like the default position of many mathematicians and physicists.

Since I ended up an engineer, perhaps it isn't really surprising that I always found the moderate realism of Aristotle and Aquinas more appealing. 

There is a good quote in this short essay that I've used to good effect:

"Because the whole point of science is to explain the universe without invoking the supernatural, the failure to explain rationally the 'unreasonable effectiveness of mathematics,' as the physicist Eugene Wigner once put it, is something of a scandal, an enormous gap in human understanding."
I, for one, was a little taken aback by the proposition that science had any "point" other than to describe the physical world as it actually is, but let that pass.

Philosophy of science is a field in fine shape, but many fans of science try to use it as a cudgel upon religious believers. Insofar as that attempt is mostly ignorant of both science and philosophy, it isn't particularly illuminating.


Is Mathematics Constitutional?

 

The New York Times remains our paper of record, even in matters of metaphysics. For proof, you need only consult the article by George Johnson that appeared in the Science Section on February 16, 1998, entitled: "Useful Invention or Absolute Truth: What Is Math?" The piece was occasioned by a flurry of recent books challenging mathematical Platonism. This is the belief, shared by most mathematicians and many physicists, that mathematical ideas are "discovered" rather than constructed by the mathematicians who articulate them. Consider the following sentence:

"Because the whole point of science is to explain the universe without invoking the supernatural, the failure to explain rationally the 'unreasonable effectiveness of mathematics,' as the physicist Eugene Wigner once put it, is something of a scandal, an enormous gap in human understanding."

I, for one, was a little taken aback by the proposition that science had any "point" other than to describe the physical world as it actually is, but let that pass. The immediate philosophical peril to the world of the Times is more narrow. That is, it is hard to be a thoroughgoing secular materialist if you have to acknowledge that there are aspects of reality that cannot be explained as either products of blind chance or of human invention. Supreme Court Justice William Kennedy has even suggested that systems of ethics claiming an extra-human origin are per se unconstitutional. Judging by some of the arguments against mathematical Platonism presented by the Times piece, however, we may soon see Establishment Clause challenges to federal aid for mathematical education.

The best-known of the books that try to de-Platonize mathematics is "The Number Sense: How the Mind Creates Mathematics," by the cognitive scientist Stanislas Dehaene. His argument is that the rudiments of mathematics are hardwired into the human brain, and so that mathematics is foundationally a product of neurology. The evidence is various. There are studies of accident victims suggesting there may be a specific area of the brain concerned with counting, as well as stimulus-response studies showing that some animals can be trained to distinguish small-number sequences. (Remember the rabbits in "Watership Down," who had the same name for all numbers from five to infinity?) Relying on even more subtle arguments is a recent article by George Lakoff and Rafael E. Núñez, "Mathematical Reasoning: Analogies, Metaphors and Images." [BE: the actual article is titled The Metaphorical Structure of Mathematics: Sketching Out Cognitive Foundations for a Mind-Based Mathematics] The authors suggest that numbers are simply extrapolated from the structure of the body and mathematical operations from movement. (The article is part of an upcoming book to be called "The Mathematical Body.")

I have not read these works, so it is entirely possible I am missing something. Still, it seems to me that there are two major problems with analyses of this sort. First, if the proposition is that mathematical entities are metaphysical universals that are reflected in the physical world, it is no argument against this proposition to point to specific physical instances of them. In other words, if numbers are everywhere, then it stands to reason that they would be inherent in the structure of the brain and body, too.

If Dr. Dehaene has really found a "math-box" in the head, has he found a fantasy-gland or an organ of perception? The Times article paraphrases him as saying that numbers are "artifacts of the way the brain parses the world...like colors. Red apples are not inherently red. They reflect light at wavelengths that the brain...interprets as red." The distinction between things that are "really red" and those that "just look like red" has always escaped me, even in languages with different verbs for adjectival predicates and the copula. Doesn't a perfectly objective spectral signature identify any red object? In order to avoid writing the Monty Python skit that arguments about perception usually become, let me just note here that the experience of qualia (such as "redness") has nothing to do with the cognitive understanding of number. Like the numbers distinguishing the wavelengths of colors, for instance.

There is a more basic objection to the physicalistic reductionism at work here, however. Consider what it would mean if it worked. Suppose that proofs were presented so compelling as to convince any honest person that mathematics was indeed nothing more than an extrapolation of the structure of the nervous system, or of the fingers on the hand, or of the spacing of heartbeats. We would then have a situation where we would have to explain the "unreasonable effectiveness" of the human neocortex, or even the universal explanatory power of the human anatomy. This would be anthropocentrism come home to roost. You could, I suppose, argue that we only imagine that the human neurological activity called mathematics lets us explain everything; the reality is that we only know about the things that our brains let us explain. Well, maybe, but then that suggests that there are other things that we don't know about because our brains are not hardwired to explain them. Maybe those are the things that are really red?

There are indeed problems with mathematical Platonism, the chief of which is that it is hard to see how the physical world could interact with the non-sensuous ideal forms. (John Barrow's delightful "Pi in the Sky" will take interested readers on a fair-minded tour of the philosophy and intellectual history of this perennial question.) The most workable solution is probably the "moderate Realism" of Aquinas. He held that, yes, there are universals, but that we can know about them only through the senses. This seems reasonable enough. In fact, this epistemological optimism is probably the reason science developed in the West in the first place. There may even be a place for Dr. Dehaene's math-box in all this, if its function is regarded as perceiving numbers rather than making them up. What there can be no place for is the bigotry of those who believe that science exists only to support certain metaphysical prejudices.

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Higher Superstition: The Academic Left and Its Quarrels with Science

The kind of thing that John Reilly laments in this book review is alive and well. If you want a taste of it, check out New Real Peer Review on Twitter, which simply reprints abstracts of actual, peer-reviewed articles. A favorite genre is the autoethnography. Go look for yourself, no summary can do it justice.

I will disagree with John about one thing: race and sex matter a lot for many medical treatments. For example, the drug marketed under the trade name Ambien, generically called zolpidem, has much worse side effects in women than in men, and it takes women longer to metabolize it.

This effect was memorably referred to as Ambien Walrus. I find this pretty funny, but I delight in the sufferings of others.

You can't ignore this stuff if you want to do medicine right. The reasons for doing so vary, but you'll get a better result if you don't.


Higher Superstition: The Academic Left and Its Quarrels with Science
by Paul R. Gross and Norman Levitt
The Johns Hopkins University Press, 1994
314 pp, $25.95
ISBN 0-8018-4766-4

 

The Enemies You Deserve

 

If you are looking for an expose' of how political correctness in recent years has undermined medical research, corrupted the teaching of mathematics and generally blackened the name of science in America, this book will give you all the horror stories you might possibly want. There have been rather a lot of indictments of the academic left, of course, but this is one of the better ones. However, the book is most interesting for reasons other than those its co-authors intended. To use the same degree of bluntness that they use against the "science studies" being carried on by today's literary critics, what we have here is an expression of bewilderment by a pair of secular fundamentalists who find themselves faced with an intellectual crisis for which their philosophy provides no solution.

Paul Gross is a biologist, now at the University of Virginia but formerly director of the Woods Hole Marine Biological Laboratory, and Norman Levitt is professor of mathematics at Rutgers University in New Jersey. They repeatedly insist, no doubt truthfully, that they have no particular interest in politics and that they are not programmatic conservatives. What does worry them is the increasing number of faculty colleagues in the liberal arts who take it as an article of faith that the results of empirical scientific research are biased in favor of patriarchy or capitalism or white people. The people who have this sort of opinion they call "the academic left," a catchall category that includes deconstructionists, feminists, multiculturalists and radical environmentalists.

The authors have a good ear for invective, such as this happy formula: "...academic left refers to a stratum of the residual intelligentsia surviving the recession of its demotic base." There has always been something rather futile about the radicalization of the academy, and in some ways the movement is already in retreat. The ideas of the academic left are based in large part on Marxist notions that were originally designed for purposes of revolutionary agitation. Revolutionary socialist politics has not proven to have the popular appeal one might have hoped, however. Marxism has therefore been largely replaced among intellectuals by that protean phenomenon, postmodernism. Although postmodernism incorporates large helpings of Freudianism and the more credulous kind of cultural anthropology, it remains a fundamentally "left" phenomenon, in the sense of maintaining an implacable hostility to market economics and traditional social structures. However, postmodernists have perforce lowered their goal from storming the Winter Palace to inculcating the "hermeneutics of suspicion" in undergraduates. The results of these efforts were sufficiently annoying to incite Gross and Levitt to write this book.

Postmodernists presume that reality is inaccessible, or at least incommunicable, because of the inherent unreliability of language. Science to postmodernists is only one of any number of possible "discourses," no one of which is fundamentally truer than any other. This is because there are no foundations to thought, which is everywhere radically determined by the interests and history of the thinker. Those who claim to establish truth by experiment are either lying or self-deluded. The slogan "scientific truth is a matter of social authority" has become dogma to many academic interest groups, who have been exerting themselves to substitute their authority for that of the practicing scientists.

The French philosophical school known as deconstructionism provided the first taste of postmodern skepticism in the American academy during the 1970s. It still provides much of its vocabulary. However, self-described deconstructionists are getting rare. Paul de Man and Martin Heidegger, two of the school's progenitors, were shown in recent years to have been fascists without qualification at certain points in their careers, thus tainting the whole school. On the other hand, while deconstruction has perhaps seen better days, feminism is as strong as ever. Thus, undergraduates in women's studies courses are routinely introduced to the notion that, for instance, Newton's "Principia" is a rape manual. Even odder is the movement to create a feminist mathematics. The authors discuss at length an article along these lines entitled "Towards a Feminist Algebra." The authors of that piece don't seem much concerned with algebra per se; what exercises them is the use of sexist word problems in algebra texts, particularly those that seem to promote heterosexuality. The single greatest practical damage done by feminists so far, however, is in medical research, where human test groups for new treatments must now often be "inclusive" of men and women (and also for certain racial minorities). To get statistically significant results for a test group, you can't just mirror the population in the sample, you have to have a sample above a mathematically determined size for each group that interests you. In reality, experience has shown that race and gender rarely make a difference in tests of new medical treatments, but politically correct regulations threaten to increase the size of medical studies by a factor of five or ten.

Environmentalism has become a species of apocalyptic for people on the academic left. It is not really clear what environmentalism is doing in the postmodern stew at all, since environmentalists tend to look on nature as the source of the kind of fundamental values which postmodernism says do not exist. The answer, perhaps, is that the vision of ecological catastrophe provides a way for the mighty to be cast down from their thrones in a historical situation where social revolution appears to be vastly improbable. Environmentalists seem to be actually disappointed if some preliminary research results suggesting an environmental danger turn out to be wrong. This happens often enough, notably in cancer research, where suspected carcinogens routinely turn out to be innocuous. However, on the environmental circuit, good news is unreportable. The current world is damned, the environmentalists claim, and nothing but the overthrow of capitalism, or patriarchy, or humanism (meaning in this case the invidious bias in favor of humans over other animals) can bring relief. Only catastrophe can bring about this overthrow, and environmentalists who are not scientists look for it eagerly.

The basic notion behind the postmodern treatment of science is social constructivism, the notion that our knowledge of the world is just as much a social product as our music or our myths, and is similarly open to criticism. The authors have no problem with the fact that cultural conditions can affect what kind of questions scientists will seek to address or what kind of explanation will seem plausible to a researcher. What they object to is the "strong form" of social constructivism, which holds that our knowledge is simply a representation of nature. The "truth" of this representation cannot be ascertained by reference to the natural world, since any experimental result will also be a representation. Constructivists therefore say that we can understand the elements of a scientific theory only by reference to the social condition and personal histories of the scientists involved. This, as the authors correctly note, is batty.

The lengths to which the principle of constructivism has been extended are nearly unbelievable. Take AIDS, for instance, which has itself almost become a postmodernist subspecialty. The tone in the postmodernist literature dealing with the disease echoes the dictum of AIDS activist Larry Kramer: "...I think a good case can also be made that the AIDS pandemic is the fault of the heterosexual white majority." Some people, particularly in black studies departments, take "constructed" quite literally, in the sense that the AIDS virus was created in a laboratory as an instrument of genocide. Kramer's notion is more modest: he suggests that the extreme homosexual promiscuity which did so much to spread the disease in the New York and San Francisco of the late 1960s and early 1970s was forced upon the gay community by its ghettoization. This is an odd argument, but not so odd as the assumption that you can talk about the origins of an epidemic without discussing the infectious agent that causes it. The upshot is that AIDS is considered to be a product of "semiological discourse," a system of social conventions. It can be defeated, not through standard medical research, but through the creation of a new language, one that does not stigmatize certain groups and behaviors. (Dr. Peter Duesberg's purely behavioral explanation of AIDS, though it has the attractions of scientific heresy, gets only a cautious reception because of its implied criticism of homosexual sex.) The postmodern academy actually seems to have a certain investment in a cure for AIDS not being found, since the apparent helplessness of science in this area is taken as a license to give equal authority to "other ways of knowing" and other ways of healing, particularly of the New Age variety.

The postmodernist critics of science usually ply their trade by studiously ignoring what scientists themselves actually think about. The anthropologist Bruno Latour, for instance, has made a name for himself by subjecting scientists to the kind of observation usually reserved for members of primitive tribes. Once he was commissioned by the French government to do a post-mortem on their Aramis project. This was to be a radically new, computerized subway system in which small trams would travel on a vastly complicated track-and-switch system along routes improvised for the passengers of each car. The idea was that passengers would type their proposed destination into a computer terminal when they entered a subway station. They would then be assigned a car with other people going to compatible destinations. The project turned into a ten year boondoggle and was eventually cancelled. The French government hired Latour to find out what went wrong. Now, the basic conceptual problem with the system is obvious: the French engineers had to come up with a way to handle the "traveling salesman" problem, the classic problem of finding the shortest way to connect a set of points. This seemingly simple question has no neat solution, and the search for approximate answers keeps the designers of telephone switching systems and railroad traffic managers awake nights. Latour did not even mention it. He did, however, do a subtle semiological analysis of the aesthetic design of the tram cars.

Postmodernists regard themselves as omniscient and omnicompetent, fully qualified to put any intellectual discipline in the world in its place. They have this confidence because of the mistaken belief that science has refuted itself, thus leaving the field clear for other ways of understanding the world. They love chaos theory, for instance, having absorbed the hazy notion that it makes the universe unpredictable. Chaos theory in fact is simply a partial solution to the problem of describing turbulence. Indeed, chaos theory is something of a victory for mathematical platonism, since it shows that some very exotic mathematical objects have great descriptive power. The implications of chaos theory are rather the opposite of chaos in the popular sense, but this idea shows little sign of penetrating the nation's literature departments. The same goes for features of quantum mechanics, notably the uncertainty principle. Quantum mechanics actually makes the world a far more manageable place. Among other things, it is the basis of electronics. To read the postmodernists, however, you would think that it makes physicists flutter about their laboratories in an agony of ontological confusion because quantum theory phrases the answers to some questions probabilistically.

On a more esoteric level, we have the strange cult of Kurt Goedel's incompleteness theorem, first propounded in the 1930s. Now Goedel's Theorem is one of the great treasures of 20th century mathematics. There are several ways to put it, one of which is that logical systems beyond a certain level of complexity can generate correctly expressed statements whose truth value cannot be determined. Some versions of the "Liar Paradox" illustrate this quality of undecidability. It is easy to get the point slightly wrong. (Even the authors' statement of it is a tad misleading. According to them, the theorem "says that no finite system of axioms can completely characterize even a seemingly 'natural' mathematical object..." It should be made clear that some logical systems, notably Euclid's geometry, are quite complete, so that every properly expressed Euclidean theorem is either true or false.) Simply false, however, is the postmodernist conviction that Goedel's Theorem proved that all language is fundamentally self-contradictory and inconsistent. Postmodernists find the idea attractive, however, because they believe that it frees them from the chains of logic, and undermines the claims of scientists to have reached conclusions dictated by logic.

Postmodernism, say the authors, is the deliberate negation of the Enlightenment project, which they hold to be the construction of a sound body of knowledge about the world. The academic left generally believes that the reality of the Enlightenment has been the construction of a thought-world designed to oppress women and people of color in the interests of white patriarchal capitalism. Or possibly capitalist patriarchy. Anyhow, fashion has it that the Enlightenment was a bad idea. Now that modernity is about to end, say the postmodernists, the idea is being refuted on every hand. Actually, it seems to many people of various ideological persuasions that the end of modernity is indeed probably not too far off: no era lasts forever, after all. However, it is also reasonably clear that postmodernism is not on the far side of the modern era. Postmodernism is simply late modernity. Whatever follows modernity is very unlikely to have much to do with the sentiments of today's academic left.

Granted that the radical academy does not have much of a future, still the authors cannot find a really satisfying explanation for why the natural sciences have been subject to special reprobation and outrage in recent years. In the charmingly titled penultimate chapter, "Why Do the People Imagine a Vain Thing?", they run through the obvious explanations. It does not take much imagination to see that today's academic leftist is often a refugee from the 1960s. Political correctness is in large part the whimsical antinomianism of the Counterculture translated into humorourless middle age. Then, of course, there is the revenge factor. In the heyday of Logical Positivism from the end of World War II to the middle 1960s, physical scientists tended to look down on the liberal arts. In the eyes of that austere philosophy, any statement which was not based either on observation or induction was literally "nonsense," a category that therefore covered every non-science department from theology to accounting. The patronizing attitude of scientists was not made more bearable by the unquestioning generosity of the subsidies provided by government to science in those years. The resentment caused by this state of affairs still rankled when the current crop of academic leftists were graduates and undergraduates. Now they see the chance to cut science down to size.

While there is something to this assessment, the fact is that the academic left has a point. Logical Positivism and postmodernism are both essentially forms of linguistic skepticism. Both alike are products of the rejection of metaphysics, the key theme in Western philosophy since Kant. The hope of the logical positivist philosophers of the 1920s and 30s was to save just enough of the machinery of abstract thought so that scientists could work. Science is not skeptical in the sense that Nietzsche was skeptical, or the later Sophists. It takes quite a lot of faith in the world and the power of the mind to do science. And in fact, the authors note that Logical Positivism, with a little help from the philosophy of Karl Popper, remains the philosophical framework of working scientists to this day. The problem, however, is that Logical Positivism leaves science as a little bubble of coherence in a sea of "nonsense," of thoughts and ideas that cannot be directly related to measurable physical events.

Logical Positivism has many inherent problems as a philosophy (the chief of which being that its propositions cannot themselves be derived from sense experience), but one ability that even its strongest adherents cannot claim for it is the capacity to answer a consistent skepticism. In their defense of science, the authors are reduced to pounding the table (or, after the fashion of Dr. Johnson's refutation of Berkeley's Idealist philosophy, kicking the stone.) Thus, it is a "brutal" fact that science makes reliable predictions about physical events, that antibiotics cure infections while New Age crystals will not, that the advisability of nuclear power is a question of engineering and not of moral rectitude. Well, sure. But why? "Because" is not an answer. Without some way to relate the reliability of science to the rest of reality, the scientific community will be living in an acid bath skepticism and superstition.

The authors tell us that the scientific methodology of the 17th century "almost unwittingly set aside the metaphysical assumptions of a dozen centuries...[that] Newton or Leibnitz sought...to affirm some version of this divine order...is almost beside the point...Open-endedness is the vital principle at stake here...Unless we are unlucky, this will always be the case." In reality, of course, it surpasses the wit of any thinker to set aside the metaphysical assumptions of a dozen centuries, or even entirely of his own generation. The scientists of the early Enlightenment did indeed scrap a great deal of Aristotle's physics. Metaphysically, however, they were fundamentally conservative: they settled on one strand of the philosophical heritage of the West and resisted discussing the matter further.

As Alfred Whitehead realized early in this century, science is based on a stripped-down version of scholasticism, the kind that says (a) truth can be reached using reason but (b) only through reasoning about experience provided by the senses. This should not be surprising. Cultures have their insistences. Analogous ideas keep popping up in different forms throughout a civilization's history. When the Senate debates funding for parochial schools, it is carrying on the traditional conflict between church and state that has run through Western history since the Investiture Controversy in medieval Germany. In the same way, certain assumptions about the knowability and rationality of the world run right through Western history. The Enlightenment was not unique in remembering these things. Its uniqueness lay in what it was willing to forget.

It would be folly to dismiss so great a pulse of human history as the Enlightenment with a single characterization, either for good or ill. Everything good and everything bad that we know about either appeared in that wave or was transformed by it. Its force is not yet wholly spent. However, one important thing about the Enlightenment is that it has always been a movement of critique. It is an opposition to the powers that be, whether the crown, or the ancient intellectual authorities, or God. The authors of "Higher Superstition" tell us that the academic left hopes to overthrow the Enlightenment, while the authors cast themselves as the Enlightenment's defenders. The authors are correct in seeing the academic left as silly people, who do not know what they are about. The authors are mistaken too, however. The fact is that the academic left are as truly the children of the Enlightenment as ever the scientists are. Science was once an indispensable ally in the leveling of ancient hierarchies of thought and society, but today it presents itself to postmodern academics simply as the only target left standing. Is it any wonder that these heirs of the Enlightenment should hope to bring down the last Bastille?

This article originally appeared in the November 1995 issue of Culture Wars magazine.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Organelle Transplant

Yesterday on Twitter, Razib Khan remarked that he hadn't realized pro-life Christians relate genetics to souls.

Since I wasn't party to the conversation, I have no idea what was said. I have heard things like this however, and it made me go hmmm....

I decided to respond in a blog post, since Twitter sucks for anything moderately complicated.

The bigger context for this is a proposal to treat mitochondrial disease that was approved by the Human Fertilisation and Embryology Authority in the United Kingdom. In what seems like an attempt to annoy the maximum number of people possible, this procedure is usually described as a 'three-parent baby'. While there is a germ of truth in this description, you could also call it an organelle transplant, since the intent is to replace defective mitochondria with working ones.

The germ of truth is this: the replacement mitochondria should breed true, because the technique referenced in the article, pronuclear transfer, removes the male and female pronuclei from one fertilized egg [the one with defective mitochondria in the cytoplasm] and moves them to another fertilized egg [whose pronuclei have been removed] with different mitochondria. These new mitochondria are in fact from a third person, and are genetically distinct from the other woman's.

I use organ transplant as a reference point, because a donated organ also contains DNA different from the recipient's. The key difference here is that your donor liver's DNA cannot be passed down to future descendants.

So why does anyone care? People care because 1) pro-life Christians are generally essentialists, meaning that essences or forms [in the Platonic or Aristotelian sense] define what things are, and 2) popular science accounts of genes or DNA usually describe these things as our 'essence' [in the loose popular sense of the word]. Thus our genes probably seem real important to some folks, and tampering with them is tantamount to playing God. I think this is a misunderstanding, albeit a predictable one.

In my opinion, I don't think the fact of getting DNA from a different source matters at all in its own right. One reason is much the same one Razib talks about in his tweet:

Some of our genes are indeed from viruses and stuff. There is a theory that mitochondria were once separate organisms that have become symbiotes. A lot of genes are common to all life on Earth. Strictly considered, a gene is just a way of encoding information about proteins. Any gene that works in some fashion is a real gene, although some clearly work better than others.

The second reason is that I think a lot of pro-life Christians have made a philosophical mistake in conflating the terms we use to talk about people. John Reilly said it, and I just stole it:

A human is an essence (if you don't believe in essences you don't believe in human beings); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. As for person, which is an entity, conscious or otherwise, that you can regard as a "thou," is conflated with the notion of person, as an entity able to respond in law, either directly or through an agent.

Human ≠ homo sapiens. It just ain't. Popular science accounts are correct insofar as homo sapiens is a biological concept, it can be usefully defined using genes. Human is a philosophical concept, moreover one that is dependent on a specific context to really be cogent. I think that at the very least Neanderthals were humans too, and possibly other hominins. Hell, if we were consistent, pygmies might be considered a separate species from homo sapiens, because they split off from other humans 300,000 years ago, which is before the currently defined date of the origin of anatomically modern humans

I have my doubts about the current theories, but that doesn't matter. Human is a status that is in principle independent of lineage. In practice, it isn't, but that is different from saying that they are identical.

Now, what about this mitochondrial replacement therapy? I'm still opposed. The reason has nothing to do with genes. In my philosophical tradition, there are three criteria an act must meet to be considered good:

  1. Right act
  2. Right end
  3. Right circumstances

The techniques in the Wikipedia article all involve IVF, which means creating embryos using harvested eggs and sperm, which has a pretty horrible success rate [10-20%]. That in itself isn't damning, but the way in which unused embryos are discarded [that essentialism again], and the way in which sperm and eggs are collected are objectionable in their own right. Only criterion 2) is met: preventing disease is a very good thing, especially if you can help reduce future occurrences. Anyone who doesn't share my premises about human embryos [if you don't believe in essences, you don't believe in humans], will likely not agree with my objections to IVF, although I do note that even people who are in theory in favor of it tend to find it icky and horrible when they see it.

The Long View: Dante's World Government

This is an absolutely beautiful exposition of the idea that a universal state is the best for the flourishing of man. I'll let the words speak for themselves.


Dante's World Government:
De Monarchia in the 21st Century

 

By John J. Reilly

“In writing the introduction to a work of political philosophy there is a temptation to attribute more importance to the work in question than it can properly claim. With Dante's Monarchy this temptation scarcely arises; for many have dismissed the treatise as a dream, the vision of an idealist out of touch with political realities who was yearning for an Empire that had passed away.”

So wrote Donald Nicholl in his introduction to the English translation (Noonday Press, 1954) that I used for this essay. There is a sense in which his assessment remains true 49 years later. It has been a long time since many people had much enthusiasm for the Holy Roman Empire, which was the particular instance of universal polity that Dante was defending. The paucity of translations of De Monarchia into English might also be taken as evidence of lasting irrelevance. (The Latin original is, oddly enough, available online, at no charge.) Some things have changed in the past half-century, however. The prospect of new forms of transnational governance is often discussed these days, either as a promise or a threat. Moreover, the dream-like abstraction of Dante's arguments may allow for modern re-interpretation in a way that would not be possible to a more concrete and historically grounded analysis. It is very unlikely that De Monarchia will someday be hailed as a guide to restructuring the international system. Nonetheless, in intellectual history, there are some issues that never really go away. In this book, Dante gives us an early formulation of some perennial ideas.

Even the most Platonic political theory has some history behind it, of course. Dante Alighieri (1265-1321) was born into Florence's Guelph party, which was the faction that generally supported the papacy against the Holy Roman Empire. (The imperial party was the Ghibellines.) Briefly a member of Florence's governing council, he was exiled in 1301, when the Guelph faction that was backed by France took control of the city. The French were there because Charles of Valois had entered Italy at the pope's invitation to restore order to the peninsula. The next year, Pope Boniface VIII issued the famous bull, Unum Sanctum, which advanced the broadest claims to the supremacy of the church over temporal authority, particularly over the empire. De Monarchia may be considered an answer to those claims; or maybe better, their dialectical opposite.

The date of De Monarchia's composition is disputed, though it was probably finished in the second decade of the 14th century. Its arguments in favor of the autonomy of the empire are not greatly different from the political theory of the Convivio, which Dante abandoned unfinished about 1308, and The Divine Comedy, which he completed shortly before his death. It probably was not finished before the arrival of the new emperor to Italy, Henry VII, in 1310. He, too, came to restore order, this time with the blessing of Clement V, the French pope who initiated the removal the papacy to Avignon that would last until 1377. These events turned Dante into what he described as a “party of one.”

De Monarchia asks three questions: Is the secular monarchy necessary? Did the Roman people receive the monarchy by right? Does the monarch receive his authority directly from God, or through the intermediation of some minister of God? These terms require a little explanation. By “monarchy,” Dante does not mean simply the rule of a single individual, though his argument does tend toward the Aristotelian proposition that legitimate monarchy is the most perfect form of government (in contrast to tyranny, which is monarchy's opposite and the worst form). The later Roman Republic was the “monarch” of the ancient world, in Dante's terminology. De Monarchia is really about the structure of the international system. As for the “Roman” element, Dante does not distinguish between the Republic and the Empire, or between ancient Rome and the medieval empire.

So, then, to take Dante's first question: Is the secular monarchy necessary?

Remarkably, Dante derives the necessity of monarchy from an argument that is almost Hegelian. Universal government is necessary, because it is the way to universal peace; universal peace is necessary, because it is the only way the human race can attain its end, or purpose; this end is actualization of the “possible intellect,” which is possessed by the human species as a whole.

The possible intellect got Dante into a lot of posthumous trouble; it was one of the reasons De Monarchia stayed on the Index of Forbidden Books from the 16th to the 19th centuries. The notion comes from the 12th-century Iberian Islamic philosopher, Averroes (Abu al-Walid Muhammad ibn Ahmad ibn Muhammad ibn Rushd), who deployed it in a way that argued against personal immortality in favor of a collective human soul. Dante himself thought no such thing, of course. His version rests on the scholastic commonplace that human beings are only partly intellectual beings (unlike angels, whose substance is intellect). Because of this defect, no single human being, however intelligent, could fully embody the intellectual capacity common to the species. That could be done only collectively and, since knowledge is cumulative, historically. The human species, if it is to achieve the state of intellectual perfection possible to it, required a peaceful and therefore unified world.

Since the 19th century, we have been more inclined to expect the advancement of intellect to come from competition than from harmonious peace. To that, perhaps, a medieval would have argued that even a market of ideas requires rules to keep the market functioning. Certainly a dynamic world is not quite contrary to the medieval ideal of the tranquility of order.

Be that as it may, Dante insists that the ideal political order is a universal polity. The good inherent in the whole, he explains, exceeds the good inherent in the parts, though these parts may have an internal constitution that resembles the order of the whole. Thus, only a polity that encompasses the whole human species could really be perfect.

The universality of the universal monarch would not be expressed by promulgating the positive law for every district. Rather, the universal law would be a common law, which deals only with those things all men have in common. Neither would it mean that the several nations could not have their own princes and other magistrates. However, those rulers could rule justly only by virtue of their relationship to the universal monarch.

This is essentially the same argument that Julius Evola made in connection with his critique of 19th century imperialism. An empire in competition with other empires for national glory was mere violence, in his estimation. The distinction between “the empire” and “an empire” is also fundamental to the analysis of the postmodern world in Antonio Negri and Michael Hardt's book, Empire. They point out that the global system of governance has a moral basis that was lacking in the competitive empires of early modernity. The empires were imperialistic; though they might sometimes benefit their subjects, they were founded on ambition and greed. The “empire” of the late modern international order, in contrast, though it may cause endless disaster, is founded on the principle of eternal justice. The former were imperialistic; the latter is imperial.

All things being equal, the universal law would better be made by one agent, rather than by several, according to Dante. Human concord can be attained only by a concord of wills, which needs a human director. One may note that this reasoning would work almost as well as an argument to move beyond a law of nations enforced by nations to a world system with a genuine executive, if not necessarily a “monarch” in the conventional sense.

Dante, who spent the last two decades of his life in exile because of the chaos among the petty states of Italy, saw nothing odd in also asserting that the empire is necessary for human freedom. Freedom is the perfect condition of man, the state he was designed for. However, man is free only when his judgment may operate undeflected by the appetite. The monarch could create the institutional basis for a society in which the most people would be able to approach this condition. This is because only the monarch could himself be entirely free; having the greatest honor in the world, there would be nothing further for him to desire. Thus, being wholly disinterested, his reign would have no object other than the common good.

This reasoning might perhaps seem non-obvious to moderns, who are quick to point out that the road to Hell is paved with good intentions. Neither would there be general assent today to the proposition that satisfying all a man's desires would necessarily make him a good person. On the other hand, Dante's reasoning does bear a family resemblance to Francis Fukuyama's hypothesis that liberal democracy is the end of history because it satisfies all aspects of human nature. Moreover, there have been several recent arguments to the effect that something very like Dante's empire is necessary to human freedom, or at least to the highest level of human freedom that is possible in much of the world. So said Niall Ferguson, in yet another book named Empire, with respect to the British tradition. On a somewhat higher level of abstraction, that is also what Patrick Kennon says in Tribe and Empire.

The modern apologists for empire use reasoning that is not as different from Dante's as might at first appear. They say that the empire is the institution best suited to mitigate ethnic strife, because the empire is transnational and, like the monarch, disinterested. Further, Dante says that only the perfectly free monarch can impart a measure of freedom to the wider world because only he possesses this quality himself; similarly, only a liberal democratic empire could impart liberal democracy to societies that lack it.

Before proceeding to Dante's second question, this might be a good point to examine Dante's method. Readers will have gathered that, in fine scholastic style, he favors arguments “in the alternative.” Indeed, in this summary I have taken some liberties by integrating arguments that Dante leaves side by side. The internal logic of each argument is formal and partisan; unlike Thomas Aquinas, Dante does not trouble to state possible counterarguments systematically. These two paragraphs are typical of the whole:

“On the basis of this exposition we reason as follows: justice is most powerful in the world when located in a subject with a perfect will and most power; such is the Monarch alone; therefore justice is at its most potent in this world when located in the Monarch alone.

“This preparatory syllogism is of the second figure, with intrinsic negation, and takes the following form: all B is A; only C is A; therefore C only is B. That is: all B is A; nothing except C is B. The first proposition clearly holds, for reasons already given; the other follows by reference to the will and then to power.”

This procedure tries to reach conclusions about the world by arguing from first principles. In effect, Dante formulates archetypes and then hunts for their incarnations. This type of metaphysical reasoning has fallen out of fashion, particularly in the social sciences; but it, too, is always with us. Modern physics is littered with examples of mathematical objects that had first been formulated as merely speculative exercises, but which later turned out to describe things in the real world. This is not so different from what Dante is doing: sifting through the products of history to find incarnations of the ideal forms.

This brings us to the second question: did the Roman people receive the monarchy by right?

Dante tells us that the history of the rise of the Roman Empire had seemed an inexplicable wonder to him. Then he realized that the Roman people did not acquire the monarchy of the world by ferocity, but through right, guided by providence. The progress of the Roman people was at many points attended by miracles, like the history of the Hebrews. Thus we see that God approved of the empire; Christ Himself chose to be born in the “fullness of time,” the peaceful age of Caesar Augustus.

Indeed, Christianity requires that the Roman Empire be legitimate. The central doctrine of Christianity is that Christ was punished for the sin of Adam. If the magistrate who sentenced Jesus was not an “appropriate judge,” then the suffering of Jesus was not a punishment, and we are not saved. Only the representative of the government of the whole world could have had the authority to inflict punishment on He Who suffered for the whole world.

Providence is not always expressed through the clearly miraculous. Sometimes God's hidden judgments are revealed by the outcome of duels, which in effect was what happened when the Romans defeated all others in the contest for world empire. The empire expressed the natural hierarchy among the peoples, of whom the Romans were the noblest. Even regarded simply as a matter of natural right, the citizens of the Roman Republic were working for the public good by creating a structure of universal peace. Nations, like individuals, should resort to force only as a last resort. However, whatever is acquired in a duel is acquired by right.

In the modern era, the idea that the historical process gradually expresses natural right is not rare: we see it from Hegel to Francis Fukuyama to Robert Wright. This is the intuition behind the dedication of transnationalists to the evolution of the network of supranational institutions and non-governmental organizations, which for them is now the seat of legitimacy in the world. Arguments even closer to Dante's have been made by macrohistorians who predict that the modern era will end in a universal state very like the Roman Empire. In any case, though the actors differ from theory to theory, the fundamentally providential structure of history remains.

Something that does change, of course, is the relationship of this providence to religion. One of the few specifics in which Hardt & Negri's empire differs from Dante's is that theirs is equated with the Kingdom of God. Possibly this was a mere rhetorical flourish on their part; they are also keen on the idea that the empire excludes the transcendent. Dante, in contrast, did insist on a transcendent foundation for the empire, but he strongly distinguished the empire from the Church, which is part of the Kingdom of God. This is the burden of his answer to the third question:

Does the monarch receive his authority directly from God, or through the intermediation of some minister of God?

In a rare display of tact, Dante said that those popes who asserted the empire owed its existence to the papacy were merely misguided by zeal. However, he says that the kings and princes who follow the popes' lead in this matter are not sons of the Church, but sons of the devil. He dismisses the claims of the class of ecclesiastical lawyers called the decretalists, because it is irrational to claim authority for the Church from its own legal rulings, when it is precisely the authority to make those rulings that is in question.

Much of the discussion about the relationship between Church and Empire is taken up with distinguishing the implications of a metaphor: the Church is the sun and the Empire is the moon. Dante accepts this then-common equation for the sake of argument. Just because the sun provides the moon with its light, he points out, that does not mean the existence or the operations of the moon are derived from the sun. Both sun and moon were created directly by God. The light the moon receives is more properly likened to divine grace, which makes everything appear different. In no way, however, is this illumination analogous to a grant of authority.

Dante assures us that God is the lord of all things, spiritual and temporal, and that the pope is His vicar. However, it does not follow from this that the pope is the lord of all things. Vicars do not have all the powers of their principals. The pope, for instance, does not have any special power over nature.

Dante also addresses the venerable allegory of the Two Swords. The proof-text is Luke 22:38, in which Peter offers Jesus two swords, and Jesus says they are enough. The lesson usually drawn from this exchange is that church and state are separate. Papalist propaganda, however, noted that the two swords remained in Peter's keeping, and so argued that both the spiritual and temporal power were both ultimately in the pope's keeping. Dante simply denies that the analogy is relevant, dwelling instead on the meaning of the verse in context.

No doubt the doctrine in question is not worth much, but one wonders how a poet could dismiss such an important metaphor. The analogy of the two swords runs right through Western history. When US senators debate whether public funds should be available to faith-based organizations, that is still the pope and the emperor arguing about who has the authority to invest the bishops of Germany. Unlike in other civilizations, church and state in the West are always distinguished, even in those periods when they closely supported each other. Even when the ecclesiastical power seems to have wholly lapsed, it is natural for academics and artists to claim the privileges and influence traditionally granted to priests.

Inevitably in any medieval discussion of the temporal power of the papacy, Dante addresses the Donation of Constantine. This legend, aided by some forged documents, had it that, in the fourth century, the Emperor Constantine had given the pope the authority to govern Italy and the western empire. Dante does not dispute the authenticity of the Donation, but he says that nothing more could have been involved than the transfer of a right of guardianship.

Why so? Because, as Dante tells us, whatever is contrary to the nature of a thing is not to be numbered among its powers. Now one of the essential features of the empire is its universality; it has the right of universal jurisdiction, even when it does not have the fact. To divide the empire by ceding sovereignty over a particular region would have been to destroy the empire as such. The powers of the emperor, which derive from the nature of the empire, could not have included such a grant. Moreover, the Church by its nature could not have received such a grant, since the Church cannot own property, but only the fruits of property. (This was, of course, the ideal of the radical Franciscans.)

The tranquility of order that the emperor protects is important for the salvation of all men. The emperor's authority is therefore providential, but the authority belongs to the office itself. The authority of the emperor could not have come from the Church, since the empire antedates the Church. Furthermore, since the emperor's authority comes directly from God, the Electors of the Holy Roman Empire do not really choose the emperor. Rather, they simply declare where the right to the office lies.

* * *

I have occasionally noted that the instrument of abdication and dissolution issued by the last Holy Roman Emperor in 1806 seems to contravene the provisions of the Golden Bull of 1356, which guaranteed the prerogatives of the electors. Thus, it is arguable that emperor did not have the authority to dissolve the empire. However, even if that is a correct reading of the law (which I rather doubt), that is still not the kind of indissolubility that Dante was talking about. Even if the constitutions of the empire had contained explicit provisions for its dissolution, the empire still could not have been dissolved. Its existence is not contingent on politics; it is the one politically necessary being.

The political theory of the modern era was designed specifically to do away with this kind of thinking. There have been schemes for world order in that time. Some, like the Concert of Europe, were reasonably effective. However, even the most idealistic internationalists thought in terms of positive law, of flesh-and-blood legislators creating laws and treaties with visible texts. Only toward the end of the 20th century did we see a return of the insistence that a universal law must already exist in some sense; more important, we have seen a return of the willingness to act as if such a law existed. This is as true of the neoconservative establishment in the United States as it is of proponents of the International Criminal Court. Neither group is likely to get quite the world it expects, but their worldviews are not as far apart as they imagine.

The empire is like the doctrine of the Two Swords: it is among the insistences of the West, which take different forms at different times. Dante's Holy Roman Empire is long gone. So is Charles V's. So, one suspects, will be the United Nations in its current form. Even today, though, we see that men are beginning to repeat in modern form the reproof that Dante wrote to his own obdurate city during an imperial siege:

“Why are you stirred by this will o' the wisp to abandon the Holy Empire and, like builders of a second Babel, to embark on new forms of state so that the Florentine sovereignty should be co-ordinate with the Roman?” 

Copyright © 2003 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The De Monarchia
By Dante Alighieri

The Long View 2003-07-17: More on Chesterton & Harry Potter

G. K. Chesterton

G. K. Chesterton

It is a fun thought experiment to imagine what G. K. Chesterton would have said about the Iraq War of 2003. John was right to note that Chesterton was no kind of pacifist. After all, he wrote a poem commemorating the Battle of Lepanto. However, he also had a fierce love of the small and local, and a great distaste for the grandiose and puffed up. Thus, he loved England, and wasn't overly fond of the British Empire.

Chesterton was a supporter of the Great War, even though his younger brother Cecil was among its casualties. At this remote distance, that war seems like it was a really, really bad idea. On the other hand, he was also a fierce critic of the Boer War, which was a nasty little imperial war that richly deserved skewering. Ultimately, I'm not sure I know what Chesterton might have thought.

I do think that Chesterton would have shared John's horror of chaos and anarchy. And like John, I think Mad Max is becoming the future we are more likely to face than 1984.


More on Chesterton & Harry Potter

Continuing in my reading of G.K. Chesterton's Autobiography for insight into the Anti-Terror Wars, I find that it would be misleading to equate his "anti-imperialism" with that of the critics of US policy today. His ideas on the subject would not at all please the folks at Antiwar. He was, in fact, far more belligerent than Samuel Huntington, who merely described the "clash of civilizations" without actually advocating it. Consider this passage:

[H. G. Wells] defends the only sort of war I thoroughly despise, the bullying of small states for their oil or gold; and he despises the only sort of war that I really defend, a war of civilizations and religions, to determine the moral destiny of mankind.

If Chesterton were alive today, he would have opposed the war in Iraq if he thought that it was primarily about oil. That was much the reason he opposed the Boer War, which really was chiefly about gaining complete control of southern Africa's gold and diamonds. If he were convinced that the Iraq campaign were part of a larger war against Islamicism, he would certainly have supported it. He would also have relished the way the war outraged the world's progressives. In his view, there "is only a thin sheet of paper between" internationalists and imperialists.

Does this mean that he could have been a happy contributor to The Weekly Standard, or even The Daily Telegraph? Possibly not. Certainly he was leery of the principle of preemption in foreign policy. He tells a parable about a householder who shoots a burglar, whom the householder discovered in his garden. Chesterton finds this use of force commendable, but he distinguishes the case from preemption:

If [the householder] had gone out to purify the world by shooting all possible burglars, it would not have been a defensive war. And it would not have been a defensible one.

There is no satisfaction in arguing with the dead: the silliest title in my library is A Challenge to C.S. Lewis, written more than a decade after Lewis died. Still, I might say to Chesterton's shade that he was right, but his example is beside the point. For the most part, the burglars in other people's gardens are no concern of the private householder. On the other hand, when you pay taxes to support a police force, you are in effect waging war against all possible burglars. Also, some societies have legal systems that rely on the self-organization of citizens. Ancient Iceland worked like that; so does the international system today. It is possible to be responsible for other people's burglars. The question is: who is the legitimate authority?

Chesterton was not pleased at the prospect of such an authority on a global scale, unless perhaps it were the Vatican. Ever the anti-pacifist, he was a keen supporter of the First World War from the beginning. He never wavered from that position, even after most of his friends had decided the war was a mistake or a crime. However, for him the war was about the defense of small nations, and the need to humble the pride of Prussia. He resisted rationales that were more, well, global:

But I am far from certain that a War to End War would have been just. I am far from certain that, even if anybody could prevent all protest or defiance under arms, offered by anybody anywhere under any provocation, it would not be an exceedingly wicked thing to do.

Anyone who has read H. G. Wells's Things to Come (1933) knows exactly the wickedness that Chesterton had in mind. That book is essentially a retelling of The War of the Worlds, except that the conquerors are armed Fabians, and their victory is complete, permanent, and, in Wells's view, the best possible outcome. The problem with Chesterton's objection is that it is not an objection to world government, but to government.

Possibly because of the period in which I grew up, anarchy has to me to be the greater danger. When I was a small child, too young to read 1984, the H. G. Wells world of totalitarian regimentation no doubt seemed a pressing danger. By the time the year 1984 arrived, however, the real danger seemed to be the world of Mad Max. It still does.

* * *

A final note about Chesterton: For someone who did not purport to be a systematic thinker, GKC actually did a pretty good job of not contradicting himself. Nonetheless, he rarely expressed his basic insights in sustained argument. He would have been outraged at the comparison, no doubt, but his style was really not so different from Nietzsche's: heavy on the aphorisms and paradoxes, short on theory. Nietzsche favored aphorisms because he was skeptical about the thought itself. Chesterton, in contrast, was a sort of Thomist: he was ideologically committed to the principle that abstract formulas can embody reality. His reluctance to make sustained theoretical argument seems to have grown out of his attachment to the particular, the local, the personal. One of the advantages to Thomism is that it provides assurance that the finite can indeed reflect the infinite.

* * *

Perhaps there has already been enough discussion of the deep significance of Harry Potter, but I could not resist following a link to this title: Harry Potter and the Future of Europe. The article, by Jeff Fountain, argues on the basis of personal experience that the Potter books really are a sign of the times:

While using techniques of magic and mythical creatures, Christian fantasy writers like Lewis and Tolkien develop their imaginary worlds within their own personal commitment to orthodox Christian belief in a sovereign God. Rowling does not share that commitment. Although she denies any personal belief in the magic her books portray, she still tells her readers, "It’s important to remember that we all have magic inside of us."

Unlike these Christian fantasies, Harry Potter is a post-Christian creation set within an occult cosmology. And his phenomenal popularity among young and old signals where our western culture seems to be headed.

This is very similar to the point about the the resilience of the "Perennial Philosophy" that I raised some years ago in a review of Robertson Davies' book, The Cunning Man. In fact, back in 1986, Owen Thomas suggested in Christianity Today that the real competitor for Christianity in the West was never Marxism or materialism, but a sort of neoplatonism.

Was Quidditch all it took to get Plotinus to the best-seller slot on Amazon?

Copyright © 2003 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Glass Bead Game

I don't think this counts as a prediction, but there is an interesting parallel to subsequent events. John says:

“The Glass Bead Game” is about the education and career of one Joseph Knecht, whose surname means “serf” or “servant.” He rises through the elite schools of his society to the pinnacle of intellectual life, the position of Magister Ludi, the Master of the Game. Though Knecht's career as a scholar and a diplomat owes something to his native charisma, his life is the tale of how he masters and perfectly embodies the traditional role for which he has been trained. Then, having reached the summit, he walks away from the whole structure, making a resignation rather more shocking than a papal abdication would be. Hesse tries to show that this withdrawal was not a rejection of Knecht's upbringing, but its fulfillment.

After Joseph Ratzinger became Pope Benedict XVI, John said that Pope Benedict is exactly the kind of man who would enjoy playing the Glass Bead Game. As it turns out, he is also exactly the kind of man who enjoys resigning from the Glass Bead Game.


Magister Ludi: The Glass Bead Game

By Hermann Hesse

German Original “Das Glasperlenspiel” (1943)

English Translation by Richard and Clara Winston (1969)

520 Pages; Approximately US$18.00

ISBN: 080501246X

Editions are available from Henry Holt and Bantam.

This book earned Hermann Hesse his Nobel Prize for literature in 1946. World War II had just ended then, so the novel's depiction of a debellicized future for Europe no doubt had special appeal in the German-speaking world. “The Glass Bead Game” is not an arbitrary Utopia, however. (The renderings of the title for some editions are arbitrary, unfortunately; sometimes it's “Magister Ludi” or the English equivalent, “The Master of the Game.”) What we have here is an example of speculative fiction that applies a humane gloss to the model of history in Oswald Spengler's “Decline of the West.” The result belongs to that small set of speculative futures that are both surprising and plausible.

Hermann Hesse was born in Germany, in 1877, where he achieved early success as a journalist and novelist. During the First World War he was an example of that modern conundrum, the pacifist activist. His career took off in 1919, when he published “Demian” and moved permanently to Switzerland. At his death in 1962, he was thought of as an esoteric and even somewhat obscure writer. Immediately afterward, however, his books gained wide popularity as guides to the path of spiritual enlightenment.

Because of the assimilation of his work by the Counter Culture of the 1960s, Hesse is often remembered, whether fairly or not, as the novelist of the truculent intellectual adolescent. This reputation is reflected in the four novels for which he became best known in the English-speaking world: “The Glass Bead Game,” “Siddhartha,” “Demian,” and “Steppenwolf.” The first three are Bildungsromane: novels about education and growing up. In all three, the protagonists eventually transcend cultural norms. The fourth is about a midlife crisis, but “Steppenwolf” deals, on the surface at least, with sex, drugs and rock-and-roll (well, with jazz; it was published in 1927). Perhaps for that reason, it has often served as the smart teenager's answer to “The Catcher in the Rye.”

The Counter Culture has long since become the middle-aged establishment, but Hesse's books still renew their readership. For one thing, they are informed by a Jungian interpretation of Chinese and Indian mysticism, features which are all perennial favorites for several audiences. Hesse's Spenglerian view of history fell out of fashion after the middle 20th century, but that does not seem to have hindered the reception of his books. Quite the opposite, in fact: without familiarity with Spengler, “The Glass Bead Game” in particular seems even more original and mysterious. I might also mention that Hesse's German is very accessible, and it translates well into English.

The text does not say just when the story takes place. However, Hesse let it be known that the principal narrator is supposed to be writing around the beginning of the 25th century, about a person who had lived long enough ago for legends about him to spring up. The action, then, is probably in the early 2300s. We learn pieces of the historical background, which we will discuss below, but part of the book's purpose is to depict an era that is anti-historical, or post-historical. Indeed, the book is largely devoid of the sort of things that novels set in the future often emphasize. We are told there are radios, telephones, ground cars, and trains: so much for technology. It is clear that a “Century of Wars” lies in the past, but the usual term for what we call modernity is the Age of the Feuilleton, of trivial and occasional literature. There are different states, or at any rate countries, which have parallel cultural and educational institutions. We learn almost nothing about the state of the world, except that Europe is extremely peaceful and has been so for longer than living memory.

“The Glass Bead Game” is about the education and career of one Joseph Knecht, whose surname means “serf” or “servant.” He rises through the elite schools of his society to the pinnacle of intellectual life, the position of Magister Ludi, the Master of the Game. Though Knecht's career as a scholar and a diplomat owes something to his native charisma, his life is the tale of how he masters and perfectly embodies the traditional role for which he has been trained. Then, having reached the summit, he walks away from the whole structure, making a resignation rather more shocking than a papal abdication would be. Hesse tries to show that this withdrawal was not a rejection of Knecht's upbringing, but its fulfillment.

All this is expounded through long talks and little incident (there is one stinging memorandum). At school, Knecht is assigned to defend the educational system against a schoolmate named Plinio Designori (there are many Italian names in this book) who later facilitates his departure from his exalted status. Knecht receives precocious promotions. Knecht's mentor turns out to be a saint. Aside from Knecht's resignation, the most dramatic episode is a long stay in a Benedictine monastery. There Knecht is instructed in the neglected subject of world history by wise old Father Jacobus, whom critics say is supposed to represent the great Swiss historian, Jacob Burkhardt. (The most annoying character is a neurasthenic named Tegularius, who represents Friedrich Nietzsche.) There is just one, very minor, female character: Designori's wife. Knecht dies of heart failure on his first day as tutor to Designori's son, thereby giving the sulky good-for-nothing something to live up to. The conversations are really interesting.

One should note that the life of Joseph Knecht in “The Glass Bead Game” was planned as just one of a number of lives of the same man; Hesse had at first envisioned an anthology of incarnations, from the prehistoric past to the distant future. In the course of composition, however, the bulk of the book became a hagiography of the famous and infamous Magister Ludi. Just three other incarnations survive as appended stories, supposedly as examples of the school exercises the students of Knecht's time are assigned to develop the historical imagination. They deal with the life of an ancient shaman, a Desert Father, and a hard-luck Indian raja.

Hesse renders the Glass Bead Game of the title absolutely believable by not describing it in detail. We are never told just what a match consists of. Some early prototype of the Game used actual glass beads. The great annual Game matches are followed as closely and widely as international soccer (the latter isn't mentioned in the book, by the way). Those matches use some unspecified projection equipment. Calligraphy enters into it. So does music. On the other hand, people can and do play by themselves.

The Game seems to be about spotting and extending homologies in the phenomena of nature and in cultural history. The notes of a musical scale, for instance, can conform to the arrangement of the elements in the Periodic Table, or the growth pattern of a plant can conform to the expansion and leveling off of an animal population. Organic growth in general can be shown to have something in common with the efflorescence and exhaustion of an artistic style. Aquinas called these commonalities “intelligible elements”; Leibnitz actually tried to create a numerical language that could express them and even generate them. Hesse posits that some such project eventually succeeds and becomes institutionalized. The Game players seek to express all the phenomena of history and science in the Game language. An international authority oversees additions to the form and subject matter.

All in all, the Game sounds like a competitive jazz of literary and scientific allusions. It serves as an outlet among the finest minds for the creativity that in prior eras would have found expression in art. It is more than a mere trial of mental dexterity, however, because it contains a strong component of meditation. Indeed, many people pursue it as a path to spiritual enlightenment, as a way to perceive Being behind the shimmering veil of thought. The Game simply makes the veil visible, however; it is no business of the Game to suggest what may lie behind the veil.

The hints we get about the Game make it sound more than a little like the “I Qing,” the famous Chinese book of divination. However, Hesse goes out of his way to dispel any implication that they might be equivalent. One Sinicizing teacher of Knecht says this to his suggestion that a Game might be based on the “I Qing”:

“Anyone can create a pretty little bamboo garden in the world. But I doubt the gardener would succeed in incorporating the world into his grove.”

As Magister Ludi, Joseph Knecht presides over the Game center at Waldzell, which is located in an unnamed German-speaking country. The Magister Ludi is just one of the dozen Magisters of the national Board of Educators, however. There is also a “Magister Mathematicae,” for instance, and a “Magister Musicae,” and so on. The primary duty of the Magisters is to oversee the teaching of their subjects in the elite schools. The Magisters as a group oversee Castalia, the “province” (it is never clear to what extent the characterization is geographical or administrative) of all disciplines.

The students in the system of elite schools, all boys, are recruited as children. They normally serve as teachers, researchers or Game players for life. The Order to which they belong, in fact, holds them to a life of comfortable poverty and bachelorhood; to judge by this book, that also means celibacy after their student years. The people call them “Mandarins,” with some reason. Unlike the Mandarins of traditional China, however, their power does not extend beyond pedagogy.

There are also ordinary schools, up through the university level, which prepare their students for the practical professions. Castalia provides many of the teachers for the public system, but the Magisters do not control it. Castalia is wholly dependent on public funding. The Magisters even spend a fair amount of time lobbying.

It is a measure of the distance that the West has traveled by the Age of Castalia that the 20th century idea of biography has become a historical curiosity. The narrator of Knecht's life puts it this way:

“[F]or the writers of those days who had a distinct taste for biography, the essence of a personality seems to have been deviance…We moderns, on the other hand, do not even speak of major personalities until we encounter men who have gone beyond all original and idiosyncratic qualities to achieve the greatest possible integration into the generalities.”

In the age of Castalia, the West has again become a Traditional society, in the special sense of “tradition” coined by René Guénon. Though he does not say this, Hesse seems to have tried to map out a trajectory for the West like that of China after the Sung Dynasty. After several centuries of dramatic growth, chaos, and experimentation, Chinese culture turned toward consolidation under the banner of Neo-Confucianism. Like the Glass Bead Game, that philosophy is as comprehensive as it is final. However, Hesse is at pains to emphasize that a comparable transition in the West need not produce an alien world, much less the fascist outcome that some followers of Tradition favor. In the Age of Castalia, there are still political parties, elections, and newspapers. Nonetheless, the creativity of the modern era is over, as well as its violence and instability.

The Age of Castalia understands the prior thousand years in this way. Two trends had been in play since the end of the Middle Ages. One was the liberation of thought from authority, particularly from the Church of Rome. The other was the “covert but passionate search” for legitimacy for this freedom, for a new and sufficient authority arising from reason itself. The result was disaster, followed by recovery:

“[T]hey were already on the verge of that dreadful devaluation of the Word which produced, at first in secret and within the narrowest circles, that ascetically heroic countermovement which soon afterward began to flow visibly and powerfully, and ushered in the new self-discipline and dignity of the human intellect.”

The reformation of the life of the mind began, clandestinely at first, even in the 20th century. This was done under the impetus of musicologists and of the loosely organized religious movement called the Journeyers to the East (a reference to Hesse's novel of similar name, published in 1932).

After the crisis of civilization, intellectual life became monastic. People understood that their culture was no longer creative, but they also understood that there were still worthy goals to pursue. There was still the work of pious preservation, of systematization and sympathetic critique. The liberal arts began to aspire to the rigor of engineering. The Glass Bead Game was just part of a general turn toward synthesis.

By Knecht's time, no one is much impressed by Enlightenment philosophy anymore. Kant is little known, while the High Scholastics are part of the regular curriculum. On the other hand, everyone is familiar with the music of the 18th century. That's the only reason they think the 18th century is important. As Spengler predicted, the controversies of the modern era have become literally incomprehensible. A Castalian refers to a long-defunct economist sect, for instance, that is probably supposed to be Marxism, but it's hard to tell; the ideology is just too alien to mean anything to him.

Knecht's society is by no means a theocracy, but neither is it secular in the modern sense. This is, no doubt, a nod to another of Spengler's prophecies: the “Second Religiousness.” The Vatican, based on its moral authority, is again a force in culture and in world politics. (Knecht spends that time at the monastery to help Castalia negotiate an agreement to send an ambassador to the Holy See.) Protestantism has died out. However, historians within the Church remember Protestantism rather fondly. As Father Jacobus puts it: “They were unable to preserve religion and the Church, but at times they displayed a great deal of courage and produced some exemplary men.”

One might think that all this hierarchy and authority would provoke a backlash, but no. The elite schools and the hierarchy of Castalia are tolerable precisely because society is not going anyplace. The qualities of a great musician, for instance, are said to be “enthusiasm, subordination, reverence, worshipful service.” Maybe only superior people have those qualities, but their superiority does not include the sort of genius that demands attention for its novelty. The Magisters do not conceive avant-garde ideas and expect people to follow. The hierarchy is the embodiment of a consensus by which the hierarchs themselves are the most strictly bound.

However, though history may have ended, time has not stopped. In preparation for his resignation, Knecht warns the other Magisters, “The world is once again about to shift its center of gravity.” Ominous but unnamed developments in the Orient threaten not just peace, but life and liberty. Serious rearmament could be just a generation or two away. When that happens, Castalia may seem an over-expensive luxury, unless its spirit can be communicated to society as a whole.

The Order is not impressed:

“In the view of the majority, the calm that descended upon our Continent must be ascribed partly to the general prostration following the bloodlettings of the terrible wars, but far more to the fact that the Occident has ceased to be the focal point of world history in which claims to hegemony are fought out.”

Oddly for men who must convince their government every year of the indispensability of their institution, the Magisters also have little patience with Knecht's argument that the calm and sanity of Castalia has itself been a force for peace. Instead, the Magisters reply that Castalia, and indeed the life of the mind, are not historical factors:

“Rather, culture or mind, or soul, has its own independent history – a second, bloodless, and sanctified history – running parallel to what is generally called history.”

Knecht does not resign as Magister in order to sell war bonds. The short explanation for his departure from Castalia is that he had exhausted his own capabilities. There was nothing left for him but the “eternal recurrence” of routine. More important, though, was the characteristic way in which he fulfilled his destiny as a servant. Like St. Christopher, he possessed “a self-reliance which by no means debarred him or hampered him serving, but demanded of him that he serve the highest master.” Because of his introduction to history, he understands that the Glass Bead Game is not the final truth. The Game, too, will prove to be ephemeral:

“Yes, Castalia and the Glass Bead Game are wonderful things; they come close to being perfect. Only perhaps they are too much so, too beautiful. They are so beautiful that one can scarcely contemplate them without fearing for them.”

“The Glass Bead Game” is the story of the progressive “awakenings” in Knecht's life. He comes to realize that these gates through which he passes do not lead to any inner sanctum. Rather, they are awakenings to the reality of each new situation. The same can be said for the progress of the spirit in history. The lack of linearity, however, does not imply a lack of exigency:

“In history, too, moments of tribulation or great upheavals have their element of convincing necessity; they create a sense of irresistible immediacy and tension. Whatever the consequence of such upheavals, be it beauty and clarity or savagery and darkness, whatever happens will bear the semblance of grandeur, necessity and importance and will stand out as utterly different from everyday events.”

In some ways, “The Glass Bead Game” represents the road that Spengler did not take. At one point in the 1920s, Spengler replied to the charge that “The Decline of the West” advocated nothing but pessimism and despair with the assertion he could fittingly have called the book “The Fulfillment of the West” or the “Perfection of the West.” His thesis, after all, was that the West may have exhausted its creative potential, but that modernity was the age in which it would fashion the final forms of Western Civilization in art, science, politics and religion. His model of history was quite consistent with a future that was humane, peaceful, and orderly. Sadly, he was distracted from pursuing this insight by Nietzsche's nihilism and the sour politics of the Conservative Revolution. More and more, he foresaw a Faustian future of disaster and tyranny.

In “The Glass Bead Game,” however, Hesse took the hint. The most intriguing story in the book deals with the final stage in the life of Knecht's old mentor, the Magister Musicae:

“He certainly does not seem to me to be close to his life's end, but his way of taking leave of the world is unique…[I]t is as if he has been on his way elsewhere for some time, and no longer lives entirely among us…
[H]is cheerfulness, his curious radiance…While his strength is diminishing, that serene cheerfulness is constantly increasing.”

Many legends later grew up about the Transfiguration of the Magister Musicae, we are told. The interesting point is that the episode seems to relate Spengler's prediction of the Second Religiousness to the palpable aura of eternity said to surround some living saints. Knecht remarks:

“Even though whole peoples and languages have attempted to fathom the depths of the universe in myths, cosmologies, and religions, their supreme, their ultimate attainment has been this cheerfulness.”

The old Magister, however, was not just any kind of saint, but a specifically Castalian saint. The sanctity he manifested was intrinsic to the Game, which is the final form of the spirit of the West:

“With us scholarship, which is the cult of truth, is chiefly allied also with the cult of the beautiful, and also with the practice of spiritual refreshment by meditation. Consequently it can never entirely lose its supreme cheerfulness.”

Good Spenglerians (among whom we must number Spengler himself) tend to imagine the final stage in the life of the West as a heroic last stand, perhaps lasting centuries but ending in defeat. Evil Spenglerians, not a trivial class, hope for conquest and domination. Hesse's book hints at the possibility that the same insights into historical morphology might be put to quite a different use. Is the world ready for holy Spenglerians? Maybe someday.

Copyright © 2002 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Host Cell Lines and Homo Sapiens

Greg Cochran has re-posted an interesting article about host cell lines.

A host cell line is a microorganism that was until fairly recently a part of some higher organism – roughly speaking, a contagious cancer. We know of one good example, transmissible venereal tumor, also known as canine venereal sarcoma or Sticker’s sarcoma, a contagious neoplasm of dogs. It is not contagious in the same sense as liver or cervical cancer, which are (usually) consequences of viral infections. In those cases, it is the virus that is infectious; here it is the cancer itself. Viable cells become engrafted onto mucous membranes and grow in the new host animal. Transmission is usually sexual, but licking or inhaling sometimes causes oral or nasal tumors. Chromosomal and genetic studies indicate that all cases of TVT share a common origin – all share a particular pattern of chromosomal rearrangement and carry characteristic insertions.

Greg goes on to speculate that the cell line derived from a cervical adenocarcinoma in Henrietta Lacks in 1951, HeLa, might be something like a host cell line. Descended from a homo sapiens, but a new species. Since I just happen to be reading Dune, this reminds me of the Bene Gesserit belief that not everyone who happens to be a homo sapiens counts as a human.

Which further reminds me of one of my favorite ideas of John Reilly's: humans, homo sapiens, and persons, are different things. Discoveries like this just reinforce my conviction that John was right. However, I've found that right-thinking people seem obscurely scandalized when I repeat this. I think this is probably a good thing, because de-humanizing people is usually the first step in justifying doing something bad to them. To say that a human being and a person are not logically identical is not the same thing as saying we should de-personalize some human beings. However, it does open that up as a possibility. Thus I am not surprised when people seem off-put.

However, it does not therefore follow that those three things are logically identical. They cannot be, because they are different kinds of things. It is a category mistake to identify them. John summarized thus:

A human is an essence (if you don't believe in essences you don't believe in human beings); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. As for person, which is an entity, conscious or otherwise, that you can regard as a "thou," is conflated with the notion of person, as an entity able to respond in law, either directly or through an agent.

I think that the human beings we know of are homo sapiens, and that homo sapiens are persons. I just think you have to make an argument that these things are true, rather than making an indefensible assumption about it.

The last distinction John makes in the quote above often trips people up. If you conflate the two senses of the word person, and then further identify that with human being, I can see how that idea might be offensive. The problem is, it isn't true. If you can look past the controversies of contemporary American politics, the idea that a corporation can be a person has allowed institutions to flourish in the West, as opposed to tribes, nations, or dynasties, which are defined by common descent. An institution can continue through time once the founder has died, regardless of the familial relations of the people who comprise them.

Societies that lack the ability to create groups with a common purpose that do not depend on ties of kinship are weaker than those that can. We shouldn't cast that aside blithely.

The Long View: Tragedy and Hope

The Magistra likes to watch period dramas on Netflix. This is a work of the same genre, although Quigley probably didn't know he was writing one at the time. This book is a fossil from that brief and peaceful period in the 1960s before the big cultural upheavals in 1968, which in this case really means something more like the 1950s for most Americans, but more stylish.

That was the time when "liberal Catholic" meant something like Pope Francis, left-of-center politically, but wholly orthodox. It was also a high point of the self-confidence of the West as a whole, and American liberalism in particular. One of the enthusiams of this book is Operations Research, a formal science that emerged out the the Second World War. This is a discipline that does not bulk large in anyone's mind today, but it was the nanotech of its day, capable of solving any problem in principle. John notes that the belief in Operations Research methods was part of what made Robert McNamara think the war in Vietnam could be managed using metrics and scientific methodology. This famously did not go well, although I can think of at least one old OR man who might dispute that interpretation.

Quigley's book has achieved a kind of fame, at least amongst the conspiracy minded, due to his singling out of the Round Table groups founded by Cecil Rhodes. The most familiar of these groups today is the Council on Foreign Relations, but Rhodes also founded the Rhodes Scholarships with the same motivation as the Round Table groups; he wanted to build an international network of public-minded and public-spirited men across the English-speaking world.

The Round Table groups were really just a spontaneous outgrowth of the first period of internationalization, when international trade began to rapidly outpace international institutions, and some mechanism was needed to fill the gap. Industrious Victorians like Rhodes were only too happy to come up with something clever to do just that. As international institutions caught up, the influence of the Round Table groups declined, if for no other reason than they were no longer the only players in the game.

Anglophilia and conspiracy theories are not the main interest of this book, however. Quigley summed up everything he thought was best about the Western world thus:

"The Outlook of the West is that broad middle way about which the fads and foibles of the West oscillate. It is what is implied by what the West says it believes, not at one moment but over the long succession of moments that form the history of the West. From that succession of moments it is clear that the West believes in diversity rather than in uniformity, in pluralism, rather than in monism, or dualism, in inclusion rather than exclusion, in liberty rather than in authority, in truth rather than in power, in conversion rather than in annihilation, in the individual rather than in the organization, in reconciliation rather than in triumph, in heterogeneity rather than in homogeneity, in relativisms rather than in absolutes, and in approximations rather than in final answers. The West believes that man and the universe are both complex and that the apparently discordant parts of each can be put into a reasonably workable arrangement with a little good will, patience, and experimentation. In man the West sees body, emotions, and reason as all equally real and necessary, and is prepared to entertain discussion about their relative interrelationships but is not prepared to listen for long to any intolerant insistence that any one of these has a final answer." [Page 1227]

While at first glance this might seem to be merely to be a summary of the early 1960s liberal consensus, its roots go far deeper. Quigley himself considered this to be an interpretation of Aquinas. That is defensible, there really are ideas like this in Aquinas's thinking. It is also not the only possible interpretation of Aquinas. But it does represent a durable line of thought in the history of the West.

This line of thought may or may not be true, but it has been with us a long time.


Tragedy and Hope:
A History of the World in Our Time
by Carroll Quigley
First Published 1966
The Macmillan Company
(Reprint GSG & Associates)
1,348 Pages, US$35.00
ISBN 0-945001-10-X

 

For reasons that are only partly the author's fault, "Tragedy and Hope" has become one of the key texts of conspiracy theory. Famous for its exposition of the workings of the Anglophile American establishment during the first half of the twentieth century, the book is reputed to have "named names" to such a degree that the hidden masters of the world tried to suppress the unabridged edition. It did not diminish the book's reputation that Carroll Quigley (1910-1977), a historian with the Foreign Service School at Georgetown University, made a deep impression on US-president-to-be Bill Clinton during Clinton's undergraduate years at that university. We have Mr. Clinton's own word on this, so it must be true.

 

If the hidden masters did try to suppress the book when it first appeared, they seem to have lost interest by now; the only problem I had buying this enormous volume was carrying the 15 pounds of it home. "Tragedy and Hope" has no notes, no bibliography, and a very inadequate index. As with the Bible, its sheer size has done something to ensure that it would be more cited than read. For what it is intended to be, a history of the world from about 1895 to 1964, the book is a failure. As Quigley acknowledges, there are insuperable problems of perspective in writing about one's own time. On the other hand, the book's prejudices are fascinating. It was written at the point in the 1960s just before the American liberal consensus began to unravel. Perhaps as important for Quigley, that was also the brief interval after the Second Vatican Council when "liberal Catholic" did not mean someone who rejected all dogma and tradition. Beyond its value as a period piece, however, the book occasionally transcends its time. Its remarks about the future, presumably a future more distant than our present, are close to becoming conventional wisdom today.

 

Quigley's frame of reference is roughly that of Arnold Toynbee: the West, including Europe, the United States, Latin America, and Australasia, has entered an Age of Crisis. Other civilizations, when faced with analogous crises, solved them by entering an Age of Universal Empire. Universal Empires, however, are morbid: they are stultifying at best and eventually collapse in any case. Quigley's objection is not to international institutions, or even to world government. What the West must do, according to Quigley, is end its Age of Crisis without creating a Universal Empire through military conquest. The problem with the 20th century, down to the 1960s, has been repeated attempts by persons and groups to achieve universal power by force or manipulation.

 

This analysis sounds much more interesting than it is. Quigley's tale is pretty much a vindication of President Franklin Roosevelt's administration (1933-1945). By Quigley's account, the failure to adopt the policies of those years earlier in the 20th century led to the disasters of the Depression and the Second World War, while the need of the decades that followed was to expand and perfect the Progressive tradition they embodied. Much of the reputation of this book among conspiracy theorists rests on its account of the world financial system of the 1920s, when the Bank of England no longer had the power to regulate the system, as it had before the First World War. The gap was filled by private institutions acting in collusion with the heads of the central banks, generally without oversight from the world's major governments. A combination of bad luck and stupidity made the system collapse at the end of the decade, so that currencies became inexchangeable, trade froze, and force displaced commerce both domestically and internationally. It's not hard to make ordinary banking practices sound like the work of the devil, and in this book the devil's little helpers are Morgans, Rothschilds, and Barings.

 

One can take or leave Quigley's long, very long, expositions of economic theory. Many readers will be inclined to leave an argument that suggests the whole of history was preparation for the ultimate enlightenment contained in John Kenneth Galbraith's "The Affluent Society," which argued for Keynsian macroeconomics and a mildly redistributive social policy. (Quigley clearly alludes to that book, published in 1958, but does not cite it.) In any case, Quigley described speculative, international finance-capitalism as a feature of the past; he did not think it had any relevance to his own day.

 

What chiefly ensured Quigley's work a lasting place in the pantheon of paranoia, however, was his attempt to provide a social context for this activity. This paragraph appears at the end of a tirade against McCarthyism:

 

"This myth, like all fables, does in fact have a modicum of truth. There does exist, and has existed for a generation, an international Anglophile network which operates, to some extent, in the way the radical Right believes the Communists act. In fact, this network, which we may identify as the Round Table Groups, has no aversion to cooperating with the Communists, or any other groups, and frequently does so. I know of the operations of this network because I have studied it for twenty years and was permitted for two years, in the early 1960s, to examine its paper and secret records. I have no aversion to it or to most of its aims and have, for much of my life, been close to it and to many of its instruments. I have objected, both in the past and recently, to a few of its policies (notably to its belief that England was an Atlantic rather than a European Power and must be allied, or even federated, with the United States and must remain isolated from Europe), but in general my chief difference of opinion is that it wishes to remain unknown, and I believe its role in history is significant enough to be known." [Page 950]

 

"Anglophilia" sounds like a debilitating psychological ailment, with some reason. In its American manifestation, it suggests a preference for tweedy clothes, water sports that don't require surf, and nominal affiliation with the Anglican Communion. The syndrome has a copious literature, much of it concerned with prep schools, but here is all you need to know in this context. The ideology of Quigley's network can apparently be traced to 19th century Oxford, indeed specifically to All Souls College, back when John Ruskin was expounding a compound of Gothic Revival aesthetics, the glory of the British Empire, and the duty to uplift the downtrodden poor. These ideas seized the imagination of Cecil Rhodes during his years at Oxford. He hoped for a federation of the whole English-speaking world, and provided the money and impetus for institutions to link those countries. (Lord Alfred Milner provided the organizing talent.) The best known of these efforts are the Rhodes Scholarships for study at Oxford. (Bill Clinton is among the many well-know recipients.) They also included informal "Round Table Groups" in the Dominions and the US, which sponsored local Institutes of International Affairs. The US version is the Council on Foreign Relations.

 

While the people in these groups were very influential (that is why they were asked to join), Quigley makes clear that the Round Tables never had everything their own way, even in the administration of colonial Africa, where both Rhodes and Milner were especially interested. As with the finance capitalists, the Anglophile network was essentially a league of private persons trying to fill a gap in the international system. As public institutions were created to exercise the Round Tables' consultative and communications functions, the network itself became less important.

 

Quigley makes the increasing marginalization of the Anglophile network perfectly clear, and in fact he does not suggest that it was ever more than one factor among many at any point in the 20th century. Nonetheless, it is his failing as a historian to suggest that a causal nexus can be inferred whenever two actors in a historical event can be shown to have met. Consider this excerpt from a discussion of the history of Iran:

 

"By that time (summer, 1953) almost irresistible forces were building up against [Prime Minister] Mossadegh, since lack of Soviet interference give the West full freedom of action. The British, the AIOC, the world petroleum cartel, the American government, and the older Iranian elite led by the shah combined to crush Mossadegh. The chief effort came from the American supersecret intelligence agency (CIA) under the personal direction of its director, Allen W. Dulles, brother of the secretary of State. DulIes, as a former director of the Schroeder Bank in New York, was an old associate of Frank C Tiarks, a partner in the Schroeder Bank in London since 1902, and a director of the Bank of England in 1912-1945, as well as Lazard Brothers Bank, and the AIOC. It will be recalled that the Schroeder Bank in Cologne helped to arrange Hitler's accession to power as chancellor in January 1933." [Page 1059]

 

I don't quite know what this is supposed to mean; that pretty much the same people overthrew Prime Minister Mossadegh as brought us Hitler? I am reminded of nothing so much as Monty Python's parody of an Icelandic saga, about the deeds of "Hrothgar, son of Sigismund, brother of Grundir, mother of Fingal, who knew Hermann, the cousin of Bob." Maybe this is Quigley's idea of "thick" description. Certainly "Tragedy and Hope" is thick with it; it goes on for pages and pages.

 

"Tragedy and Hope" is a fossil, perfectly preserved, of the sophisticated liberalism of the Kennedy era. Quigley takes a partisan position in the debates about nuclear strategy that began in the 1950s. (He sat on several government commissions on scientific questions, including the one that recommended creating NASA. The book explains the physics of nuclear weapons in some detail; Quigley does not just name names, he names the weight of fissionable material necessary for a bomb.) Thus, he praises Oppenheimer and condemns Teller, deplores the cost-cutting strategy of "massive retaliation" embraced by the Eisenhower Administration and supports tactical nuclear devices suitable for conventional war. "Tragedy and Hope" has prose poems to "Operations Research," the application of quantitative analysis to military affairs, which he ranks with Keynsian economics as one of the pillars of modern civilization.

 

Though it is not entirely fair to criticize even a book such as this for failing to foresee the immediate future, still I cannot help but remark how many of these ideas were tested in the 1960s and found wanting. The number-crunching military philosophy that Quigley endorsed was essentially that of Robert McNamara's Pentagon; as much as anything else, it is what lost the Vietnam War for the United States. Quigley covers Vietnam up through the assassination of President Diem in 1963, but gives no greater prominence to the conflict there than to other Cold War trouble spots. This book is good evidence, if any more were needed, that even the Americans who knew most had not the tiniest idea what they were doing.

 

The problem with the Kennedy Enlightenment is not that elements of its conventional wisdom were wrong; that is true of all eras. The great flaw was its totalitarian streak. Quigley expresses this attitude perfectly:

 

"The chief problem of American political life for a long time has been how to make the two Congressional parties more national and more international. The argument that the two parties should represent opposed ideals and policies, one, perhaps of the Right, and the other of the Left, is a foolish idea acceptable only to doctrinaire and academic thinkers. Instead, the two parties should be almost identical so that the American people can 'throw the rascals out' at any election without leading to any profound or extensive shifts in policy. The policies that are vital and necessary for America are no longer subjects of significant disagreement, but are disputable only in terms of procedure, priority and method..." [Page 1248-1249]

 

Quigley was aware that there was a substantial number of persons in the nascent conservative movement who did not think that all issues had been settled yet, but he regards their opinions as not just erroneous but illegitimate. Quigley has fits of class analysis, so he tells us that the traditional middle class, considered as a cultural pattern rather than an economic group, was evaporating because of growing prosperity and feminization. (His description of contemporary students as promiscuous, unkempt and unpunctual suggests he had some inkling of just how annoying the Baby Boom generation was going to be.) The Right, however, was dominated by a parody, also destined to be ephemeral in Quigley's estimation, of the disappearing middle class. The Right was "petty bourgeois" (he actually uses the term), grasping, intolerant and careerist. They were ignorant, even the ones who tried to get into top colleges on the basis of good grades, since those grades were achieved by unimaginative drudgery rather than by any real engagement with the life of the mind. The Right even came from unfashionable places, principally the Southwest, where they made fortunes in dreadful extractive industries, like oil and mining. The Right, particularly as manifest in the Republican Party, is merely ignorant. It must be combated, but need not be listened to.

 

Let us think less harshly of Bill Clinton hereafter, if these were the opinions he heard from the Wise and the Good of his youth.

 

The infuriating thing is that Quigley knows better. He was well aware of the totalitarian trajectory of the respectable consensus of his day, and he was not pleased by it. Consider this paragraph:

 

"Because this is the tradition of the West, the West is liberal. Most historians see liberalism as a political outlook and practice founded in the nineteenth century. But nineteenth-century liberalism was simply a temporary organizational manifestation of what has always been the underlying Western outlook. That organizational manifestation is now largely dead, killed as much by twentieth-century liberals as by conservatives or reactionaries...The liberal of about 1880 was anticlerical, antimilitarist, and antistate because these were, to his immediate experience, authoritarian forces that sought to prevent the operation of the Western way. ...But by 1900 or so, these dislikes and likes became ends in themselves. The liberal was prepared to force people to associate with those they could not bear, in the name of freedom of assembly, or he was, in the name of freedom of speech, prepared to force people to listen. His anticlericalism became an effort to prevent people from getting religion, and his antimilitarism took the form of opposing funds for legitimate defense. Most amazing, his earlier opposition to the use of private economic power to restrict individual freedoms took the form of an effort to increase the authority of the state against private economic power and wealth in themselves. Thus the liberal of 1880 and the liberal of 1940 had reversed themselves on the role and power of the state..." [Page 1231]

 

Quigley strongly suspected that, whatever else may happen to the West, democracy was likely to be a decreasingly important feature. In part, this was for a reason that would gladden the hearts of defenders of the Second Amendment of the US Constitution: the disarming of the citizenry, at least in comparison to the military. Universal male suffrage was partly a side effect of the dominance in the 19th century of the rifle-armed mass infantry. Firearms were cheap and great equalizers; governments could use such armies only with a high level of consent from the citizens who composed them. In the 20th century, however, the new weapons were beyond the means of private parties or groups, and they could be operated only by trained experts. In a way, the world came back to the era knights and castles, when the bulk of the population figured in politics chiefly as silent taxpayers.

 

Quigley did recognize that the trends of the 20th century up to his day might not go on forever, and at this point the book becomes positively disconcerting. He saw no end to the standoff between the US and the Soviet Union, except to the extent that their economic and political systems might be expected to converge in an age increasingly dominated by experts. ("Convergence": now that's a buzzword that brings back memories.) On the other hand, he did think that the lesser countries of each block would be able to operate more independently from the US and the USSR, and even to relax internally. He makes remarks about the possibility of balkanization and decentralization that might almost have been made by Robert Kaplan and Thomas Friedman, who are perhaps best known for their recent writing about chaos and disintegration in the world after the Cold War. Like other people writing 40 years later, Quigley also suggests that, simultaneous with increasing disorder and complexity, new international institutions would also flourish, so that the nations of his day would lose authority to entities both greater and smaller than themselves.

 

"Tragedy and Hope" suggests that the future may look something like the Holy Roman Empire of the late medieval period. [Page 1287] In principle, the empire was a federal hierarchy of authorities, but the principle was scarcely visible in the tangle of republics, kingdoms, and bishoprics that composed it. The Imperial Diet was as multichambered as a conch-shell, while the executive functioned only on those rare occasions when the emperor, an elected official, managed to persuade the potentates of the empire that what he wanted to do was in their interest. Actually, Quigley did not have far to seek for this model. The early European Economic Community of his day already was starting to look like just such a horse designed by a committee. Its evolution into the European Union has not lessened the resemblance. Quigley seemed to expect a parallel evolution of institutions universally, through the UN system, for which a united Europe would stand as a model. He is not perfectly clear on this point, however. As is so often the case when people talk about transcending national sovereignty, it is not clear whether they are talking about the evolution of the West, or of the world, or of both.

 

To broach a final topic, one of the things that struck me about "Tragedy and Hope" was Quigley's lack of interest in intellectual history, except for science. His treatments of ideology tend to be cursory, misleading or wrong. Lack of interest is his privilege, of course, but to write a 1,300-page book about the first half of the twentieth century without liking ideology is like owning a candy store and not liking chocolate. The only point when the matter seems to fully engage his interest is when he is speculating about the ideology that might help the West to emerge from its Age of Crisis. What the West needs to do, he says, is to hold fast to its special intellectual virtues, which he summaries like this:

 

"The Outlook of the West is that broad middle way about which the fads and foibles of the West oscillate. It is what is implied by what the West says it believes, not at one moment but over the long succession of moments that form the history of the West. From that succession of moments it is clear that the West believes in diversity rather than in uniformity, in pluralism, rather than in monism, or dualism, in inclusion rather than exclusion, in liberty rather than in authority, in truth rather than in power, in conversion rather than in annihilation, in the individual rather than in the organization, in reconciliation rather than in triumph, in heterogeneity rather than in homogeneity, in relativisms rather than in absolutes, and in approximations rather than in final answers. The West believes that man and the universe are both complex and that the apparently discordant parts of each can be put into a reasonably workable arrangement with a little good will, patience, and experimentation. In man the West sees body, emotions, and reason as all equally real and necessary, and is prepared to entertain discussion about their relative interrelationships but is not prepared to listen for long to any intolerant insistence that any one of these has a final answer." [Page 1227]

 

At first glance, this might seem to be just another instance of the Kennedy Enlightenment assuming that its own parochial ideas are all the ideas there are. Certainly this laundry list looks more than a little like John Dewey's pragmatism. Pragmatism has its virtues, but is hardly the thread that runs through all Western history. However, that is not where the summary comes from. On close examination, Quigley's "Way of the West" has more content than is characteristic of pragmatism, which is a philosophy about procedure. What we have here, as Quigley tells us himself, is a take on the philosophy of Thomas Aquinas.

 

Aquinas has been credited and blamed for many things. In the 20th century, he had been called "the father of science" and "the first Whig." There really are features of his ideas that are friendly to empirical science and to limited government with the consent of the government. On the other hand, if you need a detailed account of the physiology of demons, he is your man. A "liberal" Thomas is not the only possible Thomas, but such an interpretation would have appealed to a Catholic scholar like Quigley in the immediate aftermath of the Second Vatican Council, where the ideas of John Cardinal Newman on the development of doctrine seemed to carry all before them.

 

There is an obvious pattern in Quigley's ideas about the future. Consider the specifics: the end of mass warfare and mass democracy, the disintegration of the nation state into both a universal polity and local patriotisms, and a global intellectual synthesis that is willing to entertain any idea that is not contrary to faith and morals. (Aquinas was rather more honest about that last part than was Quigley.) What we have here is a vision of the High Middle Ages with International Style architecture. This vision may or may not reflect the future, but it certainly has a long history. Let us let Oswald Spengler have the last word; I suspect this is where the citation-shy Quigley got the idea in the first place:

 

"But neither in the creations of this piety nor in the form of the Roman Imperium is there anything primary and spontaneous. Nothing is built up, no idea unfolds itself - it is only as if a mist cleared off the land and revealed the old forms, uncertainly at first, but presently with increasing distinctness. The material of the Second Religiousness is simply that of the first, genuine, young religiousness - only otherwise experienced and expressed. It starts with Rationalism's fading out in helplessness, then the forms of the Springtime become visible, and finally the whole world of the primitive religion, which had receded before the grand forms of the early faith, returns to the foreground, powerful in the guise of the popular syncretism that is to be found in every Culture at this phase."

 

The Decline of the West, Volume II, page 311


Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Last Superstition Book Review

This book review was accidentally taken down in a site update. This is the most popular book review I have ever written, so it seems worthwhile to revisit. I've long been a fan of Ed Feser, and I recommend his work. The tone of The Last Superstition has been offputting to some, but Feser knew what he was doing. If you think Feser is bad, you should read the things his critics have said about him. At least Feser feels the need to prove his assertions. If that isn't your cup of tea, he has written plenty of books with a more academic tone. Philosophy of Mind is well done. I haven't yet read Aquinas, but I managed to acquire two copies already.


The Last Superstition
by Edward Feser
ISBN 978-1587314520; $19.00

Edward Feser's The Last Superstition is a polemical work. However, this should not be surprising for two reasons. First, Feser is dealing with amounts to not mere nonsense, but nonsense on stilts. Second, Feser once wrote an essay entitled, "Can Philosophy be Polemical?", pondering whether it is appropriate to engage in polemical debate over philosophical questions. In this book, Feser answers that question in the affirmative. He freely admits in the preface, "If this seems to be an angry book, that is because it is." (TLS, x) Feser regards the creed of the New Atheists as dangerous both personally and socially, and his response is écrasez l'infâme.

The Last Superstition is the book I had been wanting, not because it is a tract against the New Atheism, but because it summarizes the best arguments for an Aristotelian-Thomist metaphysics in the face of modern objections. This metaphysics is presented as it developed historically, beginning with the pre-Socratics, on through Plato and Aristotle, to its full flowering among the Scholastics. Feser covers change, actuality and potency, form and matter, the four causes, arguments for the existence of God, and the rational foundations of morality.

By succinctly providing this history, Feser is providing a service to all those who have forgotten, or never truly knew what are the main features of an Aristotelian philosophy. For Feser's most damning criticism of Richard Dawkins et al. is that they have simply not bothered to do their homework. By not collecting the relevant data, they have sinned against the spirit of the science in whose name they crusade. To publish a scientific paper without any evidence would be scandalous, but is precisely the case that Feser makes against them. None of the New Atheists demonstrates any familiarity with the actual arguments of historical theist philosophers except for Rev. William Paley, who functions as a convenient whipping boy.

By way of example, Feser quotes the admission of philosopher Antony Flew in 2004 that he now believes in the existence of God despite a lifetime of argument to the contrary. Flew admitted that he had never actually considered the Aristotelian arguments for the existence of God, and was forced to admit their cogency upon doing so. Those whom Feser targets in The Last Superstition have not yet bothered to consult the texts. Feser documents this amply through quotations from the New Atheists' works.

The weakest part of Feser's argument is in the section on natural law. The difficulty is not that the best contemporary formulation is not presented. The difficulty is that contemporary natural law arguments use human, homo sapiens, and person univocally. These are not just different things, they are different kinds of things. To use the Scholastic terminology, each belongs to a different genus. However, this failure leaves Feser's main argument untouched, because Aristotle and Aquinas were alike able to discern rational foundations for morality without the benefit of a modern doctrine of natural rights that makes use of equivocal terms.

Feser's references are very good, providing further information for the many points which could be elaborated upon. Covering as much ground as this book does would be impossible without considering a great many complicated and subtle topics briefly. However, this is not to say that Feser does not adequately address his topic. He makes short work of the New Atheists due to the poverty of their arguments, and then briefly presents arguments that modernity is more comprehensible if one considers modern problems in light of broadly Aristotelian philosophy. In particular, many of the perennial questions of modern philosophy, such as the mind-body problem or the validity of inductive reasoning become explainable with Aristotle's more robust account of causation. Feser's task is made easier here by the latent Aristotelianism lurking in every corner of Western Civilization. We do not notice our debt to Aristotle for the same reason that fish do not feel wet.

Edward Feser's The Last Superstition is a worthy introduction to the realist philosophical tradition, and is enlivened by Feser's sharp wit. Good for anyone who would like to know more about Aristotelian philosophy.

 

Free Will and the Science of Human Nature

Via hbd* chick, I came across JayMan's article, No, You Don't Have Free Will, and This is Why. JayMan is responding to an article by Roy Baumeister, Do You Really Have Free Will?. Baumeister has been featured on this blog before.

JayMan lauds Baumeister for avoiding any supernatural arguments in his article, but he criticizes Baumeister for confusing free will with agency,

that is, the ability to make decisions, especially those that involve human-level “self-control” and response to socially constructed rules...

I wouldn't identify free will with agency, but this isn't that bad of an argument. I will make an argument for the existence of free will on the terms presented in JayMan's blog, using agency as a tool. But first, lets look at JayMan's argument. I think is a good argument, one that should be considered in detail.

Baumeister said this, but I think it is pretty good:

There is no need to insist that free will is some kind of magical violation of causality. Free will is just another kind of cause.

Right, free will is a kind of cause. It is definitely not an uncaused cause, or some sort of causeless action. All acts have causes. Unless you are willing to consider the second of the Five Ways to prove that God exists, however, we are excluding supernatural explanations here.

JayMan next criticizes Baumeister for looking for free will in complexity. I also wouldn't go looking here. People forget that chaos theory, and other such scientific results are completely determinisitic. Complicated or hard to predict are not the same thing as free.

Baumeister next makes an Aristotelian argument, that plants lack locomotion, whereas animals have it. Animals need the ability to decide where to go based on sensory input, and thus make decisions.  All correct, and a very old argument. JayMan correctly notes that the ability to make a decision doesn't require it to be a free decision. The sensory outputs could completely determine the outcome. I doubt that Aquinas or Aristotle would have disagreed with that.

JayMan next points out that in aggregate, many human behaviors are predictable, and that we know that behavioral traits are somewhat heritable. This is the best part of the whole thing. I am absolutely fascinated by this, and I like learning about the ways in which our minds work. A lot of what we do is shaped by our personalities, our education, our upbringing, our past experiences, and even our genes.

Yet, for all that, I'm still going to argue the premises don't entail the conclusion, and free will exists. I actually have the easier part of the argument. In order for JayMan to prevail, no freedom whatsoever is permissible in decision making. I simply need to find a counter-example to prove the negative. I'm happy to agree that much of human behavior is determined by material causes. The philosophical tradition of which I am part agrees that we are material beings, and subject to material causes.

I agree that the process by which we process sensory input is determined by material causes. Complex ones, but material nonetheless. The trouble comes in the process of simulating the course of action. We know that hypotheses are underdetermined by data, no matter how much there is. It is not possible for a computation, or simulation, or whatever physical process is going on, to reduce a to determinate conclusion in all cases. Heck, I only need it to be true once for this argument to work. If your mind comes up with more than one possible course of action what is equally compelling [this is an assumption on my part, but I think a reasonable one], you need a way to choose between these courses. This is precisely what is meant by "free will", the ability to freely choose between limited goods. The cases gets more compelling when you consider that we never only want one thing. Being finite beings, we can't have everything we want. You can satisfy this genetic preference, or that one, but not both. The mere inability to choose seems a possible option, but it is clear most people manage to negotiate this impasse.

It seems likely that the way the mind works is to help you estimate probabilities of things happening: if you chose X instead of Y you are more likely to get Z. This seems to neatly explain our propensities to do things predictably without requiring us to eliminate free will. All the great masters meant by "free will" was that the logic in our minds is not powerful enough to force us choose one of the options our minds present to us, because this judgment is contingent, and therefore not capable of being determinative.

Even if we were to assume that the brain must always choose the highest probability option [and I think always would be hard to prove, and contrary to experience], it is not clear that a highest probability option must exist. To build a philosophically deterministic argument on a foundation of probability seems unwise. In order to prevail with a purely probabilistic materialist determinism, you need to smuggle in something more certain to clinch the argument, which is where I think Baumeister was trying to go with emergent properties.

I am perfectly happy to argue there are some determinative forces in nature that push us, and other things, in certain directions, but some of these things are immaterial, and I said I wasn't going to go there. I think pursuing this line of argumentation does not end well for the committed materialist.

 

Decision Fatigue

An Art of Manliness article on the power of morning and evening routines linked to an article that discusses an important concept: decision fatigue. Since I have argued that willpower is a finite resource, I am not surprised. The NY Times article cites the work of Roy F. Baumeister, but another classic is the Stanford marshmallow experiment

The adult version of this test goes like this:

A nearby department store was holding a going-out-of-business sale, so researchers from the lab went off to fill their car trunks with simple products — not exactly wedding-quality gifts, but sufficiently appealing to interest college students. When they came to the lab, the students were told they would get to keep one item at the end of the experiment, but first they had to make a series of choices. Would they prefer a pen or a candle? A vanilla-scented candle or an almond-scented one? A candle or a T-shirt? A black T-shirt or a red T-shirt? A control group, meanwhile — let’s call them the nondeciders — spent an equally long period contemplating all these same products without having to make any choices. They were asked just to give their opinion of each product and report how often they had used such a product in the last six months.

Afterward, all the participants were given one of the classic tests of self-control: holding your hand in ice water for as long as you can. The impulse is to pull your hand out, so self-discipline is needed to keep the hand underwater. The deciders gave up much faster; they lasted 28 seconds, less than half the 67-second average of the nondeciders. Making all those choices had apparently sapped their willpower, and it wasn’t an isolated effect. It was confirmed in other experiments testing students after they went through exercises like choosing courses from the college catalog.

This is a rough and ready test of conscientiousness, but it is worth remembering that conscientiousness is a very big bucket. There are lots of sub-traits that fall in this category. Here is a list of sub-traits from one test:

 

  • Self-Efficacy
  • Orderliness
  • Dutifulness
  • Achievement-Striving
  • Self-Discipline
  • Cautiousness 

The sub-traits have formal similarities, but they can actually have a complete lack of correlation. My sub-trait scores on C have almost exactly zero correlation.

For all that, the marshmallow test is known to predict future success in life. The individual traits of C are harder to predict than the overall bucket, but the whole mess of them are generally helpful in life. 

An interesting result from the work of Baumeister: eating restores willpower. 

The researchers set out to test something called the Mardi Gras theory — the notion that you could build up willpower by first indulging yourself in pleasure, the way Mardi Gras feasters do just before the rigors of Lent. In place of a Fat Tuesday breakfast, the chefs in the lab at Florida State whipped up lusciously thick milkshakes for a group of subjects who were resting in between two laboratory tasks requiring willpower. Sure enough, the delicious shakes seemed to strengthen willpower by helping people perform better than expected on the next task. So far, so good. But the experiment also included a control group of people who were fed a tasteless concoction of low-fat dairy glop. It provided them with no pleasure, yet it produced similar improvements in self-control. The Mardi Gras theory looked wrong. Besides tragically removing an excuse for romping down the streets of New Orleans, the result was embarrassing for the researchers. Matthew Gailliot, the graduate student who ran the study, stood looking down at his shoes as he told Baumeister about the fiasco.

Baumeister tried to be optimistic. Maybe the study wasn’t a failure. Something had happened, after all. Even the tasteless glop had done the job, but how? If it wasn’t the pleasure, could it be the calories? At first the idea seemed a bit daft. For decades, psychologists had been studying performance on mental tasks without worrying much about the results being affected by dairy-product consumption. They liked to envision the human mind as a computer, focusing on the way it processed information. In their eagerness to chart the human equivalent of the computer’s chips and circuits, most psychologists neglected one mundane but essential part of the machine: the power supply. The brain, like the rest of the body, derived energy from glucose, the simple sugar manufactured from all kinds of foods. To establish cause and effect, researchers at Baumeister’s lab tried refueling the brain in a series of experiments involving lemonade mixed either with sugar or with a diet sweetener. The sugary lemonade provided a burst of glucose, the effects of which could be observed right away in the lab; the sugarless variety tasted quite similar without providing the same burst of glucose. Again and again, the sugar restored willpower, but the artificial sweetener had no effect. The glucose would at least mitigate the ego depletion and sometimes completely reverse it. The restored willpower improved people’s self-control as well as the quality of their decisions: they resisted irrational bias when making choices, and when asked to make financial decisions, they were more likely to choose the better long-term strategy instead of going for a quick payoff.

Again, not too surprising for me. I've known for a long time that I lose my temper when I get hungry. People who have more C can suffer fools gladly longer than I can when hungry. Since mental energy is material, this is to be expected. There is a fun Newtonian twist to this. One of Baumeister's students didn't believe that glucose could really affect willpower. He proved that overall energy usage didn't really change in the brain, no matter how much willpower the subject had. What he didn't expect, however, was that there was an equal and opposite reaction in which areas of the brain receive energy when your willpower is depleted, and the balance is restored by eating.

The results of the experiment were announced in January, during Heatherton’s speech accepting the leadership of the Society for Personality and Social Psychology, the world’s largest group of social psychologists. In his presidential address at the annual meeting in San Antonio, Heatherton reported that administering glucose completely reversed the brain changes wrought by depletion — a finding, he said, that thoroughly surprised him. Heatherton’s results did much more than provide additional confirmation that glucose is a vital part of willpower; they helped solve the puzzle over how glucose could work without global changes in the brain’s total energy use. Apparently ego depletion causes activity to rise in some parts of the brain and to decline in others. Your brain does not stop working when glucose is low. It stops doing some things and starts doing others. It responds more strongly to immediate rewards and pays less attention to long-term prospects.

If you wish, you can apply the standard evolutionary biology mental shortcut at this point.

One of the virtues of the Aristotelian account of the virtues is that as you become more experienced in living a virtuous life, good choices become habits that no longer require thought. This frees up your mental energy for bigger and better things.

Grit

John D Cook linked to an article on "grit" by Venkatesh Rao. This article really got me thinking. Ever since I discovered the utility of psychometrics for personality, I have spent a great deal of time pondering the relationship between the gifts we are given, and what we do for ourselves.

Venkat's primary point in his post is our modern economy doesn't align well with the academic disciplines the elite are educated in. He says people call him a generalist because he has a PhD in Aerospace Engineering and he ended up in marketing. However, from his perspective, there was a straight line between those two points. Thus his physics metaphor of external and internal coordinate systems.

The trouble is, we still tend to think in that external coordinate system, and may spend years trying to make that aerospace education turn into an aerospace job when our true skills and interests may lie elsewhere. Katz's now infamous article, Don't Become a Scientist, addressed precisely this mismatch between the disciplinary expectations produced in grad school, and the actual behavior of the job market.

Venkat then turns to what he calls grit, and I would call conscientiousness. He correctly notes this is probably the best predictor of success, over IQ, over family connections, over just about anything. People who bust ass almost always do well.

One point where I would disagree with Venkat is this:

Grit is the enduring intrinsic quality that, for a brief period in recent history, was coincident with the pattern of behavior known as progressive disciplinary specialization.

I don't think this should be in the past tense. Progressive disciplinary specialization is becoming more and more associated with C and less and less with g. What we may be getting is less and less value for our money and effort, because disciplinary specialization [in science at least] often means working under your 50- or 60-something PI in relative anonymity as cheap, but skilled labor.

This is a really good working definition of conscientiousness:

Grit has external connotations of extreme toughness, a high apparent threshold for pain, and an ability to keep picking yourself up after getting knocked down. From the outside, grit looks like the bloody-minded exercise of extreme will power. It looks like a super-power.

Venkat goes on to discuss how what can look like brutal hard work can actually be easy, depending on your skills and interests. Quite true. I think the big takeaway here is that building on your strengths can be more effective than trying to remedy your weaknesses. This is a subject of intense personal interest to me, because once I discovered that I have low conscientiousness, many of my frustrations became comprehensible.

Conscientiousness is a finite resource. As a Thomist, this doesn't surprise me. The part of our mind that touches infinity is our intellect, the rational, reasoning, undying part of us. The rest of us is mediated through a thoroughly material, fallible, limited body. Willpower, like strength, can be depleted because it is material.

Once I knew this, I could understand why my reach continually exceeded my grasp. I like Renkat's point about flow and the results that can come from just keeping doggedly at something. But for me, doggedly keeping at something is very, very difficult. I just don't have a lot of capacity [potentia] for self-discipline. The revelation for me was realizing this is a stable personality trait. There are things I can do, for sure, but it is a limitation I will probably struggle against for my entire life. Since my conscientiousness is so low, I actually do need to exert continual will just to keep showing up.

Engineers sometimes talk about "finding the cliff". This means looking for the failure point so you know where your assumptions are still valid, and where they are not. I found the cliff in my own conscientiousness in college. I was a junior in a physics program, and I knew that I had the mental horsepower to do as well as anyone in the program. I seriously expected to be at or near the top in all my courses. My assumption of mental horsepower is probably accurate. What I was missing was an accurate assessment of my capacity for hard work. This was the point in college where I had to stop goofing off and seriously apply myself if I wanted top honors. I tried to do that. I pushed myself beyond my limits. [who can't give 110%?]

The price I paid was I became sicker than I have ever been in my life. It was years before I really recovered. I fear that I treated my friends poorly during this time. I'm surprised they still talk to me. I was miserable. The worst part of it all was that in order to save myself, I had to give up. I'm being hard on myself. I did just fine in college, but I had to seriously adjust my expectations [the soft bigotry of low expectations] about what I was capable of. This runs against the grain of everything my education had instilled in me, so I thought I was a failure.

Thus it was an incredible relief when I discovered that I had indeed fought the good fight, and finished the race. First place just wasn't for me. I did well with what I had been given.

Thus, while I like the insight with which Renkat advises us to take the path of least resistance, I cannot take him literally, for me the path of least resistance involves a couch, videogames, and that computer guy shape. I have a family to provide for, so I have to keep grinding it out. There are some weaknesses that can simply be avoided, using the mountain metaphor. These are simply relative weaknesses, what are called contraries. To be decisive is the opposite [contrary] of carefully considering the options. Both are strengths in their place. Being too lazy to show up to work is a privation of the good of being a hard worker. This simply needs to be resisted with the tools we have at our disposal.

Further reading:

http://www.ribbonfarm.com/2011/08/19/the-calculus-of-grit/

http://psycnet.apa.org/index.cfm?fa=buy.optionToBuy&id=1993-40718-001

http://www.psychologytoday.com/articles/200510/the-winning-edge

http://www.johndcook.com/blog/2011/08/29/gritty-coordinate-systems/

http://www.wired.com/wiredscience/2011/03/what-is-success-true-grit/

http://www.tempobook.com/2011/08/17/daemons-and-the-mindful-learning-curve/

Aquinas and Neuroscience

Non-linear Brain Dynamics and Intention According to Aquinas, by Walter J. Freeman.

Abstract
We humans and other animals continuously construct and maintain our grasp of the world by using astonishingly small snippets of sensory information. Recent studies in nonlinear brain dynamics have shown how this occurs: brains imagine possible futures and seek and use sensory stimulation to select among them as guides for chosen actions. On the one hand the scienti c explanation of the dynamics is inaccessible to most of us. On the other hand the philosophical foundation from which the sciences grew is accessible through the work of one of its originators, Thomas Aquinas. The core concept of intention in Aquinas is the inviolable unity of mind, brain and body.

All that we know we have constructed within ourselves from the unintelligible fragments of energy impacting our senses as we move our bodies through the world. This process of intention is transitive in the outward thrust of the body in search of desired future states; it is intransitive in the dynamic construction of predictions of the states in the sensory cortices by which we recognize success or failure in achievement. The process is phenomenologically experienced in the action-perception cycle. Enactment is through the serial creation of neurodynamic activity patterns in brains, by which the self of mind-brain-body comes to know the world rst by shaping the self to an approximation of the sought-for input, and then by assimilating those shapes into knowledge and meaning.

This conception of the self as closed, autonomous, and selforganizing, devised over 700 years ago and shelved by Descartes, Leibniz and Spinoza 300 years ago, is now re-emerging in philosophy and re-establishes the meaning of intention in its original sense. The core Aquinian concept of the unity of brain, body and soul/mind, which had been abandoned by mechanists and replaced by Brentano and Husserl using the duality inherent in representationalism, has been revived by Heidegger and Merleau-Ponty, but in phenomenological terms that are opaque to neurscientists. In my experience there is no extant philosophical system than that of Aquinas that better ts with the new ndings in nonlinear brain dynamics. Therefore, a detailed reading and transcription of basic terms is warranted, comparing in both directions the signi cance of key words across 700 years from medieval metaphysics to 21st century brain dynamics.

My gratitude to the Social Pathologist, who pointed me to this paper inadvertently. Aquinas is my homie too.

Possibility in Nature

James Chastek has a brief post on Two meanings of "chance" that is very brief, but has revealing comments, including this one:

There is a very long history of denying the reality of chance, and there is some force to it- the sciences not infrequently find causes for things thought to be by chance, like the generation of gnats, or hand washing and patient health.

Here is the important distinction between luck and chance. It wasn’t important here, but it is important in itself. Luck (good and bad) results from some ignorance or deficiency in the power of the agent, chance in nature proceeds from some deficiency in interior causes, primarily from matter.

Aristotle called chance a real cause, but cause per accidens. It is not a proper cause, but it is that which is responsible for something, and so satisfies some notion of cause. This is a tricky question, one that De Koninck dedicated much of his career to.

The modern educated common wisdom basically accepts the billiard-ball model of physics, and tends to assume a mechanistic view of nature built upon this model. However, the trouble is that material things are not really perfect enough to be deterministic. There is an order of being argument lurking in the background here, that only a perfect cause could make something happen the same way every time, whereas every material thing is less than perfect, and therefore cannot be a part of a deterministic chain of causation. It is, however, close enough to make it work most of the time.

I suspect that some of the resistance to this idea is that chance is a kind of negation, as Chastek says in an earlier post, "a supposedly pure world of complete unintelligibility". Chance or possibilty seen as a lack of knowledge thus undermines our ability to know the world, because it would seem that as pure unintelligibility, chance events would mean that anything could happen. Thus, I think there is a fear that denying the deterministic account means denying the power of modern science.

However, in an Aristotelian account, chance can serve a purpose in nature. It is not really that anything could happen, but rather things tend toward certain ends, even if they don't quite make it all the time. Thus you end up with distributions of measurements when you study natural things in the modern mathematical way. Sometimes there can be a "true" number towards which the natural thing is tending, and the tightness of the distribution is related to the power of the cause. The actual power of modern science is to be able to say, the probability the value found is between this number and that, is X%.

I think chance could perhaps be posited as one of the things that allows sufficient uniformity in nature for us to be able to predict things for the most part. It seems that if everything were in fact completely deterministic, every event would be unique, and in a sense, unpredictable for us, because one of our primary ways of knowing is inductive, working from particular events to what happens for the most part. Thus a universe with some looseness to its physical causes is actually more intelligible to us rather than less.

Act and Potency

Edward Feser posted a lengthy reflection on Act and Potency. Act and potency are critical concepts in Aristotelian/Thomist philosophy, and will repay the attention given them. Act and potency start within what Aristotle called physics, the study of changable or mobile being. Aristotle's physics was continuously studied for almost 2000 years, first by the Greeks, then the Romans, then the Muslim Aristotelians, the Scholastics, and then by the budding natural scientists of the Renaissance and Early Modern eras.

Aristotle resolved a dilemma in early Greek thought that was in itself quite an accomplishment. Parmenides had argued against the cosmology of the Ionian philosophers by saying that true change was impossible. His argument is a masterpiece of logic. Being is. Non-being is not. For something to change means that what is not becomes what is. However, from nothing comes nothing. Thus change is impossible.

Perhaps this seems trite. However, Parmenides created such a formidable argument that both Plato and Aristotle, who are considered the two most eminent philosophers who ever lived, both devoted considerable time to answering him. Aristotle's physics is a response to Parmenides. Aristotle claimed that Parmenides was pretty much right, but he added an additional layer of distinctions that makes the matter more comprehensible. It is definitely true that being cannot come to be from non-being. However, nonetheless, things do change. Thus there must be both something that comes to, and a subject of the change that persists through the change.

In order for something to change, it must have potency. It must be able to be something else in some fashion. To change into something else is act, because then potency has ceased, and new being has replaced it. However, a new potency arises, because this new being can yet be something else. Act and potency always go together.

From this relatively simple foundation, Aristotle eventually ascends to the contemplation of being itself, the study of metaphysics. St. Thomas went even further, and used these distinctions in his Five Ways. However, as Feser notes, modern philosophy has rejected or forgotten much of this, so Aristotle and Thomas are now hard to understand because these disctinctions have been collapsed.


Cross-posted at Dead Philosophers Society