Linkfest 2017-04-28

How Western Civilization could collapse

Cyclical models of history are always popular.

Eye Makeup Used To Protect Children Can Poison Them Instead

This is something that should be on the short list of things to get rid of that will improve life for everyone.

The deep imagery of coal mining in the 1970s shows a lifestyle of peril and persistence

I love photo essays like this.

Hope for preemies as artificial womb helps tiny lambs grow

This kind of technology is always a double-edged sword; you can use it to save premature babies, or you could use to completely separate sex from reproduction.

America's Next Great Metropolis is Taking Shape in Texas

An older article about the San Antonio-Austin corridor.

How enemies became friends in this unique lesson of Vietnam

Soldiers often find they have more in common with their enemies than the civilians at home. 

50 Years Ago, This Was a Wasteland. He Changed Everything

A beautiful example of wealth well spent.

How Online Shopping Makes Suckers of Us All

Armies of PhDs with algorithms are competing to fleece us of our money.

From Vilified to Vindicated: the Story of Jacques Cinq-Mars

Sometimes it takes a very long time to be vindicated.

What is adaptation by natural selection? Perspectives of an experimental microbiologist

Our understanding of evolution owes much to E. coli.


10 questions for Adam K. Webb

A fascinating 2006 interview by Razib Kahn. I largely agree with Webb's positions, and I want to read his books.

The synthesis of ancient and modern physics and politics.

It has been a long time since I stopped by James Chastek's Just Thomism, but this is the kind of post that keeps me coming back. This is exactly how I feel about the philosophy of Aristotle and St. Thomas Aquinas.

The Long View: Island of the Day Before

The accurate measurement of longitude was a driving force of much of science during the Age of Exploration. A hell of a lot of good research came out of nations competing to do this faster and better, in search of filthy lucre. This is one of the foundational elements of my cocktail party theory of progress in science.

This book review from 1996 is the ultimate source of my characterization of hard science fiction [as opposed to space opera] as a way of introducing the reader to some useful concept in story form. In addition to the importance of measuring longitude, the baseless self-referential system of symbols that dominated the thought of the Renaissance and the Enlightenment is on display here.


The Island of the Day Before

 

by Umberto Eco
Harcourt Brace & Company, 1995
(trans. 1995 by William Weaver)
$25.00, 515 pp.
ISBN: 0-15-100151-0

The Memory-Scow of Fr. Wanderdrossel, S.J.

No matter how complicated a novel's plot or how subtle its message, all reviews of novels should start by telling you what the book is about. This novel is perfectly simple. Boy grows up during the Thirty Years' War. Boy goes on quest in order to find a way to determine longitude. Boy finds Jesuit. Boy goes mad and drowns. Everything else is a digression. Which is the problem.

If you believe you live in a world where getting there is not just half the fun but the only fun you are likely to have, novels should be written as a garland of digressions. Doubtless there has to be some unifying thread of plot to keep the whole thing together, but the treasure chest the characters have been seeking must always turn out to be empty. Of course, it would not do to have the characters ever quite realize just how much they are wasting their time. The author and the reader can know that the world, or at any rate the story, is meaningless; the characters' job is to try to find meaning and to fail in the attempt.

Umberto Eco, professor of semiotics at the University of Bologna, has written this kind of novel more than once. The trick is to use the book as a lecture room in which to instruct the reader in the milieu of some historical period or social setting, but without waxing tediously didactic. This, of course, is the method of good "hard" science fiction, which leaves the reader usefully instructed in certain principles of physics or biology after reading a story that otherwise closely resembles a Western. Eco does this very well. In "The Name of the Rose," we learned a great deal about late medieval ecclesiastical politics in the course of a story that did not pretend to be anything more than a merry parody of a Sherlock Holmes adventure. In "Foucault's Pendulum," we became much the wiser about the subsidy publishing business while following what I for one think was a slightly superior occult conspiracy. (Of course, Eco's occult conspiracy was not as good as the one in Theodore Roszak's underappreciated novel, "Flicker," but you can't have everything.)

"The Island of the Day Before" is even more ambitious, since we are treated to nothing less than a tour of the episteme of the 17th century. If you believe Foucault (the twentieth century deconstructionist, not a man after whom any type of pendulum is named), the eighteenth century was a time of logical, schematic knowledge. As exemplified by Linnaeus's system of biological classification, the Enlightenment mind was a-historical, given to discerning timeless formal patterns. The episteme of the nineteenth century, in contrast, was evolutionary in its view of both the physical world and of society. Together, these two epistemes constitute the mind of modernity. As a way, perhaps, to discerning the characteristics of the postmodern era, Eco tries to give us a sense of the European mind on the eve of modernity, before the epistemes of the modern era overwhelmed other ways of understanding the world.

People in that age did not expect the world, or their own lives, to make much sense as a linear narrative. It was not that they were suspicious of such narratives; as with Eco's plots, there was always one handy to tie things together. Rather, their first instinct was to look for subtle connections between particular and particular. Their politics and their science, and not least their prose, were complex, obscure, allusive. One did not try to understand the world by extrapolating from first principles. Rather, they lived in a world of signs and symbols. There was no high road to understanding. An education meant going from book to book, ancient and modern, in order to understand obscure allusions made by others, and as preparation to make a few of your own.

Most symbols were obviously of human manufacture. It was a great age for emblems and crests and heraldic devices, from which a suitably informed person might be able to deduce a great deal about the user's history and philosophy. Since they also thought that the natural world worked in much the same way, natural knowledge was a catalogue of the hidden sympathies between metals and birds and plants and planets. Medicine was an understanding of how the humors and parts of the body fit into this dense web of sympathies and pointers. I have long suspected that the Hermetic tradition fascinated Yeats because it provided a language in which things and not just words could rhyme. In the period in question, all sophisticated people thought like that.

All of this sounds as text-driven as recent schools of literary criticism, and maybe in practice it was. However, the big difference between late premodernity and early postmodernity was that the former was not incredulous of the possibility of certainty, of reliable foundations for thought and belief. Europe in the first half of the 17th century was clearly in a transitional state. Western Christendom had broken up politically and confessionally into divisions that could not yet acknowledge each other's legitimacy. Traditional Ptolemaic cosmology was no longer acceptable, but no alternative was available that was consistent with contemporary physics. Europe had become aware of the size of the planet and how alien many of the societies on it were, but as yet had no idea how to fit this new information into received ideas about history and providence. Of course, in another few decades, all these questions would be answered in what seemed to be a perfectly satisfactory manner. The connecting theme of this book, however, is a search for certainty that failed. The search was cartographical, the search for a method to determine longitude.

An extended discussion of this most interesting problem in the history of applied science is perhaps out of place here; readers who want a full account are referred to Dava Sobel's excellent recent book, "Longitude." Basically, the chief problem faced by early oceanic navigators was that, while it is not hard to tell how far north or south of the equator you are (the measure of latitude), there is no comparably simple way to tell how far east or west you are (the measure of longitude). You can determine latitude, for instance, by measuring how far the sun rises above the horizon at noon. It is a natural quantity, produced by the fact the earth is a sphere that spins on its axis. Longitude, however, is a relative, artificial concept. You must pick an arbitrary line that runs north and south all around the earth, through both poles, and then try to figure how far east or west of this line you are. (Today, of course, this line, called the prime meridian, is the meridian of Greenwich in England.) If you know the difference between your local time and the time at the baseline, you can easily determine how far east or west of the line you are, since every hour's difference means 15 degrees difference in longitude. If you have an astronomical observatory and the leisure to make certain very fine astronomical measurements, such as the relative position of the moon to the fixed stars, you can determine what time it is at the prime meridian. However, such measurements are hard to do aboard ship. It took until John Harrison's invention in 1761 of a spring-regulated clock, suitable for use aboard ship, to finally solve the problem. Until then, the ambiguity of longitude created a doubt about one's position in the world that seemed almost ontological, or so Eco would have us believe.

Supposedly, we know about the story in this novel because the author acquired the papers of one Roberto della Griva. Born in 1614, he was a member of a minor noble family of northern Italy, self-described vassals of the marquis of Monferrato. This memoir-romance-love letter collection was written while the author was cast away on an abandoned ship, whose whole company save one had been eaten by cannibals. The ship was anchored off an island in the south Pacific, located on a meridian which Roberto believed to be the natural prime meridian. (For reasons which still make a fair amount of sense, many people of the time thought the prime meridian should run through the Canary Islands.) He thus believed that he was on the west side of what today we would call the international date line, the island on the east side. When he looked at the island, he was therefore looking at the day before. As I said, the book is simple.

As a child, Roberto conceived the notion that he had a wicked brother, kept secret by the family, to whom Roberto ascribes all his own bad actions. Roberto believes, with varying degrees of seriousness, that he goes through life being punished for his brother's misdeeds. This imaginary brother, named Ferrante, serves not so much to relieve Roberto of moral responsibility as to explain Roberto's bad luck. If something bad happens to Roberto, it is Ferrante's fault, one way or another. Roberto finds the putative existence of Ferrante ever less comforting with the passage of time.

Roberto's experience of the homicidal meaninglessness of life begins at age 16 at the siege of Casale, whose fortress is key to the frontier between France and Italy. Eco explains with great lucidity the dispute which caused the French and their Italian allies, including Roberto and his father, to defend the city against the Spanish and the Holy Roman Empire. Even after the explanation, the siege still makes little sense. None of the participants was acting irrationally; logic simply worsened the tangle. The siege eventually degenerates into a truce whereby the Spanish occupy the town and the French the citadel. Finally the whole thing is settled by negotiation. Roberto's father is killed early on, to no particular purpose. Roberto returns to his ancestral land only long enough to arrange for an income for himself, and then travels to France.

The early 1640s find him in Paris, at the moment of the transition between the regime of Cardinal Richelieu and that of Cardinal Mazarin. Roberto does not really have a philosophical mind, but he is interested in scientific and metaphysical questions, so he frequents salons attended by astronomers and philosophers. We thus learn a great deal about what the early 17th century thought about the plurality of worlds and the possibility of a vacuum. The young Pascal puts in an appearance, and one character gets a letter from an officer serving in Holland whom are not told is named Descartes. Roberto sees a successful application of a substance called the "powder of sympathy." This is used to treat wounds, not by application to the wound itself, but by application to the weapon that caused it. He becomes something of an expert on sympathetic medicine, with grave consequences for his future. He also becomes infatuated with one of the great ladies of the salons. He believes, through a fanciful interpretation of the available information, that she is equally infatuated with him. He starts writing her self-revealing letters, a practice he continues even when there is no way to deliver them. This habit eventually produced Eco's holographic manuscript.

These pleasant years in Paris are ended when Cardinal Mazarin dragoons Roberto for a machination. (Roberto blames Ferrante for the misunderstanding that puts Roberto into the Cardinal's power.) Mazarin, like the leaders of other maritime states of his time, was much interested in the longitude question. He was particularly concerned that the English might find a solution before France did. Learning that the English were about to conduct experiments using the principle of the powder of sympathy to transmit the time to ships at sea, the Cardinal blackmails Roberto to take passage on the Amaryllis, a Dutch vessel on which the experiments would be made. Since Roberto is being sent to act as Mazarin's spy, the Cardinal gives him a good measure of sound advice about human nature and the ways of the world, such as one might expect from a contemporary of Baltasar Gracian, author of "The Art of Worldly Wisdom." (Actually, Mazarin's instructions also sound like Elrond's farewell address to the Fellowship of the Ring in the "Lord of the Rings," or at least what Elrond would have sounded like, had he been a pompous ass.)

Both the Dutch and the English being too stupid not to take paying passengers on a secret mission, Roberto has no trouble booking passage on the Amaryllis and sailing to the south seas. He also has no trouble finding out what the English are up to. A dog had been wounded with sword and brought on board, where an appalling English physician kept the wound from closing. The sword remained in London. At set times in the day, the sword was heated, which was supposed to make the dog howl and whimper. Noting when the dog exhibited acute distress, its tormentors on the Amaryllis believed that they could tell exactly what the time was in London. Happily, the ship sank in a storm. Roberto was the only survivor. He could not swim, but he had sense enough to grab a plank.

Roberto washes up, not on a deserted island, but on a deserted ship. This is the Daphne, another Dutch ship, also obviously on some kind of scientific expedition. There is a roomful of all manner of time pieces. There are a garden and an aviary. There is unending succession of storerooms filled with remarkable stuff. Indeed, one of these cubbyholes turns out to contain Father Caspar Wanderdrossel of the Society of Jesus. It should be mentioned that the first third or so of the book consists of Roberto's recollections incited by one or another of the chambers of the Daphne. To me, at least, this procedure is reminiscent of "The Memory Palace of Matteo Ricci," Jonathan Spence's biography of the great missionary to China. The book was much concerned with Ricci's science of mnemonics, which works by creating associations between facts you want to remember and an imaginary structure you know well. I could be wrong. Roberto had also been exploring the Amaryllis's seeming endless stores of aqua vitae, so it is hard to say. ("Aqua vitae" is Latin for the Irish "uisce beatha," which of course is whiskey. Did 17th century Dutch ships carry barrels of whiskey? Rum maybe? The question is irrelevant, but in keeping with the spirit of the book.)

Fr. Wanderdrossel was himself looking for a way to determine longitude (hence the roomful of clocks), but only to help prove a larger thesis about the origin of the waters of the Great Flood. The priest believed that the excess water arose physically from submarine fissures in the antipodes, and then was magnified temporally by being passed from one day to another across the international date line. At least, this is what I think he said. Fr. Wanderdrossel had spent many years in Rome, but unfortunately he spent most of his time there speaking Latin to other Jesuits. His talk is therefore a jargon of Latin and German and such English (doubtless meant to represent the Italian of the original) as used to be found in the comic strip, the "Kaztenjammer Kids." The result is unlovely, yet his dialogues with Roberto go on for pages and pages. However, the content of their discussions, which dealt in large part with the structure of the solar system, are very interesting. Roberto defended the Copernican system, whereas Fr. Wanderdrossel endorsed the more moderate hypothesis of Tycho Brahe, which had the sun and moon orbiting the Earth and the planets orbiting the sun. The remarkable thing is that, absent Newton's laws of motion, Tycho Brahe has the better of the argument.

For reasons that seemed sufficient at the time, the crew had abandoned Fr. Wanderdrossel and taken the Daphne's only boat to the island, where the cannibals got them. Despite the cannibals, Roberto and the priest look for a way to reach the island. The priest cannot swim either, so he tries to teach Roberto how to swim. Despairing of the young man's progress, he brings a remarkable machine out of storage, a sort of diving bell with an open bottom that would let the occupant walk to shore over the sea floor. Roberto lowers him over the side, and he is never seen again. Roberto considers the possibility that the floors of all the seas are covered with hidden Jesuits.

Roberto thereafter divides his time among swimming practice, the aqua vitae, and his increasingly fanciful writings. The latter come to deal almost exclusively with the wicked deeds of his brother, Ferrante. By and by, Roberto describes how Ferrante himself sets out in search of the prime meridian. When Ferrante reaches it, he uses its time-travelling capacity to sail back to the time of Christ, whom he kidnaps from the Garden of Gesthemane and imprisons on the Island of the Day Before. The Redemption never having taken place, the whole human race is damned. Roberto does devise a fitting end for Ferrante, however. Not long after, Roberto drowns, leaving his papers to astonish a later world.

There is nothing wrong in principle with a story that has no particular point, or whose point is that there is no point. Unfortunately, none of this was really as much fun as it should have been. A pessimist, or a professor of semiotics, might say the same about life. However, books are supposed to be better than life.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-11-18: You Can't Make This Stuff Up, But People Do Anyway

Unfortunately, Bush Derangement Syndrome has proven to be a permanent feature of American politics since 2004, with the object changing every so often.


You Can't Make This Stuff Up, But People Do Anyway

 

The expression "Bush Derangement Syndrome" seems to have been coined by Charles Krauthammer last year. Conceived as a witticism, it referred to the fury that afflicts some critics of the Bush Administration, a fury with the peculiar property that those who have Bush Derangement Syndrome don't recognize it as anger. The term has become widespread: David Kaspar uses it to describe the German media's reaction to Bush's reelection.

But if Bush Derangement Syndrome was supposed to be a joke, what are we to make of Post Election Stress Trauma [PEST]?

Boca Raton News: Mental health officials in South Florida blasted Rush Limbaugh on Monday, saying the conservative talk show host’s offer of "free therapy" for traumatized John Kerry voters has made a mockery of a valid psychological problem..."Rush Limbaugh has a way of back-handedly slamming people," said Sheila Cooperman, a licensed clinician with the American Health Association (AHA) who listened Friday as Limbaugh offered to personally treat her patients....Cooperman, whose professional practice is based in Delray Beach, said the election-related symptoms she sees in the Kerry supporters more than [qualify] PEST as "a legitimate syndrome or disorder within the trauma spectrum," according to the American Psychiatric Association’s Diagnostic Statistical Manual of Mental Disorders.

I suspect PEST is a joke, too. I am afraid to check. However, it is true that some people really do view the recent election as a medical trauma for which they require treatment.

* * *

Whenever discussion of a public issue begins to use therapeutic terms, you normally find that someone is trying to get back on-message because pesky facts are interfering with their opinions. We saw quite a bit of that during the Nuclear Freeze Movement, when psychologists began to diagnose support for the Reagan Administration's Soviet policy as "psychic numbing." This had the convenient effect of relieving the opponents of that policy of the need to discuss the nature of the Soviet Union, much less of the need to actually know something about arms control. Psychology is not the only way to keep on-message, however. Translating an issue into Marxist terms used to be a sure-fire way to obviate the need to mention unpalatable realities. More recently, feminism has served the same function. I think that we are seeing an example of this in Theodore Dalrymple's piece in City JournalWhy Theo Van Gogh Was Murdered:

 

But why kill Theo Van Gogh, of all the people who have expressed hostility to radical Islam? Perhaps it was mere chance, but more likely it resulted from his work’s exposure of a very raw nerve of Muslim identity in Western Europe: the abuse of women...Were it not for the abuse of women, Islam would go the way of the Church of England. ...Religious sanction for the oppression of women (whether theologically justified or not) is hence the main attraction of Islam to young men in an increasingly secular world.

So now we know. Islam does not make converts in every Western country because Westerners seek to ground a sober way of life in the transcendent, a desire that secular modernity cannot satisfy. They do it to abuse women. (Those converts who are women, presumably, do it to be abused.) In fact, one might surmise from Dalrymple's argument that the oppression of women is the only real attraction that any religion has.

There are no mitigating circumstances in the slaughter of Theo Van Gogh. There is an explanation, though, which is that he was Michael Moore without the tact (or the body-guards). From what I can tell, Van Gogh does seem to have shared something of Dalrymple's contempt about the religious roots of human life. In Van Gogh's case, I suspect, willful ignorance of the dangers he faced made him vulnerable. Secularists who adopt Dalrymple's analysis will similarly be blinded to the nature and the enormity of the threat.

 

* * *

Meanwhile, not only are physicians taking money to cure parodic diseases, but parody publishing concept are appearing in the light of day. Consider, for instance, this passage from an imaginary business-philosophy book that Walter Kirn described in his novel, Up in the Air:

 

In The Garage, I propose a bold new formula to replace the lurching pursuit of profit: "Sufficient Plenitude." Enough really can be enough, that is. Heresy? Not to students of the human body, who know that optimum health is not achieved by ever greater consumption, but by functioning within certain dynamic parameters of diet and exercise, work and leisure.

Very funny, but then what are we to make of this new publication?:

 

Plenty hits the newsstands today and is scheduled to be published six times in 2005. It is aimed, the creators say, with no apparent comic intent, at the "environmental consumer" and promises "smart living for a complex world." The idea is that you don't have to be stodgy and self-flagellating to be green.

I am almost certain that Plenty is for real. Again, I am afraid to check.

 

* * *

On the subject of timidity, those of us who are too timid to simply confront the future have long been comforted by the opportunity to read about it ahead of time in the books of Strauss & Howe, with their beguiling cyclical generations model of American history. The problem they have faced since 911 is whether that event began the long-predicted generation of Crisis. They recently addressed the matter again:

 

As we wrote at the time, and as many readers have remarked, 9/11 came a bit early in the cycle--before Silent influence weakened sufficiently, before Boomers began entering old age with generational imperatives, before Gen Xers began entering midlife as societal anchors, before Millennials began coming of age and asserting themselves politically. In The Fourth Turning, we set 2005 as the time when that generational constellation would make a shift from the third to the fourth turning more likely...On domestic as well as foreign issues, America is now primed for a spark to catalyze the new mood far more fundamentally than 9/11 ever did outside the two attacked cities.

The difficulty is that, if 911 was the beginning of the Crisis, like the financial collapse of 1929, or like the Dred Scott decision of 1857, then the corresponding "regeneracy" event is just about due, like the beginning of the New Deal in 1933, or the inauguration of Abraham Lincoln in 1861. S&H cast doubt on whether the election of 2004 was a sufficient template for a new departure of that magnitude, so they wonder whether some incident still to come might serve to begin the Crisis.

In my opinion, the crisis that began on 911 was quite critical enough. As for the regeneracy, who knows what 2005 will bring?

 

* * *

Returning to merely historical history, John J. Dilulio Jr. argues in The Weekly Standard ("Wooing Purple America": November 17, 2004) that the Democratic Party has a dim future, unless it breaks its ties to the Cultural Left:

 

An old Philadelphia Democratic committeeman once put it this way: "I don't like [Moral Majority fundamentalist preacher] Jerry Falwell or [Grateful Dead drug-culture rocker] Jerry Garcia, but if I had to pick one Jerry to watch my grandkids, I'd sure pick Falwell."

Once again we see the old principle: give the people a choice between Us and Them, and the people will inevitably choose Them.

 

* * *

Finally, I have often complained in this space that the 21st century does not have all the technologies I had looked forward to, and it has other technologies that don't interest me, or are otherwise unsatisfactory. It was with some relief, therefore, that I saw this report about a good old-fashioned 21st-century system about to go into operation:

 

CHICAGO (CBS 2) Mayor Daley officially opened a new city operations center Tuesday that will include a dramatic increase in camera surveillance on Chicago’s streets...real time video and audio information from 2,000 cameras and microphones stationed around the city..The operations center will respond to anything from terrorist attacks to gas leaks...The new system also has the ability to instantly report the sound of gun shots within hearing distance of the microphones planned around the city.

The only problem is that the system is obsolete. Why doesn't the city just set up webcams that everyone can use?

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Millennium: A History of the Last Thousand Years

The consequences of population expansion due to the Industrial Revolution

The consequences of population expansion due to the Industrial Revolution

Prior to 1500, the idea that Northwestern Europe and its disapora would come to dominate the world would have seemed pretty strange. One can make a case that this is a temporary state of affairs. All of the ancient seats of civilization were in other places, and had been for a very long time.

I appreciate this book because it takes a genuinely worldwide look at history for the last thousand years, without attempting to boost the accomplishments of the author's favorite peoples, or minimize those of his hereditary enemies.

As an aside, I note that underdevelopment economics regarding Indian textiles don't seem to have changed much in the last twenty years.


Millennium:
A History of the Last Thousand Years
by Felipe Fernandez-Armesto
Charles Scribner's Sons, 1995
$35.00, 816 pp.
ISBN: 0-684-80361-5

This book is what multiculturalism would be like if multiculturalism were not a fraud. The author is an Oxford don and, apparently, a serious Catholic who undertook the appalling labor of trying to discern the large trends in the history of the world over the last thousand years. The authors of universal histories usually conclude them with some speculation about the future, so Fernandez-Armesto does too, glumly anticipating that reviewers will use up more space critiquing the final few pages of prediction than in assessing the hundreds of pages of history. He was certainly right, at least as far as this reviewer is concerned. However, even where one might think his analysis is flawed, still the whole of the book is a rare, non-tendentious attempt to make sense of it all.

According to the author, a world history only really became possible in the past thousand years, because the second Christian millennium was the time when the major regional civilizations of the world achieved substantial and continuous contact. Though the author gives considerable attention to events in Africa and the Americas, he is, for good reason, something of a Eurasian chauvinist. For most practical purposes, the history of the world since the year 1000 A.D. can be understood as the joint and several life stories of just four great cultures: China, Christendom, India and Islam. Fernandez-Armesto seems never to have heard of the solemn debate that took place forty years ago about whether the United States constituted a civilization distinct from Europe. In his scheme of things, the Americas and Western Europe are obviously part of something he calls "Atlantic Civilization." One of the recurrent themes of the book is that the era of the hegemony of this civilization does not go back as far as you might think, and that already it is probably about to be supplanted by a civilization of the Pacific, anchored on the East Asian shore.

As recently as the fifteen century, all of the major civilizations of Eurasia were expanding, with the exception of India. In the early 1400s, the newly-established Ming dynasty mounted a series of naval expeditions into the Indian Ocean as far as the coast of East Africa, involving ships far larger than anything to be found in Europe and forces of nearly 30,000 men. Russia was at the beginning of the expansion that would win it the most durable of all European empires. (It still possesses Siberia.) Christendom East and West, however, was losing territory to the Ottoman Empire. The adventures of a few Western explorers on the coasts of Africa and, later, in the Americas, did not seem to matter much in the grand scheme of things at the time. Fernandez-Armesto suggests that the contemporary perception may have been correct. Although Western expansion in the Age of Exploration certainly laid the foundations for the temporary ascendancy over the rest of the world that occurred in the nineteenth century and first half of the twentieth centuries, the fact is that for most of this millennium the preponderant civilization in the world has been China. China had the "initiative" before, say, 1750, and is likely to regain it in the twenty-first century.

The author has a point. Before the mid-eighteenth century, for instance, most of the world's printed books were in Chinese, and that country's iron industry was by far the world's largest until well into England's Industrial Revolution. While the era of scientifically informed technological progress begun in the West has done an amazing job of standardizing ordinary human life all over the planet, we should remember that the most dramatic effects have only been felt for about two long lifetimes. For centuries before that, Chinese inventions such as gunpowder, paper money, magnetic compasses, civil service tests, printing and the abacus had been making more or less of an impact throughout Eurasia and Africa (with which China once had quite a lot of indirect trade). The point is not that the Western inventions were less important than the Chinese ones, but that China has demonstrated a greater ability to affect the rest of the world over long periods of time.

The great Western empires were, of course, the largest political units that have ever existed, but they did not last long and they impinged on other Eurasian civilizations only fairly recently. European nations began to dominate other societies of roughly their own size and complexity only in the mid-eighteenth century, when British East India Company took possession of the already-disintegrating Mughal Empire in India. (As was the case with the Spanish in Mexico, the conquest was really a revolt of native powers led by small European armies, who did not do most of the fighting.) Even at that time, China was the largest it has ever been and was perfectly successful in its own neo-Confucian mold. It became vulnerable to Western pressure only when the Opium Wars began in the 1840s, when the Qing dynasty was in a state of decay. The Ottoman Empire was in slow retreat, but quite capable of defeating Western and Russian armies until quite late into the nineteenth century. In Africa, the great European empires began to be built only around 1870 and were almost all gone by about 1960. With a little luck and ingenuity, an African born in the last third of the nineteenth century could have lived through the whole thing. This flash of expansion and retreat looks to the author less like the inevitable "Rise of the West" (the title of William McNeill's 1950s universal history) than the sort of fleeting hegemony established by the Mongols in the twelfth century. In two hundred years, that was pretty much all gone, too.

"Millennium" is not a multicultural diatribe against the West. The author early on in the book declares his support for the traditional liberal arts curriculum. More important, he makes some effort to try to understand what other cultures actually think, rather than to use them as convenient screens for the projection of progressive opinion. Still, in his effort to view the world from other than a Western perspective, he does tend to wash out the real differences that exist between civilizations at any given point in time. The West has always had certain peculiarities which really do go far go explain its great relative successes of the past two or three or five hundred years.

In an aside perhaps intended to emphasize the equivalence of civilized societies before the 18th century, Fernandez-Armesto notes that both Japan and Spain considered invading China in the sixteenth century. The Japanese actually made a start on the project with an unsuccessful attack on Korea, while the Spanish wisely abandoned the idea for logistical reasons. While this coincidence does illustrate that blue-water navies and expansionist intentions could be found on both sides of Eurasia at the same time, it seems to me at least that there is a certain asymmetry here. Spain was an underpopulated, intrinsically poor country that yet could seriously contemplate creating an empire on the other side of the planet (as indeed it did at roughly this time, in the Philippines). The Japanese, on the other hand, never to my knowledge gave any thought to making raids against the countries of Atlantic Europe, despite the fact the Japanese had been in vigorous contact with them for some time and had a predilection in that era for piratical adventure.

The West, meaning the civilization that arose in Atlantic and central Europe after the Dark Ages, really does seem to have a peculiar case of "applied curiosity." This goes beyond mere scientific knowledge. Many societies have known or guessed that the world is a sphere, for instance. The idea of sailing around it, however, does not seem to be natural, in the sense that it does not occur to intelligent people in every culture. Even Saint Augustine, at the end of Classical civilization, is on record as recoiling at the thought that there might be people in the antipodes, with their feet pointing towards his through the center of the sphere. Yet Fernandez-Armesto remarks on a Genoese sailing fleet that, as early as the thirteenth century, tried to do what did what Columbus tried to do two centuries later. The fleet was never heard of again, for the good reason that the enterprise was suicidal with existing technology, but it was a peculiarly Western thing to do. Even the great Indian Ocean expeditions led by the Ming admiral Cheng Ho in the fifteenth century were not missions of exploration. Chinese oceanic trade had long been familiar to the regions he visited. The expeditions came to known locations to establish tribute relations, something the Ming dynasty was also attempting to do at the same time across central and southern Asia as part of its program to establish its position in the world.

"Millennium" is blessedly free of models of history, whether Marxist, Spenglerian or Darwinist, yet the price the author pays for freedom from theory is a blindness to the long phases that civilizations do undoubtedly go through. Not everything is possible to every culture at every phase of its life, even if the people have the knowledge, the resources, and the incentive to do it. For instance, Fernandez-Armesto echoes the "underdevelopment" school of economic history when he notes, accurately, that the English dismantled the Indian textile industry when they took over the country, despite the fact it was at least as efficient as the early mechanical textile industry to be found in England at the same time. This act of imperial preference, the underdevelopment economists say, is why the Industrial Revolution did not occur in India and why India is still so poor. This conclusion (which the author does not push), is obvious nonsense. Why the industrial and scientific revolutions occurred when and where they did is likely to remain something of a mystery, but it is at least clear that Mughal India did not have the physics, or the system of commercial law, or the practical engineering to do what Georgian England was doing. India was simply not about to embark on an Industrial Revolution in the eighteenth century, either with or without the interference of the East India Company.

This same point can be made about China, where it relates directly to Fernandez- Armesto's anticipations of a coming civilization centered around the "world ocean" of the Pacific. As John King Fairbank notes in his magisterial study, "China: A New History," Chinese civilization in the eighteenth century was ending an adventure that had begun 700 years before, in the Sung Dynasty. The first two or three centuries of the second millennium, including the Mongol interlude, was the time when China made most of the technological and commercial advances that so affected the rest of the civilized world. (Fernandez-Armesto perceptively points out that those centuries were also a time when Chinese civilization encompassed several competitive states.) More than one historian has noted the similarity of this era to Western modernity itself, both in its creativity and its character as a long civilizational "civil war." However, modernity ended when the Ming reunited China in the fourteenth century. That dynasty and the Manchu (or Qing) dynasty that followed were an era of cultural consolidation, measured territorial expansion, and a gradual loss of imagination.

The China that Lord Macartney met in 1793 on his unsuccessful mission to open diplomatic relations between China and Great Britain was incomparably mighty and wealthy by any standard, including its own history. However, it was a civilization that had exhausted its own cultural potential. Its decline in the next century was at first slow, and then catastrophic: by far the bloodiest war of the nineteenth century was the T'ai P'ing millenarian revolt of the 1850s and 60s. Since then, the country has seen many ups and downs. The empire ended in 1911 and was replaced by a series of regimes, each less satisfactory than the one before. Today, of course, the world economic system is transfixed by China's economic potential. The problem for China is that economic potential is not always the kind of potential that counts.

The author's remarks about the future of the West are perhaps more interesting than his ideas about the future of the East. While not himself a multiculturalist or deconstructionist, he seems to have absorbed a lot of the commonplaces that infest the academy these days. He believes, wholly inaccurately, that twentieth century science has been forced to abandon the search for objective truth. On the other hand, he does rightly note that the belief that objective certainty is no longer possible, either in science or in morality, has an enervating effect on the effectiveness of Western governments. While hardly a British chauvinist, he argues persuasively that the British Empire was done in by a loss of nerve at home, one that had less to do with Britain's relative decline in the world in the first half of the twentieth century than with a kind of auto-hypnosis. It took a fairish amount of moral self-confidence to bluff the Indians into submission, he suggests. When the English could no longer be certain of the inherent superiority of their civilization, they were no longer willing to maintain the bluff, and the empire dissolved.

Just as the nineteenth century empires evaporated in the harsh light of mid-twentieth century skepticism, so did the personal lives of many of the people who live in the former metropolitan countries. To some extent, this process was expedited by the social welfare policies of the Western states themselves. However, the author expects a revival of traditional family life in the future, for the simple reason that the state will be unwilling or unable to continue to provide the kind of social services that had been intended to replace family structures. Unfortunately, since the academy will be unable to articulate any common moral vision for western societies, public morality will become more and more incoherent. The author praises the Roman Catholic Church for providing the last system of moral certitude in the world (he is against abortion and capital punishment), but expresses doubt that the Church will be able to exercise this function for long. While allowing that John Paul II has so constituted the College of Cardinals that one may expect his successor to be orthodox, the author expects the Church to relax many of its positions in papacies to come.

Since skepticism and doubt cannot be maintained indefinitely, Fernandez-Armesto believes that we are living in a period of transition to new certainties. He rather expects that these will be terrible certainties, characterized by the "passionate conviction" that Yeats in "The Second Coming" ascribes to the worst kind of people. He predicts a bright future for fascism (he was raised in Franco's Spain, apparently), and like many people he believes that the death of communism has been exaggerated. For myself, it seems to me that he errs in looking forward to a sharp distinction between today's era of "liberal irony" and a totalitarian future. Radical ecology and the new forms of racism simply illustrate that when people are denied the possibility of certainty on the level of the spirit, they will seek it in the body, in biological and social history. Without a transcendent to appeal to, there is no answer to Heidegger.

One the whole, though, the future he anticipates is not a catastrophic one. The population crisis, for instance, he believes to be something of a chimera, as is the increasing hysteria in developed countries about their new demographic disadvantage with respect to the south. Experience shows that population growth rates tend to level off after a while, quite without government interference. The author anticipates, indeed seems to hope, that the Pacific will play a role in the history of the third millennium like that played by the Mediterranean in Western antiquity. Again, this does not seem to me to be altogether plausible, but if the new Mediterranean turns out to be a theater for a history as rich as that enacted in the old one, then maybe we will not have much to complain of.

The author does not look forward to universal peace, or a universal state, or to the collapse of civilization as we know it. In other words, the future is likely to turn out to be a little like life. One may hope that freedom will persist in such a world, without necessarily expecting the next millennium to be THE millennium. As the author puts it: "One of the drawbacks of freedom is that free choices are regularly made for the worst, ever since the setting of an unfortunate precedent in Eden. To expect people to improve under its influence is to demand unrealistic standards from freedom, and, ultimately, to undermine its appeal."

This article originally appeared in the December 1995 issue of Culture Wars magazine.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-11-04: Anti-Tet

Indian Mutiny

Indian Mutiny

The comparison John Reilly makes here[via Robert Kaplan] between Iraq and India is a particularly interesting one. You do not usually hear of the two countries compared in any meaningful way, but the distinction here is the differing responses of each people to extended foreign occupation. Violent uprisings were seen in both places. Eventually, the British left India, but they stayed far longer, and left a deeper impression on the people of India.


Anti-Tet

 

Sometimes, the histories that did not happen seem to burn through the text of the newspapers. Consider, for instance, this headline from today's New York Times, Rebels Routed in Falluja; Fighting Spreads Elsewhere in Iraq. The piece sums up the Battle of Falluja like this:

American commanders said 38 American servicemembers had been killed and 275 wounded in the Falluja assault, and the commanders estimated that 1,200 to 1,600 insurgents - about half the number thought to have been entrenched in Falluja - had been killed. But there was little evidence of dead insurgents in the streets and warrens where some of the most intense combat took place.

It's straight war reporting. The journalists report from a US perspective, but they do not simply rewrite military news bulletins. The gist of the report is that Falluja was a difficult operation, tactically successful, and part of a coherent strategy that can succeed.

Imagine now that John Kerry had won the recent election. It is unlikely that the headline would have mentioned "routing the enemy." Instead, the piece would have focused on the open-ended nature of the mop-up operation, and on the outbreaks of violence in other Iraqi cities. Indeed, there would probably have been more of the latter to report. Jihadis watch international television, too. They would have assessed the outcome of the election, not unreasonably, as a repudiation of the whole of George Bush's policies. Just as important, the officials of the nascent Iraqi government would have reassessed their own situation. Instead of desertions among the new Iraqi army and security units, we might have seen the legitimate administrations of cities and provinces going over to the insurgents.

There have been some comparisons of the Falluja campaign to the Tet Offensive during the Vietnam War, but it is the differences that are important. Starting on January 31 of 1968, the North Vietnamese and Vietcong launched attacks intended to spark a general uprising in South Vietnam. The campaign failed completely. The Vietcong never recovered. Nonetheless, the offensive visibly persuaded the US media that the war was unwinnable, and that became the subtext of all subsequent reporting of the war. Actually, the change of heart in the media might not have had all that much effect on the North Vietnamese: they simply did not follow the US media that closely. However, they could scarcely have missed the implication of President Lyndon Johnson's announcement on March 31 that he would not seek a second term.

Negotiations started soon thereafter. Negotiations, usually, are a good idea. Starting to talk then, however, demoralized the US military and demoralized the South Vietnamese. The talks did, however, cause the enemy to surmise that, whatever they were doing, they should keep it up. So they did. This was, pretty much, the pattern that John Kerry promised to repeat if he were elected.

Falluja was the anti-Tet. If 3.5-million Americans had voted otherwise, it would have been Tet II.

* * *

None of this is to say that everything in Iraq, or the Middle East, is going unambiguously well. Robert Kaplan supported the Iraq War, and still does, after a fashion. However, he now thinks that the time has come for the dreamy Wilsonians in the Bush Administration to reassess their goals in Iraq. He gave us this medley of historical allusions in yesterday's New York Times:

But rather than a replay of the Balkans in 1995 and 1999, Iraq has turned out like the Indian mutiny against the British in 1857 and 1858, when the attempts of Evangelical and Utilitarian reformers in London to modernize and Christianize India - to make it more like England - were met with a violent revolt against imperial rule. Delhi, Lucknow and other cities were besieged and captured, before being retaken by colonial forces.

The bloody debacle did not signal the end of the British Empire, which expanded for another century. But it did signal a transition: away from an ad hoc imperium fired by an intemperate lust to impose domestic values abroad, and toward a calmer, more pragmatic empire built on international trade and technology.

May I point out that this account has the history of the British Raj upside-down and backwards? When the East India Company had direct responsibility for the subcontinent, it did tolerate missionaries, but then it also tolerated most things. That was part of the problem. The British Empire in India only hit its stride after the Mutiny, when London assumed direct control, and attempted systematically to foster good government and a liberal civil-society.

* * *

Speaking of Alternative History, readers may wonder why I have failed to do a review of Philip Roth's new novel, The Plot Against America. The fact is that I ordered the book through Amazon, but not from Amazon. Rather, overcome by parsimony, I ordered it from a third-party seller, for less than half list-price. The seller took the money, but if he ever had a copy of the book, he did not send it to me. Amazon, which collects the payments for these sellers, was nice enough about it. They promise me a refund, in the fullness of time. Nonetheless, hereafter I will regard all third-party booksellers with due caution.

* * *

I have actually been sick for the past few days (kidney stones: ouch!), and in my delirium, I viewed a VHS copy of Monster Island. This is an MTV Original Movie, though I think it might easily have been produced by the Disney people. The humor in the film is not always unintentional. However, I was chiefly struck by the film's geographical transpositions. A tropical island that had been a nuclear test site is located in the middle of the Atlantic (or possibly the Bermuda Triangle is supposed to be in the South Pacific, which would be awkward, considering the location of Bermuda). The wonderful thing was that this tropical island was covered with maples and beeches and freshwater lakes, such as one sees in Canada, where filming on location is cheap.

* * *

Meanwhile, those readers who really need to be frightened about the alleged ascendancy of the religious right in America can satisfy all their theophobic needs with this story:

ROCHESTER, NH -- Lawyers for a Farmington woman and her boyfriend who are charged with threatening to kill her three children in a church say their clients meant the children no harm. Nicole Mancini, 29, and John Thurber, 35, of Rochester, were arrested at St. Mary's Church on Wednesday after workers said they heard the woman say she wanted to sacrifice her sons on the altar. Each was arraigned on three misdemeanor counts of child endangerment on Friday.

"They were never tied to the altar, there was no blood, there were no constraints for sacrificial use," said Kimberly Shoen, Mancini's attorney. According to Linda Slamon, Thurber's attorney, Thurber said Mancini had been acting "irrationally" recently, and Thurber accompanied her to church to get her help.

One suspects the authorities overreacted.

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Beyond the End Times: The Rest of the Greatest Story Ever Told

The minutiae of eschatology matter quite a bit to many people. This book review was my introduction to preterism, the claim that all of the events prophesied in the New Testament have already happened. Like John, I don't think this really fits into a systematic Christian theology, but systematic theology has never really been a popular endeavor.


Beyond the End Times: The Rest of the Greatest Story Ever Told
by John Noe
Preterist Resources, 1999
300 Pages, US$ 17.95
ISBN 0-9621311-4-8

 

The idea that the eschatological prophecies of the Old Testament were fulfilled in the life and mission of Jesus is scarcely new. That is why the Old Testament is part of the Christian canon. However, most forms of Christianity have usually held that prophecy also points to events in the indefinite future, when the work of salvation will be completed along with history itself. The proposition that all the prophecies of the "end times" were fulfilled in their entirety in the first century is known as "preterism." In "Beyond the End Times," John Noe, a writer on business topics and president of the Prophecy Reformation Institute, makes a vigorous case for preterism. His intended audience are the theologically conservative evangelicals who, he feels with some reason, have been ill-served by the premillennialism that has dominated American evangelicalism for the last century and a half.

Quite a few members of the evangelical audience are likely to find this book more than a little disconcerting. All the familiar landmarks by which many believers have oriented themselves in contemporary history are systematically leveled. The founding of modern Israel becomes a political accident, with no significance for salvation history. There will be no Third Temple, no Antichrist and no Battle of Armageddon. There will be no Rapture of the Saints before the Tribulation, and no Tribulation. Indeed, there will be no Second Coming.

And, since the millenarian streak in evangelicalism is only a special case of the millenarianism that runs through American culture generally, evangelicals are not the only ones whose most cherished images of future horror are dismissed. Preterism, in Noe's understanding, requires the doctrine that the world will have no end. This means that neither the human race, nor the planet Earth, nor the universe itself will ever cease to exist. Noe makes a moderate critique of the more hysterical kinds of environmentalism. He also points out that, while a general nuclear war would be very terrible, it would not exterminate the human race. Perhaps most remarkably, this is the only book I have ever encountered which suggests that the second law of thermodynamics, at least as applied to cosmology, may be contrary to scripture.

Noe does not argue, as might a typical theological liberal, that the prophecies of the Old Testament were simply mythology or metaphor. Wherever possible (which is not everywhere), he prefers a literal interpretation of prophecy. Rather, he argues that the prophecies referred to concrete historical events that have already occurred. His particular care is to show the compatibility of Matthew 24:34 with history. That verse comes at the end of the so-called "Olivet Discourse," in which Jesus predicts great tribulation and the coming of the Son of Man. Then he says, "I tell you the truth, this generation will not pass away until all these things have happened." Noe argues that this need not be, as C.S. Lewis called it, "the most embarrassing verse in the Bible."

The basis for Noe's argument is the Book of Daniel and its prophecy in chapter 9 of "70 weeks of years." This prophecy was supposed to predict the history of the Jews after their return from the Babylonian Exile. Daniel is set in the sixth century BC, though most scholars prefer a date for its composition in the second century BC. Noe is willing to live with either date of composition. If the second-century date is accepted, then the book's "prediction" of the desecration of the Temple by a wicked tyrant was actually a contemporary account of the successful Maccabean revolt in 168 BC against the Seleucid king, Antiochus Epiphanes, who had in fact defiled the Temple. In contrast, the book's prediction of the coming of "Michael" to save the people really was a prediction, and did not occur. At any rate, it did not occur in the second century BC. Christian apologists, however, have long noted that, if you start the 70 "weeks" running from any of several plausible dates for the end of the Exile in the fifth century BC, then "Michael" is predicted to appear during what turned out to be the life of Jesus. Noe explicates the arithmetic in detail, capping it with the destruction of Jerusalem in AD 70. That event, he holds, was the finale, foretold in both the Old and New Testaments, of the "end times" that began with the career of Jesus.

Noe's system has unusual implications for scripture. To begin with, if this version of preterism is to work, then the books of the New Testament, the Book of Revelation most especially included, must all have been written before AD 70. The proposal of such an early date for Revelation is new to me, as it is to both traditional and modern biblical criticism, both of which were quite happy with a late first-century date. Furthermore, the "Babylon" which is destroyed in Revelation turns out to be Jerusalem itself, which is here characterized as apostate for its refusal to accept Jesus as the messiah in the generation after his life on earth. References traditionally thought to have been prophetic of the Antichrist, such as the ruler mentioned by Daniel who would end the sacrifice in the Temple after three-and-a-half years, turn out to refer to Jesus himself, whose crucifixion made the Temple and its liturgy obsolete.

Noe attempts to be faithful to the principles of both the "plain meaning" of the text and of "sola scriptura." The result is often a cautionary example of the degree to which these principles are incompatible. There are numerous passages in both Testaments to the effect that "the earth endures forever," and Noe insists on their literal truth. On the other hand, he says that passages which speak of a "new Heaven and a new Earth" after the "time of the end" actually refer to the New Covenant, which will follow the end of the Mosaic Covenant. The images of cosmic catastrophe in the apocalyptic texts of the New Testament, from the earth being shaken to the stars falling from the sky, are just that: images. He notes that they also occur in prophecies in the Old Testament that predicted punishments against specific peoples and cities. These prophecies actually came to pass, quite without a universal conflagration.

The same method is applied to predictions of the Second Coming. When Jesus spoke of the Son of Man "coming on the clouds" in a way that would be visible "to all the nations of the Earth," he was in fact predicting his return to exact a judgment that would be famous throughout all later time, but in a mode familiar from other chastisements that God had exacted in the Old Testament. This mode was the "sign" of his coming spoken of in Matthew 24:30.

Actually, the principle of sola scriptura notwithstanding, Noe seems to come close at times to opening up the biblical canon to include the "History of the Jewish War" by Flavius Josephus, the famous turncoat of the anti-Roman revolt of AD 66-70. At any rate, it is only through reference to such nonscriptural sources that we can clearly see how the fall of Jerusalem might be interpreted as the culmination of the time of the end, since nowhere in the Bible is that event referred to directly as an accomplished fact.

Noe's use of Josephus is frequently ingenious. For instance, he uses him to identify "the Antichrist," or at any rate, the worst of the class of persons to whom that title might be given. One of the passages frequently cited as referring to a future Antichrist is II Thessalonians 2, where St. Paul says that the Lord cannot come until the "Man of Sin" sets himself up in the Temple as God. Jesus refers to an "Abomination" in the Temple in Matthew 24:15, and both passages can reasonably be taken to echo Daniel 9:27. Consulting Josephus for the events of the Roman-Jewish War, Noe identifies the Abomination as the slaughter of the Temple priesthood by the insurgent John of Gischala, the son of Levi, whose intransigence and viciousness made impossible both negotiations with the Romans and the coordinated defense of the city. This person, Noe concludes, must have been the Man of Sin Paul was predicting 20 years earlier. Well, that's settled.

People who are at all familiar with biblical prophecy can easily think of many verses that would seem to tell against this outline of Noe's version of preterism. All I can suggest is that they read the book. Noe does get around to attempting an answer to most of the familiar proof texts, though not always in the principal discussion of the doctrines to which they are supposed to relate. (Something this book needs is an index, and particularly an index of scriptural citations.) Let me put aside the labor of a close analysis of scripture, however, to ask a larger question: Does this really work? Can Christian theology, even within its own frame of reference, really claim that biblical eschatology was completely fulfilled in the first century? Most important, would such a theology be of more than academic interest?

Again, all I can suggest is that Christian eschatology has always had a large element of immanence, an element that is present in greater or lesser degree throughout the New Testament. Even within the lifetime of Jesus, even in what he says of himself, it is clear that the Kingdom of God already exists. It is accessible through prayer and sacrament, a matter of personal experience that may affect history but that transcends it. The remarks of Jesus about the Son of Man coming in glory in that generation have traditionally been linked with the account of his Transfiguration before Peter, James and John (Matthew 17) and his mock-triumphal entry into Jerusalem.

The higher criticism implicitly endorses a "partial-preterist" point of view in the early church, by insisting that the synoptic gospels could not have been written before AD 70. The argument is that the fall of Jerusalem was experienced by contemporaries as an apocalyptic event, the foretelling of which was later put into the mouth of Jesus by the evangelists. As for the Gospel of John, normally considered the latest of the gospels, it deals with little except eschatology, yet is quite devoid of "apocalyptic" elements in the conventional sense: God is fully revealed in the life of Jesus, and salvation is complete.

There are problems with the modern dating of the synoptics: Luke may well be post-AD 70, but I have doubts about Mark and even Matthew. Still, there is little doubt that the first century church thought of the fall of Jerusalem as a confirmation of eschatological expectations that were already well established when it happened. But did they think of the catastrophe as a complete fulfillment of prophecy? There is a dearth of evidence that they did, plus a lot of evidence that they did not, including just about all the earliest post-biblical writers on the subject.

The fact is that the sack of Jerusalem by the Romans just was not big enough to be the fulfillment of the prophecies of the Book of Revelation, or even of the Olivet Discourse. To try to limit the "end times" to that single event smacks of the complacent surmise by Tacitus that the whole of Jewish messianic prophecy was fulfilled by the ascension of Vespasian to the imperial title in Rome. Christians living through AD 70 may well have expected a quick, universal end to the order of things when Titus, the son of Vespasian, took Jerusalem and destroyed the Temple. As things happened, they did not get to witness the fall of Babylon the Great (which I think it is a bit perverse to associate with Jerusalem in any case), but they did understand that they were living through a "type" of the tribulation of which Jesus spoke. Four hundred years later, Christians did get to see Babylon the Great fall. Maybe they got to see it fall again in 1989. As John Newman remarked, Revelation is a drama that is produced on an ever increasing scale.

Aside from the theological issues, is there any hope that a system like preterism could ever become the basis of a living religion? Preterism presents many of the problems that Francis Fukuyama's famous "End of History" thesis presented at the end of the Cold War, when he proclaimed that the history of political theory had terminated with the triumph of liberal democracy. I share this thesis, with certain reservations, but it leaves you with the problem of what to do next. In preterism's case, we are left with the problem of what the 1,900+ years of Christian history were all about. Biblical prophecy provided a sort of plot for the story of history to follow, but preterism claims that the story ended in AD 70. Is God now to be found only in the text of the Bible and in religious practice, and not at all in history?

Noe clearly does not think so. In fact, he does not even think that revelation, broadly defined, is yet at an end. He has written on the many "theophanies" of Jesus in the Old Testament, and says he sees no reason why these cannot happen at any time. He even looks forward to the 21st century as the occasion for a new Reformation, this one concerned especially with the church's understanding of prophecy. While not precisely a Social Gospeler, he does deplore the deadening effect that millenarianism has on the participation of Christians in constructive politics and other social activities. That is something which he hopes the coming Prophecy Reformation will remedy, after people see that conventional evangelical premillennialism is a false idol.

I would not dismiss these expectations out of hand, but I suspect that preterism would have to grow in certain ways to realize them. Frankly, the model needs a future. If it cannot provide a universal eschaton, it must at least define some goals for the world short of eternity. The slightly unsettling thing about preterism is that it seems to leave itself almost absolute freedom in that regard. The Bible is capped by AD 70, and has nothing more to say about history. One cannot help but wonder whether something like preterism might be the necessary predicate for a "Third Testament."

Copyright © 2001 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: C. S. Lewis: An Alternative Obituary

A young C. S. Lewis

A young C. S. Lewis

C. S. Lewis is now famous for having written the Chronicles of Narnia and a number of popular works of Christian apologetics, but as a young man he was involved in some dubious circles. This is an alternative account of his life had he not decided to seek the light.


This article is Alternative History. It is not true. Do not cite it except as fiction.

From the Obituaries of The New York Times, November 26, 1963

 

Argentine police officials today confirmed that the remains of Clive Staples Lewis were among those found in the ashes of a bungalow on the outskirts of Buenos Aires. The building burned to the ground on November 22, just as Mr. Lewis, a long-time international fugitive, was about to be apprehended by agents of the CIA and MI5. Allegations of his involvement with this week's tragic events in Dallas are continuing to stir worldwide controversy [See Page A1]. Mr. Lewis is believed to have committed suicide by self-immolation. The exact number of his companions and the cause of their deaths are still under investigation.

With the death of Mr. Lewis, the hunt for the major war criminals of the Second World War can be said to be over.

C.S. Lewis was born on November 29, 1898, to an ordinary professional-class household of Belfast in the north of Ireland. His father, Albert, was a successful police prosecutor. His mother, born Flora Hamilton, died while he and his only sibling, an older brother named Warren, were still young. (Warren Lewis, a career army officer, died of liver disease in 1936.) According to C.S. Lewis's own memoirs, he endured a singularly unhappy childhood in the British public (i.e., private) schools of the period. He was the object of repeated beatings by other boys, and his academic performance was marginal. The young Lewis took refuge in bizarre fantasies involving animals, and also began a fascination with the occult that would greatly affect his later career.

Lewis served as a junior officer in the British Army in the First World War, during which he was wounded. Like many other figures who would later become important on the Right, Lewis wrote positively of his military service. He remarked of his time in the trenches that "this is what Homer wrote of," though he dismissed the war as a whole as merely an occasion "to meet the great goddess Nonsense." It is certainly true that Lewis benefited from the experience. Although before the war Lewis had repeatedly failed to pass the admission test for Oxford, the requirement was waived for veterans and Lewis was able to attend.

Lewis's time at Oxford is the most shadowy of his life. Although his only major works during the 1920s were two semi-pornographic verse novels published under a pseudonym, he is acknowledged to have developed a fetching style that could have won him a conventional academic career. However, rumors of sado-masochistic relations with students and faculty soon put a question mark by his hopes for university advancement. Additionally, his active involvement with ritual magic during this period seems to have occasioned a conspicuous decline in his mental equilibrium.

Writing long afterward, Lewis reports, in all seriousness, that he attended a ceremony in which a participant was literally dragged down to Hell. For whatever reason, Lewis clearly became increasingly paranoid about the powers he believed he had invoked. "You must picture me," he wrote, "alone in that room in Magdalen, night after night, feeling, whenever my mind lifted even for a second from my work, the steady approach of Him whom I so earnestly desired not to meet." Some final crisis occurred in 1929 which left Lewis unable to function. He was dismissed from Oxford, and later spent some months in Belbury Mental Hospital, during which he wrote an account of his conversion to Typhonianism entitled "The Pilgrim's Regress" (1933).

After his release from the hospital, Lewis used his contacts in the occult underground to meet Oswald Mosley, soon joining what came to be called Mosley's "Inner Ring." Lewis was instrumental in organizing the publicity strategy for Mosley's British Union of Fascists. Indeed, Lewis regarded this period as the happiest of his life. As he put it, he wrote successful propaganda "with his tongue in his cheek and the printer's devil by the door, and no one able to call him a nonentity ever again." Lewis is also believed to have been the real author of Mosley's "Allegory of Love" (1936), a provocative book that applied Georges Sorel's ideas about the manipulation of political myth to a "revolution of elites" in a parliamentary democracy.

Although active in the peace movement throughout the later 1930s, Lewis volunteered for military service when Great Britain declared war on Germany in September 1939. Nonetheless, Lewis was interned in Blackmoor Prison with Mosley and other prominent Fascists in the early days of the war. Mosley and many of his colleagues were killed in the assault on the prison by German Special Forces during the invasion, leaving Lewis the highest-ranking British Fascist to survive. Lewis was Minister of Education (1940-43) and Home Secretary (1941-43) in the quisling government of Lloyd George. In the period of direct German rule under the Protectorate, he served as Deputy Director of the Nordic Institute for the Civilization of England (1943-44).

During the occupation, Lewis was chiefly responsible for the cultural policy of the new order, a position for which he insisted that plenary police powers were necessary. Among his most notorious policies were the persecution of all manifestations of historical religious orthodoxy, and his use of the Anglican Church to promote a neo-pagan cult of his own devising. Lewis's voice became well-known to short-wave radio listeners during the war years through his weekly talks on this "British Christianity." Lewis is best remembered in England, however, for his treatment of intellectuals believed to be hostile to the regime, many of whom at been interned in the month just after the invasion. His orders regarding the faculty of Magdalen College, "Beat them, bite them, throw them into pits with snakes and never let them see the sun again!," secured his death sentence in absentia during the War Crimes trials at Portsmouth in 1946. A selected anthology of the directives issuing from his office during the war, published as "The Screwtape Memoranda," became one of the chief primary sources for understanding the workings of totalitarian bureaucracies.

Lewis was not in London on "Prince Caspian's Day," so called for the famous codeword that triggered the British uprising. It was later learned that, moved by some intuition when communications were cut, Lewis fled secretly to the Republic of Ireland to await events. Remaining in Ireland after the liberation of Britain and the Continent, Lewis wrote an enormous thesis describing the Neo-Nazi empire which he believed was the inevitable future of western civilization. Privately published as Imperium in 1948 under the pseudonym "Ulick Varange," the book has functioned ever since as the "bible" of postwar international fascism.

In the 15 years between the publication of "Imperium" and his apparent death on November 22, Lewis is believed to have been a major figure in the international fascist underground, and particularly in the mysterious "Odessa" organization. Though staunchly opposed to Communism, Odessa's tactical opposition to American influence in Europe has led it to cooperate with the Eastern Block security services. Lewis was known to have been operating in Latin America for some time, and American security officials had been hinting that an arrest could be imminent. None would confirm the rumors that Odessa cells operating in the western hemisphere had threatening retaliation if Lewis were taken.

"What can we say?" said the FBI's Assistant Director of Western Hemisphere Affairs, L. H. Oswald. "He was the wickedest man in the world."

Copyright © 2000 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Name the Present, Name the Future

Integrated rococo carving, stucco and fresco at Zwiefalten

Integrated rococo carving, stucco and fresco at Zwiefalten

The identification of the present with nominalism ascendant is plausible, especially if you combine that with its handmaiden antinomianism.


Name the Present, Name the Future

 

The term "postmodern" is an unsatisfactory way to refer to the last few decades of the 20th century. (The era itself was not altogether satisfactory, either, though not for that reason.) Postmodern is a definition-by-negation, which is rarely a good idea: consider the sad example of those atheists who devote their lives to combating their nonexistent god. Moreover, there never really was much evidence that the period was moving beyond the modern era in any serious sense. In both its popular and elite forms, the "postmodern" spirit is largely a matter of living off the achievements of the modern age by making fun of them. Postmodernity is just modernity's late phase, rather like the rococo is to the baroque.

But then, what of the term "modern" itself? Strictly speaking, any era can (and does) call itself modern. When we speak of modernity, we usually have something more specific than "the present" in mind. Even so, the term is elastic. Modernity can mean the 20th century after the First World War, or the 19th and 20th centuries, or everything after Columbus. The macrohistorian William McNeill once plausibly suggested that the modern world system actually began in 11th-century China.

It makes most sense, I think, to consider that our modern world began with the French Revolution. The era is an episode within the Enlightenment, some of whose possibilities it realized and some of which it has forever precluded. Modernity has had a great deal in common with the Hellenistic Age of the Classical West and with the Warring States period in ancient China. It is a good bet that, like those epochs, it will last rather less than three centuries. Probably some watershed like 1789 lies in the 21st century, more likely in its second half than in its first. On the other side of it, history flows in another direction.

The future will look after its own nomenclature, but I for one find it hard to resist speculation about how the future will characterize our modernity. Even if we entertain the notion that there have been analogous periods in the past, still every such era must also be unique. "Warring States" would not be appropriate for the modern West, for instance, since the era has not been one of continual warfare, but of unusually long periods of tranquillity, punctuated by apocalyptic explosions. Herman Hesse made a better suggestion in "The Glass Bead Game," where modernity is seen from the future as the "Age of Feuilletons." That is just strange enough to happen.

Certainly the name would have to evoke the tendency toward analysis and reduction that has characterized the West these last two centuries. The great movements in intellectual life, from philosophy to economics, have been toward atomization, even as sovereign states multiplied in accordance with the principle that every little language must have its own country. The modern era is really the Age of Nominalism. As for its postmodern coda, these decades are simply the stage when nominalism achieved its natural culmination in solipsism, of language speaking itself.

This brings us to the age to come. There is ample precedent for naming undiscovered countries. "Brazil" and "Australia," for example, were appearing on maps before the territories were discovered to which those names finally stuck. ("Brazil" was a Celtic paradise, and "Australia" was the generic name for a southern continent.) In naming the future, it seems fitting to proceed with a little help from Hegel. Historical epochs really do tend to react against the excesses of their predecessors, though that is never all that they do. If the Age of Nominalism is the thesis, then any medievalist can tell you that the obvious antithesis will be an Age of Realism.

Maybe already we see the advancing shadows of a future that is more interested in synthesis than in analysis. These adumbrations take various forms, from the proposals for a "final theory" of physics to the two-steps-forward, one-step-back progress toward world government. Perhaps we see a hint of the mind of the future in E. O. Wilson's ambitious, metaphysically naive, notion of "consilience," a universal structure of knowledge that would have a sociobiological backbone. More ambitious and not at all naive is the project outlined in John Paul II's "Fides et Ratio," which looks toward a harmonization of our understanding of all levels of reality, something not seen since the Thomistic synthesis. None of these projects is likely to have quite the results their proponents have in mind, but they may tell us something about the cultural climate of 2100.

 

End

 

An edited version of this piece appeared the symposium, "What Can We Reasonably Expect?" (First Things, January 2000)

Copyright © 2000 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: A Doomsday Reader: Prophets, Predictors, and Hucksters of Salvation

Order of the Solar Temple imitating Catholic ritual

Order of the Solar Temple imitating Catholic ritual

I've said before that Freud was a Fraud. One concept of Freud's that may have predictive validity, projection, is featured in this book review.


A Doomsday Reader: 
Prophets, Predictors, and Hucksters of Salvation
by Ted Daniels
New York University Press, 1999
253 Pages, $15.16
ISBN: 0-8147-1909-0

 

"Doomsday" can suggest a variety of things. Literally, the term means "Judgment Day," and in that sense it is familiar from Christian eschatology. However, without much stretching, the term is also an apt characterization for the role "the revolution" plays in the Marxist model of history. The fact that the present order of things is judged by "historical necessity," rather than by God, is inessential. In fact, this basic pattern of belief is familiar around the world and throughout history, though the mechanism that is supposed to bring about doomsday varies according to the local sense of the possible. The world we know is flawed, it will presently be destroyed, and it will be followed by a better one. This is the faith and the patience of the Saints, of Deep Ecologists, and of social revolutionaries alike.

"A Doomsday Reader" does not purport to cover the whole world, though it includes an overview of the role millennialism plays in the major world religions. Its stated goals are ambitious enough: to illustrate the modern role of "apocalypse in the West and its effects on our politics and our lives." The author is Ted Daniels, a folklorist at the University of Pennsylvania and the Director of the Millennium Watch Institute. The Institute is part of a network of organizations that have been monitoring what is loosely called "millennial fever" in the run-up to the year 2000. For instance, Dr. Daniels contemporaneously collected the popular rumors that began growing up about the Hale-Bopp Comet in late 1996. At the time, I wondered why he bothered. Then, in 1997, the Heaven's Gate cult committed mass suicide, motivated in part by these beliefs. Shows you what I know. (Nevertheless, I am mentioned in the Acknowledgments.)

The book consists of 11 brief "apocalyptic" texts of relatively recent vintage (none is earlier than the excerpt from "The Communist Manifesto"), prefaced by longer analytical essays that provide historical context. Many readers will find the final five chapters particularly useful. These deal with the major violent or suicidal groups of recent years whose beliefs incorporated a large apocalyptic element. Daniels does not attempt to devise a unified theory to explain the Branch Davidians, the Order of the Solar Temple, Aum Shinri Kyo, Heavens Gate and the Montana Freeman. Nevertheless, he does make what may turn out to be very useful observations about the dynamics of such groups. For instance, he suggests that the reason the Freeman eventually surrendered to the authorities, while the other groups either killed themselves or tried to kill everybody else, was simply that the Freeman lacked a charismatic leader.

Daniels offers a general Freudian interpretation of the leaders of destructive apocalyptic groups that may persuade even non-Freudians. In this view, such leaders are narcissistic personalities who understand, at some level, that they are deficient. However, rather than locating the deficiency in themselves, they project it onto the world. Thus, rather than trying to heal themselves, they seek to heal the world. In extreme cases, rather than try to kill themselves, they will try to kill the world. When many people come to share such a person's projections, then you have an apocalyptic movement. (Usage varies in the terminology, but a movement is often said to be specifically "millenarian" if it seeks to help create a future age quite different from the world we know, and "millennial" if it seeks a future that is better than but continuous with the past. An "apocalyptic movement" might be any drive for fundamental change based on a "revelation" of some sort.)

"A Doomsday Reader" is of far more than historical interest. Though the number of readings is small, they and the groups that produced them are nevertheless typical of quite durable apocalyptic traditions. This is particularly the case with the global conspiracy theories that the media have released from the subcultural subcellar in recent years. One version of them, the Jewish international conspiracy, runs through both "The Protocols of the Learned Elders of Zion" and "The Turner Diaries," which were written 80 years apart and excerpted for this collection. It is particularly illuminating to read this material in conjunction with the history of the Order of the Solar Temple, which really was a secret society that aspired to exert its influence internationally. The fact that the Order was not a rousing success did nothing to dampen the conviction among conspiracy buffs of the potency of such groups. Some bad ideas just don't go away.

This is not to say that all the characteristics of apocalyptic thinking are without value. Daniels notes that apocalyptic is often an expression of the desire for vengeance, a forum where the high and mighty are brought to answer for their malefactions. For my money, at least, he does a bit of it himself, by naming some smug secular organizations as apocalyptic actors. He caps his discussion of the anti-human wing of the ecology movement with the text of the "World Scientists' Warning to Humanity," issued by the Union of Concerned Scientists in 1992. As Daniels observes, this document does not request, but requires, dramatic changes in every area of life, everywhere, if total ecological calamity is to be avoided. Indeed, this Warning has the distinction of being the bossiest text in an anthology that also includes an excerpt from "Mein Kampf." We need more of this willingness to tell the educated that, when they are looking for millenarians, they can often forget about looking in trailer-parks and just look in the mirror.

"A Doomsday Reader" does have bloopers which should have been picked up by the editor. To pick a few nits, "chiliasm" is not a Greek cognate of "millenarism," but merely its equivalent in meaning. The pyramids of Egypt were not "rediscovered" by Napoleon's armies "at the beginning of the eighteenth century," or indeed at any other time. Hegel was far more likely to have been a major influence on Comte than the other way around, since Hegel was 28 years older. More seriously, while the statement, "Augustine surrendered the world to evil," might be defended, the defense would need to engage the widely held belief that Augustine is the father of the idea of progress. Finally, though I recognize the point is really beyond the scope of the book, the fact that Chinese cultural history is little concerned with "eschatology," in the sense of the final end of history, does not mean that it lacks a conspicuous millenarian element.

Still, these are minor blemishes. "A Doomsday Reader" is a groundbreaking book. Dr. Daniel's special forte has been the mastery of the Internet as a medium for research into popular culture. The references in this book do not simply tell you about what is happening in various apocalyptic subcultures, they give you the tools to go online and watch it happening yourself. Additionally, this book could have important implications for public policy. Its close analysis of the successes and failures that the authorities have had in dealing with apocalyptic groups may help to prevent more disasters like those we have seen in the 1990s. While we may not always find other people's ideas about the imminence of the new age plausible, the fact that they think this world is about to end usually means they have some real complaints against it. We should pay more attention in the 21st century.

Copyright © 1999 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Thrawn Book Review

IMAGE CREDIT: LUCASFILM LTD.

IMAGE CREDIT: LUCASFILM LTD.

Thrawn
by Timothy Zahn
Del Rey, 2017
$28.99; 427 pages
ISBN 9780425287071

This was everything I ever wanted out of a Star Wars novel. Timothy Zahn's Heir to the Empire Trilogy was my introduction to Star Wars novels. If this had to be the last one I ever read, I think I could be happy. Much like Rogue One, Timothy Zahn's Thrawn expertly painted in the gaps left by the original trilogy of movies, making an already great work of popular art even better.

This is so because Zahn managed to answer more questions than he posed. For example, the complete lack of aliens within the Imperial Navy gets a plausible explanation that fits in with all of the movies. Also, the process by which the Imperial Navy and Army came to staffed and led by incompetent lackwits, despite having the resources of an entire galaxy to call on, is laid bare. With an intentionally open-ended story, this is a remarkable feat. Not least because the available narrative space for Zahn is, if anything, more limited than it was when he initially created Grand Admiral Thrawn.

In 1991, references to the Clone Wars were just that. Zahn ended up making choices to move his story along [without any objections at the time] that were simply abrogated when George Lucas wrote the prequels. The gap thus created was a large part of what doomed the previous iteration of Thrawn's origin story, Outbound Flight. Trying to shoehorn in the continuity of the prequels made that book plod along without any sparkle. Thus, Disney's decision to sweep away all of the previous books, comics, and videogames seems to have given Zahn an opportunity to reimagine Thrawn without being bound by even his own works.

In my opinion, this Thrawn instantiates who he was meant to be better than before. He is the best version of himself. Zahn commented on his Facebook page that this is the same Thrawn we saw before, just in a new light. Maybe so. But he sure feels a bit different to me. It has been twenty-five years since Zahn introduced us to the Grand Admiral, and Disney's decision plus the weight of twenty-five years of additional writing experience created an opportunity to make something new.

The vehicle by which we are introduced to Thrawn's early Imperial career is a political thriller. [along with excepts from Thrawn's journal] I didn't expect that [I'm not sure what I expected] but I think it works. While we do get to see Thrawn's tactical and strategic virtuosity, the scope of his campaigns are limited by the scope of his responsibilities. What we do get to see are the political machinations that characterized the day-to-day business of the Empire.

We get to see Imperial politics mostly through the eyes of Arihnda Pryce [a tie-in to the on-going Star Wars Rebels series] and Eli Vanto, an Imperial ensign who speaks a common language with Thrawn and ends up trailing Thrawn throughout his career. Pryce is exactly the kind of person who prospered in the years following the end of the Clone Wars: amoral, ruthless, and calculating. Here we find the root cause of the Empire's rot. Pryce, while bright and capable, came to be Governor of Lothal because of who she knew, and whom she had betrayed. A functioning bureaucracy requires a bit more probity than this.

Thrawn himself does not seem much bothered by the venality and incompetence of Imperial officers and politicians. Which strikes me as odd, and also as perfectly appropriate. Zahn made me feel that Thrawn was very alien. He just didn't want what I want, at least in the same way. The dust jacket for the book features this quote, which also looms large in an incident Thrawn's Imperial career:

There are things in the universe that simply and purely evil. A warrior does not seek to understand them, or to compromise with them. He seeks only to obliterate them.

The things that Thrawn finds abominable, and the things he finds excusable, are very different things from almost everyone around him. He clearly disliked the chaos of the late Republic, and liked the orderliness of the new Empire, despite its tyranny. I appreciated this, Thrawn really is an outsider, an alien, from a culture with a completely different point of view.

Yet at the same time he felt very familiar. The analogue I find ready to hand are the classical Romans. The Thrawn we meet in the early Empire is good at the hard virtues. His courage and stoicism are undeniable. As is his lack of pity. He is honest to the point of bluntness. He lacks the soft virtues: kindness, gentleness, compassion.  He never seeks wanton destruction, but suffering as such does not faze him. Disorder does.

Scipio Africanus

Scipio Africanus

The portrait of Rome I have in mind is the one Chesterton painted in the first part of The Everlasting Man. Rome represented the best of the ancient world, but it was still very different from the Christian civilization that eventually replaced it. Just, but harsh. Uncompromising and stern. And very, very good at war. Most of us modern Westerners would also be taken aback at something a 1st century Roman commander would find obvious and proper, if we ever met one.

Much like Scipio Africanus, Thrawn's political adversaries tend to find him a bit of a naïf at politics. It is left ambiguous whether Thrawn is really bad at politics. All the Imperials think he is, but Clausewitz said that war is a continuation of politics by other means. Insofar as Thrawn is quite adept at manipulating his opponents on the battlefield, the idea that he cannot do the same to politicians seems strange.

It is just possible that Thrawn isn't interested, or doesn't care, because that is way his alien mind works. He could just have a blind spot there. It is also possible that is is playing a really long game. The story I'm thinking of is about John von Neumann, that he could have a normal conversation with absolutely anyone, from a 5-year old to one of his peers in physics and mathematics. The idea is that he was so smart that he was just simulating what normal looked like to whomever he was talking to. This is like that, but if you had the added goal of manipulating and controlling the person you were talking to.

What that long game really is, we don't know. We know more than we started, however, which is good enough for me. I loved this book, and I suspect many Star Wars fans will too. You might even like it if you aren't a fan. I've read a lot of Star Wars books, and I haven't liked most of them. This one is good, a thought-provoking exercise in order and justice through the mind of an enemy commander.

My other book reviews

 

Thrawn (Star Wars)
By Timothy Zahn

The Long View 2004-11-04: Voting with their Flyer Miles; Integrity; Marketing

The continuing saga of John Reilly versus the HotLlama DVD player is pretty funny. This player doesn't seem to be commercially available anymore, but all of the Google search hits are for people complaining about it.


Voting with their Flyer Miles; Integrity; Marketing

 

Not only celebrities are threatening to leave the United States because they find it ideologically uncongenial. Ordinary upper-middle-class people are more or less advanced in their plans for comfortable exile.

"I can no longer in good conscience support a nation that believes it is OK to lie to start wars," she said. "I will not live in a country where dumb and dumber are my two choices for president. I'm taking my assets out of the country and moving to Central America, where ironically, I will have more freedom to live my life without interference from a corrupt government. My husband and I will leave within four months."

Unless this woman is moving to Costa Rica, her expectations for a corruption-free future are likely to be rudely disappointed. And if she is moving to Costa Rica, she is likely to find an expatriot American community that moved there in the 1990s to escape what they perceived as creeping socialism.

Still, things could be worse. The last time emigrant fever broke out in the United States was during the early years of the Depression, when hundreds of Americans accepted offers to lend their expertise to help build socialism in the Soviet Union. For the most part, these people disappeared during the Purges.

* * *

And if you do flee the stultifying confines of the Great Republic, you may find that its politics is not as idiosyncratic as you have been led to believe:

CANBERRA, Australia, Nov. 8 (UPI) -- The issue of abortion is becoming an increasingly hot topic in Australia, with the federal treasurer claiming it is a regional, and not a federal matter...The issue arose recently when the federal health minister, deputy prime minister and other senior coalition members of parliament called for a reduction in the number of abortions, particularly late terminations.

I don't know enough about Australian federalism to say what the principled pro-life position should be there. If a matter has usually been handled locally, people will often react badly if the matter is arbitrarily preempted by the national government. Certainly the pro-abortion faction in the US never made a bigger mistake than when they federalized the issue.

In this regard, we should note that the one really objectionable thing about John Ashcroft's Justice Department has been its studied refusal to allow its pro-life litigation to be affected in any way by considerations of mere constitutional principle:

Oregon Gov. Ted Kulongoski is criticizing outgoing Attorney General John Ashcroft for appealing a federal appeals court's decision preventing the federal government from declaring that federally-controlled drugs can't be used in assisted suicides because they don't constitute a medical purpose.

On this narrow question, the governor is right: the federal government cannot control the practice of medicine in this fashion. Why did Ashcroft continually bring cases like this?

* * *

On November 9, The PBS affiliate WNET aired The Persuaders, another expose' by Douglas Rushkoff of marketers and their wicked ways. The program emphasized that the problem of "clutter" is becoming critical: there is so much advertising that ordinary ads are becoming invisible. That is why advertisers are increasingly turning to "product placement," the strategy in which products are incorporated into entertainment. The program also had the first acknowledgment I have seen in a long while that the real target of marketers is their clients. Marketers are creative types who are more interested in exercising their talents than in selling goods and services; the real challenge lies in coaxing the client to pay the marketer to amuse himself. The expression "he who pays the piper calls the tune" is a marketing slogan devised by pipers.

Critiques of this sort have been with us for 50 years, and they still have some validity. Still, I wonder whether they are becoming anachronistic, at least with regard to some topics. "The Persuaders" addressed the question of political advertising, but without once addressing the fact that this was the year when the "broadcast" model of politics began to break down. No doubt it is shabby, as the program pointed out, that Republican strategists managed to replace the term "Estate Tax" with "Death Tax" during their campaign to repeal the tax on transfers of wealth at the time of death. But is that really more important than the successful revolt in the blogosphere against Dan Rather's Texas Air National Guard hoax?

* * *

Speaking of marketing issues, I would like to take back some of the harsh things I said about the Hotllama software company in my blog entry of October 25, in which I basically said that their DVD player was the sort of software you would have expected Lovecraft's monsters to write. The Hotllama customer service department found that entry, and emailed me a friendly note of explanation. Glitches happen, and it is too much to expect every application to work seamlessly with my increasingly archaic software. Still, I would like to highlight one point that Hotllama made in its note, in response to my complaint about the amount of personal information the DVD player asked for during installation:

But as with any software, installation is necessary, but you could have opted out of adding your email address, etc. but we did require your ZIP Code. We assure you that we treat any information we received as purely and totally anonymous, always have, and always will.

Is it possible that marketers do not know that no sane person responds to this sort of prompt accurately? Or do they really think that 25% of their customers are 100-year-old women who live in Alaska?

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-04-21

Why is this so funny?

Why is this so funny?

Depending on you do the accounting, emergency rooms aren't the most expensive way to get care.

If you combine this essay from 2016 by Rusty Reno, editor of First Things, with this 2017 article by Steve Sailer, you can get a sense of just how weird American elite universities have gotten.

Another Rusty Reno / Steve Sailer pairing, this time on how corporate and political diversity initiatives are used to shore up the status quo.

Tyler Cowen points out that stats wise, West Virginia isn't so bad. This is an interesting article on its own merits, but it also makes me wonder whether standard economic metrics are all they are cracked up to be.

Bryan Caplan points out that talking about IQ doesn't have to make a monster, but in his experience it often does. Since I follow a lot of IQ/psychology/genetics researchers on Twitter, I got to see many of them questioning Caplan about this in real time.

This story is almost ten years old now, but I didn't know that the monasteries at Mt. Athos still run under their Byzantine grant.

A number of my favorites made this list: Gattaca, Screamers, and Event Horizon.

H. P. Lovecraft is a favorite author of mine. I think these are indeed good places to start.

This title is horribly misleading. This is really an article about intellectual property law, and how a clever strategy almost allowed Google to publish orphaned books.

 

The Long View 2004-11-04: Questions about Convertibility

With my recent discovery of a large number of John Reilly's book reviews and essays, I had been enjoying a respite from John's topical political commentary from thirteen years ago. Unfortunately, we need to get back at it.


Questions about Convertibility

 

As with most things in life, the important reactions to the reelection of George Bush were summed up long ago by Ambrose Bierce, in The Devil's Dictionary:

President, n. The leading figure in a small group of men of whom -- and of whom only -- it is positively known that immense numbers of their countrymen did not want any of them for president.

It would, of course, be churlish to fault the Democratic Establishment for declining to eat their ration of crow all at once. They are still in the denial phase, which is only to be expected. As students of millenarianism know, the failure of the Parousia to occur on a predicted date simply excites the people who had hoped in it to greater efforts to convert the unbelievers. This dynamic is evident in Thomas Frank's column in today's New York TimesWhy They Won:

To short-circuit the Republican appeals to blue-collar constituents, Democrats must confront the cultural populism of the wedge issues with genuine economic populism. They must dust off their own majoritarian militancy instead of suppressing it; sharpen the distinctions between the parties instead of minimizing them; emphasize the contradictions of culture-war populism instead of ignoring them; and speak forthrightly about who gains and who loses from conservative economic policy.

If something does not work, and you don't know what else to do, the natural impulse is to do it harder. That is what made the First World War what it was, and it seems a fair description of the (American) liberal strategy in the Culture War. At this point, one can only remark out that the contradictions are not on the side of the Christian Realists. Thomas Frank in particular has promoted the thesis that the cultural and value issues are not real issues at all, but devices to deceive and bewilder the masses. The contradiction lies in the refusal of progressives to give even one inch on the abortion license or the normalization of homosexuality.

If the points are unimportant, then they should be conceded. If they cannot be conceded, then they must be important enough to figure prominently in public debate. I predict the points will be conceded, however much that outrages and alienates Left Reactionary elements. And then everything will change.

* * *

Many intemperate things have been said since Wednesday, when the results of the election became known. For shear shock value, however, none exceeds Ann Coulter's blasphemy against Karl Rove:

If Rove is "the architect" -- as Bush called him in his acceptance speech -- then he is the architect of high TV ratings, not a Republican victory. By keeping the race so tight, Rove ensured that a race that should have been a runaway Bush victory would not be over until the wee hours of the morning...Seventy percent to 80 percent of Americans oppose gay marriage and partial-birth abortion. Far from appealing exclusively to a narrow Republican base..."Boy Genius" Rove decided Bush shouldn't even run radio ads on gay marriage,

And Boy Genius was right. The values issues were important in the 2004 election, but they were scarcely the only factors; one might mention the continuing low-grade world war, for instance. Just shy of a quarter of the electorate said they were voting chiefly on moral questions. Very good: but a presidential candidate who talked about nothing else would be rightly dismissed as a crank.

Regarding the gay marriage issue in particular, may I point out that the chief difficulty in combating it is that it is nonsense? One falls silent when the matter is raised, not for fear of being revealed as a bigot, but because the notion is incoherent. Arguing about it is like talking about the man who was not there. It's an embarrassment, not a controversy. The people don't want gay marriage refuted; they want it to go away.

* * *

It was with these thoughts in mind that I viewed the press conference that President Bush gave yesterday. The president said something that Bill Clinton was never brave enough to say, much less do:

"I earned political capital during the campaign, and now I intend to spend it."

Good for him, and for the most part I wish him well, but the president needs to remember that he was reelected to win the Terror War and the Culture War. The capital he has amassed is like grocery-store coupons: it can be spent only on certain things.

I am not very keen on the Administration's proposals to partially privatize Social Security. I bow to no one in my eagerness to reform the tax code, but I was distressed to hear that the president has not adopted the position that the way to reform the code is to design it to do nothing but raise revenues for the federal government. If the government must subsidize industries, then let it do so through rebates, which must be separately budgeted and authorized by Congress. As for the federal deficit, the thought of it makes me nearly frantic.

I am not alone in these reservations.

* * *

A good argument has been made (by Glenn Reynolds, probably) that what got the Republicans where they are today is the process of "disintermediation," which means the diminishing importance of the institutional gatekeepers of information. So far, at least as a political phenomenon, disintermedation has been most important in America. If Medienkritik gets his way, however, Germany will not be far behind:

In the United States, consumers have talk radio, Fox News and the blogosphere as an alternative information source to the left-leaning, "mainstream" media. In Germany, none of that exists. The deepest fear of the German media elite and the angry left is that such an alternative could emerge and compete with or even replace them...It will be the stated goal of Davids Medienkritik over the next four years and beyond to continue to offer such an alternative to the German media and to encourage and support others seeking to do so. WE WILL BREAK THIS MONOPOLY, we will provide an alternative, we will seek to bridge the widening transatlantic gap and not to deepen it. And we will do so with your help and support.

And as Germany goes, so goes Europe.

* * *

Finally, here is what that Other Spengler had to say about the election. As usual, every slap on the back from this fellow comes with the jab of a needle:

What brought 4 million more evangelical voters to the polling stations than in the previous presidential election?... It is the hard, grinding reality of American life in the liberal dystopia that makes the "moral issues" so important to voters. Partial-birth abortion and same-sex marriage became critical issues not because evangelical voters are bigots. On the contrary, parents become evangelicals precisely in order to draw a line between their families and the adversary culture. This far, and no more, a majority of Americans said on November 2 on the subject of social experimentation...Unlike the Europeans, whose demoralization has led to depopulation, Americans still are fighting against the forces of decay that threaten - but do not yet ensure - the ultimate fall of American power. That is the message of November 2.

And speaking of Spengler, would you all please buy my damn book, so you know what's going on?

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Fighting for Liberty and Virtue: Political and Cultural Wars in Eighteenth-Century America

21-year old George Washington

21-year old George Washington

Olasky's Fighting for Liberty and Virtue seems a bit like Fernandez-Morera's Myth of the Andalusian Paradise, an regrettably polemical look at an inherently interesting subject.


Fighting for Liberty and Virtue:
Political and Cultural Wars in Eighteenth-Century America

by Marvin Olasky
Crossway Books, 1995
$25.00, 316 pp.
ISBN 0-89107-848-7

In the Beginning Was the Future

"[Chairman Mao] went on to a comparison between his China as seen from [the Communist rebel base at] Yenan and the American Revolution as a foreign reporter might have seen George Washington at Valley Forge. ...Did George Washington have machinery? he asked. Did George Washington have electricity? No. The British had all those things and Washington did not, but Washington won because he had the people with him."

---From "In Search of History," by Theodore White, page 260.

 

While Chairman Mao was perhaps a little confused about the sequence of technological progress in the West, few people would quarrel with his general point that the American Revolution succeeded because of widespread popular support, despite the general material inferiority of the rebels to their British opponents. In this book, Dr. Olasky seems to be similarly confused about the strength of the parallels between the political climate of the Revolutionary era and that of our own time, though many of his points about the importance of non-conformist Protestantism in the politics of eighteenth century America are well taken. America is, in some sense, the same country that it was two centuries ago, so some hardy perennials of American history, such as tax protests and calls for local control, were discernible even in late colonial times. The fact is, however, that the late eighteenth century was not, as he would have us believe, a time of culture war between virtuous American Republicans and decadent British monarchists. One could argue, in fact, that he slights his own partisan interest by claiming that Presbyterians and Baptists played as strong a role in the Revolution as the Christian Right is playing today. Contemporary political Christianity is not as important as it was in the 1770s: it is much more important.

Marvin Olasky is a journalism professor at the University of Texas at Austin, editor of the Christian Weekly news magazine "World," and, among other things, a general editor of the Turning Point Christian Worldview Series, published by Crossway Books and the Fieldstead Institute. One of his recent books, "The Tragedy of American Compassion," is a historical critique of welfare policy that received the special approbation of Speaker Gingrich. "Fighting for Liberty and Virtue" similarly uses history to advance a contemporary agenda. His thesis is that the American Revolution was made possible by a coalition of those interested in small government (particularly as manifested by low taxes) and those interested in holy government. The book will, no doubt, provide rhetorical ammunition with which cultural conservatives can defend themselves against the charge of injecting theological values into today's politics that would have been alien to America's founders. (There is even an Appendix with the helpful title, "Sound Bites from the 1780s for the 1990s.") The difference between polemic and scholarship is that the latter is careful to provide contrary evidence, and to state opposing views fairly. "Fighting for Liberty and Virtue" is clearly an example of the former.

The people of the Revolutionary era, we are told, favored every possible device to keep government "close to the people." For instance, as Olasky notes, colonial legislators generally served without pay, except for expenses. He contrasts this with representation in the British parliament, which he characterizes as "potentially enriching." Actually, there was nothing "potential" about the financial benefits that flowed to loyal faction members in the legislatures on both sides of the Atlantic in those years. However, he might also have mentioned that service in parliament during the eighteenth century was also without pay, except for members of the cabinet, and that it continued to be an unpaid honor until 1911. The Liberal Party government then provided a small salary, so that people who were not independently wealthy or in the employ of some interest could afford to serve. The Founding Fathers made this reform in the Constitution we have today, but then Olasky suggests, as we will see, that the Constitution was a ghastly mistake.

The book's most peculiar thesis is that the moral depravity of the ruling class of Georgian England doomed the Empire in the Revolutionary War. That eighteenth century British aristocrats were often very naughty is not in dispute. The point is easy to prove from the contemporary literature: we are talking about a time and place that produced Swift and Sam Johnson and Hogarth. And of course, the stories themselves are entertaining. Thus, we are regaled with tales of how Lord Cornbury, appointed to be royal governor of New York in the early 1700s because he was a cousin of Queen Anne, used to flounce about the ramparts of Manhattan in a woman's dress. We hear about the eighteenth century "Hell Fire" clubs (there was apparently more than one), and the odd parties that the Earl of Sandwich, who as First Lord of the Admiralty was perhaps most responsible for the neglect of the fleet that permitted the French to provide vital aid to the American insurgents, used to host at his renovated Cistercian abbey. (The incidents involving the baboon were particularly deplorable.) George Sackville, Secretary of State for North America during the Revolution and primarily responsible for grand strategy, seems to have been uncommonly fond of certain young officers, thus suggesting that even then "gays in the military" were a morale problem.

In sum, we are reminded that eighteenth century British government was rife with bribery, that business enterprises were frequently scams (as the South Sea Bubble illustrates), and that military officers embezzled and mismanaged supplies. Olasky does not claim that the American leaders were without stain. Only Samuel Adams and Patrick Henry get his unmitigated approval, whereas Benjamin Franklin, the old goat, comes in for special denunciation. (Indeed, we are told that Franklin only "changed sides" to support the colonies a year before the Revolution began.) Curiously absent from this rather Confucian tally of virtue and vice in office is the figure of George III. By all accounts, he was hardworking and well-meaning. He was so faithful to his Queen Charlotte (they had 15 children) that his less monogamous courtiers thought it odd. After the American Revolution he suffered from bouts of insanity, which doctors now believe to have been caused solely by body chemistry. (During his incapacitation, his relative virtues were made all the more evident by the appalling behavior of his son, the Prince Regent.) The story of George III bears less resemblance to that of Belshazzar than to that of Nebuchadnezzar, or even of Job.

It is simply false to say that the Revolution constituted a "culture war" between England and the United States. The leaders of both sides were a mixture of deists, agnostics, libertines and Christians of ordinary piety. The populations they led were not so different from their leaders. The British government, unlike the liberal establishment of today, had no new morality of their own devising which they hoped to impose on the colonies. The British leadership seems not to have had any ideological policy at all, beyond the maintenance of parliamentary supremacy throughout the Empire. The "culture war" of the Enlightenment started when the Revolution in France sought the end of class distinctions, traditional morality, and the expurgation of Christianity. In that war, Britain and the young United States were pretty much on the same side. (The United States also had its Jacobin "Left," of course, which agitated for direct support of revolutionary France.) Though the culture war that began with the French Revolution has suffered various mutations, that is the conflict which continues to this day.

Eighteenth century Americans were certainly amenable to the idea that the British Empire was a hopelessly corrupt institution on its way to a resounding collapse. When Edward Gibbon's "The Decline and Fall of the Roman Empire" began to appear in 1776 (the whole work was published over a period of about ten years), it soon became part of the favorite reading of America's leaders. It was cited as a mirror of the contemporary condition of Great Britain, and, at the Constitutional Convention, as a prophecy of what could happen to the United States if the participants did not do a good job. While its application as prophecy remains to be seen, as a diagnosis of Britain it was clearly wide of the mark. The British polity in the second half of the eighteenth century was a society striving to get a grip on itself. This, really, was what started the American Revolution. Parliament was trying to introduce some rudimentary fiscal and regulatory order into its haphazard Empire. The problem in America was that these reforms ran afoul of well-established traditions of self-government, with what results we know. However, in the decades immediately following the loss of the American colonies, decadent Britain went on to defeat Napoleon, reform its domestic life in the reign of a queen whose name became a byword for pious rectitude, and create the largest empire in the history of the world. Eighteenth century Britain was a disordered society. However, its disorder did not spring from moral exhaustion.

Olasky's chief focus in the book is the role of religion in eighteenth century politics, and the material he presents is worth reading. He quotes at length from the tracts and sermons of the time, sources which probably reflect the feelings and ideas of ordinary people, even of ordinary educated people, far better than do the writings of people like Franklin or Tom Paine. He is quite correct in noting that the role of religion in American history has often been shortchanged by historians. As G. K. Chesterton remarked, America is a nation with the soul of a church. This is why American history is more than usually baffling. It is almost certainly true that the Great Awakening, that strange movement of the national spirit that erupted in the 1730s and 40s, was a necessary predicate to the Revolution forty years later. Olasky quotes a Hessian officer as saying that the Revolution was essentially a revolt of Scotch-Irish Presbyterians, and certainly religious non-conformists looked on the Revolution as a way to free themselves from the government that had persecuted them or their ancestors. To the English after their victory in the Seven Years War (the French and Indian War in America), it seemed perfectly reasonable to consider establishing an Anglican bishop in America; after all, Anglicanism was the established church in several of the colonies. To American Presbyterians and Baptists, however, the very idea smacked of the crypto-papist tyranny of the Stuart dynasty.

The Stuarts had been deposed in the Glorious Revolution of 1688, an event that gets curiously scant attention from Olasky, though it was certainly much on the minds of people in the English-speaking world in the century that followed. To the Awakened, as Olasky calls those touched by the Great Awakening, that prior Revolution was a milestone in the history of Christianity. Because of it, the personal conscience of Englishmen was freed from the dictates of an established creed. To the Enlightened, which included such people as Franklin and Adams and Hamilton, it was the foundation of consensual, rational, limited government. To both groups, it provided a model for insurrection that could be put into action, if some issue arose that engaged both strands of American culture. Olasky notes some pre-Revolutionary controversies that seemed to do this. Notable among them was the "Parsons' Cause." In that case, Patrick Henry defended against a suit by clergy of the established Anglican Church in Virginia in which they sought to be paid in money rather than tobacco. (He won in that the jury awarded the parsons only a penny in damages.) This was one of those incidents, according to Olasky, that made it possible for the Awakened with their concern for holy government to join forces with the Enlightened, whose idea of good government included the need for low taxes. Persons wishing to draw comparisons with the present state of the Republican Party are invited to do so.

The problem is that, if you draw such a comparison, the apparent power of eighteenth century piety will suffer from it. The concerns about church establishment and holy government which Olasky (and others) have documented are absent from the Declaration of Independence. God is much mentioned in that document, of course, but nowhere does it suggest that He is much exercised by the prospect of an American bishop or by the cross-dressing ways of certain royal governors. Even though the recent British reorganization of neighboring Quebec is alluded to, the document does not complain about the re-establishment of Roman Catholicism as the official church there, a step that is well-known to have excited outrage from American Protestants. Olasky explains these omissions by noting that the Declaration was a compromise between the Enlightened and the Awakened. Besides, its primary drafter was that notorious free-thinker, Thomas Jefferson. Still, in a compromise you are supposed to get at least part of what you want. If we judge by the output of the Continental Congress, however, it would appear that Olasky's Awakened were not so much accommodated as co-opted.

This brings us to the question of the merit, and even the legitimacy, of the current United States Constitution. There is a long tradition in American historiography, most famously propounded by the Marxist historian Charles Beard, which holds that the Revolution itself was a popular rebellion, but the Constitutional Convention of 1787 was a sort of counter- revolution, an anti-democratic cabal of Masons and proto-capitalists intent on wresting power from the populist state governments. Olasky is of similar mind, with the addition that the Convention retreated not just from the principles of small government, but from the desire for holy government. A powerful central government, it seems, is an occasion of sin.

This assessment of the origins of the Constitution is counterfactual. If anything, it was the Continental Congress that smacked of conspiracy. It was composed, after all, of the representatives of provisional revolutionary governments, themselves with shaky legality and questionable popular support, who met to declare themselves the effective government of the continent. The Declaration it issued looked for its ultimate ratification not in ratifying assemblies, but on the battlefield. The Constitutional Convention of 1787, in contrast, was composed of the representatives of legal states. They assembled simply to draft a document, a constitution they claimed no power or right to impose themselves.

In further contrast to the Congress of 1776, the Convention discussed in detail all those issues of religious establishment and freedom of conscience which so deeply troubled pious Americans of the time. Unlike the Declaration of Independence, the Constitution does not mention God. On the other hand, it actually does make some provision for liberty of conscience by forbidding the imposition of a religious test for public office. (It also allows for "affirmations" rather "oaths," in deference to Quaker sensibility.) The Bill of Rights, soon to be promulgated by the First Congress, would provide more protections. It is doubtless true, as Olasky suggests, that the Supreme Court today has radically misinterpreted the First Amendment as a protection from religion, rather than as a prohibition of a state church. However, the flaw there is not the Constitution, but the prejudices of the modern judiciary.

What makes Olasky's argument somewhat grotesque is the assumption that small government and holy government normally go together. If by "small government" you mean low tax, low service states with lengthy constitutions that make it almost impossible for the state to do anything at all, then there has been no lack of such regimes throughout American history, particularly in the Deep South. They have been notable chiefly for the corruption of their public officials and the poverty of their people. They have also been fantastically anti-democratic. Their relative isolation from federal oversight until the desegregation era permitted many a governor or party leader to entrench his squalid little tyranny with small danger of ever having to face a fair election. Much the same thing happened in the cities of the Northeast in the late nineteenth and early twentieth centuries. Taxes were higher, perhaps, but the urban Democratic political machines were as "close to the people" as any post-Revolutionary anti-federalist might have wished. Probably rather more so.

The Founding Fathers, who for all their faults set a standard of intelligent civic virtue unique in human history, discussed the merits of large and small government at length in the Constitutional Convention. They concluded, correctly, that small states were at least as prone to tyranny and corruption as large ones. They also saw that the attempt to turn the question of the powers of the federal government (indeed, the question of whether there should be a United States at all) into a choice of "big government, small government" was a red-herring. The real danger was anarchy and civil war, with tyranny to follow. This is one parallel with the past that today's cultural conservatives would do well to remember.

(This article originally appeared in Fidelity magazine.)

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Samuel Johnson and His Dictionary of Doom

Samuel Johnson may be one of the most influential figures in English orthography.


Samuel Johnson and His Dictionary of Doom

Perhaps the most common defense of the traditional orthography of English is that the spelling is supposed to reflect the etymologies of the words, and so gives useful clues to their meanings. This argument is, of course, a red herring. The orthographies of all the modern European languages take etymology into account; only in English is this an excuse to allow spellings to become so phonetically ambiguous that standard dictionaries must provide a pronunciation key for each word. There are many reasons why this condition has been allowed to persist. Among the most important is the support it received from Samuel Johnson's Dictionary of the English Language, which appeared in 1755. This great work is credited with standardizing the spelling of English for the first time, but at the cost of phonetic incoherence.

Dr. Johnson set out his principles of lexicography in the Dictionary's Preface, which, happily, is available from the Gutenberg Project. In that fascinating essay, he demonstrated a proper understanding of the use of etymology, which any reform of the writing system of a language with an ancient and extensive literature would have to employ:

Such defects [as the divergence of loan words from their roots] are not errours in orthography, but spots of barbarity impressed so deep in the English language, that criticism can never wash them away: these, therefore, must be permitted to remain untouched; but many words have likewise been altered by accident, or depraved by ignorance, as the pronunciation of the vulgar has been weakly followed; and some still continue to be variously written, as authours differ in their care or skill: of these it was proper to enquire the true orthography, which I have always considered as depending on their derivation, and have therefore referred them to their original languages: thus I write enchant, enchantment, enchanter, after the French and incantation after the Latin; thus entire is chosen rather than intire, because it passed to us not from the Latin integer, but from the French entier.

It is too much for any speaker of a major language to expect that its orthography will perfectly mirror his pronunciation; it is enough if every spelling yields a possible pronunciation. For other European languages, the standardization of orthography has gone hand in hand with a process of modifying the historical spellings to satisfy that criterion. Dr. Johnson, however, enunciated a contrary principle, to the continuing cost of English-speakers ever since:

In this part of the work, where caprice has long wantoned without controul, and vanity sought praise by petty reformation, I have endeavoured to proceed with a scholar's reverence for antiquity, and a grammarian's regard to the genius of our tongue. I have attempted few alterations, and among those few, perhaps the greater part is from the modern to the ancient practice; and I hope I may be allowed to recommend to those, whose thoughts have been perhaps employed too anxiously on verbal singularities, not to disturb, upon narrow views, or for minute propriety, the orthography of their fathers. It has been asserted, that for the law to be KNOWN, is of more importance than to be RIGHT. Change, says Hooker, is not made without inconvenience, even from worse to better. There is in constancy and stability a general and lasting advantage, which will always overbalance the slow improvements of gradual correction. Much less ought our written language to comply with the corruptions of oral utterance, or copy that which every variation of time or place makes different from itself, and imitate those changes, which will again be changed, while imitation is employed in observing them.

One might say, in the eminent lexicographer's defense, that no one was paying him to reform English spelling. The Dictionary was supposed to record contemporary good usage. That it did, and had Johnson tried to legislate a new orthography for English, he would have had few readers. However, one cannot help but imagine how different the last quarter of a millennium would have been if, in that same Preface, he had noted the unnatural and unnecessary divide between written and spoken English, and called on those who cared for the language to close the gap.

Copyright © 1997 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Why We Need a Philosophy of History

The driver of world history. No, I'm really not kidding.

The driver of world history. No, I'm really not kidding.

Francis Fukuyama's legacy will surely be his essay, "The End of History", or the book version thereof. In the narrow sense that liberal democracy does indeed seem to reflect a completion, or an exhaustion, of the Western political tradition, I think Fukuyama's thesis can still be broadly defended.

This was a recurring theme of John's, he called it The Perfection of the West. This title is John Reilly's gloss on the cyclical historical theories of Spengler and Toynbee. Spengler famously titled his version The Decline of the West, but John noted that Spengler himself said he could just have easily called it the Completion or the Perfection of the West.

The idea here is that progress [in just about any fashion you want to define that] is not linear. It goes through periods of growth that result in an efflorescence of novelty, followed by long periods of stasis. However, the periods of stasis are really just as important as the periods of growth, because the times when it seems like nothing is changing are when the advances of the previous period of growth are turned into permanent features of civilization.

The periods of stasis are a winnowing, separating the wheat from the chaff. If you take the long view, you can use past experience to filter current enthusiasm through a version of that winnowing:

One of the best reasons to study philosophy is so that you know enough not to worry too much about the world historical implications of things like Prozac. Lots of drugs, notably alcohol, also produce a sense of accomplishment and self-esteem sufficient to deaden the struggle for recognition. Prozac will have to be very widely prescribed indeed before it has as much effect on the state of human consciousness as Heineken beer.

If you add in human genetics, you can probably understand the modern world very well indeed.


Why We Need a Philosophy of History

 

In the summer of 1989, Francis Fukuyama published an essay in "The National Interest" entitled "The End of History." Appearing in one of the great revolutionary years of modern history, the essay provided a Hegelian interpretation of the collapse of Eastern European Marxism and the apparent universal vindication of liberal democracy. The essay (later expanded into a book, "The End of History and the Last Man") became famous, but not because so many people leapt to embrace its thesis. The title invites attack, especially attacks that do not engage the fairly narrow meaning that "history" has in Hegelian philosophy.

On this tenth anniversary of "The End of History," Fukuyama is at it again with another essay in "The National Interest," this one entitled "Second Thoughts." To put it briefly, he says that his 1989 essay was correct on its own terms, but that those terms were wrong. He continues to assert that liberal democracy is the only possible philosophy of society that satisfies both the economic and the "spirited" sides of human nature, the latter being that aspect of the personality which craves recognition as a moral agent. Thus, liberal democracy truly is the terminus of the long struggle between "master and slave" that constitutes political history in the Hegelian sense. Fukuyama now says, however, that this terminus is not really final, because science is still progressing.

While Hegel knew that different aspects of human nature were manifest in different historical eras, still he assumed that this nature was in some sense constant. A constant human nature implied the possibility of some form of society that would optimally satisfy all its aspects. In 1989, Fukuyama announced that we at last had such a society, or at least a situation where the principles for such a society were universally acknowledged. Societies prior to liberal democracy were inherently unstable, because they could not provide for the physical needs of their members adequately, and because they were so structured as to invite struggles for personal recognition. Liberal democracy is the first society that can no longer be disturbed by these factors, but it is nonetheless mortal. Human nature may have been constant in the past, but it will probably not be in the future. Modern science is on the verge of making fundamental changes in the physical and psychological nature of the species.

Society would change dramatically, for instance, if people could be made immortal, a goal that Fukuyama says is at least conceivable in light of some recent findings in the genetics of aging. Less speculative is the use of psychoactive drugs, such as Prozac and Ritalin. These are already used on large numbers of school children, mostly boys, to control newly discovered "behavior disorders." It is not hard to imagine a world in which the struggle for recognition, or for anything else for that matter, is contained by the use of chemicals rather than by liberal economic and political institutions. This is how "soma" was used in Aldous Huxley's "Brave New World," a novel that also illustrated how reproductive technology could be used to maintain an inherently stable caste system.

For myself, I have to say that I never had much problem with the conclusion of Fukuyama's original essay, if it is understood as a statement about intellectual history. There is a sense in which Western classical music "ended" in the 19th century, just as political philosophy is supposed to have ended with Hegel. (Of course, it took until 1989 for all the alternatives to liberal democracy to be disposed of in practice, but then people persisted in composing new kinds of music after Brahms, too.) The relationship of a "final" theory of society to the actual practice of politics and economics was less clear to me. For instance, it is possible that "democracy" could persist as a venerated fossil in a world where hardly anyone bothered to vote and government was largely the business of a small corps of judges and bureaucrats, or for that matter of plutocrats and soldiers. "The End of History" in this sense means not the achievement of a state of perfection, but the admission of a failure of imagination. Thus, while I too did not quite accept Fukuyama's original thesis, I found it a valuable exercise.

The level of pure philosophical analysis found in the earlier essay, very rare in today's public life, is missing from "Second Thoughts." One of the best reasons to study philosophy is so that you know enough not to worry too much about the world historical implications of things like Prozac. Lots of drugs, notably alcohol, also produce a sense of accomplishment and self-esteem sufficient to deaden the struggle for recognition. Prozac will have to be very widely prescribed indeed before it has as much effect on the state of human consciousness as Heineken beer. The really interesting point raised by neuropharmacology is the credulity with which its claims are received. These are, in reality, based on materialist superstitions about the mind that contemporary philosophy is often unable or unwilling to combat.

Genetic and reproductive technology might seem to be a more serious issue, but I wonder whether it really presents important systemic implications. Human cloning, when it occurs, will be a misguided enterprise, but it is not going to change the nature of life as we know it. If the human genome were tampered with in such a way as to create a wholly new kind of intelligent animal, that might indeed end human history. However, as E.O. Wilson notes in one of the responses that accompany Fukuyama's article, making a new animal on purpose is very hard. Since one gene sequence is often involved in a number of somatic and behavioral expressions, you cannot change the biological characteristics of an organism to fit arbitrary specifications. As for immortality in higher organisms, if it were possible, it would occur somewhere in nature.

Francis Fukuyama was interviewed by John Horgan for the book, "The End of Science," so it is a good bet that he has at least heard the phrase. It is a little mysterious why the subject is not mentioned in "Second Thoughts." We do indeed live in a world of brilliant basic research, particularly in cosmology, and of astonishing breakthroughs in engineering, not the least of which concern genetics. Still, what we also see today, perhaps, is the beginning of a failure of the imagination that is not so different from that which began in political theory in the 19th century. Fundamentally new ideas in the physical sciences are surprisingly hard to come by. There is still a great deal of development to be done with the chief established theories, particularly in biology, and the limits of technology are very far away in most areas. However, it is not at all clear that science really has much further to go, in the sense of revealing really new things about the physical world. We may well be entering an age of synthesis rather than of exploration.

It is possible that we are not at or near the end of history, even in the narrow sense of the completion of a set of long-running trends in intellectual life and economics. It may even be foolish to speculate about such things. Still, I count myself among those who cannot help making the attempt. In this pursuit, different people find different philosophical approaches helpful. In fact, different people seem to mean different things by "philosophy." In the context of history, what philosophy means to me is viewing contemporary enthusiasms skeptically.

 

End

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Son of Rosemary

I like the late 90s idea that devotees of Ayn Rand might prove to be unusually resistant to the false religion of the Antichrist, because of how sweetly naive it is. Rand built up a formidable cult of personality around herself that is probably only limited by intentional eschewing of religious elements. Thank God.

I have some inkling of this, because I too felt the siren call of Rand's individualist philosophy as a teenager. The scholarship programs aimed at high school students that encourage them to read The Fountainhead or Atlas Shrugged are persuasive genius. Intelligent high school students are the perfect targets for this kind of thing. Some small percentage are probably hooked forever.

As a teenager, I read everything I could find by and about Rand. And then I discovered how weird she really was. The best story [recounted by Greg Cochran in his recent interview] is how her adulterous lover Nathaniel Branden decided to end the affair they had been carrying on and marry a normal woman. In response, Rand required all remaining members of her inner circle [including future Federal Reserve chairman Alan Greenspan] to denounce Branden, and forsake all future association with him.

That incident, above all else, helped me see how batty it all was. I also fondly remember my parents, sweetly pooh-poohing this bosh.

Which is just as well. I think the Objectivists are about as likely to end the world as anyone.


Son of Rosemary
by Ira Levin
Penguin Books, 1997
255 pages, $22.95
ISBN: 0-525-94374-9

 

Bloodfest at Tiffany's

 

One of the rules of supernatural fiction seems to be that the devil gets the best lines but the Antichrist sounds like an unpersuasive used-car salesman. This pattern holds in "Son of Rosemary," Ira Levin's long-delayed sequel to his well-known 1967 novel, "Rosemary's Baby." ("Son of Rosemary" is dedicated to Mia Farrow, who starred in the film version of the earlier book) Mr. Levin at least has an excuse. He is perhaps best known as the author of the play "Deathtrap," the longest-running thriller in Broadway history, so it is not surprising that "Son of Rosemary" is really a murder mystery that runs on the dialogue. (The title of this review is taken from a tabloid headline in the story.) Though of course there is some action and other descriptive writing to illuminate the situation, still most of the burden of arousing our suspicions falls on the Antichrist himself. As much as his mother loves him, she thinks he sounds just too good to be true. The only problem with this technique is that an intimate family drama is not really the appropriate setting for a murder mystery whose victim is the entire human race.

As doubtless the whole world knows, "Rosemary's Baby" dealt with the birth of the Antichrist in a noted New York City apartment house that bore a more than passing resemblance to the Dakota. This building darkly and famously overlooks Central Park in Manhattan, and its reputation has grown still darker since the assassination of resident John Lennon in its lobby in 1980. In the sequel, we learn that Rosemary Reilly divorced her loathsome husband Guy, who had sold her body to the building's coven for insemination by Satan. The coven put her into a coma when the resulting child was six years old and she was secretly planning to flee with him. (The fact she stayed in the building six years is another illustration of how hard it is to find a decent apartment in the city.) Rosemary comes out of the coma 27 years later, just as the last member of the coven, a retired dentist, is run over by a taxi. She then goes about discovering what her little demon-eyed tike has been up to in the interim.

By 1999, of course, Andy is 33 years old, the same as Jesus at the time of the crucifixion. The difference is that, unlike Jesus at that age, he is the most popular man in the world. It is hard to say why this is the case, exactly. He goes around negotiating international peace agreements and encouraging people to be nice to each other, apparently to some effect, but he lives the life of the sort of media mogul whose natural environment is Manhattan Island south of 90th Street. Still, for whatever reason, most of the people in the world wear lapel buttons that say "I Love Andy" ("Love" is represented by a heart-shaped symbol). Soon they start wearing "I Love Rosemary" buttons, too. He does not ask much of his admirers. All that he requests is that everyone in the world light a candle at midnight, Greenwich Mean Time, on New Year's Eve, 1999. Exactly at 12:00 a.m. A harmless gesture. Surely.

When Rosemary comes out of coma, she is not-unreasonably dubbed "Rip Van Rosie" by the media. The interesting thing, though, is how little explanation the 1990s seem to require. Aside from personal computers and the end of the Cold War, there is not much that is really new. (One cannot help but reflect that, had this novel been written 10 or 15 years ago, it would have dealt at length with how much New York had worsened.) Certainly Rosemary's politics seem well-preserved from the late 1960s. Andy the Antichrist is in cahoots with certain easily recognizable conservative Republicans and members of the Religious Right ("Rob Patterson," for one), who want him to endorse a slightly goofy millionaire publisher for president in the presidential race of 2000. (Ah, if only they knew!) Even more remarkable than the Antichrist's friends are his enemies, who seem to consist mostly of the followers of Ayn Rand. Known generically as "P.A."s (Paranoid Atheists), they are the only people in the world who do not buy Andy's talkshow piety. The main problem they pose, however, is not that they threaten his personality cult, but that they might not light their candles with everyone else.

"Rosemary's Baby," or at any rate its popular success, is often cited as evidence for an anti-natalist streak in popular culture that is supposed to have appeared at about the time of its publication. Certainly in the United States those were the years when the Baby Boom ended, so it is not unreasonable to suggest that people might have been more open to a story that did not view the birth of a baby as an unalloyed blessed event. (Levin's 1976 novel, "The Boys from Brazil," was a high-tech version of the same theme.) Be this as it may, there are certainly none of the conventional anti-natalist motifs in "Son of Rosemary." There is no huffing and puffing about overpopulation, for one thing, though that theme is hardly unknown in eschatological fiction. There is no occasion to mention kids as a career drag, and certainly there are none of the gruesome descriptions of morning sickness that figured so prominently in "Rosemary's Baby." Of course, the whole human race is exterminated, so you could say the book illustrates the effect of a really strict population control program, but somehow I don't think that is the point.

Something else that is not the point is universal eschatology. Although the Antichrist (and of course the Anti-Mary) are the central characters, "Son of Rosemary" really has nothing to do with late 20th century beliefs about the Last Days, or for that matter the endtime beliefs of any time or place that I am aware of. In both this novel and the earlier one, we are dealing not with apocalyptic, but with the world of ritual magic. Though this sort of thing does have its demotic side, the Levin books follow the literary tradition that places it among the educated and well-to-do. Its ceremonies must fit into private apartments (however high-ceilinged), and its conspiracies are little vendettas. You cannot profitably fit an apocalypse onto a stage so small. We see the world end on television and in that spectacular view of the Park.

Still, "Son of Rosemary" is a genial book, considering the subject, and it will please people who remember the earlier novel when it was new. My memory played tricks with me as I read "Son of Rosemary." At first, I did not recall having read "Rosemary's Baby" at all; I thought that I remembered the story just from the movie. Gradually, though, I realized that I recalled information that could not have been on film, so I probably did read it while I was in grammar school. The little details are lovingly recalled in the new book. The tannis root. The scrabble. And then, of course, there is the wicked anagram, ROAST MULES. One word. No, I won't tell you.

Copyright © 1997 by John J. Reilly

 

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-10-28: The Last Scandal; Good Usage; Little People

Homo floresiensis from ATOR (Arc-Team Open Research)

Homo floresiensis from ATOR (Arc-Team Open Research)

It is a little unclear where the small hominins on Flores Island came from, but the speculation is fascinating.


The Last Scandal; Good Usage; Little People

 

There are four things to keep in mind about the Al Qaqaa Explosives Scandal:

(1) The site was interfered with before the first US units arrived in the area, so there is no way to tell when the explosives were moved;

(2) As a defense of the Administration, point (1) is irrelevant; the Coalition should have determined the status of all IAEA sites from the beginning, even if it could not secure them;

(3) The story seems to be making the electorate's gorge rise; Bush's poll numbers have actually firmed up since Kerry started to talk about it;

(4) Next time, could we please invade a country with prettier place names?

* * *

Speaking of language, Geoffrey K. Pullum at Language Log has some remarks about the evolution of the generic "they" in English:

But the fact is that singular they is becoming completely standard, at least among younger Americans, whenever the antecedent is of a sort that could in some contexts refer to either sex. I heard a radio piece about pregnant high-schoolers in which a girl said something like I think if someone in my class was pregnant I would be sympathetic to them. In such cases it's not the inability to assign sex to the referent that drives the selection of singular they, it's the mere fact of the antecedent being quantified or headed by a noun like person that can in other contexts be used of either sex.

If it was good enough for Chaucer, it should be good enough for us.

* * *

And here is a further point of usage: what do you call a Westerner who makes common cause with Islamofascists to discredit his domestic political opponents? Consider using the term "Catilinarian," after L. Sergius Catilina, the scuzzy politician of the late Roman Republic. After losing several elections for the consulship to Cicero in the 60s BC, he tried to ally his urban supporters with a Gaulish tribe to overthrow the state. Cicero, of course, was an insufferable windbag, and since we know about Catiline (sometimes spelled "Cataline" in English) mostly through what Cicero had to say about him, he may not have been quite the demon we remember. Still, he was certainly a bad enough fellow that we may use his name for invective.

* * *

Someone else with a cavalier attitude toward Classical allusions is that Other Spengler, the one who writes for the Asia Times. Speaking in praise of the principle of preemptive military action, he recently produced this exercise in alternative history:

If Kaiser Wilhelm II had had the nerve to declare war on France during the 1905 Morocco Crisis, Count Alfred von Schlieffen's invasion plan would have crushed the French within weeks. Russia's Romanov dynasty, humiliated by its defeat in the Russo-Japanese War and beset by popular revolt, likely would have fallen under more benign circumstances than prevailed in 1917. England had not decided upon an alliance with the Franco-Russian coalition in 1905. The naval arms race between Germany and England, a major source of tension, was yet to emerge. War in 1905 would have left Wilhelmine Germany the sole hegemon in Europe, with no prospective challenger for some time to come.

I don't think you can run an international system on that basis, but it may be the only way to run a postnational one.

* * *

One of the many interesting points about the discovery of homo floresiensis is how often the term "hobbit" occurs in the press reports:

Not only did anthropologists find the skeletal remains of a hobbit-sized, 30-year-old adult female, in this fairy-tale-like discovery they also uncovered in the same limestone cave the remains of a Komodo dragon, stone tools and dwarf elephants..Subsequent finds of other similarly sized, 3-foot-tall humans with brains the size of grapefruits in a cave on the Indonesian island of Flores suggest these 18,000-year-old specimens weren't a quirk of an ancient hominin, but part of an entire species of miniature people whose existence overlapped with that of modern Homo sapiens.

I have often wondered what would have happened to the hobbits, if The Lord of the Rings were the real past. Nothing good, I fear. It is sad to think of Samwise's remote descendants being harried into increasingly marginal savagery. On the other hand, the florensiens used to hunt komodo dragons. As with hobbits, it may have been wise for any big people in the neighborhood to stay on their good side.

It is not clear when the florensiens became extinct. They may have been destroyed in a volcanic eruption about 12k years ago. They may have blended into modern populations, though that is questionable: the florensiens were descended directly from homo erectus; they were not eccentric homo sapiens. Inevitably, we are told that they may have survived into historic times, since modern people on Flores have stories about the little people who used to live in the caves.

The same argument has been made for the faery folk of northwestern Europe: maybe there was a race of small aborigines whose memory was preserved in folklore. Perhaps, but the fact is that people in Europe still see the damn things. Such apparitions could have other explanations.

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-04-14

Brazil, like many Latin American countries, has a color spectrum instead a color line, the result of not having anything like a one-drop rule defining who is black and who is not. This has interesting implications if you also want to have a binary white/black affirmative action program.

Noah Smith looks at the failures of macroeconomic models.

This is the kind of thing Razib Khan calls being a 'star-man', the result of genetic success. I am a bit non-plussed by the assertion in the article that Lindbergh was being untrue to his eugenic principles by fathering children with women who had difficulty walking due to a childhood illness. Susceptibility to infectious disease has some genetic component, but it is largely random, and so often has little impact on genetic fitness. I wouldn't be surprised if this kind of thing was more obvious to Lindbergh. On the other hand, maybe he was just a horndog.

The Library of Congress has a list of books that helped shape American culture. This is a pretty good list, and it seems about right to me. It is also much, much funnier if you read the list annotated with intersectional Pokemon points by Steve Sailer. Intersectionality is largely about status, which is also about class, which proponents would like you not to think about.

Joel Kotkin looks at the disenfranchisement and poverty of rural California.

A recent look at the research on whether videogames cause violence. [short answer, still no.]

A very clever bit of work in making a localizable font for displaying characters in Chinese, Japanese, and Korean languages.

Michael Anton AKA Publius Decius Mus makes an argument for a Trumpian foreign policy [one that arguably better instantiates Trump's campaign rhetoric than his actual behavior as President].

You need to be a well-educated Westerner to be surprised by this. Almost everyone else in the world is massively ethnocentric, and only cares about people like them. Notable exception, Nelson Mandela, fellow recipient of the Nobel Peace Prize, who blew people up and spent years in prison for it, negotiated a political compromise that preserved the power of whites in South Africa.

Tyler Cowen riffs on Shashi Tharoor's book Inglorious Empire: What the British did to India. Some of the claims of Tharoor's book are a little odd, like that deindustrialization was a British policy in eighteenth-century India. I'm not sure traditional artisans count as "industry".

The Long View: Omens of Millennium: The Gnosis of Angels, Dreams, and Resurrection

Ross Douthat pointed out today that atheism, as such, isn't particularly rational. For most of recorded history, gnosticism has been the preferred alternative for intellectuals to classical monotheism or paganism. The argument that God is evil is a far stronger one than that God doesn't exist.

Also, this paragraph:

Gnosticism

Gnosticism

The short answer to this view is that apocalypse and gnosis usually go together. Certainly they did in Zoroastrianism, the apparent source of much of Judeo-Christian apocalyptic. It is common in religious systems for eschatology to be expressed on both the personal and the universal level. In other words, the fate of the world and the fate of individual human souls tend to follow parallel patterns, and Gnostic theology is no different. Manicheanism, for instance, had a particularly elaborate cosmology describing how the divine substance was trapped in the world of matter, forming the secret core of human souls. The hope Manicheanism offered was that someday this divine essence will all be finally released in a terminal conflagration. Details vary among Gnostic systems, but they generally hold that the creation of the world shattered God. History and the world will end when the fragments are reassembled. Often this takes the form of the reintegration of the Primal Adam, the cosmic giant whose fragments are our souls. While this aspect of gnosis can also be taken metaphorically, the fact is that Gnostic millenarianism has not been at all rare in history.

is the best summary of End of Evangelion I've ever seen. Far better than this psychoanalytic take [Freud was a fraud, dammit].


Omens of Millennium: The Gnosis of Angels, Dreams, and Resurrection
By Harold Bloom
Riverhead Books (G.P.Putnam's Sons), 1996
$24.95, pp. 255
ISBN: 1-57322-045-0

 

Getting Over the End of the World

 

Harold Bloom, perhaps, needs no introduction. A professor at both Yale and New York University, he is primarily a Shakespearean scholar who in recent years has taken an interest in religious questions in general and American religion in particular. This book is a personal spiritual meditation. Though quite devoid of index, footnotes or bibliography, it is well-informed, and the author is good about citing his sources. In fact, the book has something of the appeal of G.K. Chesterton’s historical works: the author relies on a modest selection of books with which many of his readers are probably familiar, so the argument is not intimidating. Reading it, you will learn a great deal about Sufism, Kabbalah and those aspects of popular culture that seem to be influenced by the impending turn of the millennium. You will, however, learn less about millennial anticipation than you might have hoped. The lack is not an oversight: apocalypse is a kind of spirituality that holds little appeal for Bloom. While this preference is of course his privilege, it does mean that, like the mainline churches which prefer to take these things metaphorically, his understanding of the spiritual state of today’s millennial America has a major blind spot.

Bloom's subject is his experience of "gnosis," the secret knowledge that is at once self-knowledge and cosmic revelation. The book's method is a review of different kinds of gnosis. Bloom has much to say about "Gnosticism" properly so-called, which was the religion of heretical Christians and Jews in the early centuries of the Christian era. (It would be churlish to put "heretical" in quotations marks here. The word, after all, was coined with the Gnostics in mind.) He is also concerned with contemporary popular spiritual enthusiasms. We hear a lot about the fascination with angels, dreams, near-death experiences and intimations of the end of the age that take up so much shelf-space in bookstores these days. Bloom is at pains to show that these sentimental phenomena in fact are part of a long Gnostic tradition that has engaged some of the finest minds of every age.

This aspect of the book is perhaps something of a patriotic exercise, since Bloom reached the conclusion in his study, "The American Religion," that America is a fundamentally Gnostic country, whose most characteristic religious product is the Church of Latter Day Saints. Bloom’s conclusions struck many people familiar with the professed theologies of America’s major denominations as a trifle eccentric, but he was scarcely the first commentator to claim that the people in the pews actually believe somthing quite different from what their ministers learned at the seminary. Besides, Tolstoy thought much the same thing as Bloom about the place of the Mormons in American culture, so who will debate the point?

Bloom is perfectly justified in complaining that the angels in particular have been shamefully misrepresented in America today. In the popular literature of angels, they appear as a species of superhero. They friendly folks just like you and me, except they are gifted with extraordinary powers to make themselves helpful, especially to people in life-threatening situations. Angels in art have been as cute as puppies for so long that the popular mind has wholly lost contact with the terrifying entities of Ezechiel’s vision. Bloom seeks to reacquaint us with these images, particularly as they have survived in Kabbalah and in Sufi speculation. He is much concerned with Metatron, the Angel of America, variously thought to be the Enoch of Genesis and the secret soul of the Mormon prophet Joseph Smith. His treatment of Metatron never quite rises to that of Neil Gaiman and Terry Pratchett in their novel, “Good Omens,” who describe him as, “The Voice of God. But not the ‘voice’ of God. A[n] entity in its own right. Rather like a presidential spokesman.” Nevertheless, it is good to see some hint of the true depths of angelic theology make available to the general public.

While “Omens of Millennium” is not without its entertaining aspects for people who do not regularly follow New Age phenomena, Bloom does seek to promote a serious spiritual agenda. The central insight of gnosis (at least if you believe Hans Jonas, as Bloom does without reservation) is the alienage of man from this world. We are strangers to both matter and history. Bloom despairs of theodicy. Considered with an objective secular eye, the world is at best a theater of the absurd and at worst a torture chamber. If there is a god responsible for this world, then that god is a monster or a fool. And in fact, for just shy of two millennia, Gnostics of various persuasions have said that the god of conventional religion was just such an incompetent creator. The consolation of gnosis is that there is a perfect reality beyond the reality of the senses, and a God unsullied by the creation of the world we know. The fiery angels, the prophetic dreams, the visions of an afterlife that make up much of the occult corpus are images of that true reality. They move in a middle realm, connecting the temporal and the eternal, ready to guide human beings desperate enough to seek the secret knowledge that gives mastery over them.

The people take these images literally. They believe they will not die, or that the resurrection is an event that will take place in the future. They believe that spiritual entities wholly distinct from themselves love them and care for them. They wait, sometimes with anxiety and sometimes with hope, for the transformation of this world. The Gnostic elite, in contrast, knows that these things are symbols. They understand that there is something in themselves that was never created, and so can never die. They can learn to use the images of the mid-world to approach these fundamental things, but without investing them with an independent reality. They need neither hope nor faith: they know, and their salvation is already achieved.

All of this sounds wonderfully austere. It allows for an aesthetic spirituality that avoids the twin perils of dead-between-the-ears materialism and vulgar supernaturalism. It is, one supposes, this sort of sensibility that accounts for the popularity of chant as elevator music. Neither is this spirituality without formidable literary exponents. Robertson Davies, for instance, suffused his fiction for decades with a genial Gnostic glow, marred only occasionally by a flash of contempt for the “peanut god” of the masses. Of even greater interest to Bloom, perhaps, would be the fiction of John Crowley. His recent novel, “Love and Sleep,” is entitled with the esoteric terms for the forces by which the truly divine is imprisoned in the world of matter. The story even treats in large part of Shakespeare and Elizabethan England, Bloom’s special province. If gnosis as such still seems to have a relatively small audience, this could be reasonably ascribed to its very nature as a philosophy for a spiritual elite. The problem with Bloom’s particular take on gnosticism, however, is that it is not only alien to sentimental popular religion, it is also alien to the esoteric forms gnosis has taken throughout history.

Bloom believes that gnosis appears when apocalyptic fails. This is what he believes happened in Judaism around the time of Jesus. By that point, Palestine had been bubbling with literal millenarianism for two centuries. Generation after generation looked for the imminent divine chastisement of Israel’s enemies and the establishment of a messianic kingdom. This universal regime would endure for an age of the world that, thanks to the Book of Revelation, finally came to be called “the millennium.” The dead would rise, the poor would be comforted, and the wicked would be infallibly punished. It was the stubborn refusal of these things to happen that prompted the strong spirits of those days to consider whether they may not have been looking for these things on the wrong level of reality. They were not arbitrary fantasies; they spoke to the heart in a way that mere history could not. Rather, they were images of realties beyond what this dark world could ever support. This was true also of the image of the apocalypse, in which this world comes to the end it so richly deserves. Apocalypse properly understood is not prophecy, but an assessment that put this world in its place. More important, it pointed to the greater reality that lay eternally beyond the world. Bloom hints that this process of ontological etherealization is in fact the explanation for Christianity itself, since he suspects that Jesus himself was a Gnostic whose subtle teachings were grossly misinterpreted by the irascible apostle Paul.

The short answer to this view is that apocalypse and gnosis usually go together. Certainly they did in Zoroastrianism, the apparent source of much of Judeo-Christian apocalyptic. It is common in religious systems for eschatology to be expressed on both the personal and the universal level. In other words, the fate of the world and the fate of individual human souls tend to follow parallel patterns, and Gnostic theology is no different. Manicheanism, for instance, had a particularly elaborate cosmology describing how the divine substance was trapped in the world of matter, forming the secret core of human souls. The hope Manicheanism offered was that someday this divine essence will all be finally released in a terminal conflagration. Details vary among Gnostic systems, but they generally hold that the creation of the world shattered God. History and the world will end when the fragments are reassembled. Often this takes the form of the reintegration of the Primal Adam, the cosmic giant whose fragments are our souls. While this aspect of gnosis can also be taken metaphorically, the fact is that Gnostic millenarianism has not been at all rare in history.

One of the impediments to understanding apocalyptic is the secular superstition, perhaps best exemplified by E.J. Hobsbawm’s book “Primitive Rebels,” that millenarianism is essentially a form of naive social revolution. Thus, one would expect people with an apocalyptic turn of mind to be ill-educated and poor. Bloom is therefore at something of a loss to explain the ineradicable streak of millenarianism in American culture, a streak found not least among comfortable middle class people who worship in suburban churches with picture windows. His confusion is unnecessary. Indeed, once could argue that the persistence of American millenarianism is some evidence for his thesis that America is a Gnostic country, since gnosticism is precisely the context in which apocalyptic flourishes among the world’s elites.

Sufi-influenced Islamic rulers, from the Old Man of the Mountain to last Pahlevi Shah of Iran, have a long tradition of ascribing eschatological significance to their reigns. Kabbalah has an explosive messianic tradition that has strongly influenced Jewish history more than once, most recently in the ferment among the Lubavitchers of Brooklyn. (The tradition is itself part of an intricate system of cosmic cycles and world ages, in which more or less of the Torah is made manifest.) Regarding Christian Europe, Norman Cohn has made a special study of the Heresy of the Free Spirit, which from the 13th century forward offered Gnostic illumination to the educated of the West in a package that came with the hope of an imminent new age of the spirit. As Bloom knows, the Renaissance and early modern era, and not least Elizabethan England, was rife with hermeticists like Giordano Bruno who divided their time between political intrigue and their own occult apotheosis. The gentlemanly lodge-politics of the pre-revolutionary 18th century made a firm connection between hermetic theory and the hope of revolution (as well as providing endless entertainment for conspiracy buffs who think that secret societies like the Bavarian Illuminati are somehow immortal). Whatever else can be said about gnosis, it is clearly not hostile to apocalyptic thinking.

In the light of this history, it is hard to accept Bloom’s complacent assertion that gnosis bears no guilt because it has never been in power. It has frequently been in power, though rarely under its own name. There is even a good argument to be made that the Nazi regime was fundamentally Gnostic. Certainly Otto Wagoner, one of Hitler’s early confidants, made note of his master’s admiration for the Cathars, those martyrs of the Gnostic tradition. Some segments of the SS even cultivated Vedanta. For that matter, as Robert Wistrich argued in “Hitler’s Apocalypse,” the regime’s chief aim was the expungement of the Judeo-Christian God from history. Marcion, the ancient heresiarch who rejected the Old Testament as the work of the evil demiurge, might have been pleased.

Is there a logical connection between gnosis and apocalyptic? Of course. Apocalypses come in various flavors. Some are hopeful, some are fearful, some are actually conservative. There is also an apocalypse of loathing, of contempt and hatred for the world and its history. We can clearly see such a mood in societies that nearly destroy themselves, such as Pol Pot’s Cambodia, or sixteenth century Mexico, but to a smaller degree it has also informed less extreme revolutions and upheavals throughout history. Gnosis has much in common with this mood. Gnostics at best seek to be reconciled with the world. Some seek to purify themselves of it. Others look forward to its destruction in a grossly literal fashion. More than a few, it seems, have been willing to help the process along.

Finally, at the risk of making a churlish comment about what after all is supposed to be a personal spiritual statement, one might question the credentials of gnosis to be the treasured possession of a true spiritual elite. Bloom mentions at one point that C.S. Lewis’s “Mere Christianity” is one of his least favorite books. One may be forgiven for wondering whether this antipathy arises because even a cursory acquaintance with Lewis’s writings show him to have been a Gnostic who eventually grew out of it. (If you want a popular description of serious angels, no book but Lewis’s novel “That Hideous Strength” comes to mind.) As St. Augustine’s “Confessions” illustrates, gnosis may be a stage in spiritual maturity, but it has not been the final destination for many of the finest spirits. Bloom seems to think that his version of gnosis has a great future in the next century, after people tire of their current millennial enthusiasms. Perhaps some form of spirituality has a great future, but it is unlikely to be the one he has in mind.

An abbreviated version of this article appeared in the February 1997 issue of First Things magazine.

Copyright © 1997 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site