The Long View: Name the Present, Name the Future

Integrated rococo carving, stucco and fresco at Zwiefalten

Integrated rococo carving, stucco and fresco at Zwiefalten

The identification of the present with nominalism ascendant is plausible, especially if you combine that with its handmaiden antinomianism.


Name the Present, Name the Future

 

The term "postmodern" is an unsatisfactory way to refer to the last few decades of the 20th century. (The era itself was not altogether satisfactory, either, though not for that reason.) Postmodern is a definition-by-negation, which is rarely a good idea: consider the sad example of those atheists who devote their lives to combating their nonexistent god. Moreover, there never really was much evidence that the period was moving beyond the modern era in any serious sense. In both its popular and elite forms, the "postmodern" spirit is largely a matter of living off the achievements of the modern age by making fun of them. Postmodernity is just modernity's late phase, rather like the rococo is to the baroque.

But then, what of the term "modern" itself? Strictly speaking, any era can (and does) call itself modern. When we speak of modernity, we usually have something more specific than "the present" in mind. Even so, the term is elastic. Modernity can mean the 20th century after the First World War, or the 19th and 20th centuries, or everything after Columbus. The macrohistorian William McNeill once plausibly suggested that the modern world system actually began in 11th-century China.

It makes most sense, I think, to consider that our modern world began with the French Revolution. The era is an episode within the Enlightenment, some of whose possibilities it realized and some of which it has forever precluded. Modernity has had a great deal in common with the Hellenistic Age of the Classical West and with the Warring States period in ancient China. It is a good bet that, like those epochs, it will last rather less than three centuries. Probably some watershed like 1789 lies in the 21st century, more likely in its second half than in its first. On the other side of it, history flows in another direction.

The future will look after its own nomenclature, but I for one find it hard to resist speculation about how the future will characterize our modernity. Even if we entertain the notion that there have been analogous periods in the past, still every such era must also be unique. "Warring States" would not be appropriate for the modern West, for instance, since the era has not been one of continual warfare, but of unusually long periods of tranquillity, punctuated by apocalyptic explosions. Herman Hesse made a better suggestion in "The Glass Bead Game," where modernity is seen from the future as the "Age of Feuilletons." That is just strange enough to happen.

Certainly the name would have to evoke the tendency toward analysis and reduction that has characterized the West these last two centuries. The great movements in intellectual life, from philosophy to economics, have been toward atomization, even as sovereign states multiplied in accordance with the principle that every little language must have its own country. The modern era is really the Age of Nominalism. As for its postmodern coda, these decades are simply the stage when nominalism achieved its natural culmination in solipsism, of language speaking itself.

This brings us to the age to come. There is ample precedent for naming undiscovered countries. "Brazil" and "Australia," for example, were appearing on maps before the territories were discovered to which those names finally stuck. ("Brazil" was a Celtic paradise, and "Australia" was the generic name for a southern continent.) In naming the future, it seems fitting to proceed with a little help from Hegel. Historical epochs really do tend to react against the excesses of their predecessors, though that is never all that they do. If the Age of Nominalism is the thesis, then any medievalist can tell you that the obvious antithesis will be an Age of Realism.

Maybe already we see the advancing shadows of a future that is more interested in synthesis than in analysis. These adumbrations take various forms, from the proposals for a "final theory" of physics to the two-steps-forward, one-step-back progress toward world government. Perhaps we see a hint of the mind of the future in E. O. Wilson's ambitious, metaphysically naive, notion of "consilience," a universal structure of knowledge that would have a sociobiological backbone. More ambitious and not at all naive is the project outlined in John Paul II's "Fides et Ratio," which looks toward a harmonization of our understanding of all levels of reality, something not seen since the Thomistic synthesis. None of these projects is likely to have quite the results their proponents have in mind, but they may tell us something about the cultural climate of 2100.

 

End

 

An edited version of this piece appeared the symposium, "What Can We Reasonably Expect?" (First Things, January 2000)

Copyright © 2000 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: A Doomsday Reader: Prophets, Predictors, and Hucksters of Salvation

Order of the Solar Temple imitating Catholic ritual

Order of the Solar Temple imitating Catholic ritual

I've said before that Freud was a Fraud. One concept of Freud's that may have predictive validity, projection, is featured in this book review.


A Doomsday Reader: 
Prophets, Predictors, and Hucksters of Salvation
by Ted Daniels
New York University Press, 1999
253 Pages, $15.16
ISBN: 0-8147-1909-0

 

"Doomsday" can suggest a variety of things. Literally, the term means "Judgment Day," and in that sense it is familiar from Christian eschatology. However, without much stretching, the term is also an apt characterization for the role "the revolution" plays in the Marxist model of history. The fact that the present order of things is judged by "historical necessity," rather than by God, is inessential. In fact, this basic pattern of belief is familiar around the world and throughout history, though the mechanism that is supposed to bring about doomsday varies according to the local sense of the possible. The world we know is flawed, it will presently be destroyed, and it will be followed by a better one. This is the faith and the patience of the Saints, of Deep Ecologists, and of social revolutionaries alike.

"A Doomsday Reader" does not purport to cover the whole world, though it includes an overview of the role millennialism plays in the major world religions. Its stated goals are ambitious enough: to illustrate the modern role of "apocalypse in the West and its effects on our politics and our lives." The author is Ted Daniels, a folklorist at the University of Pennsylvania and the Director of the Millennium Watch Institute. The Institute is part of a network of organizations that have been monitoring what is loosely called "millennial fever" in the run-up to the year 2000. For instance, Dr. Daniels contemporaneously collected the popular rumors that began growing up about the Hale-Bopp Comet in late 1996. At the time, I wondered why he bothered. Then, in 1997, the Heaven's Gate cult committed mass suicide, motivated in part by these beliefs. Shows you what I know. (Nevertheless, I am mentioned in the Acknowledgments.)

The book consists of 11 brief "apocalyptic" texts of relatively recent vintage (none is earlier than the excerpt from "The Communist Manifesto"), prefaced by longer analytical essays that provide historical context. Many readers will find the final five chapters particularly useful. These deal with the major violent or suicidal groups of recent years whose beliefs incorporated a large apocalyptic element. Daniels does not attempt to devise a unified theory to explain the Branch Davidians, the Order of the Solar Temple, Aum Shinri Kyo, Heavens Gate and the Montana Freeman. Nevertheless, he does make what may turn out to be very useful observations about the dynamics of such groups. For instance, he suggests that the reason the Freeman eventually surrendered to the authorities, while the other groups either killed themselves or tried to kill everybody else, was simply that the Freeman lacked a charismatic leader.

Daniels offers a general Freudian interpretation of the leaders of destructive apocalyptic groups that may persuade even non-Freudians. In this view, such leaders are narcissistic personalities who understand, at some level, that they are deficient. However, rather than locating the deficiency in themselves, they project it onto the world. Thus, rather than trying to heal themselves, they seek to heal the world. In extreme cases, rather than try to kill themselves, they will try to kill the world. When many people come to share such a person's projections, then you have an apocalyptic movement. (Usage varies in the terminology, but a movement is often said to be specifically "millenarian" if it seeks to help create a future age quite different from the world we know, and "millennial" if it seeks a future that is better than but continuous with the past. An "apocalyptic movement" might be any drive for fundamental change based on a "revelation" of some sort.)

"A Doomsday Reader" is of far more than historical interest. Though the number of readings is small, they and the groups that produced them are nevertheless typical of quite durable apocalyptic traditions. This is particularly the case with the global conspiracy theories that the media have released from the subcultural subcellar in recent years. One version of them, the Jewish international conspiracy, runs through both "The Protocols of the Learned Elders of Zion" and "The Turner Diaries," which were written 80 years apart and excerpted for this collection. It is particularly illuminating to read this material in conjunction with the history of the Order of the Solar Temple, which really was a secret society that aspired to exert its influence internationally. The fact that the Order was not a rousing success did nothing to dampen the conviction among conspiracy buffs of the potency of such groups. Some bad ideas just don't go away.

This is not to say that all the characteristics of apocalyptic thinking are without value. Daniels notes that apocalyptic is often an expression of the desire for vengeance, a forum where the high and mighty are brought to answer for their malefactions. For my money, at least, he does a bit of it himself, by naming some smug secular organizations as apocalyptic actors. He caps his discussion of the anti-human wing of the ecology movement with the text of the "World Scientists' Warning to Humanity," issued by the Union of Concerned Scientists in 1992. As Daniels observes, this document does not request, but requires, dramatic changes in every area of life, everywhere, if total ecological calamity is to be avoided. Indeed, this Warning has the distinction of being the bossiest text in an anthology that also includes an excerpt from "Mein Kampf." We need more of this willingness to tell the educated that, when they are looking for millenarians, they can often forget about looking in trailer-parks and just look in the mirror.

"A Doomsday Reader" does have bloopers which should have been picked up by the editor. To pick a few nits, "chiliasm" is not a Greek cognate of "millenarism," but merely its equivalent in meaning. The pyramids of Egypt were not "rediscovered" by Napoleon's armies "at the beginning of the eighteenth century," or indeed at any other time. Hegel was far more likely to have been a major influence on Comte than the other way around, since Hegel was 28 years older. More seriously, while the statement, "Augustine surrendered the world to evil," might be defended, the defense would need to engage the widely held belief that Augustine is the father of the idea of progress. Finally, though I recognize the point is really beyond the scope of the book, the fact that Chinese cultural history is little concerned with "eschatology," in the sense of the final end of history, does not mean that it lacks a conspicuous millenarian element.

Still, these are minor blemishes. "A Doomsday Reader" is a groundbreaking book. Dr. Daniel's special forte has been the mastery of the Internet as a medium for research into popular culture. The references in this book do not simply tell you about what is happening in various apocalyptic subcultures, they give you the tools to go online and watch it happening yourself. Additionally, this book could have important implications for public policy. Its close analysis of the successes and failures that the authorities have had in dealing with apocalyptic groups may help to prevent more disasters like those we have seen in the 1990s. While we may not always find other people's ideas about the imminence of the new age plausible, the fact that they think this world is about to end usually means they have some real complaints against it. We should pay more attention in the 21st century.

Copyright © 1999 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Thrawn Book Review

IMAGE CREDIT: LUCASFILM LTD.

IMAGE CREDIT: LUCASFILM LTD.

Thrawn
by Timothy Zahn
Del Rey, 2017
$28.99; 427 pages
ISBN 9780425287071

This was everything I ever wanted out of a Star Wars novel. Timothy Zahn's Heir to the Empire Trilogy was my introduction to Star Wars novels. If this had to be the last one I ever read, I think I could be happy. Much like Rogue One, Timothy Zahn's Thrawn expertly painted in the gaps left by the original trilogy of movies, making an already great work of popular art even better.

This is so because Zahn managed to answer more questions than he posed. For example, the complete lack of aliens within the Imperial Navy gets a plausible explanation that fits in with all of the movies. Also, the process by which the Imperial Navy and Army came to staffed and led by incompetent lackwits, despite having the resources of an entire galaxy to call on, is laid bare. With an intentionally open-ended story, this is a remarkable feat. Not least because the available narrative space for Zahn is, if anything, more limited than it was when he initially created Grand Admiral Thrawn.

In 1991, references to the Clone Wars were just that. Zahn ended up making choices to move his story along [without any objections at the time] that were simply abrogated when George Lucas wrote the prequels. The gap thus created was a large part of what doomed the previous iteration of Thrawn's origin story, Outbound Flight. Trying to shoehorn in the continuity of the prequels made that book plod along without any sparkle. Thus, Disney's decision to sweep away all of the previous books, comics, and videogames seems to have given Zahn an opportunity to reimagine Thrawn without being bound by even his own works.

In my opinion, this Thrawn instantiates who he was meant to be better than before. He is the best version of himself. Zahn commented on his Facebook page that this is the same Thrawn we saw before, just in a new light. Maybe so. But he sure feels a bit different to me. It has been twenty-five years since Zahn introduced us to the Grand Admiral, and Disney's decision plus the weight of twenty-five years of additional writing experience created an opportunity to make something new.

The vehicle by which we are introduced to Thrawn's early Imperial career is a political thriller. [along with excepts from Thrawn's journal] I didn't expect that [I'm not sure what I expected] but I think it works. While we do get to see Thrawn's tactical and strategic virtuosity, the scope of his campaigns are limited by the scope of his responsibilities. What we do get to see are the political machinations that characterized the day-to-day business of the Empire.

We get to see Imperial politics mostly through the eyes of Arihnda Pryce [a tie-in to the on-going Star Wars Rebels series] and Eli Vanto, an Imperial ensign who speaks a common language with Thrawn and ends up trailing Thrawn throughout his career. Pryce is exactly the kind of person who prospered in the years following the end of the Clone Wars: amoral, ruthless, and calculating. Here we find the root cause of the Empire's rot. Pryce, while bright and capable, came to be Governor of Lothal because of who she knew, and whom she had betrayed. A functioning bureaucracy requires a bit more probity than this.

Thrawn himself does not seem much bothered by the venality and incompetence of Imperial officers and politicians. Which strikes me as odd, and also as perfectly appropriate. Zahn made me feel that Thrawn was very alien. He just didn't want what I want, at least in the same way. The dust jacket for the book features this quote, which also looms large in an incident Thrawn's Imperial career:

There are things in the universe that simply and purely evil. A warrior does not seek to understand them, or to compromise with them. He seeks only to obliterate them.

The things that Thrawn finds abominable, and the things he finds excusable, are very different things from almost everyone around him. He clearly disliked the chaos of the late Republic, and liked the orderliness of the new Empire, despite its tyranny. I appreciated this, Thrawn really is an outsider, an alien, from a culture with a completely different point of view.

Yet at the same time he felt very familiar. The analogue I find ready to hand are the classical Romans. The Thrawn we meet in the early Empire is good at the hard virtues. His courage and stoicism are undeniable. As is his lack of pity. He is honest to the point of bluntness. He lacks the soft virtues: kindness, gentleness, compassion.  He never seeks wanton destruction, but suffering as such does not faze him. Disorder does.

Scipio Africanus

Scipio Africanus

The portrait of Rome I have in mind is the one Chesterton painted in the first part of The Everlasting Man. Rome represented the best of the ancient world, but it was still very different from the Christian civilization that eventually replaced it. Just, but harsh. Uncompromising and stern. And very, very good at war. Most of us modern Westerners would also be taken aback at something a 1st century Roman commander would find obvious and proper, if we ever met one.

Much like Scipio Africanus, Thrawn's political adversaries tend to find him a bit of a naïf at politics. It is left ambiguous whether Thrawn is really bad at politics. All the Imperials think he is, but Clausewitz said that war is a continuation of politics by other means. Insofar as Thrawn is quite adept at manipulating his opponents on the battlefield, the idea that he cannot do the same to politicians seems strange.

It is just possible that Thrawn isn't interested, or doesn't care, because that is way his alien mind works. He could just have a blind spot there. It is also possible that is is playing a really long game. The story I'm thinking of is about John von Neumann, that he could have a normal conversation with absolutely anyone, from a 5-year old to one of his peers in physics and mathematics. The idea is that he was so smart that he was just simulating what normal looked like to whomever he was talking to. This is like that, but if you had the added goal of manipulating and controlling the person you were talking to.

What that long game really is, we don't know. We know more than we started, however, which is good enough for me. I loved this book, and I suspect many Star Wars fans will too. You might even like it if you aren't a fan. I've read a lot of Star Wars books, and I haven't liked most of them. This one is good, a thought-provoking exercise in order and justice through the mind of an enemy commander.

My other book reviews

 

Thrawn (Star Wars)
By Timothy Zahn

The Long View 2004-11-04: Voting with their Flyer Miles; Integrity; Marketing

The continuing saga of John Reilly versus the HotLlama DVD player is pretty funny. This player doesn't seem to be commercially available anymore, but all of the Google search hits are for people complaining about it.


Voting with their Flyer Miles; Integrity; Marketing

 

Not only celebrities are threatening to leave the United States because they find it ideologically uncongenial. Ordinary upper-middle-class people are more or less advanced in their plans for comfortable exile.

"I can no longer in good conscience support a nation that believes it is OK to lie to start wars," she said. "I will not live in a country where dumb and dumber are my two choices for president. I'm taking my assets out of the country and moving to Central America, where ironically, I will have more freedom to live my life without interference from a corrupt government. My husband and I will leave within four months."

Unless this woman is moving to Costa Rica, her expectations for a corruption-free future are likely to be rudely disappointed. And if she is moving to Costa Rica, she is likely to find an expatriot American community that moved there in the 1990s to escape what they perceived as creeping socialism.

Still, things could be worse. The last time emigrant fever broke out in the United States was during the early years of the Depression, when hundreds of Americans accepted offers to lend their expertise to help build socialism in the Soviet Union. For the most part, these people disappeared during the Purges.

* * *

And if you do flee the stultifying confines of the Great Republic, you may find that its politics is not as idiosyncratic as you have been led to believe:

CANBERRA, Australia, Nov. 8 (UPI) -- The issue of abortion is becoming an increasingly hot topic in Australia, with the federal treasurer claiming it is a regional, and not a federal matter...The issue arose recently when the federal health minister, deputy prime minister and other senior coalition members of parliament called for a reduction in the number of abortions, particularly late terminations.

I don't know enough about Australian federalism to say what the principled pro-life position should be there. If a matter has usually been handled locally, people will often react badly if the matter is arbitrarily preempted by the national government. Certainly the pro-abortion faction in the US never made a bigger mistake than when they federalized the issue.

In this regard, we should note that the one really objectionable thing about John Ashcroft's Justice Department has been its studied refusal to allow its pro-life litigation to be affected in any way by considerations of mere constitutional principle:

Oregon Gov. Ted Kulongoski is criticizing outgoing Attorney General John Ashcroft for appealing a federal appeals court's decision preventing the federal government from declaring that federally-controlled drugs can't be used in assisted suicides because they don't constitute a medical purpose.

On this narrow question, the governor is right: the federal government cannot control the practice of medicine in this fashion. Why did Ashcroft continually bring cases like this?

* * *

On November 9, The PBS affiliate WNET aired The Persuaders, another expose' by Douglas Rushkoff of marketers and their wicked ways. The program emphasized that the problem of "clutter" is becoming critical: there is so much advertising that ordinary ads are becoming invisible. That is why advertisers are increasingly turning to "product placement," the strategy in which products are incorporated into entertainment. The program also had the first acknowledgment I have seen in a long while that the real target of marketers is their clients. Marketers are creative types who are more interested in exercising their talents than in selling goods and services; the real challenge lies in coaxing the client to pay the marketer to amuse himself. The expression "he who pays the piper calls the tune" is a marketing slogan devised by pipers.

Critiques of this sort have been with us for 50 years, and they still have some validity. Still, I wonder whether they are becoming anachronistic, at least with regard to some topics. "The Persuaders" addressed the question of political advertising, but without once addressing the fact that this was the year when the "broadcast" model of politics began to break down. No doubt it is shabby, as the program pointed out, that Republican strategists managed to replace the term "Estate Tax" with "Death Tax" during their campaign to repeal the tax on transfers of wealth at the time of death. But is that really more important than the successful revolt in the blogosphere against Dan Rather's Texas Air National Guard hoax?

* * *

Speaking of marketing issues, I would like to take back some of the harsh things I said about the Hotllama software company in my blog entry of October 25, in which I basically said that their DVD player was the sort of software you would have expected Lovecraft's monsters to write. The Hotllama customer service department found that entry, and emailed me a friendly note of explanation. Glitches happen, and it is too much to expect every application to work seamlessly with my increasingly archaic software. Still, I would like to highlight one point that Hotllama made in its note, in response to my complaint about the amount of personal information the DVD player asked for during installation:

But as with any software, installation is necessary, but you could have opted out of adding your email address, etc. but we did require your ZIP Code. We assure you that we treat any information we received as purely and totally anonymous, always have, and always will.

Is it possible that marketers do not know that no sane person responds to this sort of prompt accurately? Or do they really think that 25% of their customers are 100-year-old women who live in Alaska?

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-04-21

Why is this so funny?

Why is this so funny?

Depending on you do the accounting, emergency rooms aren't the most expensive way to get care.

If you combine this essay from 2016 by Rusty Reno, editor of First Things, with this 2017 article by Steve Sailer, you can get a sense of just how weird American elite universities have gotten.

Another Rusty Reno / Steve Sailer pairing, this time on how corporate and political diversity initiatives are used to shore up the status quo.

Tyler Cowen points out that stats wise, West Virginia isn't so bad. This is an interesting article on its own merits, but it also makes me wonder whether standard economic metrics are all they are cracked up to be.

Bryan Caplan points out that talking about IQ doesn't have to make a monster, but in his experience it often does. Since I follow a lot of IQ/psychology/genetics researchers on Twitter, I got to see many of them questioning Caplan about this in real time.

This story is almost ten years old now, but I didn't know that the monasteries at Mt. Athos still run under their Byzantine grant.

A number of my favorites made this list: Gattaca, Screamers, and Event Horizon.

H. P. Lovecraft is a favorite author of mine. I think these are indeed good places to start.

This title is horribly misleading. This is really an article about intellectual property law, and how a clever strategy almost allowed Google to publish orphaned books.

 

The Long View 2004-11-04: Questions about Convertibility

With my recent discovery of a large number of John Reilly's book reviews and essays, I had been enjoying a respite from John's topical political commentary from thirteen years ago. Unfortunately, we need to get back at it.


Questions about Convertibility

 

As with most things in life, the important reactions to the reelection of George Bush were summed up long ago by Ambrose Bierce, in The Devil's Dictionary:

President, n. The leading figure in a small group of men of whom -- and of whom only -- it is positively known that immense numbers of their countrymen did not want any of them for president.

It would, of course, be churlish to fault the Democratic Establishment for declining to eat their ration of crow all at once. They are still in the denial phase, which is only to be expected. As students of millenarianism know, the failure of the Parousia to occur on a predicted date simply excites the people who had hoped in it to greater efforts to convert the unbelievers. This dynamic is evident in Thomas Frank's column in today's New York TimesWhy They Won:

To short-circuit the Republican appeals to blue-collar constituents, Democrats must confront the cultural populism of the wedge issues with genuine economic populism. They must dust off their own majoritarian militancy instead of suppressing it; sharpen the distinctions between the parties instead of minimizing them; emphasize the contradictions of culture-war populism instead of ignoring them; and speak forthrightly about who gains and who loses from conservative economic policy.

If something does not work, and you don't know what else to do, the natural impulse is to do it harder. That is what made the First World War what it was, and it seems a fair description of the (American) liberal strategy in the Culture War. At this point, one can only remark out that the contradictions are not on the side of the Christian Realists. Thomas Frank in particular has promoted the thesis that the cultural and value issues are not real issues at all, but devices to deceive and bewilder the masses. The contradiction lies in the refusal of progressives to give even one inch on the abortion license or the normalization of homosexuality.

If the points are unimportant, then they should be conceded. If they cannot be conceded, then they must be important enough to figure prominently in public debate. I predict the points will be conceded, however much that outrages and alienates Left Reactionary elements. And then everything will change.

* * *

Many intemperate things have been said since Wednesday, when the results of the election became known. For shear shock value, however, none exceeds Ann Coulter's blasphemy against Karl Rove:

If Rove is "the architect" -- as Bush called him in his acceptance speech -- then he is the architect of high TV ratings, not a Republican victory. By keeping the race so tight, Rove ensured that a race that should have been a runaway Bush victory would not be over until the wee hours of the morning...Seventy percent to 80 percent of Americans oppose gay marriage and partial-birth abortion. Far from appealing exclusively to a narrow Republican base..."Boy Genius" Rove decided Bush shouldn't even run radio ads on gay marriage,

And Boy Genius was right. The values issues were important in the 2004 election, but they were scarcely the only factors; one might mention the continuing low-grade world war, for instance. Just shy of a quarter of the electorate said they were voting chiefly on moral questions. Very good: but a presidential candidate who talked about nothing else would be rightly dismissed as a crank.

Regarding the gay marriage issue in particular, may I point out that the chief difficulty in combating it is that it is nonsense? One falls silent when the matter is raised, not for fear of being revealed as a bigot, but because the notion is incoherent. Arguing about it is like talking about the man who was not there. It's an embarrassment, not a controversy. The people don't want gay marriage refuted; they want it to go away.

* * *

It was with these thoughts in mind that I viewed the press conference that President Bush gave yesterday. The president said something that Bill Clinton was never brave enough to say, much less do:

"I earned political capital during the campaign, and now I intend to spend it."

Good for him, and for the most part I wish him well, but the president needs to remember that he was reelected to win the Terror War and the Culture War. The capital he has amassed is like grocery-store coupons: it can be spent only on certain things.

I am not very keen on the Administration's proposals to partially privatize Social Security. I bow to no one in my eagerness to reform the tax code, but I was distressed to hear that the president has not adopted the position that the way to reform the code is to design it to do nothing but raise revenues for the federal government. If the government must subsidize industries, then let it do so through rebates, which must be separately budgeted and authorized by Congress. As for the federal deficit, the thought of it makes me nearly frantic.

I am not alone in these reservations.

* * *

A good argument has been made (by Glenn Reynolds, probably) that what got the Republicans where they are today is the process of "disintermediation," which means the diminishing importance of the institutional gatekeepers of information. So far, at least as a political phenomenon, disintermedation has been most important in America. If Medienkritik gets his way, however, Germany will not be far behind:

In the United States, consumers have talk radio, Fox News and the blogosphere as an alternative information source to the left-leaning, "mainstream" media. In Germany, none of that exists. The deepest fear of the German media elite and the angry left is that such an alternative could emerge and compete with or even replace them...It will be the stated goal of Davids Medienkritik over the next four years and beyond to continue to offer such an alternative to the German media and to encourage and support others seeking to do so. WE WILL BREAK THIS MONOPOLY, we will provide an alternative, we will seek to bridge the widening transatlantic gap and not to deepen it. And we will do so with your help and support.

And as Germany goes, so goes Europe.

* * *

Finally, here is what that Other Spengler had to say about the election. As usual, every slap on the back from this fellow comes with the jab of a needle:

What brought 4 million more evangelical voters to the polling stations than in the previous presidential election?... It is the hard, grinding reality of American life in the liberal dystopia that makes the "moral issues" so important to voters. Partial-birth abortion and same-sex marriage became critical issues not because evangelical voters are bigots. On the contrary, parents become evangelicals precisely in order to draw a line between their families and the adversary culture. This far, and no more, a majority of Americans said on November 2 on the subject of social experimentation...Unlike the Europeans, whose demoralization has led to depopulation, Americans still are fighting against the forces of decay that threaten - but do not yet ensure - the ultimate fall of American power. That is the message of November 2.

And speaking of Spengler, would you all please buy my damn book, so you know what's going on?

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Fighting for Liberty and Virtue: Political and Cultural Wars in Eighteenth-Century America

21-year old George Washington

21-year old George Washington

Olasky's Fighting for Liberty and Virtue seems a bit like Fernandez-Morera's Myth of the Andalusian Paradise, an regrettably polemical look at an inherently interesting subject.


Fighting for Liberty and Virtue:
Political and Cultural Wars in Eighteenth-Century America

by Marvin Olasky
Crossway Books, 1995
$25.00, 316 pp.
ISBN 0-89107-848-7

In the Beginning Was the Future

"[Chairman Mao] went on to a comparison between his China as seen from [the Communist rebel base at] Yenan and the American Revolution as a foreign reporter might have seen George Washington at Valley Forge. ...Did George Washington have machinery? he asked. Did George Washington have electricity? No. The British had all those things and Washington did not, but Washington won because he had the people with him."

---From "In Search of History," by Theodore White, page 260.

 

While Chairman Mao was perhaps a little confused about the sequence of technological progress in the West, few people would quarrel with his general point that the American Revolution succeeded because of widespread popular support, despite the general material inferiority of the rebels to their British opponents. In this book, Dr. Olasky seems to be similarly confused about the strength of the parallels between the political climate of the Revolutionary era and that of our own time, though many of his points about the importance of non-conformist Protestantism in the politics of eighteenth century America are well taken. America is, in some sense, the same country that it was two centuries ago, so some hardy perennials of American history, such as tax protests and calls for local control, were discernible even in late colonial times. The fact is, however, that the late eighteenth century was not, as he would have us believe, a time of culture war between virtuous American Republicans and decadent British monarchists. One could argue, in fact, that he slights his own partisan interest by claiming that Presbyterians and Baptists played as strong a role in the Revolution as the Christian Right is playing today. Contemporary political Christianity is not as important as it was in the 1770s: it is much more important.

Marvin Olasky is a journalism professor at the University of Texas at Austin, editor of the Christian Weekly news magazine "World," and, among other things, a general editor of the Turning Point Christian Worldview Series, published by Crossway Books and the Fieldstead Institute. One of his recent books, "The Tragedy of American Compassion," is a historical critique of welfare policy that received the special approbation of Speaker Gingrich. "Fighting for Liberty and Virtue" similarly uses history to advance a contemporary agenda. His thesis is that the American Revolution was made possible by a coalition of those interested in small government (particularly as manifested by low taxes) and those interested in holy government. The book will, no doubt, provide rhetorical ammunition with which cultural conservatives can defend themselves against the charge of injecting theological values into today's politics that would have been alien to America's founders. (There is even an Appendix with the helpful title, "Sound Bites from the 1780s for the 1990s.") The difference between polemic and scholarship is that the latter is careful to provide contrary evidence, and to state opposing views fairly. "Fighting for Liberty and Virtue" is clearly an example of the former.

The people of the Revolutionary era, we are told, favored every possible device to keep government "close to the people." For instance, as Olasky notes, colonial legislators generally served without pay, except for expenses. He contrasts this with representation in the British parliament, which he characterizes as "potentially enriching." Actually, there was nothing "potential" about the financial benefits that flowed to loyal faction members in the legislatures on both sides of the Atlantic in those years. However, he might also have mentioned that service in parliament during the eighteenth century was also without pay, except for members of the cabinet, and that it continued to be an unpaid honor until 1911. The Liberal Party government then provided a small salary, so that people who were not independently wealthy or in the employ of some interest could afford to serve. The Founding Fathers made this reform in the Constitution we have today, but then Olasky suggests, as we will see, that the Constitution was a ghastly mistake.

The book's most peculiar thesis is that the moral depravity of the ruling class of Georgian England doomed the Empire in the Revolutionary War. That eighteenth century British aristocrats were often very naughty is not in dispute. The point is easy to prove from the contemporary literature: we are talking about a time and place that produced Swift and Sam Johnson and Hogarth. And of course, the stories themselves are entertaining. Thus, we are regaled with tales of how Lord Cornbury, appointed to be royal governor of New York in the early 1700s because he was a cousin of Queen Anne, used to flounce about the ramparts of Manhattan in a woman's dress. We hear about the eighteenth century "Hell Fire" clubs (there was apparently more than one), and the odd parties that the Earl of Sandwich, who as First Lord of the Admiralty was perhaps most responsible for the neglect of the fleet that permitted the French to provide vital aid to the American insurgents, used to host at his renovated Cistercian abbey. (The incidents involving the baboon were particularly deplorable.) George Sackville, Secretary of State for North America during the Revolution and primarily responsible for grand strategy, seems to have been uncommonly fond of certain young officers, thus suggesting that even then "gays in the military" were a morale problem.

In sum, we are reminded that eighteenth century British government was rife with bribery, that business enterprises were frequently scams (as the South Sea Bubble illustrates), and that military officers embezzled and mismanaged supplies. Olasky does not claim that the American leaders were without stain. Only Samuel Adams and Patrick Henry get his unmitigated approval, whereas Benjamin Franklin, the old goat, comes in for special denunciation. (Indeed, we are told that Franklin only "changed sides" to support the colonies a year before the Revolution began.) Curiously absent from this rather Confucian tally of virtue and vice in office is the figure of George III. By all accounts, he was hardworking and well-meaning. He was so faithful to his Queen Charlotte (they had 15 children) that his less monogamous courtiers thought it odd. After the American Revolution he suffered from bouts of insanity, which doctors now believe to have been caused solely by body chemistry. (During his incapacitation, his relative virtues were made all the more evident by the appalling behavior of his son, the Prince Regent.) The story of George III bears less resemblance to that of Belshazzar than to that of Nebuchadnezzar, or even of Job.

It is simply false to say that the Revolution constituted a "culture war" between England and the United States. The leaders of both sides were a mixture of deists, agnostics, libertines and Christians of ordinary piety. The populations they led were not so different from their leaders. The British government, unlike the liberal establishment of today, had no new morality of their own devising which they hoped to impose on the colonies. The British leadership seems not to have had any ideological policy at all, beyond the maintenance of parliamentary supremacy throughout the Empire. The "culture war" of the Enlightenment started when the Revolution in France sought the end of class distinctions, traditional morality, and the expurgation of Christianity. In that war, Britain and the young United States were pretty much on the same side. (The United States also had its Jacobin "Left," of course, which agitated for direct support of revolutionary France.) Though the culture war that began with the French Revolution has suffered various mutations, that is the conflict which continues to this day.

Eighteenth century Americans were certainly amenable to the idea that the British Empire was a hopelessly corrupt institution on its way to a resounding collapse. When Edward Gibbon's "The Decline and Fall of the Roman Empire" began to appear in 1776 (the whole work was published over a period of about ten years), it soon became part of the favorite reading of America's leaders. It was cited as a mirror of the contemporary condition of Great Britain, and, at the Constitutional Convention, as a prophecy of what could happen to the United States if the participants did not do a good job. While its application as prophecy remains to be seen, as a diagnosis of Britain it was clearly wide of the mark. The British polity in the second half of the eighteenth century was a society striving to get a grip on itself. This, really, was what started the American Revolution. Parliament was trying to introduce some rudimentary fiscal and regulatory order into its haphazard Empire. The problem in America was that these reforms ran afoul of well-established traditions of self-government, with what results we know. However, in the decades immediately following the loss of the American colonies, decadent Britain went on to defeat Napoleon, reform its domestic life in the reign of a queen whose name became a byword for pious rectitude, and create the largest empire in the history of the world. Eighteenth century Britain was a disordered society. However, its disorder did not spring from moral exhaustion.

Olasky's chief focus in the book is the role of religion in eighteenth century politics, and the material he presents is worth reading. He quotes at length from the tracts and sermons of the time, sources which probably reflect the feelings and ideas of ordinary people, even of ordinary educated people, far better than do the writings of people like Franklin or Tom Paine. He is quite correct in noting that the role of religion in American history has often been shortchanged by historians. As G. K. Chesterton remarked, America is a nation with the soul of a church. This is why American history is more than usually baffling. It is almost certainly true that the Great Awakening, that strange movement of the national spirit that erupted in the 1730s and 40s, was a necessary predicate to the Revolution forty years later. Olasky quotes a Hessian officer as saying that the Revolution was essentially a revolt of Scotch-Irish Presbyterians, and certainly religious non-conformists looked on the Revolution as a way to free themselves from the government that had persecuted them or their ancestors. To the English after their victory in the Seven Years War (the French and Indian War in America), it seemed perfectly reasonable to consider establishing an Anglican bishop in America; after all, Anglicanism was the established church in several of the colonies. To American Presbyterians and Baptists, however, the very idea smacked of the crypto-papist tyranny of the Stuart dynasty.

The Stuarts had been deposed in the Glorious Revolution of 1688, an event that gets curiously scant attention from Olasky, though it was certainly much on the minds of people in the English-speaking world in the century that followed. To the Awakened, as Olasky calls those touched by the Great Awakening, that prior Revolution was a milestone in the history of Christianity. Because of it, the personal conscience of Englishmen was freed from the dictates of an established creed. To the Enlightened, which included such people as Franklin and Adams and Hamilton, it was the foundation of consensual, rational, limited government. To both groups, it provided a model for insurrection that could be put into action, if some issue arose that engaged both strands of American culture. Olasky notes some pre-Revolutionary controversies that seemed to do this. Notable among them was the "Parsons' Cause." In that case, Patrick Henry defended against a suit by clergy of the established Anglican Church in Virginia in which they sought to be paid in money rather than tobacco. (He won in that the jury awarded the parsons only a penny in damages.) This was one of those incidents, according to Olasky, that made it possible for the Awakened with their concern for holy government to join forces with the Enlightened, whose idea of good government included the need for low taxes. Persons wishing to draw comparisons with the present state of the Republican Party are invited to do so.

The problem is that, if you draw such a comparison, the apparent power of eighteenth century piety will suffer from it. The concerns about church establishment and holy government which Olasky (and others) have documented are absent from the Declaration of Independence. God is much mentioned in that document, of course, but nowhere does it suggest that He is much exercised by the prospect of an American bishop or by the cross-dressing ways of certain royal governors. Even though the recent British reorganization of neighboring Quebec is alluded to, the document does not complain about the re-establishment of Roman Catholicism as the official church there, a step that is well-known to have excited outrage from American Protestants. Olasky explains these omissions by noting that the Declaration was a compromise between the Enlightened and the Awakened. Besides, its primary drafter was that notorious free-thinker, Thomas Jefferson. Still, in a compromise you are supposed to get at least part of what you want. If we judge by the output of the Continental Congress, however, it would appear that Olasky's Awakened were not so much accommodated as co-opted.

This brings us to the question of the merit, and even the legitimacy, of the current United States Constitution. There is a long tradition in American historiography, most famously propounded by the Marxist historian Charles Beard, which holds that the Revolution itself was a popular rebellion, but the Constitutional Convention of 1787 was a sort of counter- revolution, an anti-democratic cabal of Masons and proto-capitalists intent on wresting power from the populist state governments. Olasky is of similar mind, with the addition that the Convention retreated not just from the principles of small government, but from the desire for holy government. A powerful central government, it seems, is an occasion of sin.

This assessment of the origins of the Constitution is counterfactual. If anything, it was the Continental Congress that smacked of conspiracy. It was composed, after all, of the representatives of provisional revolutionary governments, themselves with shaky legality and questionable popular support, who met to declare themselves the effective government of the continent. The Declaration it issued looked for its ultimate ratification not in ratifying assemblies, but on the battlefield. The Constitutional Convention of 1787, in contrast, was composed of the representatives of legal states. They assembled simply to draft a document, a constitution they claimed no power or right to impose themselves.

In further contrast to the Congress of 1776, the Convention discussed in detail all those issues of religious establishment and freedom of conscience which so deeply troubled pious Americans of the time. Unlike the Declaration of Independence, the Constitution does not mention God. On the other hand, it actually does make some provision for liberty of conscience by forbidding the imposition of a religious test for public office. (It also allows for "affirmations" rather "oaths," in deference to Quaker sensibility.) The Bill of Rights, soon to be promulgated by the First Congress, would provide more protections. It is doubtless true, as Olasky suggests, that the Supreme Court today has radically misinterpreted the First Amendment as a protection from religion, rather than as a prohibition of a state church. However, the flaw there is not the Constitution, but the prejudices of the modern judiciary.

What makes Olasky's argument somewhat grotesque is the assumption that small government and holy government normally go together. If by "small government" you mean low tax, low service states with lengthy constitutions that make it almost impossible for the state to do anything at all, then there has been no lack of such regimes throughout American history, particularly in the Deep South. They have been notable chiefly for the corruption of their public officials and the poverty of their people. They have also been fantastically anti-democratic. Their relative isolation from federal oversight until the desegregation era permitted many a governor or party leader to entrench his squalid little tyranny with small danger of ever having to face a fair election. Much the same thing happened in the cities of the Northeast in the late nineteenth and early twentieth centuries. Taxes were higher, perhaps, but the urban Democratic political machines were as "close to the people" as any post-Revolutionary anti-federalist might have wished. Probably rather more so.

The Founding Fathers, who for all their faults set a standard of intelligent civic virtue unique in human history, discussed the merits of large and small government at length in the Constitutional Convention. They concluded, correctly, that small states were at least as prone to tyranny and corruption as large ones. They also saw that the attempt to turn the question of the powers of the federal government (indeed, the question of whether there should be a United States at all) into a choice of "big government, small government" was a red-herring. The real danger was anarchy and civil war, with tyranny to follow. This is one parallel with the past that today's cultural conservatives would do well to remember.

(This article originally appeared in Fidelity magazine.)

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Samuel Johnson and His Dictionary of Doom

Samuel Johnson may be one of the most influential figures in English orthography.


Samuel Johnson and His Dictionary of Doom

Perhaps the most common defense of the traditional orthography of English is that the spelling is supposed to reflect the etymologies of the words, and so gives useful clues to their meanings. This argument is, of course, a red herring. The orthographies of all the modern European languages take etymology into account; only in English is this an excuse to allow spellings to become so phonetically ambiguous that standard dictionaries must provide a pronunciation key for each word. There are many reasons why this condition has been allowed to persist. Among the most important is the support it received from Samuel Johnson's Dictionary of the English Language, which appeared in 1755. This great work is credited with standardizing the spelling of English for the first time, but at the cost of phonetic incoherence.

Dr. Johnson set out his principles of lexicography in the Dictionary's Preface, which, happily, is available from the Gutenberg Project. In that fascinating essay, he demonstrated a proper understanding of the use of etymology, which any reform of the writing system of a language with an ancient and extensive literature would have to employ:

Such defects [as the divergence of loan words from their roots] are not errours in orthography, but spots of barbarity impressed so deep in the English language, that criticism can never wash them away: these, therefore, must be permitted to remain untouched; but many words have likewise been altered by accident, or depraved by ignorance, as the pronunciation of the vulgar has been weakly followed; and some still continue to be variously written, as authours differ in their care or skill: of these it was proper to enquire the true orthography, which I have always considered as depending on their derivation, and have therefore referred them to their original languages: thus I write enchant, enchantment, enchanter, after the French and incantation after the Latin; thus entire is chosen rather than intire, because it passed to us not from the Latin integer, but from the French entier.

It is too much for any speaker of a major language to expect that its orthography will perfectly mirror his pronunciation; it is enough if every spelling yields a possible pronunciation. For other European languages, the standardization of orthography has gone hand in hand with a process of modifying the historical spellings to satisfy that criterion. Dr. Johnson, however, enunciated a contrary principle, to the continuing cost of English-speakers ever since:

In this part of the work, where caprice has long wantoned without controul, and vanity sought praise by petty reformation, I have endeavoured to proceed with a scholar's reverence for antiquity, and a grammarian's regard to the genius of our tongue. I have attempted few alterations, and among those few, perhaps the greater part is from the modern to the ancient practice; and I hope I may be allowed to recommend to those, whose thoughts have been perhaps employed too anxiously on verbal singularities, not to disturb, upon narrow views, or for minute propriety, the orthography of their fathers. It has been asserted, that for the law to be KNOWN, is of more importance than to be RIGHT. Change, says Hooker, is not made without inconvenience, even from worse to better. There is in constancy and stability a general and lasting advantage, which will always overbalance the slow improvements of gradual correction. Much less ought our written language to comply with the corruptions of oral utterance, or copy that which every variation of time or place makes different from itself, and imitate those changes, which will again be changed, while imitation is employed in observing them.

One might say, in the eminent lexicographer's defense, that no one was paying him to reform English spelling. The Dictionary was supposed to record contemporary good usage. That it did, and had Johnson tried to legislate a new orthography for English, he would have had few readers. However, one cannot help but imagine how different the last quarter of a millennium would have been if, in that same Preface, he had noted the unnatural and unnecessary divide between written and spoken English, and called on those who cared for the language to close the gap.

Copyright © 1997 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Why We Need a Philosophy of History

The driver of world history. No, I'm really not kidding.

The driver of world history. No, I'm really not kidding.

Francis Fukuyama's legacy will surely be his essay, "The End of History", or the book version thereof. In the narrow sense that liberal democracy does indeed seem to reflect a completion, or an exhaustion, of the Western political tradition, I think Fukuyama's thesis can still be broadly defended.

This was a recurring theme of John's, he called it The Perfection of the West. This title is John Reilly's gloss on the cyclical historical theories of Spengler and Toynbee. Spengler famously titled his version The Decline of the West, but John noted that Spengler himself said he could just have easily called it the Completion or the Perfection of the West.

The idea here is that progress [in just about any fashion you want to define that] is not linear. It goes through periods of growth that result in an efflorescence of novelty, followed by long periods of stasis. However, the periods of stasis are really just as important as the periods of growth, because the times when it seems like nothing is changing are when the advances of the previous period of growth are turned into permanent features of civilization.

The periods of stasis are a winnowing, separating the wheat from the chaff. If you take the long view, you can use past experience to filter current enthusiasm through a version of that winnowing:

One of the best reasons to study philosophy is so that you know enough not to worry too much about the world historical implications of things like Prozac. Lots of drugs, notably alcohol, also produce a sense of accomplishment and self-esteem sufficient to deaden the struggle for recognition. Prozac will have to be very widely prescribed indeed before it has as much effect on the state of human consciousness as Heineken beer.

If you add in human genetics, you can probably understand the modern world very well indeed.


Why We Need a Philosophy of History

 

In the summer of 1989, Francis Fukuyama published an essay in "The National Interest" entitled "The End of History." Appearing in one of the great revolutionary years of modern history, the essay provided a Hegelian interpretation of the collapse of Eastern European Marxism and the apparent universal vindication of liberal democracy. The essay (later expanded into a book, "The End of History and the Last Man") became famous, but not because so many people leapt to embrace its thesis. The title invites attack, especially attacks that do not engage the fairly narrow meaning that "history" has in Hegelian philosophy.

On this tenth anniversary of "The End of History," Fukuyama is at it again with another essay in "The National Interest," this one entitled "Second Thoughts." To put it briefly, he says that his 1989 essay was correct on its own terms, but that those terms were wrong. He continues to assert that liberal democracy is the only possible philosophy of society that satisfies both the economic and the "spirited" sides of human nature, the latter being that aspect of the personality which craves recognition as a moral agent. Thus, liberal democracy truly is the terminus of the long struggle between "master and slave" that constitutes political history in the Hegelian sense. Fukuyama now says, however, that this terminus is not really final, because science is still progressing.

While Hegel knew that different aspects of human nature were manifest in different historical eras, still he assumed that this nature was in some sense constant. A constant human nature implied the possibility of some form of society that would optimally satisfy all its aspects. In 1989, Fukuyama announced that we at last had such a society, or at least a situation where the principles for such a society were universally acknowledged. Societies prior to liberal democracy were inherently unstable, because they could not provide for the physical needs of their members adequately, and because they were so structured as to invite struggles for personal recognition. Liberal democracy is the first society that can no longer be disturbed by these factors, but it is nonetheless mortal. Human nature may have been constant in the past, but it will probably not be in the future. Modern science is on the verge of making fundamental changes in the physical and psychological nature of the species.

Society would change dramatically, for instance, if people could be made immortal, a goal that Fukuyama says is at least conceivable in light of some recent findings in the genetics of aging. Less speculative is the use of psychoactive drugs, such as Prozac and Ritalin. These are already used on large numbers of school children, mostly boys, to control newly discovered "behavior disorders." It is not hard to imagine a world in which the struggle for recognition, or for anything else for that matter, is contained by the use of chemicals rather than by liberal economic and political institutions. This is how "soma" was used in Aldous Huxley's "Brave New World," a novel that also illustrated how reproductive technology could be used to maintain an inherently stable caste system.

For myself, I have to say that I never had much problem with the conclusion of Fukuyama's original essay, if it is understood as a statement about intellectual history. There is a sense in which Western classical music "ended" in the 19th century, just as political philosophy is supposed to have ended with Hegel. (Of course, it took until 1989 for all the alternatives to liberal democracy to be disposed of in practice, but then people persisted in composing new kinds of music after Brahms, too.) The relationship of a "final" theory of society to the actual practice of politics and economics was less clear to me. For instance, it is possible that "democracy" could persist as a venerated fossil in a world where hardly anyone bothered to vote and government was largely the business of a small corps of judges and bureaucrats, or for that matter of plutocrats and soldiers. "The End of History" in this sense means not the achievement of a state of perfection, but the admission of a failure of imagination. Thus, while I too did not quite accept Fukuyama's original thesis, I found it a valuable exercise.

The level of pure philosophical analysis found in the earlier essay, very rare in today's public life, is missing from "Second Thoughts." One of the best reasons to study philosophy is so that you know enough not to worry too much about the world historical implications of things like Prozac. Lots of drugs, notably alcohol, also produce a sense of accomplishment and self-esteem sufficient to deaden the struggle for recognition. Prozac will have to be very widely prescribed indeed before it has as much effect on the state of human consciousness as Heineken beer. The really interesting point raised by neuropharmacology is the credulity with which its claims are received. These are, in reality, based on materialist superstitions about the mind that contemporary philosophy is often unable or unwilling to combat.

Genetic and reproductive technology might seem to be a more serious issue, but I wonder whether it really presents important systemic implications. Human cloning, when it occurs, will be a misguided enterprise, but it is not going to change the nature of life as we know it. If the human genome were tampered with in such a way as to create a wholly new kind of intelligent animal, that might indeed end human history. However, as E.O. Wilson notes in one of the responses that accompany Fukuyama's article, making a new animal on purpose is very hard. Since one gene sequence is often involved in a number of somatic and behavioral expressions, you cannot change the biological characteristics of an organism to fit arbitrary specifications. As for immortality in higher organisms, if it were possible, it would occur somewhere in nature.

Francis Fukuyama was interviewed by John Horgan for the book, "The End of Science," so it is a good bet that he has at least heard the phrase. It is a little mysterious why the subject is not mentioned in "Second Thoughts." We do indeed live in a world of brilliant basic research, particularly in cosmology, and of astonishing breakthroughs in engineering, not the least of which concern genetics. Still, what we also see today, perhaps, is the beginning of a failure of the imagination that is not so different from that which began in political theory in the 19th century. Fundamentally new ideas in the physical sciences are surprisingly hard to come by. There is still a great deal of development to be done with the chief established theories, particularly in biology, and the limits of technology are very far away in most areas. However, it is not at all clear that science really has much further to go, in the sense of revealing really new things about the physical world. We may well be entering an age of synthesis rather than of exploration.

It is possible that we are not at or near the end of history, even in the narrow sense of the completion of a set of long-running trends in intellectual life and economics. It may even be foolish to speculate about such things. Still, I count myself among those who cannot help making the attempt. In this pursuit, different people find different philosophical approaches helpful. In fact, different people seem to mean different things by "philosophy." In the context of history, what philosophy means to me is viewing contemporary enthusiasms skeptically.

 

End

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Son of Rosemary

I like the late 90s idea that devotees of Ayn Rand might prove to be unusually resistant to the false religion of the Antichrist, because of how sweetly naive it is. Rand built up a formidable cult of personality around herself that is probably only limited by intentional eschewing of religious elements. Thank God.

I have some inkling of this, because I too felt the siren call of Rand's individualist philosophy as a teenager. The scholarship programs aimed at high school students that encourage them to read The Fountainhead or Atlas Shrugged are persuasive genius. Intelligent high school students are the perfect targets for this kind of thing. Some small percentage are probably hooked forever.

As a teenager, I read everything I could find by and about Rand. And then I discovered how weird she really was. The best story [recounted by Greg Cochran in his recent interview] is how her adulterous lover Nathaniel Branden decided to end the affair they had been carrying on and marry a normal woman. In response, Rand required all remaining members of her inner circle [including future Federal Reserve chairman Alan Greenspan] to denounce Branden, and forsake all future association with him.

That incident, above all else, helped me see how batty it all was. I also fondly remember my parents, sweetly pooh-poohing this bosh.

Which is just as well. I think the Objectivists are about as likely to end the world as anyone.


Son of Rosemary
by Ira Levin
Penguin Books, 1997
255 pages, $22.95
ISBN: 0-525-94374-9

 

Bloodfest at Tiffany's

 

One of the rules of supernatural fiction seems to be that the devil gets the best lines but the Antichrist sounds like an unpersuasive used-car salesman. This pattern holds in "Son of Rosemary," Ira Levin's long-delayed sequel to his well-known 1967 novel, "Rosemary's Baby." ("Son of Rosemary" is dedicated to Mia Farrow, who starred in the film version of the earlier book) Mr. Levin at least has an excuse. He is perhaps best known as the author of the play "Deathtrap," the longest-running thriller in Broadway history, so it is not surprising that "Son of Rosemary" is really a murder mystery that runs on the dialogue. (The title of this review is taken from a tabloid headline in the story.) Though of course there is some action and other descriptive writing to illuminate the situation, still most of the burden of arousing our suspicions falls on the Antichrist himself. As much as his mother loves him, she thinks he sounds just too good to be true. The only problem with this technique is that an intimate family drama is not really the appropriate setting for a murder mystery whose victim is the entire human race.

As doubtless the whole world knows, "Rosemary's Baby" dealt with the birth of the Antichrist in a noted New York City apartment house that bore a more than passing resemblance to the Dakota. This building darkly and famously overlooks Central Park in Manhattan, and its reputation has grown still darker since the assassination of resident John Lennon in its lobby in 1980. In the sequel, we learn that Rosemary Reilly divorced her loathsome husband Guy, who had sold her body to the building's coven for insemination by Satan. The coven put her into a coma when the resulting child was six years old and she was secretly planning to flee with him. (The fact she stayed in the building six years is another illustration of how hard it is to find a decent apartment in the city.) Rosemary comes out of the coma 27 years later, just as the last member of the coven, a retired dentist, is run over by a taxi. She then goes about discovering what her little demon-eyed tike has been up to in the interim.

By 1999, of course, Andy is 33 years old, the same as Jesus at the time of the crucifixion. The difference is that, unlike Jesus at that age, he is the most popular man in the world. It is hard to say why this is the case, exactly. He goes around negotiating international peace agreements and encouraging people to be nice to each other, apparently to some effect, but he lives the life of the sort of media mogul whose natural environment is Manhattan Island south of 90th Street. Still, for whatever reason, most of the people in the world wear lapel buttons that say "I Love Andy" ("Love" is represented by a heart-shaped symbol). Soon they start wearing "I Love Rosemary" buttons, too. He does not ask much of his admirers. All that he requests is that everyone in the world light a candle at midnight, Greenwich Mean Time, on New Year's Eve, 1999. Exactly at 12:00 a.m. A harmless gesture. Surely.

When Rosemary comes out of coma, she is not-unreasonably dubbed "Rip Van Rosie" by the media. The interesting thing, though, is how little explanation the 1990s seem to require. Aside from personal computers and the end of the Cold War, there is not much that is really new. (One cannot help but reflect that, had this novel been written 10 or 15 years ago, it would have dealt at length with how much New York had worsened.) Certainly Rosemary's politics seem well-preserved from the late 1960s. Andy the Antichrist is in cahoots with certain easily recognizable conservative Republicans and members of the Religious Right ("Rob Patterson," for one), who want him to endorse a slightly goofy millionaire publisher for president in the presidential race of 2000. (Ah, if only they knew!) Even more remarkable than the Antichrist's friends are his enemies, who seem to consist mostly of the followers of Ayn Rand. Known generically as "P.A."s (Paranoid Atheists), they are the only people in the world who do not buy Andy's talkshow piety. The main problem they pose, however, is not that they threaten his personality cult, but that they might not light their candles with everyone else.

"Rosemary's Baby," or at any rate its popular success, is often cited as evidence for an anti-natalist streak in popular culture that is supposed to have appeared at about the time of its publication. Certainly in the United States those were the years when the Baby Boom ended, so it is not unreasonable to suggest that people might have been more open to a story that did not view the birth of a baby as an unalloyed blessed event. (Levin's 1976 novel, "The Boys from Brazil," was a high-tech version of the same theme.) Be this as it may, there are certainly none of the conventional anti-natalist motifs in "Son of Rosemary." There is no huffing and puffing about overpopulation, for one thing, though that theme is hardly unknown in eschatological fiction. There is no occasion to mention kids as a career drag, and certainly there are none of the gruesome descriptions of morning sickness that figured so prominently in "Rosemary's Baby." Of course, the whole human race is exterminated, so you could say the book illustrates the effect of a really strict population control program, but somehow I don't think that is the point.

Something else that is not the point is universal eschatology. Although the Antichrist (and of course the Anti-Mary) are the central characters, "Son of Rosemary" really has nothing to do with late 20th century beliefs about the Last Days, or for that matter the endtime beliefs of any time or place that I am aware of. In both this novel and the earlier one, we are dealing not with apocalyptic, but with the world of ritual magic. Though this sort of thing does have its demotic side, the Levin books follow the literary tradition that places it among the educated and well-to-do. Its ceremonies must fit into private apartments (however high-ceilinged), and its conspiracies are little vendettas. You cannot profitably fit an apocalypse onto a stage so small. We see the world end on television and in that spectacular view of the Park.

Still, "Son of Rosemary" is a genial book, considering the subject, and it will please people who remember the earlier novel when it was new. My memory played tricks with me as I read "Son of Rosemary." At first, I did not recall having read "Rosemary's Baby" at all; I thought that I remembered the story just from the movie. Gradually, though, I realized that I recalled information that could not have been on film, so I probably did read it while I was in grammar school. The little details are lovingly recalled in the new book. The tannis root. The scrabble. And then, of course, there is the wicked anagram, ROAST MULES. One word. No, I won't tell you.

Copyright © 1997 by John J. Reilly

 

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-10-28: The Last Scandal; Good Usage; Little People

Homo floresiensis from ATOR (Arc-Team Open Research)

Homo floresiensis from ATOR (Arc-Team Open Research)

It is a little unclear where the small hominins on Flores Island came from, but the speculation is fascinating.


The Last Scandal; Good Usage; Little People

 

There are four things to keep in mind about the Al Qaqaa Explosives Scandal:

(1) The site was interfered with before the first US units arrived in the area, so there is no way to tell when the explosives were moved;

(2) As a defense of the Administration, point (1) is irrelevant; the Coalition should have determined the status of all IAEA sites from the beginning, even if it could not secure them;

(3) The story seems to be making the electorate's gorge rise; Bush's poll numbers have actually firmed up since Kerry started to talk about it;

(4) Next time, could we please invade a country with prettier place names?

* * *

Speaking of language, Geoffrey K. Pullum at Language Log has some remarks about the evolution of the generic "they" in English:

But the fact is that singular they is becoming completely standard, at least among younger Americans, whenever the antecedent is of a sort that could in some contexts refer to either sex. I heard a radio piece about pregnant high-schoolers in which a girl said something like I think if someone in my class was pregnant I would be sympathetic to them. In such cases it's not the inability to assign sex to the referent that drives the selection of singular they, it's the mere fact of the antecedent being quantified or headed by a noun like person that can in other contexts be used of either sex.

If it was good enough for Chaucer, it should be good enough for us.

* * *

And here is a further point of usage: what do you call a Westerner who makes common cause with Islamofascists to discredit his domestic political opponents? Consider using the term "Catilinarian," after L. Sergius Catilina, the scuzzy politician of the late Roman Republic. After losing several elections for the consulship to Cicero in the 60s BC, he tried to ally his urban supporters with a Gaulish tribe to overthrow the state. Cicero, of course, was an insufferable windbag, and since we know about Catiline (sometimes spelled "Cataline" in English) mostly through what Cicero had to say about him, he may not have been quite the demon we remember. Still, he was certainly a bad enough fellow that we may use his name for invective.

* * *

Someone else with a cavalier attitude toward Classical allusions is that Other Spengler, the one who writes for the Asia Times. Speaking in praise of the principle of preemptive military action, he recently produced this exercise in alternative history:

If Kaiser Wilhelm II had had the nerve to declare war on France during the 1905 Morocco Crisis, Count Alfred von Schlieffen's invasion plan would have crushed the French within weeks. Russia's Romanov dynasty, humiliated by its defeat in the Russo-Japanese War and beset by popular revolt, likely would have fallen under more benign circumstances than prevailed in 1917. England had not decided upon an alliance with the Franco-Russian coalition in 1905. The naval arms race between Germany and England, a major source of tension, was yet to emerge. War in 1905 would have left Wilhelmine Germany the sole hegemon in Europe, with no prospective challenger for some time to come.

I don't think you can run an international system on that basis, but it may be the only way to run a postnational one.

* * *

One of the many interesting points about the discovery of homo floresiensis is how often the term "hobbit" occurs in the press reports:

Not only did anthropologists find the skeletal remains of a hobbit-sized, 30-year-old adult female, in this fairy-tale-like discovery they also uncovered in the same limestone cave the remains of a Komodo dragon, stone tools and dwarf elephants..Subsequent finds of other similarly sized, 3-foot-tall humans with brains the size of grapefruits in a cave on the Indonesian island of Flores suggest these 18,000-year-old specimens weren't a quirk of an ancient hominin, but part of an entire species of miniature people whose existence overlapped with that of modern Homo sapiens.

I have often wondered what would have happened to the hobbits, if The Lord of the Rings were the real past. Nothing good, I fear. It is sad to think of Samwise's remote descendants being harried into increasingly marginal savagery. On the other hand, the florensiens used to hunt komodo dragons. As with hobbits, it may have been wise for any big people in the neighborhood to stay on their good side.

It is not clear when the florensiens became extinct. They may have been destroyed in a volcanic eruption about 12k years ago. They may have blended into modern populations, though that is questionable: the florensiens were descended directly from homo erectus; they were not eccentric homo sapiens. Inevitably, we are told that they may have survived into historic times, since modern people on Flores have stories about the little people who used to live in the caves.

The same argument has been made for the faery folk of northwestern Europe: maybe there was a race of small aborigines whose memory was preserved in folklore. Perhaps, but the fact is that people in Europe still see the damn things. Such apparitions could have other explanations.

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-04-14

Brazil, like many Latin American countries, has a color spectrum instead a color line, the result of not having anything like a one-drop rule defining who is black and who is not. This has interesting implications if you also want to have a binary white/black affirmative action program.

Noah Smith looks at the failures of macroeconomic models.

This is the kind of thing Razib Khan calls being a 'star-man', the result of genetic success. I am a bit non-plussed by the assertion in the article that Lindbergh was being untrue to his eugenic principles by fathering children with women who had difficulty walking due to a childhood illness. Susceptibility to infectious disease has some genetic component, but it is largely random, and so often has little impact on genetic fitness. I wouldn't be surprised if this kind of thing was more obvious to Lindbergh. On the other hand, maybe he was just a horndog.

The Library of Congress has a list of books that helped shape American culture. This is a pretty good list, and it seems about right to me. It is also much, much funnier if you read the list annotated with intersectional Pokemon points by Steve Sailer. Intersectionality is largely about status, which is also about class, which proponents would like you not to think about.

Joel Kotkin looks at the disenfranchisement and poverty of rural California.

A recent look at the research on whether videogames cause violence. [short answer, still no.]

A very clever bit of work in making a localizable font for displaying characters in Chinese, Japanese, and Korean languages.

Michael Anton AKA Publius Decius Mus makes an argument for a Trumpian foreign policy [one that arguably better instantiates Trump's campaign rhetoric than his actual behavior as President].

You need to be a well-educated Westerner to be surprised by this. Almost everyone else in the world is massively ethnocentric, and only cares about people like them. Notable exception, Nelson Mandela, fellow recipient of the Nobel Peace Prize, who blew people up and spent years in prison for it, negotiated a political compromise that preserved the power of whites in South Africa.

Tyler Cowen riffs on Shashi Tharoor's book Inglorious Empire: What the British did to India. Some of the claims of Tharoor's book are a little odd, like that deindustrialization was a British policy in eighteenth-century India. I'm not sure traditional artisans count as "industry".

The Long View: Omens of Millennium: The Gnosis of Angels, Dreams, and Resurrection

Ross Douthat pointed out today that atheism, as such, isn't particularly rational. For most of recorded history, gnosticism has been the preferred alternative for intellectuals to classical monotheism or paganism. The argument that God is evil is a far stronger one than that God doesn't exist.

Also, this paragraph:

Gnosticism

Gnosticism

The short answer to this view is that apocalypse and gnosis usually go together. Certainly they did in Zoroastrianism, the apparent source of much of Judeo-Christian apocalyptic. It is common in religious systems for eschatology to be expressed on both the personal and the universal level. In other words, the fate of the world and the fate of individual human souls tend to follow parallel patterns, and Gnostic theology is no different. Manicheanism, for instance, had a particularly elaborate cosmology describing how the divine substance was trapped in the world of matter, forming the secret core of human souls. The hope Manicheanism offered was that someday this divine essence will all be finally released in a terminal conflagration. Details vary among Gnostic systems, but they generally hold that the creation of the world shattered God. History and the world will end when the fragments are reassembled. Often this takes the form of the reintegration of the Primal Adam, the cosmic giant whose fragments are our souls. While this aspect of gnosis can also be taken metaphorically, the fact is that Gnostic millenarianism has not been at all rare in history.

is the best summary of End of Evangelion I've ever seen. Far better than this psychoanalytic take [Freud was a fraud, dammit].


Omens of Millennium: The Gnosis of Angels, Dreams, and Resurrection
By Harold Bloom
Riverhead Books (G.P.Putnam's Sons), 1996
$24.95, pp. 255
ISBN: 1-57322-045-0

 

Getting Over the End of the World

 

Harold Bloom, perhaps, needs no introduction. A professor at both Yale and New York University, he is primarily a Shakespearean scholar who in recent years has taken an interest in religious questions in general and American religion in particular. This book is a personal spiritual meditation. Though quite devoid of index, footnotes or bibliography, it is well-informed, and the author is good about citing his sources. In fact, the book has something of the appeal of G.K. Chesterton’s historical works: the author relies on a modest selection of books with which many of his readers are probably familiar, so the argument is not intimidating. Reading it, you will learn a great deal about Sufism, Kabbalah and those aspects of popular culture that seem to be influenced by the impending turn of the millennium. You will, however, learn less about millennial anticipation than you might have hoped. The lack is not an oversight: apocalypse is a kind of spirituality that holds little appeal for Bloom. While this preference is of course his privilege, it does mean that, like the mainline churches which prefer to take these things metaphorically, his understanding of the spiritual state of today’s millennial America has a major blind spot.

Bloom's subject is his experience of "gnosis," the secret knowledge that is at once self-knowledge and cosmic revelation. The book's method is a review of different kinds of gnosis. Bloom has much to say about "Gnosticism" properly so-called, which was the religion of heretical Christians and Jews in the early centuries of the Christian era. (It would be churlish to put "heretical" in quotations marks here. The word, after all, was coined with the Gnostics in mind.) He is also concerned with contemporary popular spiritual enthusiasms. We hear a lot about the fascination with angels, dreams, near-death experiences and intimations of the end of the age that take up so much shelf-space in bookstores these days. Bloom is at pains to show that these sentimental phenomena in fact are part of a long Gnostic tradition that has engaged some of the finest minds of every age.

This aspect of the book is perhaps something of a patriotic exercise, since Bloom reached the conclusion in his study, "The American Religion," that America is a fundamentally Gnostic country, whose most characteristic religious product is the Church of Latter Day Saints. Bloom’s conclusions struck many people familiar with the professed theologies of America’s major denominations as a trifle eccentric, but he was scarcely the first commentator to claim that the people in the pews actually believe somthing quite different from what their ministers learned at the seminary. Besides, Tolstoy thought much the same thing as Bloom about the place of the Mormons in American culture, so who will debate the point?

Bloom is perfectly justified in complaining that the angels in particular have been shamefully misrepresented in America today. In the popular literature of angels, they appear as a species of superhero. They friendly folks just like you and me, except they are gifted with extraordinary powers to make themselves helpful, especially to people in life-threatening situations. Angels in art have been as cute as puppies for so long that the popular mind has wholly lost contact with the terrifying entities of Ezechiel’s vision. Bloom seeks to reacquaint us with these images, particularly as they have survived in Kabbalah and in Sufi speculation. He is much concerned with Metatron, the Angel of America, variously thought to be the Enoch of Genesis and the secret soul of the Mormon prophet Joseph Smith. His treatment of Metatron never quite rises to that of Neil Gaiman and Terry Pratchett in their novel, “Good Omens,” who describe him as, “The Voice of God. But not the ‘voice’ of God. A[n] entity in its own right. Rather like a presidential spokesman.” Nevertheless, it is good to see some hint of the true depths of angelic theology make available to the general public.

While “Omens of Millennium” is not without its entertaining aspects for people who do not regularly follow New Age phenomena, Bloom does seek to promote a serious spiritual agenda. The central insight of gnosis (at least if you believe Hans Jonas, as Bloom does without reservation) is the alienage of man from this world. We are strangers to both matter and history. Bloom despairs of theodicy. Considered with an objective secular eye, the world is at best a theater of the absurd and at worst a torture chamber. If there is a god responsible for this world, then that god is a monster or a fool. And in fact, for just shy of two millennia, Gnostics of various persuasions have said that the god of conventional religion was just such an incompetent creator. The consolation of gnosis is that there is a perfect reality beyond the reality of the senses, and a God unsullied by the creation of the world we know. The fiery angels, the prophetic dreams, the visions of an afterlife that make up much of the occult corpus are images of that true reality. They move in a middle realm, connecting the temporal and the eternal, ready to guide human beings desperate enough to seek the secret knowledge that gives mastery over them.

The people take these images literally. They believe they will not die, or that the resurrection is an event that will take place in the future. They believe that spiritual entities wholly distinct from themselves love them and care for them. They wait, sometimes with anxiety and sometimes with hope, for the transformation of this world. The Gnostic elite, in contrast, knows that these things are symbols. They understand that there is something in themselves that was never created, and so can never die. They can learn to use the images of the mid-world to approach these fundamental things, but without investing them with an independent reality. They need neither hope nor faith: they know, and their salvation is already achieved.

All of this sounds wonderfully austere. It allows for an aesthetic spirituality that avoids the twin perils of dead-between-the-ears materialism and vulgar supernaturalism. It is, one supposes, this sort of sensibility that accounts for the popularity of chant as elevator music. Neither is this spirituality without formidable literary exponents. Robertson Davies, for instance, suffused his fiction for decades with a genial Gnostic glow, marred only occasionally by a flash of contempt for the “peanut god” of the masses. Of even greater interest to Bloom, perhaps, would be the fiction of John Crowley. His recent novel, “Love and Sleep,” is entitled with the esoteric terms for the forces by which the truly divine is imprisoned in the world of matter. The story even treats in large part of Shakespeare and Elizabethan England, Bloom’s special province. If gnosis as such still seems to have a relatively small audience, this could be reasonably ascribed to its very nature as a philosophy for a spiritual elite. The problem with Bloom’s particular take on gnosticism, however, is that it is not only alien to sentimental popular religion, it is also alien to the esoteric forms gnosis has taken throughout history.

Bloom believes that gnosis appears when apocalyptic fails. This is what he believes happened in Judaism around the time of Jesus. By that point, Palestine had been bubbling with literal millenarianism for two centuries. Generation after generation looked for the imminent divine chastisement of Israel’s enemies and the establishment of a messianic kingdom. This universal regime would endure for an age of the world that, thanks to the Book of Revelation, finally came to be called “the millennium.” The dead would rise, the poor would be comforted, and the wicked would be infallibly punished. It was the stubborn refusal of these things to happen that prompted the strong spirits of those days to consider whether they may not have been looking for these things on the wrong level of reality. They were not arbitrary fantasies; they spoke to the heart in a way that mere history could not. Rather, they were images of realties beyond what this dark world could ever support. This was true also of the image of the apocalypse, in which this world comes to the end it so richly deserves. Apocalypse properly understood is not prophecy, but an assessment that put this world in its place. More important, it pointed to the greater reality that lay eternally beyond the world. Bloom hints that this process of ontological etherealization is in fact the explanation for Christianity itself, since he suspects that Jesus himself was a Gnostic whose subtle teachings were grossly misinterpreted by the irascible apostle Paul.

The short answer to this view is that apocalypse and gnosis usually go together. Certainly they did in Zoroastrianism, the apparent source of much of Judeo-Christian apocalyptic. It is common in religious systems for eschatology to be expressed on both the personal and the universal level. In other words, the fate of the world and the fate of individual human souls tend to follow parallel patterns, and Gnostic theology is no different. Manicheanism, for instance, had a particularly elaborate cosmology describing how the divine substance was trapped in the world of matter, forming the secret core of human souls. The hope Manicheanism offered was that someday this divine essence will all be finally released in a terminal conflagration. Details vary among Gnostic systems, but they generally hold that the creation of the world shattered God. History and the world will end when the fragments are reassembled. Often this takes the form of the reintegration of the Primal Adam, the cosmic giant whose fragments are our souls. While this aspect of gnosis can also be taken metaphorically, the fact is that Gnostic millenarianism has not been at all rare in history.

One of the impediments to understanding apocalyptic is the secular superstition, perhaps best exemplified by E.J. Hobsbawm’s book “Primitive Rebels,” that millenarianism is essentially a form of naive social revolution. Thus, one would expect people with an apocalyptic turn of mind to be ill-educated and poor. Bloom is therefore at something of a loss to explain the ineradicable streak of millenarianism in American culture, a streak found not least among comfortable middle class people who worship in suburban churches with picture windows. His confusion is unnecessary. Indeed, once could argue that the persistence of American millenarianism is some evidence for his thesis that America is a Gnostic country, since gnosticism is precisely the context in which apocalyptic flourishes among the world’s elites.

Sufi-influenced Islamic rulers, from the Old Man of the Mountain to last Pahlevi Shah of Iran, have a long tradition of ascribing eschatological significance to their reigns. Kabbalah has an explosive messianic tradition that has strongly influenced Jewish history more than once, most recently in the ferment among the Lubavitchers of Brooklyn. (The tradition is itself part of an intricate system of cosmic cycles and world ages, in which more or less of the Torah is made manifest.) Regarding Christian Europe, Norman Cohn has made a special study of the Heresy of the Free Spirit, which from the 13th century forward offered Gnostic illumination to the educated of the West in a package that came with the hope of an imminent new age of the spirit. As Bloom knows, the Renaissance and early modern era, and not least Elizabethan England, was rife with hermeticists like Giordano Bruno who divided their time between political intrigue and their own occult apotheosis. The gentlemanly lodge-politics of the pre-revolutionary 18th century made a firm connection between hermetic theory and the hope of revolution (as well as providing endless entertainment for conspiracy buffs who think that secret societies like the Bavarian Illuminati are somehow immortal). Whatever else can be said about gnosis, it is clearly not hostile to apocalyptic thinking.

In the light of this history, it is hard to accept Bloom’s complacent assertion that gnosis bears no guilt because it has never been in power. It has frequently been in power, though rarely under its own name. There is even a good argument to be made that the Nazi regime was fundamentally Gnostic. Certainly Otto Wagoner, one of Hitler’s early confidants, made note of his master’s admiration for the Cathars, those martyrs of the Gnostic tradition. Some segments of the SS even cultivated Vedanta. For that matter, as Robert Wistrich argued in “Hitler’s Apocalypse,” the regime’s chief aim was the expungement of the Judeo-Christian God from history. Marcion, the ancient heresiarch who rejected the Old Testament as the work of the evil demiurge, might have been pleased.

Is there a logical connection between gnosis and apocalyptic? Of course. Apocalypses come in various flavors. Some are hopeful, some are fearful, some are actually conservative. There is also an apocalypse of loathing, of contempt and hatred for the world and its history. We can clearly see such a mood in societies that nearly destroy themselves, such as Pol Pot’s Cambodia, or sixteenth century Mexico, but to a smaller degree it has also informed less extreme revolutions and upheavals throughout history. Gnosis has much in common with this mood. Gnostics at best seek to be reconciled with the world. Some seek to purify themselves of it. Others look forward to its destruction in a grossly literal fashion. More than a few, it seems, have been willing to help the process along.

Finally, at the risk of making a churlish comment about what after all is supposed to be a personal spiritual statement, one might question the credentials of gnosis to be the treasured possession of a true spiritual elite. Bloom mentions at one point that C.S. Lewis’s “Mere Christianity” is one of his least favorite books. One may be forgiven for wondering whether this antipathy arises because even a cursory acquaintance with Lewis’s writings show him to have been a Gnostic who eventually grew out of it. (If you want a popular description of serious angels, no book but Lewis’s novel “That Hideous Strength” comes to mind.) As St. Augustine’s “Confessions” illustrates, gnosis may be a stage in spiritual maturity, but it has not been the final destination for many of the finest spirits. Bloom seems to think that his version of gnosis has a great future in the next century, after people tire of their current millennial enthusiasms. Perhaps some form of spirituality has a great future, but it is unlikely to be the one he has in mind.

An abbreviated version of this article appeared in the February 1997 issue of First Things magazine.

Copyright © 1997 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Cthuluism and the Cold War

Despite John's protests, I think this is pretty funny, and disturbingly topical.


Cthuluism and the Cold War

 

Preface:

Some of the references in this parody are admittedly obscure. You have not only to know a bit about Lovecraft's fiction, you also have to be familiar with public affairs programming on the US Public Broadcasting System. It also helps to be up on the latest (circa 1998) twist of Cold War revisionism. Even then, of course, you may also have to be pretty easy to amuse to find any of it funny.

Well, here goes anyway. Happy Halloween!

Disclaimer
Any resemblance between living persons and the dead is deeply regretted.

 

"Welcome to the Bob Lerner News Hour. I'm your host, Bob Lerner. That's why I am telling you this.

"Tonight, our main story is something else you have probably already seen done to death on CNN: new revelations about the role of Cthuluism in American politics during the Cold War. Our guests tonight are Dr. Timothy Turnip, professor of Comparative Eschatology and author of the widely banned `McCarthy versus the Starry Wisdom Party,' and Charles Dexter Ward, publisher of `The Burrower,' a journal of disturbing political opinion."

LERNER: "Good evening, Dr. Turnip; good evening, Mr. Ward."

TURNIP: "I've been waiting for years to get on this show. What happened to the smart host?"

WARD: "Loser."

LERNER: "Dr. Turnip, can you tell us about the significance of the recently declassified sections of the Venona Codex?"

TURNIP: "The Codex proves beyond a shadow of a doubt that people like Alger Hiss and the Rosenbergs were in fact in league with unspeakable evil throughout the 1930s and `40s. We not only have names and dates, we even have Henry Wallace's fingerprints on the Silver Key."

LERNER: "And Mr. Ward, what do you have to say to that?"

WARD: "Highly mephitic, I say. This is pure American triumphalism. Maybe 100 million people have been consumed since the Old Ones returned in 1917, but that is no reason to condemn as a traitor everyone who ever attended an invocation of the Crawling Chaos. We are talking about the fundamental legitimacy of progressive politics here."

LERNER: "Dr. Turnip?"

TURNIP: "Throughout the 20th century, the term `progressive' has been the silken mask of the High Priest Not to be Described. It's people like the readers of `The Burrower' who became pacifists when the Ribbentrop-Nyarlathotep Pact was signed, but suddenly changed their minds when Hitler invaded Leng."

WARD: "This is McCarthyism of the most eldritch kind. In the 1930s, no one but the Starry Wisdom Party was doing anything in this country about racial equality and the condition of working people. That's what the Cthuluist tradition is really about."

TURNIP: "If you read the Party platform from those years, you will see that what 'equality' meant to Cthuluists was that all non-initiates were equally tasty. As for the condition of workers, you know perfectly well that the old CIO demanded that the membership surrender their souls on election day."

LERNER: "Gentlemen, please. To change the subject slightly, it is often said today that the only place that Cthuluism still finds adherents is on college campuses. Mr. Ward, would you agree with that?"

WARD: "That is a squamous calumny on multiculturalism. There are indeed a few campuses today where gender equity and anthropophagy are actively promoted by the administration, but the reality is that most institutions of higher education in this country are highly reactionary. To this day, in fact, a few colleges refuse to hire faculty who cannot tolerate direct sunlight. But doubtless this situation pleases Dr. Turnip and his neoconservative friends at Miskatonic University."

TURNIP: "The real fact of the matter is that our universities have been taken over by Black Diaper Babies."

WARD: "You know, it's people like you who see a zoog under every bed."

TURNIP: "There usually are zoogs under my bed; it's people like you who send them."

LERNER: "Dr. Turnip, isn't what you say a little extreme? Aren't you free at Miskatonic University to write and teach whatever you want about the influence of the Starry Wisdom Party?"

TURNIP: "Let me begin by saying that Dean Golder at Miskatonic has done a very good job of keeping the more obviously non-human applicants out of the tenure track, at least in the liberal arts. And it is also true that, nationally, the number of undergraduates who are inexplicably dismembered on Lammas Night has fallen to its lowest level since the late `60s. Nevertheless, the situation only grows worse and worse. Spontaneous deliquescence is now a protected condition under the Americans with Disabilities Act. Literature survey courses used to start with 'Moby Dick.' Now they start with 'The Pnakotic Manuscripts.' There's postmodernism for you. Most of these ideas are simply imported from France, where Cthulu always had a large following."

LERNER: "That brings us to an important point. Is it really fair to identify French postmodernism completely with Cthuluism?"

TURNIP: "Well, Michel Foucault did die by being torn in pieces by a nightgaunt over the Boulevard Saint Germain."

WARD: "Excuse me, but I think it is simply bigotry to invoke the tragic circumstances of Foucault's death as a way to discredit his ideas. It expresses contempt for the thousands of people who suffer similar afflictions everywhere in the world today."

LERNER: "Point taken, Mr. Ward. Let me bring this discussion to a close by asking you both about the significance of the events of 1989. Do you think that the fall of the Gate in that year permanently discredited Cthuluism as a viable intellectual option, or do you think that the Old Ones might be objects of worship again? Dr. Turnip?"

TURNIP: "I believe that the Starry Wisdom Party will continue to be discredited. The Shadow may grow again, but it will have to take a different form.

LERNER: "Mr. Ward?"

WARD: "If you knock down a Gate, you not only make a way for yourself to go out, you make a way for what is on the other side to come in. `That is not dead which can eternal lie; the struggle continues.'"

LERNER: "And there we must end it. Gentlemen, good night."

TURNIP: "What does Michael Beschloss know that I don't know, eh?."

LERNER: "YOG SOTH...er, yes, well, good evening."

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Coming Age of Cathedrals

Times Square in 1978

Times Square in 1978

In 2014 I speculated that John Reilly probably knew Richard Landes, because of their common interests. I managed to miss this essay of John's where he talked about meeting with Landes in New York City. I'm not sure how, since I referenced the ideas here in a couple of talks I gave at my local Catholic parish on millennialism

This essay is also an interesting point of contact with my unreleased review of Christopher Nolan's The Dark Knight. By 2004, New York City had begun to decisively move away from the archetype of Gotham City that it had been embodying ever since the 1970s[popularly referred to as the Sixties, this trend really started about 1968 and peaked in 1973]. The fate of Times Square is a synedoche for the city as a whole. The only time I have been to Times Square was in 1998, on a school trip, and it was less seedy than in 1978, but far less clean than John saw it in 2004, or the sanitized version we have now.

The story of how that happened is a fascinating one, and it illuminates the curious nature of American politics at present. But that is a story for another post.

I really like Richard Landes' theory that millennialism is embarrassing to most educated Westerners, while also being absolutely fascinating to almost everyone, even the people who find it embarrassing. John takes that idea here, and links it up with a great many other ideas that he often featured on his blog, and produces a truly great essay on how the idea of historical progress fits in to the broader cultural trends of the West.

Written in 1997, this essay is more optimistic than it would have been in 2017. In 1997, the United States was in the middle of an economic boom, had no serious rivals, and had not yet been humbled by 9/11. An interesting twist to 2017 is that the optimism of 1997 really did manage to leave out a number of Americans from the increasing prosperity, but since they were largely concentrated in the declining industrial heartland, the hidden losers of the dotcom boom, the coastal elites largely ignored them. This was likely helped by a very robust late 90s stock market. Pensions were generally pretty strong then.

For all that, John had enough historical depth to know that good times don't last forever.


The Coming Age of Cathedrals

 

by John J. Reilly

 

I rarely have occasion to walk through Times Square in Manhattan. My visits to the city usually have to do with business to the east or south. That was why, when I walked through the area one morning in the summer of 1996 on one of my even rarer visits to Lincoln Center, I was taken aback by how much the place had improved since the last time I saw it. It was still noisy and crowded (it is hard to imagine that location in the city being otherwise), but the area was clean. The facades and many of the buildings were new. If any of the stores specialized in pornography, they were discrete about it. Shabby persons did not wait under the eaves of storefronts to offer goods and services to passersby. There was a cop or security guard on every other corner.

If the dark, dystopic film "Bladerunner" is the popular image of the future American city, then here was a city that was evolving in a different direction. It was not just Times Square, of course. The reason you see few politicians trekking to the South Bronx these days is that the burned-out neighborhoods that provided such dramatic photo opportunities for several election cycles have been substantially torn down and rebuilt. Actually, good news like this seems to crop up more and more these days, in areas ranging from medicine to crime statistics. Like many of the people who pass through Times Square each day, I generally just note the improvement and continue on my way. That morning, however, I was going to a meeting that gave me reason to consider such things in a broader context.

I was in Manhattan to speak to one Richard Landes, a medievalist from Boston University and an authority on the year 1000. With each year's calendar getting closer to the double-millennium figure, this previously obscure subject is becoming increasingly topical. It is already fashionable to attribute this or that event to "millennial fever." (In a way, that is what I am going to do here.) Anyway, we were meeting to talk about several of the academic projects that are in the works in connection with the upcoming turn of the millennium.

Landes is something of a revisionist. Like many revisionists who seek to overturn the accepted wisdom on a subject, his new interpretation is a dialectical synthesis that strongly resembles the view of the matter which preceded the accepted wisdom he is revising. For reasons which I trust I will be able to make clear, his ideas about the 11th century may have a great deal to do with the once and future Times Square.

It was perhaps the nineteenth[-century] historian, Jules Michelet, who was most responsible for popularizing the idea of the "terrors of the year 1000." You can find contemporary, or nearly contemporary, chronicles of the period which describe the people of Western Europe as living in a agony of apocalyptic expectation. There are accounts of civil disturbances, of grotesque acts of mass public repentance, of popular prophets and their crazed followers. All in all, Michelet made the turn of the millennium sound like the sixteenth century on particularly bad day. By the beginning of the twentieth century, historians realized there was something fishy about this picture. For one thing, while these accounts turn up in some historical literature from the period, they do not dominate it. More generally, Western Europe in the decades following the year 1000 really did not act like a society that was paralyzed by fear of the imminent end of the world, or that was disappointed by the failure of its eschatological schedule.

The 11th century was the time when the great cathedrals began to go up and the crusades were launched, following decades of increasing contact with Byzantium and the Levant. Western Christendom in those decades was an expanding, curious, inventive society. To that extent, it did resemble the Western Europe of the sixteenth century. However, this earlier age of discovery and change was not characterized by the dark disasters of the century that followed Columbus and Luther. This perhaps is the chief reason why for nearly a hundred years historians have generally believed that the "terrors of the year 1000" existed largely in the minds of the nineteenth century Romantics.

Well, maybe not. Landes and other medievalists are taking a third look at the primary sources, and finding both more and less in them than did their predecessors. It is true that nothing happened around the beginning of the second millennium on the order of the Peasants' Revolt in sixteenth Germany. (For purposes of eschatological anxiety, by the way, the millennium did not turn in an instant. The year 1033, for instance, was at least as good a year for the Second Coming in the minds of apocalyptic literalists as was the year 1000.) On the other hand, it is not hard to find discussion about questions of universal eschatology in the writings of the period. Evidence of popular interest in these questions is fragmentary, but it is there. More accessible is the scholarly debate which arose about when the age might be expected to end.

Landes believes he detects a degree of censorship among the writers of the period in favor Augustine's model of history. Whatever else might be said of Augustine's ideas about the end of the world, certainly they tended to downplay the catastrophic and revolutionary. (Violent, popular endtime belief is sometimes characterized as "millenarian," to be distinguished from the less dramatic "millennialism" with which Augustine is often associated.) Augustine, in most interpretations of him, preserved the events of the Endtime depicted in Revelation and the Prophets as literal expectations for the indefinite future. However, his system (to the extent he had one) was very wary of any attempts to discern eschatological significance in the events of secular history.

The medieval Latin Church, in its eschatology as in so much else, was at least nominally Augustinian. The Church around the year 1000, however, dealt in two ways with what probably was perceived to be a crisis of apocalyptic expectation. The immediate response was to deal with millenarianism on its own terms. The more long-term and more important response, however, was to transform apocalyptic into theodicy.

The proper Augustinian reaction to millenarian enthusiasm, particularly to enthusiasm sparked by calendrical considerations, is to declare the time of the end to be unknowable. What many of the authorities around the year 1000 did, however, was to quibble about chronology. Thus, accepting for the sake of argument the old thesis that the world would last 6,000 years, they answered doom-mongers with estimates for the age of the world that put the beginning of the seventh millennium a comfortable distance into the future. Such arguments were not always wholly convincing on their merits, and they did have the disadvantage of leaving time bombs for later Augustinians. (The excitement about the year 1000 was perhaps a time bomb planted by Augustine himself.) Be that as it may, such arguments sufficed for the immediate occasion, and they probably did contribute to the pacification of millenarian sentiment, especially among the lower clergy.

On the other hand, there is a great deal more to Augustinian eschatology than the suppression of other people's enthusiasms. Augustine is sometimes called "the father of progress." This view can be exaggerated, as it was perhaps in Robert Nisbet's "History of the Idea of Progress." Certainly St. Augustine's ideas about the future bore little resemblance to those of, say, the Fabian socialists. Nevertheless, there is a great deal to be said for the proposition that his model of time is the basic template on which more specific ideas about history can form, of which progress is simply one instance.

Augustine freed time from the constriction of an imminent eschaton, thereby making history a theater of grace and will. Augustinian history need not be progressive, but it can be. In fact, it has a predilection to be under certain circumstances. The Augustinian view of time is not unique in being linear or in its suspicion of revolutionary enthusiasm. Neo-Confucianism, for instance, has these characteristics. For that matter, Neo-Confucian historiographers, like Landes's turn-of-the-millennium ecclesiastics, did indeed tend to de-emphasize or mischaracterize popular millenarian movements. What makes Augustinianism different is its ability to impart meaning to favorable historical trends.

Although the idea of historical progress has received more than its share of derision in recent years, the fact is that many facets of history, and even whole historical eras, really are progressive. The statistics on population growth and economic output in certain parts of the world often rise steadily for a long time. New arts and sciences appear and are perfected over the course of a few centuries. These things were almost as true of the Hellenistic Age as they were of the West in the nineteenth and twentieth centuries. Yet notoriously the ancients were without an idea of progress, despite the fact that at least part of their history was progressive by any measure. Other fortunate times and places have suffered from a similar lack of imagination.

Was Western Christendom at the turn of the first millennium the first society to take steps toward giving historical meaning to "progress," to great social enterprises terminating only at a horizon of unguessable distance? Naturally, the classical nineteenth-century idea of progress is no more medieval than it is Hellenistic or Neo-Confucian. However, the cathedrals and the crusades may stand as symbols for a wider cultural assumption that social development can be a moral enterprise, perhaps even a morally necessary enterprise. Such a conviction would be far less deterministic than, for instance, the theology of the Social Gospel. Socially progressive Christianity in this century has demanded progress from history. Augustine merely hoped for it. Perhaps he did not hope for very much, just that the Vandals would go away and that future emperors would be more edifying. Nevertheless, he hoped with good reason.

Whatever the validity of these reflections with respect to the 11th century, certainly this interpretation of Augustine is alive and well and being expounded from the Throne of St. Peter. John Paul II's 1994 encyclical on the celebration of the coming turn of the millennium, "Tertio Millennio Adveniente," can hardly be described as a millenarian document. Nevertheless, it looks forward to the turn of the century as far more than a peculiarly obvious occasion for historical commemoration. For reasons which are perhaps intuitive, the Pope anticipates that the beginning of the next millennium will be a time of novel significance in the history of salvation. The encyclical puts the Second Vatican Council into perspective as a providential event whose true significance was to prepare for this new era. The specifics of the document are concerned with how the Church should ready herself to take advantage of these coming opportunities.

If in fact the next century is another time of constructive hope, future historians who attend to such things will probably see this transformation as in part a reaction to the dark images of the future that have prevailed since the 1960s. While the `60s were a time of exhilaration for the young, we should not forget that one of the tenets of the Counter Culture of that era was the coming collapse of civilization. William Irwin Thompson perhaps best captured the mood of the period in his still-interesting book, "At the Edge of History" (1971). Visiting the Esalen Institute retreat center in the summer of 1967, he learned that the end of civilization was not only expected by hippies of nearby San Francisco, it was devoutly hoped for. Many of the people attending Esalen with him that summer, who like himself would soon become prominent figures in the nascent New Age movement, were of similar if subtler mind. They were less likely than the hippies to put their faith in predictions of world-changing earthquakes or in the public arrival of the flying saucers. Instead, they anticipated salutary effects from the breakdown of American society from more conventional causes. After an era of "broken-back" technocracy, they expected a new spiritual age to emerge. The immediate future did not turn out the way that the budding opinion-makers of those years anticipated, but as with so much else about the Counter Culture, their view of the future became a popular orthodoxy.

The hope for a new spiritual age has waxed and waned, but the expectation of a future with a broken back has shown a quarter century of resilience. Examples of it can be found from before the 1970s, of course. It is related to "post apocalypse" stories, tales built around the idea of a new barbarism that arises after some great catastrophe, usually a world war. H.G. Wells's novel "The Shape of Things to Come" (1933) may be the classic of the genre, despite the fact it antedates the invention of nuclear weapons (which were another one of Wells's ideas, but that is another story). However, while many of these stories, including Wells's, are about the rebuilding of civilization, the broken-back future is about civilization's progressive darkening. It depicts a society in which social chaos often continues to exist with high technology. Among its prominent literary exponents is Doris Lessing, in such novels as "Memoirs of a Survivor" and "Shikasta." It achieved a somewhat cultish respectability in the work of J.G. Ballard. It appeared in the short novels by John Crowley, notably "Engine Summer." As for cinema, we find it in films from "Soylent Green" to the appallingly-influential "Blade Runner." The influence of the "Mad Max" series has, of course, long been inescapable. In fact, in recent years it has become difficult to find fictional presentations of the near future that do not feature decaying cities, a ruthless ruling class, economic collapse and impending ecological catastrophe.

These images have effects beyond the arts. They informed, though of course they did not determine, a great deal of the social and economic thought of the last twenty years. The "declinist" school of geopolitics, most notably associated with Paul Kennedy's provocative book "The Rise and Fall of the Great Powers," sometimes seemed to depict a future for America more than a little like the shabby International Style cities of the "Max Headroom" television series. On a more serious level of ethical reflection, John Lukacs' "The End of the Twentieth Century" anticipated a new dark age, in which ethnic nationalisms clash in the twilight like street gangs with national anthems. This image of the geopolitical future is not original with Lukacs. It is close to being the consensus view of the post-Cold War world.

One might be tempted to see these motifs as simply reflections of America as it changed during the Reagan Administration, but they antedate the Reagan years. It is, indeed, quite likely that they affected the way those years were reported. Certainly they affect the way America is perceived by its public officials at this writing. Charles Lane, writing recently in "The New Republic," describes how a group of staffers from the Clinton White House met for a briefing by a German economist to get some international perspective on U.S. economic policy. They had come expecting a lecture on the comparative disadvantages of slovenly American work habits and the lack of coherent U.S. industrial policy. They quickly became highly disoriented. The economist lamented the way things are done in his own country. He came close to depicting the United States as a Shangri-La of job growth and technological innovation. The staffers were at a loss to know what to say.

America is not Shangri-La, nor likely to become such a place anytime soon. However, it is also no longer the country in desperate need of restructuring that it was in 1976. At various levels, the realization that this is the case is gradually seeping into elite opinion. It is, perhaps, even affecting public administration, as projects like the renovation of Times Square illustrate. If you expect Mad Max to rule the future, then you are unsurprised by the decay of public places and disinclined to do much about it. What is perhaps most interesting about America culture today is the revulsion, sometimes inarticulate but with increasing clarity, against the assumption of a dark future.

Regarding the material side of things, the case for a merry beginning to the next millennium was recently put by the economist (and senator's nephew) Michael Moynihan in "The Coming American Renaissance." Even if the title is over-optimistic, nevertheless it is useful to have a handy compendium of good news that is only gradually becoming reportable.

Economics is not everything, of course. If you want a positive image for American society as a whole in the next century, you could do much worse than to consult William Strauss and Neil Howe's "Generations." It appeared in 1991, and it apparently has something of cult. It stays in print for good reason, since its anachronistic forecasts of declining crime rates and rising academic performance have proven remarkably accurate.

The book is another attempt to interpret American history as a recurring sequence of generational psychologies. The elder and younger Arthur Schlesingers tried this with a model using two types of generation, whereas Strauss and Howe use four. The latters' thesis owes its popularity in large part to its description of Generation X as a set of demographic cohorts fated to be misfits and tragic heroes. They are like the Lost Generation of the 1920s, and thus more noble than the be-ringed and be-whiskered zombies they appear to be at first sight. The Xers, it seems, are destined to be the parents of a new "civic" generation, one that could accomplish works of daring and organization in the next century as great as those accomplished by the "civic" generation that began to come of age about 1940. They will be the sort of people who could colonize Mars, create universal peace and end poverty. They will have their faults, just as the World War II generation did. Still, any future they would create would be more in the spirit of the 11th century than in that of "Bladerunner." One hopes that at least the lighting will be brighter than in the film.

Attempts to predict the future are best kept to the briefest of outlines, unless you want to afford amusement to people who live in the future you attempt to describe. Certainly these reflections have been at a sufficiently high level of abstraction to protect them from disconfirmation by grubby facts. All I am suggesting, really, is that if the turn of second millennium is significantly similar to the turn of the first, then we should look for a dynamic century of hope and progress on many levels. On the other hand, these reflections have also been too specific, since they referred mostly to the United States. A millennial future would involve the whole West as well, since all our civilization runs to some degree on the same historical clock.

Many people make a point of ignoring the current pope, including some who work in the same building as he does. In this case, however, I wonder whether he may not have sniffed a change in the eschatological wind. We live at the end of a chaotic interlude. That it was going to end should not have surprised us: few conditions are so ephemeral as chaos. Order always reasserts itself, whether in international politics or in personal mores. This insight is likely to be a commonplace of the other side of what John Paul II calls the "Holy Door" of the year 2000.

End

Copyright © 1997 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: What If the Second Temple Had Survived AD 70?

A fun bit of alternative history exploring the likely impact of the survival of the Second Temple upon the religion and politics of the Middle East.


What If the Second Temple Had Survived AD 70?

 

This note takes issue with Donald Harman Akenson's recent book, "Surpassing Wonder: The Invention of the Bible and the Talmuds." You can find my review of the book by clicking here

--John J. Reilly

 

----------------------------------------------------------------------------

 

Akenson's governing assumption is that the key event that created Christianity and Rabbinical Judaism was the destruction of the Temple at Jerusalem in AD 70. Actually, he holds that there never was such a thing as non-rabbinical Judaism. Akenson uses the words "Judahism" to refer to the religion of Yahweh that existed in Palestine between the end of the Babylonian Captivity in the sixth century BC and the fall of Jerusalem to the Romans. This was a religion of very many sects, which often had little in common and sometimes were mutually hostile.

One growing sect after about AD 30 was the Jesus Faith. Another was the closely related (and therefore antagonistic) movement known to us as the Pharisaism. (Akenson makes the interesting observation that we know of just two self-proclaimed Pharisees. One was St. Paul, the other was Flavius Josephus, the turncoat author of "The Jewish War.") Like the rest of Judahism, these two groups greatly revered the Temple, and their religious practice was closely connected with it. According to Akenson, it was only the destruction of the Temple that made it possible for them to become separate religions. They then set themselves to replace the physical temple with mental temples. Thus, the Christian scriptures came to refer to Jesus as the Temple, while the rabbis came to equate studying the rituals that had been performed in the Temple with actually conducting them.

The year AD 70 (well, the Roman-Jewish War of AD 66-73) is a comforting landmark to historians of religion. God alone knows precisely when Jesus was born or what the Sadducees really believed. For scholars of religion to study the first century, they must interpret and reinterpret partisan texts of ambiguous provenance, all while living in terror that someone will blow their beautiful theories to smithereens. (As, indeed, they themselves plan to do to the theories of their colleagues.) For the Jewish War, in contrast, they have vivid first person accounts and sober descriptions by the standard historians of the second century. Scholars are greatly tempted to attribute decisive significance to this event for the perfectly understandable reason that they happen to know a lot about it.

The problem is that the fall of the Temple need not have been decisive for the history of either Christianity or Judaism.

The case of Christianity need not detain us. It is possible that the whole of the "Jesus Faith" was reconfigured after AD 70 to show that it had always been independent of its homeland. Maybe all that the earliest Jesus People wanted was to add a little filigree about the Messiah to their Temple-based religious practice. Perhaps the entire canon of the New Testament grossly misrepresents both the life of Jesus and the careers of the Apostles, particularly that of St. Paul. Well, maybe. The problem with this sort of argument is like the problem with the argument that God created the world in 4004 BC, fossils and all, to look as if it were billions of years old. The fact is that the texts of the New Testament say what they say. They do not suggest that the Temple was central to the concerns of the earliest Christians, or even to Jesus himself. If the New Testament is judged to be wholly misleading on this matter, then fancy can wander freely. However, the result will have nothing to do with history.

With Judaism, the matter is more complicated. The Mishnah, the code of the "oral law," does consist in large part of loving recollection of the structure of the Temple and the rites performed there. Prayers for the reconstruction of the Temple featured in public and private devotions for centuries. These observations, however, do not address the question of whether this preoccupation could not have developed had the Temple not been destroyed.

The obvious analogy is Islam. Like Judaism before AD 70, Islam has a ritual center, in Mecca. It has a legal tradition, the Sharia, which resembles the Babylonian Talmud in seeking to be completely comprehensive both of secular life and religious practice. It has a Book, the Koran, which like the Torah is held to be a special, textual revelation from God. If anything, the Koran is even more insistent on the importance of the ritual center at Mecca than is the Jewish canon about Jerusalem, since the Koran enjoins Muslims to make a pilgrimage to Mecca if they possibly can.

Something else that Judaism and Islam have in common is that their adherents have been spread out all over the world for a very long time. This was true of Judaism (let us forget this "Judahism" hypothesis) even during the period of the Second Temple. This is not the kind of thing you would normally expect of a cult tied to a particular place, which is what is usually meant by a "temple religion." The religion of the Classical world, like that of much of the Far East today, is built around the particular shrines of local gods. Grand abstractions like "Zeus" or "Shiva" are really for poets. The piety of the practitioners of these cults is always local. They worship the god of one temple because he is the god of where they live. If they travel, then naturally they worship the gods of the places through which they pass. To do otherwise would seem nonsensical.

In contrast, what Judaism and Islam, as well as Christianity and some forms of Buddhism, have in common is that they are fairly portable. You can find God wherever you are, and if a holy book directs your attention to a sacred site on the far side of the world, then the site's sacredness comes from the book and not the other way around. This is true today in the case of Islam, even though a ritual center is an important part of its theology. It also has been true of Judaism since the Babylonian Captivity. The term for this is monotheism, and it has more to do with how a religion works than do the details of its ritual dimension.

That said, though, it is hard to imagine that the destruction of the Second Temple did not have some effect on the evolution of Judaism. Here is what might have happened if the Angel of Death had passed over the Temple in AD 70.

It is not difficult to imagine a history in which the Temple survives. The Roman-Jewish War was also a civil war. The contenders actually held different parts of the Second Temple and fought each other as the Romans invested the place. Supposedly, the Pharisees were not really very keen on rebelling against Rome in the first place. That is why many of them were expelled from Jerusalem by the zealots. One of their leaders, Yohanan ben Zakkai, then made a deal with the Emperor Vespasian to allow Yohanan to found the academy at Jamnia, where the Mishnah began to be composed. Suppose that, instead of abandoning Jerusalem, the Pharisees had contrived to gain control of the Temple complex, or some large fraction of it. They might then have negotiated with the Romans to, in effect, trade Jerusalem for the Temple by holding the later against the rebels. Though much of the city might have been destroyed in the Roman assault, still the Temple would have been spared.

Thereafter, the Temple would have continued to function as a ritual center as before, but with some differences. For instance, immediately after the rebellion was put down, the Temple would have found itself in the odd position of being a huge religious center without much of a surrounding population. The Temple would have been in small danger of being abandoned: Jews from all over the world came to visit and sent donations. Doubtless Jerusalem would have been rebuilt, as it had been before. Still, activity in the Temple would have begun to shift away from ritual and toward scholarship, particularly if the Pharisees were running the place. This would have accelerated trends that had long existed in Judaism.

Even before Babylonian Captivity, the prophets complained that God was less impressed by offerings in the Temple than by, say, the fair treatment of tenant farmers and the even administration of justice. The ethical dimension to Judaism would certainly have continued to develop, whether there was a temple or not. There is also some reason to suppose that the ritual practiced at the Temple might have begun to change dramatically.

We have to remember that, when we talk about ritual in this context, was are talking about animal sacrifice. This, of course, was typical of temples throughout the ancient world: they were abattoirs. The difference was that the Jerusalem Temple was huge, one of the wonders of the world, and to some extent it must have been a terrifying place. While this assessment may seem to be the projection of modern delicacies onto ancient people, there is some evidence otherwise. Noted Jewish authorities, including Maimonides himself, have argued that animal sacrifice was a brutal practice that God sought first to restrict and then to eliminate. Also, for what it is worth, we should remember that the other major religious survivor of first-century Palestine, Christianity, dropped the practice of animal sacrifice from the first. (This was the case even though Christianity, too, retained the basic texts on the subject in its Old Testament.)

Ironically, the emphasis given to the old rituals in the Mishnah and the Talmuds was due precisely to the abruptness with which they were cut off. In the normal course of events, one suspects, temple sacrifice would have become rarer and more symbolic, until eventually no actual animals were killed at all. As it was, though, all the early rabbis were left with were memories to record, which they did with great thoroughness.

We must therefore imagine the Temple continuing to function through late antiquity, becoming all the while less like a Classical temple and more like an academy. There was one more major Jewish revolt in Palestine, the Bar Kochba rebellion of the 130s. It is entirely possible that the continued existence of the Temple would have defused this uprising. That rebellion is famous in the study of Messianic millenarianism. (Bar Kochba was called the Messiah, though he may not have claimed the title for himself.) However, richly endowed religious foundations usually take a dim view of militant endtime movements, as the history of the Catholic Church illustrates.

Even if the influence of the conservative Temple failed to prevent the outbreak, the existence of the Temple would still have altered matters. It is likely that the Temple authorities would have stood aloof from the rebellion. Jerusalem might have been declared an open city, or it might actually have resisted Bar Kochba in the name of Rome. Even if the insurgents gained control of Jerusalem for a period, in this case the Romans would have had no reason to destroy the city or the temple when they reconquered the country. Unlike the situation in AD 70, there would have been a normative form of Judaism, one more concerned with the affairs of the spirit than with those of this world. The Romans would have made haste to reestablish this orthodoxy in its chief center as soon as they could. This would have been the quickest way to restore peace. After all, this was pretty much what the Emperor Vespasian did with Rabbi Yohanan.

By the time Christianity became the Imperial religion in the fourth century, it is quite likely that Jerusalem would have been a university town, like Athens or Alexandria. Like them, it would have had increasing trouble with the Imperial government's wildly gyrating religious policies. In the fifth century, these resulted in the closing of the academies in Palestine in which the Jerusalem Talmud was composed. In 529, the Byzantine Emperor Justinian closed even the Academy at Athens. It would thus be reasonable to suppose that, sometime in those centuries, the Temple would have been converted into a church, and the associated schools into seminaries.

In the seventh century, with the appearance of Islam, the role of Jerusalem in world history would have become considerably different. It is conceivable that the attraction of Jerusalem, with the Temple intact, might have preempted the choice of Mecca as the center of Muslim worship. (Mohammed prayed to Jerusalem for a time, even without the Temple.) This would have had considerable consequences for the development of later Islamic civilization. Neither Mecca nor Medina are suitable points from which to administer a great empire. They are too isolated, too small, and they depend on local resources that are too thin. To a lesser extent, the same is also true of Jerusalem. As the Ummayid and Abbasid Dynasties realized, Damascus or Baghdad was far preferable. However, if Jerusalem had been the goal of the Haj, with the Temple now the holiest of Mosques, it was close enough to the Mediterranean's major trade routes that it could have continued its role as a center of learning. Jerusalem is wrongly placed to be a large city. With the Temple, however, it would never have become a backwater.

In later centuries, Jerusalem would have been captured and lost by the Crusaders, patronized and abused by the Turks. Its political history might not have been dramatically different from that in our own world. The biggest difference would have come in the 20th century. In 1900, Palestine was a relatively lightly populated country. Its cities, including Jerusalem, were of mainly historical interest. Had the Temple been the center of Islam, however, these things would not have been the case. Certainly the enterprise of Zionism would have been inconceivable. Jews might well have had easy access to the Temple by the second half of the 20th century. Christians have been able to hold services in the Hagia Sofia under the Turkish Republic, to take a comparable case. Nevertheless, we must consider the possibility that one consequence of the preservation of the Temple in the first century might have been the non-existence of Israel in the twentieth.

Copyright © 1999 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Surpassing Wonder: The Invention of the Bible and the Talmuds

Textual analysis and criticism of religious texts was a real innovation in the mid-nineteenth century. However, historical criticism of the Bible and the Talmud exhibits the amusing spectacle, as work has slowly progressed, of converging on something very much like the traditional account.


Surpassing Wonder:
The Invention of the Bible and the Talmuds
by Donald Harman Akenson
Harcourt Brace & Company, 1998
658 Pages, $35.00
ISBN 0-15-100418-8

 

Few areas of academic endeavor are so in need of public debunking as are biblical studies. The physicist Richard Feynman coined the term "Cargo Cult science" to refer to literary speculation that tries to steal the authority of the physical sciences by using some of their vocabulary and format. Much the same relationship holds between such enterprises as the Jesus Seminar and the methods of serious historians. In this book, Donald Akenson, a noted scholar of Celtic studies, uses his own discipline's perspectives on textual analysis to critique the modern treatment of the Old and New Testaments, as well as the less well-known evolution of the rabbinical literature that was created during the five centuries following the destruction of the Second Temple in AD 70. This is merely by the way, since the primary purpose of "Surpassing Wonder" is to set out Akenson's own theory of a "grammar of invention" that supposedly governed the evolution of all this vast body of material.

The result is a book that is valuable for its general insights, and especially for its history of the tradition that led to the Babylonian Talmud. Still, though the author makes light of postmodern analytical methods, nevertheless "Surpassing Wonder" is willfully tendentious in a way that is characteristic of postmodernism. The author flips back and forth between the assertion that "all we have is a text" and appeals to questionable historical reconstructions, all the time drawing analogies from dated popular science. Akenson succeeds in clearing away some learned nonsense, but the effect is sometimes like that of a critique of "recovered memory" therapy made by an intelligent astrologer.

"Surpassing Wonder" begins with a bold hypothesis. According to Akenson, the first nine books in the Bible (excluding Ruth and counting First and Second Samuel and First and Second Kings as one each) were composed as a unified whole. The work may even have had a single editor, who would have lived among the exiles in Babylon in the sixth century BC, after the destruction of the First Temple. The Genesis-Kings unity was based on material deriving from the Temple writings of the now-destroyed Kingdom of Judah, and of course the editor did not imagine that his compilation was in any way innovative. This, in fact, is the first rule of the "grammar of invention" that Akenson describes: novelty is never admitted. Still, the result was substantially new, if for no other reason than that it suppressed other traditions that existed before the Babylonian Exile.

For one thing, Genesis-Kings established that the religion of the People of God would be a monotheism centered on Yahweh. (This point was not altogether clear in the strand of documents coming from the northern Kingdom of Israel, whose name for the divine was "elohim," a plural.) The work ensured that this religion would not be just a temple religion, but one centered on a single Temple and the ritual performed there. (This point, too, had not been clear, since Solomon's Temple at Jerusalem had never been able to entirely suppress all other centers of ritual sacrifice.) It established the notion of the various covenants with God, first with Noah, then with Abraham, then with Moses. It also established the notion of history as a theodicy governed by the results of breaking these covenants. Finally, it did all this by being very conservative of its sources. The Bible is filled with "doublets" of text that tell somewhat different stories about the same thing. Thus, famously, there are two creation stories in Genesis, and they are not obviously consistent. Consistent or not, all the narratives in Genesis-Kings routinely echo each other: covenant with covenant, punishment with punishment, prophet with prophet.

The "grammar of invention" of which Akenson speaks consists of just these themes and these methods of presenting them, repeated and transformed throughout the thousand years after the completion of Genesis-Kings. Literatures grew up speaking this grammar, and so produced works that later could be incorporated into carefully constructed canons. While not wholly original, the notion that the Bible and its associated literatures are essentially a riff on a few basic measures has a lot to recommend it. As a method of historical investigation, however, it does have certain pitfalls.

Akenson describes the religion of the ancient Davidic kingdom as "Yahwehism," and asserts that its beliefs and practices are simply unknowable. During the Babylonian Exile a new religion arose, "Judahism," that both composed the Genesis-Kings unity and was shaped by it. It was Judahism that returned to Palestine with the small band of enthusiasts who built a new Temple. Then another cloud of unknowing descends on the religious condition of the People of God until AD 70, when Judahism is succeeded by "Judaism" and Christianity.

Here we see the problem: this hypothesis that the literature is "invented" from a stock of traditional elements leads to a kind of dispensationalism. (I suppose Foucault might have called these periods "epistemes.") The invention of a text serves to consume all the information that went into it, leaving us with a textual wall through which no image of the more distant past can penetrate. Time and again in this book, Akenson concludes some demonstration of the irretrievability of the history of Judaism and Christianity with the injunction to admire the "surpassing wonder" of the literary construct that blocks our view. There is something a little fishy about all this.

Most of these little fishies live in the "Pool of Siloam," which is Akenson's term for Palestine in the period from the Maccabean revolt of 167 BC to the destruction of the Second Temple in AD 70. During the first half of this period, the "Judahist" commonwealth was more or less free of foreign political control. In the second half, the area was more and more ruled by Rome, either directly or through allied kings. The period was politically chaotic, to put it mildly, and, as is often the case in such situations, it was also culturally creative. This was the period in which there was a great efflorescence of religious literature, much though not all of it apocalyptic in nature. The Pool of Siloam swarmed with sects and philosophies that were more or less violent, pious, or merely bizarre. Some of their literature was recovered during this century in the Dead Sea Scrolls, and Akenson makes the interesting observation that, on the whole, this material rather confirms the significance of the apocryphal literature that we had already possessed, such as the "Books of Enoch." He also deplores the lazy habit of attributing all this material to the Essenes.

Akenson compares the late Second Temple era to the "Cambrian Explosion," the period about 540 million years ago when, suddenly, a vast variety of multicellular organisms appeared. There were some earlier, but the fossil record suggests a real acceleration in the pace of evolution early in this era. At the end of the Cambrian, these creatures were almost all destroyed in one of the mass extinctions that punctuate Earth's history. Akenson similarly likens this catastrophe to the destruction of the Second Temple. The Romans, he says, effectively sterilized the Pool of Siloam, leaving only a few surviving sects to make their way in a wider world. In Stephen Jay Gould's account of the late Cambrian die-off, given in "It's a Wonderful Life," the few survivors of the catastrophe were selected wholly at random, thus ensuring that the nature of life on Earth after the disaster was quite different from life before it. Akenson tries to make a similar argument for Judaism and Christianity, but his heart clearly is not in it. This is just as well, since Gould's thesis has been pretty well refuted.

Clearly, Pharisaism, which evolved into rabbinical Judaism, as well as Christianity, were pre-adapted to live in a world without a Temple. Neither was it an "accident" that some of the principle figures in these sects were not in Jerusalem when the Romans destroyed it. It was precisely because the religious practices of these groups were relatively independent of the Temple that they could set down roots in distant locales. One perhaps perverse effect of "Surpassing Wonder" may be to make some readers wonder whether AD 70 was very important at all.

It really is not true that the post-Exilic religion of the People of God was a "temple religion" in the same sense as was the religion of the reign of Solomon. Solomon's Temple was where the people worked who composed the primary components of the documents that became Akenson's Genesis-Kings unity. The writings were auxiliary to the cult. The Second Temple, in contrast, was built to conform to the books that the exiles brought with them from Babylon. Akenson, apparently thinking himself a naughty fellow for making the suggestion, calls the Temple an "icon" for a religion that supposedly had none. In reality, it was more like a textual illustration. Akenson tells us in great detail about how the rabbinical survivors of Roman Palestine constructed a "temple of the mind" in the Mishnah and the commentaries on it, so that study about the Temple could substitute for worship in it. However, he also notes that imaginary Temples are features of the Dead Sea Scrolls literature, so the discontinuity of AD 70 was not total. Did the existence of Mecca prevent the composition of the Muslim Sharia?

Similarly, Akenson's tub-thumping insistence that it is absolutely, positively established that none of the Gospels "in their present form" could have been written before AD 70 arouses the suspicion that he is chiefly concerned to wall off the dispensations before and after that year with the impenetrable text of the evangelists. In fact, this assertion about dating backfires on some of his other arguments. According to Akenson, so perfectly did the "inventors" of the Gospels make use of material from the Old Testament that there is almost nothing that Jesus says or does in the New Testament that cannot be attributed to literary construction. (The exceptions are the Virgin Birth and the bodily Resurrection, ideas that Akenson considers inexplicable literary blemishes.)

Now, the texts that Akenson (and most modern scholars) rely on to date all the Gospels after AD 70 are apparent predictions by Jesus in his apocalyptic discourses of the destruction of the Temple. The problem is that these references, particularly in Mark and Matthew, closely echo the description of the desecration of the Temple in the Book of Daniel. They are not different in kind from similar remarks in Paul's Second Letter to the Thessalonians, which is generally thought to have been written a good ten years before the Temple was destroyed. (II Thessalonians is not on Akenson's short list of epistles certain to have been written by Paul, though I gather that is an unusual position these days.) Thus, these "predictions" could easily be the sort of literary inventions that Akenson so loves, something that could have been written at any time after the middle of the second century BC. Akenson will have none of it, however. The problem is not theological. The problem is Akenson's apparent horror at the prospect that an ancient text might tell him something about history he did not already know.

In some ways, Akenson's textualism works better with the rabbinical material, most of which does not purport to be historical. The Mishnah, the code of the "oral law" that was probably compiled by AD 200, was apparently designed to be memorized, and may not have been collected together in written form until the second millennium. Akenson is familiar with the mnemonic devices that the Celts used in the early centuries AD for their legends and legal codes, and it seems to him that the rabbis used much the same methods.

Akenson describes the later chief works of the rabbinical literature, the Tosefta, the Sifra, and the Jerusalem and Babylonian Talmuds, as essentially attempts to tame the inflexible Mishnah. All but Sifra, which comments on the text of Leviticus, qualify and expand on the Mishnah and follow its structure. The Talmuds quote great slabs of it, along with commentaries and comments on the commentaries. Since the Babylonian Talmud, the last of these great works, was probably finished about AD 600, we have the odd spectacle of direct quotations from a book that was not entirely written down yet. Well, sentence structure in ancient Hebrew, as in modern Irish, goes verb + subject + object, so maybe it was indeed grammar that made a secondary question of establishing precisely the matter under discussion.

Regardless of the criticisms I make here, "Surpassing Wonder" is a very amusing book. There is something to be said for any work that uses a biography of baseball manager Casey Stengel ("oracle and miracle worker") as a model for how a gospel might be written. There are 100 pages of chatty but informative footnotes. There are four major appendices, one of which, "Modern Scholarship and the Quest for the Historical Yeshua," would be a good candidate for expansion into a small book. Still, "Surpassing Wonder" should be consulted with caution. This is one book where it pays to check the citations.

Copyright © 1999 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-10-25: Halloween Activities; Bad Book; Good Movie; Bad Software

I also hated Catcher in the Rye, I was glad to find that someone else did.


Halloween Activities; Bad Book; Good Movie; Bad Software

 

Here's a conference you might want to attend this Halloween:

Marseille, France (PRWEB) October 14, 2004: ICCF-11: The 11th International Conference on Condensed Matter Nuclear Science (Formerly the International Conference on Cold Fusion)...will gather on October 31, in Marseille, France, to present scientific research, exchange ideas, and debate this most controversial and engaging field of research...The worldwide cold fusion community [is] awaiting a conclusion from the U.S. Department of Energy's review of the field...

As are we all, no doubt, but when are we going to find out whether this alleged effect can be scaled up?

* * *

Even with cold fusion, there are some things you should not do on October 31:

CHURCHILL, MANITOBA -- Polar bear and seal costumes are definite Halloween no-nos in this northern Manitoba town. Costume selection takes on unique significance in the Churchill area, as Halloween coincides with the bears' migration to the ice. Seal costumes are especially worrisome, as seals are the polar bears' natural source of food on the ice. Officials fear the costumes could attract the unwanted attention of hungry bears. "I've never seen a kid dressed up as a seal -- but the message would be don't dress up as a polar bear or a seal," said conservation official Richard Romaniuk.

Readers will adapt this warning to their local fauna.

* * *

When I was 14, I found a dog-eared copy of J.D. Salinger's The Catcher in the Rye. I well remember settling down to read the first page, and then the second. Then I tossed the book aside, because the protagonist, Holden Caulfield, was such a jerk that nothing he did or said was going to be of much interest. I did read the book many years later, and confirmed my initial impression. I was recently pleased to find that my reaction as a youth was not unique:

Washington Post, Tuesday, October 19, 2004: J.D. Salinger's Holden Caulfield, Aging Gracelessly
By JONATHAN YARDLEY

I shared Caulfield's contempt for "phonies" as well as his sense of being different and his loneliness, but he seemed to me just about as phony as those he criticized as well as an unregenerate whiner and egotist. It was easy enough to identify with his adolescent angst, but his puerile attitudinizing was something else altogether....

Why is Holden Caulfield nearly universally seen as "a symbol of purity and sensitivity" (as "The Oxford Companion to American Literature" puts it) when he's merely self-regarding and callow? Why do English teachers, whose responsibility is to teach good writing, repeatedly and reflexively require students to read a book as badly written as this one?

The "why" here is clear enough: it is more important to start kids reading fiction than to trouble overmuch about what they read at first. This is the reason for the promotion of the Harry Potter books. The difference is that the Potter books are good.

* * *

Over the weekend, I viewed the film The Day after Tomorrow. That's the one about global warming triggering a new ice age, all in the space of a week. As disaster movies go, this one is pretty good. It's supposed to be a commercial for global-warming anxiety, and maybe it is. Happily, it has so little to do with science, even speculative science, that you can just accept the flooding and freezing of New York City in the same spirit that you accept rampaging dinosaurs and giant monkeys lowering the quality of life in the same locale. An odd thing is that you also have to forget whatever you happen to know about the neighborhood of the 42nd Street Manhattan Library. There's a perfectly good restaurant attached to the rear of the building; people trapped there by a blizzard would have been in small danger of starving.

In any case, since the movie was released earlier this year, the state of Florida has been through the sort of unprecedented meteorological disaster that the film contemplates. The chief result of that seems to have been an increase in the popularity of Governor Jeb Bush, because of his management of the emergency. That had the collateral effect of increasing the reelection chances of his brother, George. As any ecologist in a disaster movie can tell you, the most important consequences are often unexpected.

* * *

My one problem with the movie, which I saw on DVD on my PC, was the glitchy and intrusive player-software, Hotllama. It made a great fuss about installing itself. It's one of those players that try to force users to go online, and it demanded demographic information and an email address before it would let me see the movie. Once it was installed, there followed 45 minutes of crashes and freezes (computer crashes and freezes: I could not open to the section of the disk that would let me see the damn freezes in the movie). Finally, I found an inconspicuous "Configure" option, from which I enabled some obscure script. The film then played, but the audio was out-of-sync with the video for most of it.

My assessment of Hotllama is best summed up by this passage from Lovecraft's The Dreamquest of Unknown Kadath:

These latter [idols] did not, despite their material, invite either appropriation or long inspection; and Carter took the trouble to hammer five of them into very small pieces.

In other words, I used GoBack to remove the abomination from my harddrive.

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Arguing the Apocalypse: A Theory of Millennial Rhetoric

William Miller

William Miller

In case anyone needs help with the terminology of millennial studies, I have a glossary in my lecture notes.


Arguing the Apocalypse: A Theory of Millennial Rhetoric
by Stephen D. O'Leary
Oxford University Press, 1994
314 Pages, US$19.95
ISBN 0-19-512125-2

 

The study of millennialism did not begin with the build up to the year 2000. Theologians, sociologists and anthropologists had been writing for decades (in the case of the theologians, for centuries) about the end of the world and about the ways that people react to that prospect. After a long period of subcultural obscurity, the subject again came to the notice of the general public in the 1980s, and a flurry of academic and journalistic treatments appeared in the 1990s. Among the most theoretically ambitious was this book by Stephen O'Leary, Associate Professor in the Annenberg School for Communication at the University of Southern California.

"Arguing the Apocalypse" attempts nothing less than a "general theory" for millennial studies, one that could help relate the many disciplines that have dealt with one aspect or other of the Last Things. The book develops the theory through a detailed examination of two familiar episodes of apocalyptic thinking in American history, the Millerite Movement that culminated in the "Great Disappointment of 1844," and the return of date-setting premillennialism that began, very approximately, with the publication of Hal Lindsay's "Late Great Planet Earth" in 1973.

The theory is useful, though the book does share some of the defects of late 20th-century literary studies. (I hope never to see the words "rhetor" and "topoi" again.) The historical exposition is gripping, and the author's insights are essential to anyone interested in the field.

"Apocalyptic" is really a term for a genre of biblical and apocryphal literature that flourished in the Near East around the beginning of the Christian era. It deals with a class of ideas that are part of the broader category of eschatology, the study of the final or ultimate things. The latter also includes questions addressed by philosophy, cosmology, anthropology and other disciplines. The aspect of eschatology that usually attracts the most interest, however, is the study of what societies do with apocalyptic literature, particularly with the prophetic books of Daniel and Ezekiel in the Old Testament, and the Book of Revelation in the New.

The most conspicuous social manifestations of apocalyptic ideas are often called "millenarian" or "millennial," with reference to the thousand-year reign of the Saints, or "Millennium," mentioned in Chapter 20 of the Book of Revelation. (To put an extraordinarily complicated matter quickly, "millenarian" usually refers to violent or even revolutionary expectations for the future, while "millennial," a more general term, can also refer to hopes for gradual improvement as history nears its end.) Though not all eschatological systems, not even all models of history, necessarily have a moral dimension, O'Leary deals with apocalypse as a solution to the problem of theodicy, of how God can permit evil to exist in the world. Essentially, the apocalyptic solution is that God will not permit evil indefinitely, and in the final accounting, all the suffering in history will have been justified.

There is a considerable literature that attempts to explain all or most millennial activity in terms of some single sociological or psychological cause. Class conflict was an early contender, but equally plausible cases have been put for millennial activity as a delayed reaction to disaster, or as a reaction to modernization, or as a manifestation of one kind of mass psychological pathology or another. "Arguing the Apocalypse" starts with the sensible observation that there is no obvious single cause underlying all the millennial activity in the world, but that there is quite a lot of similarity in the way that people talk about it. The beginning of wisdom in the understanding of millennial behavior, in fact, is the appreciation of the fact that apocalyptic rhetoric is persuasive. By examining millennial activity from the perspective of rhetoric, O'Leary is able to look at texts, the "rhetor" who expounds the text and the rhetor's audience as an interactive system.

"Arguing the Apocalypse" amplifies the long-standing thesis that apocalyptic is essentially a form of drama. (This is particularly the case with the Book of Revelation, which looks for all the world like a classical Greek play; it even has a chorus.) Now drama, according to Aristotle, comes in two flavors. There is tragedy, which features good and evil characters who proceed to an inevitable catastrophe. Dramatic plots tend to be about how sin is met with revenge. Comedy, on the other hand, is about foolish or mistaken characters who stumble into a happy ending. Error is cured by enlightenment, eventuating in reconciliation.

The Book of Revelation has both tragic and comedic strands: the Beast and his followers prosper mightily in this profane age but meet with everlasting punishment on the last day, while the sufferings that the Saints endure in this age are all set right at the end. These tragic and comedic strands also appear in the history of millennial movements, often as pure types.

According to O'Leary, the topics (that's "topoi" to you, partner) on which apocalyptic rhetors engage audiences are "evil," "time" and "authority." There is some reason to suppose that, for the earliest Christians, the evil that faced them was the malice of the devil working through the powers of the Roman Empire. The time when the evil would be amended was very near, and the authority for these propositions was the direct prophecy of the apostolic generation and then of texts ascribed to them. This type of apocalyptic is often associated with "premillennialism," the belief that the Second Coming will occur before the Millennium. Premillennialists are often profoundly pessimistic about the future, which scripture says will be filled with disaster and persecution in the days prior to the Second Coming. Postmillennialism, in contrast, is the belief that the time of the Second Coming will not occur until the end of the Millennium, during which period the church will have gradually rid the world of natural evil. The "authority" invoked by postmillennialists tends to be a metaphorical interpretation of scripture at the service of pragmatism. This distinction between pre- and postmillennialism roughly corresponds to the tragic and comedic "frames" that Aristotle proposed. (St. Augustine was a comedian? Wonders never cease.)

The Second Great Awakening, a generation of reform and revival that characterized the first few decades of the nineteenth century in the United States, produced just about every possible form of millennial activity. It's earlier phase, however, was predominantly postmillennial in theology. This Awakening was associated with a variety of reform movements, from the abolition of slavery to the prohibition of alcohol. These movements were attended by acute religious fervor. When some of the reform movements made little or no progress even after years of mass rallies and evangelism, however, some members of the generation of the Awakening began to doubt whether real reform was possible in the current world. The result was a turn toward premillennialism, manifested most spectacularly in the Millerite Movement and the Great Disappointment of 1844.

William Miller was a respectable farmer in Upstate New York who came to believe, probably about 1830, that the Second Coming would occur around 1843. A diligent amateur student of scripture, his authority was arithmetic, as applied to the complex prophetic number system of the Old Testament prophets and the Book of Revelation. The transparency and reasonable tone of his argument seized the imagination of a large fraction of the public.

Respectable and learned ministers from many denominations either embraced Millerite ideas wholeheartedly or expressed sympathy for them. (Miller himself was an influential voice rather than a prophet in the movement. Indeed, the date of the Great Disappointment, October 22, 1844, was not set by Miller, but welled up out of the movement.) Publications with large circulations sprang up to spread the doctrine, and the mass meetings used to promote the reform movements of the earlier phase of the Awakening were put to new uses. As O'Leary notes, all this activity was not intended solely to persuade people. Proselytism was supposed to be one of the features of the latter days. By proselytizing, the Millerites were not just telling people about the apocalypse; they were enacting it.

The Disappointment itself was dealt with in various ways. The kernel of the Millerite movement decided that the event actually foretold by Miller's computations was an event in Heaven that prepared for the earthly Second Coming at some imprecise point in the future. Many went on to found the Adventist movement. Other Millerites threw themselves into the Abolitionist movement. O'Leary reports that the fiasco of 1844 ensured that, for a long time to come, only the most marginal rhetors would dare set a specific, near-term apocalyptic date. However, we should also note that the turn to premillennialism evidenced by Millerism survived the Great Disappointment, at least in evangelical circles. After the end of the Civil War, the historical pessimism associated with premillennialism was one of the factors that induced evangelicals to recuse themselves as much as possible from public life and practical politics.

There are many reasons why evangelical Christianity returned as a public force in the last quarter of the 20th century. One of the chief reasons, as O'Leary notes, was that history was making their worldview more plausible. The Jews really had returned to Israel, something that evangelical eschatologists had been talking about for over a century. Furthermore, the invention of the atomic bomb made the apocalypse something that everyone could believe in, one way or another. Indeed, not only did premillennialism again challenge the implicitly postmillennial "civic religion" of the United States, but apocalyptic date-setting came back, too.

O'Leary is at pains to emphasize the differences between Millerism and the brand of apocalypticism that Hal Lindsey promoted in his fantastically popular books that began with "The Late, Great Planet Earth." Their scenarios were different, for one thing. Although the doctrine of the pretribulation rapture of the Saints existed in the 1830s, it was not incorporated into Millerism, and did not really become important until after the Civil War. Lindsey's future, in contrast, contains both the prospect of another world war and a pretribulation rapture of the Saints to Heaven that would save believers the trouble of living through the final struggle. The difference that chiefly impresses O'Leary is that, granted their premises, the logic of "The Late Great Planet Earth" is much shakier than that of William Miller and his followers.

Lindsey's warrant for starting the countdown to the end is the assurance given by Jesus in the Olivet Discourse that "this generation" would see the fulfillment of all apocalyptic prophecy. In Lindsey's model of history, the machinery of salvation paused when the Jews failed to accept Jesus as the Messiah. Salvation history started up again only when Israel was founded in 1948. (This approach is called "dispensationalism," as opposed to the Millerite "historicism.") "This generation," therefore, refers to the people who were alive in 1948. In his earlier work, Lindsey made bold to wax more specific. Alleging that a biblical generation is about 40 years, he speculated that 1988 would be a reasonable date for the rapture to occur, followed by seven years of tribulation, and then the Second Coming.

Even granting the greatest deference to scripture, these interpretations are not obvious. That was not the case with the Millerite computations: they may not have been correct, but were reasonably clear. Furthermore, Miller and his colleagues invited criticism and answered their critics in print, something that Hal Lindsey never did. Nonetheless, while Millerism was extinguished in a bit over a decade, the apocalyptic revival of which Lindsey was so conspicuous a part is not completely extinct, even after 30 years. This is partly because Lindsey's system was tentative enough to avoid outright disconfirmation, even after the end of the 20th century. A factor that was at least as important, perhaps, was that evangelicalism has gained a measure of cultural acceptance, and even political power.

O'Leary devotes an interesting chapter to the conservative revival of the 1980s, and particularly to the eschatological aspects of the Reagan Administration. This period posed a problem for apocalypse-minded conservatives. Not only was the clock running out on the best-known estimate for the rapture, but evangelicals now needed a theory that would justify them in helping to reform a society that was doomed in several senses of the word. In O'Leary's nomenclature, they needed to move from the tragic frame to the comic frame. To a limited degree, this is what they did.

In his later books, Hal Lindsey held out the hope that conservatives could keep America out of the hands of the Antichrist right up to the rapture, if they all pitched in to aid the process of conservative reform. This was an exhortation to his readers to become tragic heroes, united in the last stand against the forces of darkness. Ronald Reagan became, in effect, "President of the Last Days" for some of his supporters. Like his medieval type, the mythical Emperor of the Last Days, his reign ensured present safety, while in no way compromising the inevitability of apocalypse in the more distant future.

Televangelist Pat Robertson went even further in his serious though failed bid for the White House. He stopped making premillennialist predictions of doom entirely, and began to speak about the future with the sunny optimism of a postmillennial preacher of the early Second Great Awakening. The strategy did not lessen the suspicion in which the press held him, though it did cause his erstwhile supporters to suspect him of backsliding on doomsday. Still, what did not work for Pat Robertson may work in other contexts.

"Arguing the Apocalypse" ends with a meditation on just what we are supposed to do with the apocalypse. There is obviously no getting rid of it. O'Leary suggests that the best course would be to seek to keep it in the comic frame. The idea seems to be that the apocalypse can be permanently tamed by turning it into the ever-receding horizon on the road of progress. People might still dread impending disaster, but they would not think some final disaster to be inevitable, and so would not be tempted to historical fatalism.

While there is something to be said for this strategy, we should keep in mind that the comic frame is not coextensive with postmillennialism, or St. Augustine's amillennialism. Even if the images of disaster and judgment in the Book of Revelation are taken as metaphors whose application is never exhausted by any particular event in history, that does not mean that ultimate questions are not posed by historical events. To take the most obvious example, even if all persecutions are types of an ultimate persecution by Antichrist that never arrives, martyrs throughout history have nevertheless been killed just as dead as the hypothetical Tribulation Saints are supposed to be. To make the apocalypse immanent or episodic does not lower the stakes. The opposite, rather. This is the real meaning of the saying of Franz Kafka that O'Leary quotes: "Only our concept of Time makes it possible to speak of the Day of Judgment by that name. In reality it is a summary court in perpetual session."

Copyright © 2001 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Great Disappointment of 1844

John maintained the HTML for his website by hand. I also starting making webpages in the late nineties, and that was just how you did it. As such, he had indexes by topic for his major interests, for example eschatology. I debated recreating these for a long time, but I finally decided to do it because a few items slipped though the cracks of the blog-centric chronological method I had been using to repost John's writings.

This also gives me an opportunity to escape the tedium of John's topical political blog posts from twelve years ago. While nothing looks more dated than old scifi movies, old political controversy is an especial trial to read.

Thus, let us move on to this short book review of a book that never existed, combining John's interests in eschatology and alternative history into one!


The Great Disappointment of 1844
by John de Patmos
Misketonic University Press, 2001
567 Pages, US$30
ISBN: 0-7388-2356-2

This item is Alternative History.

The Second Coming did not actually occur in 1844.

The Great Disappointment is a real hisrorical term, however.

Look under Eschatology for the review of Arguing the Apocalypse.

 

The Millerite Movement and its sequel are, for obvious reasons, the most studied manifestations of mass millennialism since the New Testament period itself. Indeed, so carefully has this grand finale of America's "Second Great Awakening" been examined that one may wonder whether there is anything new for historians to say. Certainly the author of the present study does not aspire to novelty. Rather, "The Great Disappointment of 1844" performs the invaluable service of sifting through the last generation of scholarship on the subject to provide a narrative that is both readable and current.

The optimism of America in the early decades of the 19th century was reflected in the "postmillennial" view of history that underlay the great outbreak of religious revival and social reform that we know as the Second Great Awakening. Postmillennialism, as all students of eschatology know, was the doctrine that the Second Coming of Christ would occur at the end of the thousand year reign of the Saints, the Millennium foretold by Chapter 20 of the Book of Revelation. The implication was that the Saints would themselves put the world in order in preparation for the great event.

The Second Great Awakening was in fact characterized by a high level of political and cultural engagement by Christians. The reform movements of the time, from Abolitionism to Women's Suffrage to the Prohibition of Alcohol, began as aspects of postmillennial religious revival. While some progress was made on these fronts, the failure of the reform movements to remake society as a whole caused many persons to despair of the possibility that the world could be perfected purely by human efforts. The time was ripe for a return of premillennialism, the doctrine that the Second Coming would inaugurate rather than conclude the Millennial kingdom, which would then develop under divine guidance.

The name that became inextricably linked with the triumph of premillennialism was William Miller, a respectable farmer and keen amateur student of scripture living in northern New York State. His reexamination of the dating of people and events in the Bible, set alongside certain familiar interpretations of the complex prophetic number systems of Daniel, Ezekiel and Revelation, convinced him that the Second Coming would occur around the year 1843. Though his analysis was multi-layered, a key feature of his logic was a demonstration that a proper calculation of the generations mentioned in the Old Testament showed that Bishop Ussher, who had famously announced that the world had been created in 4004 BC, had in fact underestimated the age of the world by a good 150 years. Thus, the six-thousandth year of the world would occur in the first half of the 19th century. Then would begin the "Seventh Day of Creation," a concept long associated with the Millennium.

William Miller was not the first student of scripture to set a near-term date for the Parousia. Still, he was a little unusual in the transparency of his argument and his willingness to engage critics. Miller was never the "prophet" of Millerism; his authority was arithmetic, not personal revelation. It was possible to disagree with his calculations, and many people did. Still, the argument was of such a nature that it could not be merely dismissed; it had to be refuted.

William Miller reached his conclusions about the dating of the Second Coming about 1830. He soon began to disseminate them in print and, more diffidently, on speaking tours. His message took on a life of its own, becoming the template for an interdenominational network of evangelists and publications. People abandoned their ordinary affairs to propagate the gospel of the last days, often giving away their property or neglecting to plant their fields. The precise date for the great event, October 22, 1844, did not come from Miller, or indeed from any of the leading figures of the movement. Rather, it appeared among the mass of believers, who overwhelmingly gave it immediate acceptance.

Of course, as we now know, the prediction was correct. The study of the Parousia Event of 1844 naturally overshadows the Millerite Movement (as it does the contemporary Taiping and Babist movements). However, the Days of the Presence required the creation of a new historiographical discipline, which the present study only briefly outlines. The Millerite story picks up when coherent documentation again begins to become available in January of 1845.

Against the unsettled economic and cultural landscape of the early Millennial world, Millerism presents the not unfamiliar spectacle of a movement destroyed by its own success. The ironic details are well known. Even historical survey courses devote some attention to accounts of the attempts by exasperated Millerites to regain control of property that they had given away, sometimes by arguing in court that they had been temporarily insane during the months leading up to the Advent. Far more important, however, was the fact that Millerism, and premillennial Christianity in general, had nothing to say to the Millennium.

The movement had come into existence as a reaction to the theory that Christians, as Christians, had a duty to leaven the world. Premillennialists had consciously recoiled from the labor of formulating a social philosophy, or even a coherent political program. The Millerite Movement had been entirely about chronology. Though the train left at the expected time, the premillennialists found that they had no idea where they were going.

This vacuum at the heart of post-Millerite evangelicalism had profound implications for the role of religion in the English-speaking world during the 19th and 20th centuries. It is a commonplace among historians that the great events of those years, the US Civil War and the First and Second World Wars, were to a greater or lesser extent "Wars of Armageddon," fought by societies for reasons that were essentially millenarian. All the great social movements of the period were also informed by the millennialist "Social Gospel." However, though evangelicals took part as individuals in the general historical process, they did not engage the great issues on a soteriological level. It was only in the last quarter of the 20th century that they began to emerge from the isolation of the denominational subcultures into which they had retreated. The end of the long alienation of a large a fraction of Christianity can only be applauded.

We will never cease to experience the influence of the events of 1844. Even the completion of the current Sabbatical Millennium will not nullify the process that began with the Parousia of that year. However, there are stories within that greater story, some of the saddest of which deal with the disappointment occasioned by the fulfillment of prophecy. Those stories can have an ending. Thus, though the historical debates may go on, we may hope that the long afterlife of Millerism is at last drawing to a close.

Copyright © 2001 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site