The Long View 2007-03-30: Human Nature, Human Rights, Iranian Motives, Goodbye Bees

This is a fun one: I hadn’t remember that John J. Reilly referenced Greg Cochran and John Hawks in 2007.

John also mentions a fairly standard criticism of any attempt to understand human behavior in terms of evolution, the Just So Stories of Kipling. It is sometimes true that such explanations are just ad hoc rationalizations in the mode of fiction, but the charge tends to get used regardless of the merits of the original argument.

A more interesting thing is that many of the most interesting arguments about understanding human behavior in light of evolution and genetics is that the best arguments are often taking advantage of final and formal causation to argue that we can understand something to be true without knowing a detailed mechanism, which then causes the truest of true believers in the supremacy of efficient causes to point and splutter.

Also, I find it a little sad that there have been rumors of war with Iran for the last twelve years at least. Give it up already.

Human Nature, Human Rights, Iranian Motives, Goodbye Bees

Conservatives can appropriate Darwinism in any of several ways. The least problematical is at the intersection of culture and demographics: certain cultural regimes seem to be inconsistent with maintaining the magic replacement-fertility rate of 2.1 children per woman. In this sense, the conservative agenda will have succeeded in all essentials on the day when the phrase "The Darwin Award Agenda" principally calls to mind terms like "same-sex marriage" or "reproductive rights." However, there is also a Darwinian conservatism that aspires to make use of the full resources of sociobiology. Larry Arnhart's blog, Darwinian Conservatism, is an able presentation of this position.

There are two points that anyone interested in following this line of thought should consider. The first is one associated with most applications of "applied Darwinism": the explanations often look suspiciously like Just So Stories. There is also this point: maybe human nature ain't what it used to be:

Human evolution has been speeding up tremendously, a new study contends—so much, that the latest evolutionary changes seem to largely eclipse earlier ones that accompanied modern man’s “origin.” ....The authors are Cochran and anthropologist John Hawks of the University of Wisconsin Madison. “Holocene [from -10K years ago] changes were similar in pattern and... faster than those at the archaic-modern transition,” A “thing that should probably worry people is that brains have been getting smaller for 20,000 to 30,000 years,” said Cochran. But brain size and intelligence aren't tightly linked, he added. Also, growth in more advanced brain areas might have made up for the shrinkage, Cochran said; he speculated that an al­most breakneck evolution of higher foreheads in some peoples may reflect this. A study in the Jan. 14 British Dental Journal found such a trend visible in England in just the past millennium, he noted, a mere eye­blink in evolutionary time. ...[I]n a 2000 book The Riddled Chain..[b]ased on computer models, [John McKee] argued that evolution should speed up as a population grows...Many of the changes found in the genome or fossil record reflect metabolic alterations to adjust to agricultural life, Cochran said. Other changes simply make us weaker.

In the June 2003 issue of the research journal Current Anthropology, Helen Leach of the University of Otago, New Zealand wrote that skeletons from some populations in the human lineage have undergone a progressive shrinkage and weakening, and reduction in tooth size, similar to changes seen in domesticated animals. Humans seem to have domesticated themselves, she argued, causing physical as well as mental changes.

Never let anyone scare you with visions of the human race being replaced by artifacts. We are the artifacts.

* * *

"Human Rights" has become an Orwellian term, according to Joseph Bottum First Things:

“Peace is a communist plot,” Irving Kristol used to observe back during the Cold War...every organization with the word peace in its title was a communist front...the equation holds as true now as did then: Human rights are a communist plot, and international human rights are an international communist plot...Well, maybe not communist...Some amorphous radical leftism is clearly afloat in the world. Generally undefined in philosophy, economics, or eschatology, it seems nonetheless able to unite the most unlikely bedfellows: terrorists, and sexual-transgression artists, and agitators for radical Islam, and abortion activists, and third-world dictators—anybody, anywhere, who thinks there’s an advantage to be gained from claiming that the West is wrong. And they can always join under a banner emblazoned with that noble phrase “human rights.”

There is something to this, particularly at those United Nations agencies where the foxes are in firm possession of the chicken coops. Still, we should remember the insistence by the United States that the Helsinki Accords of 1975 contain a human-rights plank. The Soviet Union had wanted the Accords to set in stone the Cold War division of Europe, but the human-rights plank delegitimized the European Marxist regimes in a mere 15 years.

What's the difference between "human rights" as principles that protect freedom and "human rights" as an ideology that justifies enslavement and promotes extinction? About this, Dinesh D'Souza was perfectly correct: the civil liberties that the Founding Fathers understood are workable and almost universally attractive; the social engineering projects that come out of the transnational human rights industry are disliked and dysfunctional. Could the distinction be as simple as the one that Oliver Wendell Holmes proposed, that between procedural and substantive rights?

* * *

Speaking of catchy turns of phrase, was Vox Day the first to refer to the US presidency as The Cherry Blossom Throne?

* * *

About the Iranian seizure of British sailors in the Persian Gulf, Time Magazine has this to say in connection with the question, Is a U.S.-Iran War Inevitable?:

This week Iranian diplomats are telling interlocutors that, yes, they realize seizing the Brits could lead to a hot war. But, they point out, it wasn't Iran that started taking hostages — it was the U.S., when it arrested five members of the Islamic Revolutionary Guard Corps in Erbil in Northern Iraq on January 11. They are diplomats, the Iranians insist. They were in Erbil with the approval of the Kurds and therefore, they argue, are under the protection of the Vienna Convention.

Iranian grievances, real and perceived, don't stop there. Tehran is convinced the U.S. or one of its allies was behind the March 2006 separatist violence in Iranian Baluchistan, which ended up with 20 people killed, including an IRGC member executed. And the Iranians believe there is more to come, accusing the U.S. of training and arming Iranian Kurds and Azeris to go back home and cause problems. Needless to say the Iranians are not happy there are American soldiers on two of its borders, as well as two carriers and a dozen warships in the Gulf. You call this paranoia? they ask.

Actually, I would call the Iranians mendacious, and I would call the editors of Time that, too, were not honest stupidity a more economical explanation. Surely the only explanation the incident requires is that the recent votes in the US Congress to, in effect, lose the war in Iraq by a date certain show that US hegemony is evaporating; the Iranians took the sailors to demonstrate that Iran can now act with impunity, and the states of the region should restructure their foreign policies accordingly.

I suspect that that Iran will release the sailors in short order; the Iranians probably believe their point has been made. Of course, it is possible that Iran wants a war now, believing that, however much damage they suffer at first, the US and UK will be unable, for domestic reasons, to sustain it for more than a few days.

* * *

Any reader of The Hitchhiker's Guide to the Galaxy can read these reports only with great distress:

Across the country, honey bees are disappearing by the thousands. ...

“This is unique in that bees are disappearing,” Hayes said. “The hives are empty. You don’t see dead bodies. The colony, over time, dwindles until you don’t see anything left in the colony.”

So long, thanks for all the gardens?

Copyright © 2007 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Support the Long View re-posting project by downloading Brave browser. With Both Hands is a verified Brave publisher, you can leave me a tip too!

The Long View 2007-03-09: Physics, Warrior Robots, Glottochronology, & Reforms Good and Bad

A small sample of the high-temperature superconductor  BSCCO -2223.  By James Slezak, Cornell Laboratory of Atomic and Solid State Physics - Own work, CC BY 2.5,

A small sample of the high-temperature superconductor BSCCO-2223.

By James Slezak, Cornell Laboratory of Atomic and Solid State Physics - Own work, CC BY 2.5,

High-temperature superconductors are much like nanotechnology: just another kind of vaporware that has gone nowhere. I should probably update my cocktail party theory of why science can’t seem to do anything cool anymore on this in light of an additional ten years of experience.

Also interesting to note that John J. Reilly was a fan of the national popular vote, and not a fan of daylight savings time, at least as implemented.

Physics, Warrior Robots, Glottochronology, & Reforms Good and Bad

It's about time, that's all I can say:

JERUSALEM (Reuters) - An Israeli defense firm on Thursday unveiled a portable robot billed as being capable of entering most combat zones alone and engaging enemies with an onboard armory that includes a machine-pistol and grenades.

Click on this totally misleading image to see what the robot really looks like:

I am inclined to think that this is just a minor improvement in SWAT technology rather than the beginning of the end of infantry, but I could be wrong. In 1914, hardly anyone appreciated the implications of the machine gun. In any case, we have a way to go before we see the slinky Cylons of Battlestar Gallactica.

* * *

Why are there no flying cars in the early 21st century? In part because so little came of this:

Twenty years ago this month, nearly 2,000 physicists crammed into a New York Hilton ballroom to hear about a breakthrough class of materials called high-temperature superconductors, which promised amazing new technologies like magnetically levitated trains...But today the heady early promises have not yet been fully filled. High-temperature superconductors can be found in some trial high-capacity power cables, but they have not made any trains levitate. The rise in transition temperatures has stalled again, well below room temperature. Theorists have yet to find a convincing explanation for why high-temperature superconductors superconduct at all.

In those days, Chaos Theory had just recently been the flavor of the month, and was still supposed to be a new, culture-transforming model of causality. Room-temperature superconductors were supposed to provide the hardware component for the new world. Only the geekiest geeks and a few SF writers had a clue about the Internet, which really was an important technological development (and whose effect, as I have argued at tedious length, has been essentially conservative).

Let those of us take a lesson who think that neuroscience will make all things new.

* * *

The peoples of the British Isles are all pretty much cut from the same genetic cloth, according to a piece in The New York Times. For the most part, they have been there since the ice age, if not before, so we can forget about all that Saxon versus Celt business. Well, okay, but genetics is one thing; what are we to make of conclusions like this?

Dr. Oppenheimer has relied on work by Peter Forster, a geneticist at Anglia Ruskin University, to argue that Celtic is a much more ancient language than supposed, and that Celtic speakers could have brought knowledge of agriculture to Ireland, where it first appeared. He also adopts Dr. Forster’s argument, based on a statistical analysis of vocabulary, that English is an ancient, fourth branch of the Germanic language tree, and was spoken in England before the Roman invasion.

The hypothesis that Anglo-Saxon was spoken in England before the arrival of the Angles or the Saxons is, perhaps, counterintuitive, but no doubt the argument is more persuasive in detail. In any case, this attempt to date language change is based on glottochronology. That procedure is based on a reasonable notion for estimating how long one language has diverged from another with the same ancestral language: count the cognates in a list of 100 or 200 basic words in the daughter languages. Morris Swadesh estimated that 14% of that vocabulary would diverge in a millennium. That worked well for the Romance languages, but there were counter examples in different language groups. Sergei Starostin suggested that a count should be made only in "autonomous" changes in the basic wordlist, excluding loan words. With that stipulation, the rate of change falls to 5 or 6 native replacements per millennium.

The problem is that, to apply these rules, we need to already know so much about the histories of the languages in question that the glottochronological estimate will usually be superfluous. Alas.

* * *

Friends of civil peace must regret the failure of the House of the Colorado legislature to pass the National Popular Vote bill, after the Senate had approved it. As the measure's proponents put it:

Under the National Popular Vote bill, all of the state’s electoral votes would be awarded to the presidential candidate who receives the most popular votes in all 50 states and the District of Columbia. The bill would take effect only when enacted, in identical form, by states possessing a majority of the electoral votes—that is, enough electoral votes to elect a President (270 of 538).

The most discouraging thing about the opposition to this necessary measure is the transparent nonsense of arguments like this:

Law professor Robert Hardaway from the University of Denver was equally critical.

He said problems with a candidate winning the popular vote but losing the electoral vote are rare, but result in cries for changing the system.

Without the electoral college, close votes would be a nightmare, Hardaway said.

"You think 2000 was bad? You’d have recounts in every precinct, in every state," he said.

In reality, of course, the NPV mechanism does not change the local rules about when a recount can be demanded. No matter how close the national vote, districts with 60% to 40% majorities for one candidate would not have a recount. Districts with electoral results that are close within the definition of local law would have recounts, just as they do today. The NPV does not abolish the Electoral College; the College would still turn pluralities into majorities.

And why is the NPV necessary? It's necessary because if George Bush had won an Electoral College victory in 2004 there would have been gunfire. It is necessary because the US cannot promote democracy abroad if its chief executive is chosen by gerrymander. It is necessary so that the rural populations of the big electoral-vote states are no longer disenfranchised in presidential elections. The last point is maddening: it's the Republican Party that is chiefly handicapped by the current system.

* * *

Brothers and sisters, I can no longer keep silent, since we are just days away from the fulfilment of this scripture:

And he shall speak great words against the most High, and shall wear out the saints of the most High, and think to change times and laws:

And what is Microsoft doing about it??

IT workers have been waiting three or four hours to get telephone support from Microsoft [regarding the start of Daylight Saving Time on March 11 under the new federal law], whose Exchange Server serves as the official calendar for many of the world's largest businesses.

Aiming to shorten that wait, Microsoft has boosted the number of people addressing the time change issue. Earlier Thursday, the company opened up a "situation room" devoted to monitoring customer issues and providing support to the software maker's largest customers.

Unlike Y2K, this change could be a real nuisance. Supposedly, businesses like this change, because it gives people more daylight in which to shop. Again, I can only ask: why not just institute spring and autumn schedules? If federal offices were directed to open at 8:00 A.M. in March and 9:00 A.M. in November the rest of society would follow suit and we would not need to reset the damn clocks.

Copyright © 2007 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Support the Long View re-posting project by downloading Brave browser. With Both Hands is a verified Brave publisher, you can leave me a tip too!

Superbugs: The Race to Stop an Epidemic Book Review

Superbugs: The Race to Stop an Epidemic
by Matt McCarthy, MD
Avery (May 21, 2019)
ISBN 0735217505

I received a free copy of this book through LibraryThing’s Early Reviewers program.

Superbugs is a fascinating book, and I’m glad I had the chance to review it. This book is a window into the management, and hopefully curing, of difficult antibiotic-resistant infections from the point-of-view of a physician who sees the worst the world has to offer. McCarthy wrote it in a chatty, personable, and slightly ADD style that probably makes it more accessible. This is a difficult thing to get right with a work of popular science, which I take this book to be.

There is an infamous rule of thumb that including one mathematical formula in your book will reduce your readers by half. Each additional formula continues the process of exponential decay. McCarthy has clearly decided to maximize his potential readership by avoiding mathematical formulae, or worse, skeletal formulae of organic molecules.

Dalbavancin, public domain  By Hbf878 - Own work, CC0,

Dalbavancin, public domain

By Hbf878 - Own work, CC0,

However, while he doesn’t show them, he talks about them a lot. If you know what is going on, you can either envision the diagrams or look them up, but organic chemistry isn’t needed to tell the stories that McCarthy wants to tell.

The first story is McCarthy’s work with Allergan on the antibiotic dalbavancin, and his journey to learn how to write a protocol for a clinical trial and gain consent from often frightened and bewildered patients who show up in Emergency Rooms with methicillin-resistant Staph Aureus infections. His meandering style allows him to digress into the second story, which is a capsule history of the development of antibiotics, and the sometimes checkered history of human experimentation in medicine.

Sir Alexander Fleming, looking rather intense for the photographer  By Official photographer - is photograph TR 1468 from the collections of the Imperial War Museums., Public Domain,

Sir Alexander Fleming, looking rather intense for the photographer

By Official photographer - is photograph TR 1468 from the collections of the Imperial War Museums., Public Domain,

His history of antibiotic development includes well-known figures like Alexander Fleming, and the overlooked, like Elizabeth Lee Hazen and Rachel Brown, who developed nystatin, the first antifungal drug.

Elizabeth Lee Hazen and Rachel Brown  By Smithsonian Institution - Flickr: Elizabeth Lee Hazen (1888-1975) and Rachel Brown (1898-1980), No restrictions,

Elizabeth Lee Hazen and Rachel Brown

By Smithsonian Institution - Flickr: Elizabeth Lee Hazen (1888-1975) and Rachel Brown (1898-1980), No restrictions,

The book is probably worth it just for this well-done short summary of the powerhouses of modern pharmaceuticals [and more evidence for my theory that the greatest period of technological advancement in the twentieth century was between 1920-1950]

By the early 1950s, ninety percent of the prescriptions filled by patients were for drugs that had not even existed in 1938. pg 101 [citing Miracle Cure by William Rosen 2017]

However, you also get a good look at how medicine is practiced in the United States today, from the practitioner’s point-of-view. Physicians need to manage conflicts of interest, like the portion of McCarthy’s salary that is paid by Allergan and other corporations, patients that are bound and determined to pursue courses of treatment that the evidence doesn’t support, and the sheer soul-crushing burden of seeing so much suffering day-in and day-out.

We Americans expect our doctors to be superhuman: to work without rest, to diagnose without fail, and resist the siren call of wealth. Doctors receive enormous deference for our unrealistic expectations, but a subtext of McCarthy’s book is the toll this takes on our often genuinely selfless and dedicated physicians. Who do in fact accept honoraria and speaking fees from pharmaceutical companies and miss their children while they work long hours.

Another interesting aspect of American medical practice is its insularity. Nearly every reference in McCarthy’s book is from a medical journal, which is the mental world of most physicians. However, medicine might progress faster if physicians were to be a little bit more widely read. For example, McCarthy devotes a fair bit of space to the research of Vincent Fischetti, who isolates enzymes from bacteriophages. But phage therapy was a thing before antibiotics were invented, and was largely forgotten in the initial enthusiasm for antibiotics. Phages and adjacent technologies would be a useful adjunct to antibiotics, but medicine, meaning mostly expert physician opinion, has been pointedly disinterested for seventy years or more. I appreciate that McCarthy is trying to do something about that, but reading and citing mostly medical journals is only going to perpetuate the attitude that pushed useful therapies aside because it wasn’t the hot new thing, or because it came from the wrong field.

All in all, I enjoyed this book. I think McCarthy did a fine job making the history of antibiotics accessible, and was remarkably honest about himself and his field, frankly admitting the challenges physicians face today. This book could have been dry, but it wasn’t, so I am willing to embrace the rapid alternation between the present and the past. McCarthy made this style work. One can learn a lot about the world, past and present, from this book.

In a final note, there is a short letter tucked in my review copy that public results for McCarthy’s dalba study are expected on or around May 21st, just under a week from the publication of this review. I hope everything went well, because I like having options when the bacteria evolve faster than us.

My other book reviews | Reading Log

The Long View 2007-01-26: The Dead Hand of the Seventies Flexes but Is Mitigated by Scientific Advances

The end of low mass stars

The end of low mass stars

I have never bothered to closely follow popular science news for two reasons:

  1. It is often confused, at best

  2. Even when it is well-reported, whatever results are covered tend to be invalidated over time

Nothing here strikes me as especially dumb, but I like to wait for the dust to settle.

The Dead Hand of the Seventies Flexes but Is Mitigated by Scientific Advances

Homer Simpson himself once characterized the 1970s as "a dark time when folly and madness ruled the earth." Now Mark Steyn sees that decade's return:

“It feels like August,” wrote National Review’s editor, Rich Lowry, about eight months after 9/11. August 2001, that is: he meant America’s war on terror seemed to have lost its urgency and the “sleeping giant” appeared to be resuming his slumbers. Five years on, it’s worse than that: it feels like the Seventies.

But it does not feel like the Seventies. The characteristic of the Seventies (and the Sixties after 1965) was that all the West's establishments, political, artistic, and spiritual, abdicated their historical moral authority. At least in public, they deferred to the superior wisdom of "the kids," or simply embraced chaos. That is not the case today. The establishments know more or less what they are doing. They are far more guilty.

* * *

On a happier note, we are getting closer to a reliable projection for the ultimate fate of Earth. Lee Anne Willson of Iowa State University has been doing the math:

The life-giving, aging star we orbit is using up its fuel supply and will collapse within 7 billion years. Before that, though, there will be an agonizing period of repeated swelling, as the sun grows into a red giant. How giant?...

"Earth will end up in the sun, vaporizing and blending its material with that of the sun," said Iowa State University's Lee Anne Willson. "That part of the sun then blows away into space, so one might say Earth is cremated and the ashes are scattered into interstellar space"...

Willson and her colleague George Bowen studied other red giants, medium-sized stars like our sun that are near death, and used their findings to calculate the fate of Earth.

"If the sun loses mass before it gets too big, then Earth moves into a larger orbit and escapes," Willson told "The sun would need to lose 20 percent of its mass earlier in its evolution, and this is not what we expect to happen."

I had read an estimate that the sun would in fact lose enough mass to allow Earth to rise to a sustainable orbit. Now I'll have to change my plans.

But what about the moon, you ask?

During the red giant phase the Sun will swell until its distended atmosphere reaches out to envelop the Earth and Moon, which will both begin to be affected by gas drag—the space through which they orbit will contain more molecules.

The Moon is now moving away from Earth and by then will be in an orbit that's about 40 percent larger than today. It will be the first to warp under the Sun’s influence...

If left unabated the moon would continue in its retreat until it would take bout 47 days to orbit the Earth. Both Earth and Moon would then keep the same faces permanently turned toward one another as Earth’s spin would also have slowed to one rotation every 47 days....

[T]he drag caused by the Sun's extended atmosphere will cause the Moon's orbit to decay. The Moon will swing ever closer to Earth until it reaches a point 11,470 miles (18,470 kilometers) above our planet, a point termed the Roche limit.

“Reaching the Roche limit means that the gravity holding it [the Moon] together is weaker than the tidal forces acting to pull it apart,” Willson said.

I had read that solar tidal forces would continue to slow the rotation of Earth even after the day became equal to the month, causing a tidal drag that would draw the Moon to the Roche limit irrespective of friction from the evaporation of the Sun.

I am sorry, but I find this question troubling. Can't these people keep their story straight?

* * *

Speaking of troubling thoughts, there are few more troubling than the ones mentioned recently by Wesley J. Smith at First Things in the comment Zoos: Not for Children Anymore:

Perhaps it is wrong for me to comment about a movie I have no intention of seeing: But if this review of the new semi-documentary Zoo is accurate, it apparently has a sympathetic take on “the last taboo,” meaning bestiality. (”Zoos” in this context don’t refer to animal viewing facilities but are apparently the chosen moniker of people who like to have sex with animals. It is a take off on zoophilia. Who knew?)

Surely this is not the last taboo. That would be consensual cannibalism, of which there have been a few incidents in recent years. Actually, if the courts discern an autonomy right to suicide, it would be hard to see what the objection to this form of self-expression would be. Certainly it would present fewer objections than bestiality, where the consent of the animal is always in doubt. The limiting factor, perhaps, is that any society that really did not see a problem with the practice would already be so chaotic that it would make little difference what the law said.

Getting back to the Seventies for a moment: this film will have to be very strange indeed to be stranger than Equus (1977).

* * *

A scientific cliche' may be about to bite the dust: String Theory may be falsifiable!

[R]esearchers at the University of California, San Diego, Carnegie Mellon University, and The University of Texas at Austin have now developed an important test for this controversial "theory of everything" the Large Hadron Collider, or LHC, a subatomic particle collider scheduled to be operating later this year at the European Laboratory for Particle Physics, or CERN....

"The beauty of our test is the simplicity of its assumptions," explained Grinstein of UCSD. "The canonical forms of string theory include three mathematical assumptions-Lorentz invariance (the laws of physics are the same for all uniformly moving observers), analyticity (a smoothness criteria for the scattering of high-energy particles after a collision) and unitarity (all probabilities always add up to one). Our test sets bounds on these assumptions"...

He added, "If the test does not find what the theory predicts about W boson scattering, it would be evidence that one of string theory's key mathematical assumptions is violated. In other words, string theory-as articulated in its current form-would be proven impossible."

If those pesky W bosons do scatter as theory predicts, that does not prove that String Theory is true, just that it will live to face further tests. That is the best that can be said for any theory.

* * *

On the other hand, there is disturbing news from Mars:

Dried up riverbeds and other evidence imply that Mars once had enough water to fill a global ocean more than 600 metres deep...Some scientists have proposed that the Red Planet lost its water and CO2 to space as the solar wind stripped molecules from the top of the planet's atmosphere. Measurements by Russia's Phobos-2 probe to Mars in 1989 hinted that the loss was quite rapid....Now the European Space Agency's Mars Express spacecraft has revealed that the rate of loss is much lower...Its measurements suggest the whole planet loses only about 20 grams per second of oxygen and CO2 to space, only about 1% of the rate inferred from Phobos-2 data...If this rate has held steady over Mars's history, it would have removed just a few centimetres of water, and a thousandth of the original CO2.

As the link explains, it is possible, even likely, that the rate of solar-wind erosion has not been constant over time. Still, there is an apparent anomaly in the lack of a modern Martian hydrosphere.

Or is it lacking? Once again, the Seventies extend a hand of dementia into the 21st century, and I recall the lyrics of the song A Horse with No Name (1971):

An ocean is a desert
with its life underground
and a perfect disguise

And no, that was not recorded by Neil Young, but by America.

* * *

Finally, here is the strangest item in a post notable for strange items:

Bush Pushes Health Care Plan

He's doing it again. He won re-election in 2004 because he assured people he would not lose the war in Iraq; then he spent months promoting a Social Security reform that was incoherent and repulsive. Today, he just lost an election, he pleaded with the Congress and the public in the State of the Union Address to let him win the war in Iraq, and the first thing he focuses on is that nitwit health-insurance proposal.

This is beyond satire.

Copyright © 2007 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Support the Long View re-posting project by downloading Brave browser. With Both Hands is a verified Brave publisher, you can leave me a tip too!

Scott Locklin on Quantum Computing: More Support for my Cocktail Party Theory of Science

Scott Locklin has a new post up on quantum computing, Quantum Computing as a Field is Obvious Bullshit. I have to agree, quantum computing [and AI] is largely bullshit, and we should probably nuke and pave the whole field. On a side note, I wonder whether this post functions as an extended reply to this comment thread on Greg Cochran’s West Hunter blog.

Which reminds me, I really should update my cocktail party theory of why science doesn’t work anymore. Locklin makes a much longer and more detailed argument than I did that a major problem for science in the twenty-first century is that too many scientists never actually build anything with their own hands to see whether their ideas work. In the case of quantum computing, the problem is as simple [and as hard!] as aligning all of the optical elements in the system well enough that you don’t introduce errors. A foolish idea has crept into science that such things are the tasks of mere technicians!

The great scientists of the past were often obsessed with problems that would now be derided as mere engineering, however the challenge of applying the powerful ideas of science to the real world often informed further theoretical advances. Since I’ve worked for years in manufacturing, I think you also learn a lot by trying to have someone else follow your instructions, and still make the thing work every time. Your idea of what a big problem is changes once you try to make it happen, either in the lab or the factory, and that is exactly what modern science doesn’t seem to want to do.

I love this image because of the silly conceit that AI research is the example of the hardest thing people do

I love this image because of the silly conceit that AI research is the example of the hardest thing people do

I am of course exaggerating for effect, but you really should try on Locklin’s argument for size. It is fundamentally similar to the reason I don’t worry much about automation taking jobs anytime soon, because I have tried to do it myself, and I know how hard it really is. McDonald’s has been applying automation for almost 80 years, which is why they can successfully put kiosks in their stores to replace cashiers. Lots of people see the kiosks, and wrongly conclude that all such jobs will disappear in a few short years. It just isn’t that easy, because there is a very deep foundation of streamlining, elimination of waste, and optimization of workflows in the background that such people do not see.

Others can imitate what McDonald’s has done in a shorter timeframe, but someone still has to understand what needs to be done and do the work to make it happen. The greatest of fools assume that AI will do this work too.

Locklin also makes an argument that is similar to my reaction to Paul Romer’s Nobel Memorial Prize in Economics:

…the total number of Einsteins in the world, or even merely serious thinkers about physics is probably something like a fixed number. It’s really easy, though, to create a bunch of crackpot narcissists who have the egos of Einstein without the exceptional work output.

It isn’t at all clear that we have really benefited by vastly increasing the number of people who work in research. We should redirect many of the bright and technically minded people who do dead end science like quantum computing into more applied fields. Hell, we would probably be better off if we just convinced them to actually apply what they are doing now. They might learn something useful.

Fst and Selection

Fst by number of migrants

Fst by number of migrants

Greg Cochran had an instructive Twitter exchange on Fst and adaption, which he expanded into a blog post at West Hunter.

Genetic similarity is usually described using the statistic Fst, fixation index. The fixation index is a useful number, but it doesn’t mean what a lot of people seem to think it means.

There are a variety of ways to calculate genetic similarity. Let’s look at the definition Greg gives:

Fixation index, via Greg Cochran

Fixation index, via Greg Cochran

N sub e m stands for number of migrants, with the sub e probably reminding you that it is a representation of people who not only moved into a new location, but successfully had kids, along with some simplifying assumptions. Since this is about gene flow, the mechanism is reproduction. If we take this formula, we can see what Fst looks like by number of migrants:

A plot of Fst values

A plot of Fst values

So when Greg says that the number of migrants per generation needed to keep populations genetically similar is 1, he is describing where the knee of that plot is. You get real big changes in Fst to the left of that point, but the plot is basically flat as the numbers of migrants go up.

I think I can see why people get confused. One migrant per generation seems trivially small. And it is! But what does this degree of genetic similarity really mean? Wikipedia has a chart taken from the International HapMap project:

Fst across the world

Fst across the world

An average number of migrants per generations equal to one gets you approximately the genetic distance between white Mormons in Utah plus some Italians (CEU) and the Yoruba people (YRI).

Yoruba dancers  Ayo Adewunmi [CC BY-SA 4.0 (], from Wikimedia Commons

Yoruba dancers

Ayo Adewunmi [CC BY-SA 4.0 (], from Wikimedia Commons

In this context, we can see that similar is far from identical. Other than obvious differences in appearance, sub-Saharan Africans like the Yoruba often differ from Europeans in things like malaria resistance or salt retention, so there are real differences in addition to real similarities. In theory, Fst goes from zero to one, but in practice we see numbers of 0.16 or less.

Much of the argument on Twitter was whether you could get any real genetic differences by selection with an average number of migrants per generation around 1. It certainly seems possible to me! Fst is a pretty high level model, and in general is calculated looking at lots of different loci. For selection to occur, you only need the frequency of one gene to change [for simple adaptions], which could readily happen without affecting Fst estimates at all. Based on this, Fst isn’t real useful in determining whether selection occurred. You would be better off looking for selection directly.

The Long View: Two Scientists

Albert Einstein and Marie Curie

Albert Einstein and Marie Curie

This essay from John is an extended reflection on the lives of Albert Einstein and Marie Curie, who were famous at about the same time for about the same reason.

One hundred years later, I note that famous people’s personal lives are approximately as scandalous now as they were then. The mid-twentieth century placed a big premium on at least the appearance of propriety, and perhaps even encouraged it some, but the fin de siècle and Edwardian eras were a little bit more rowdy.

Because of Tim Powers’ novel Three Days to Never, I was familiar with Einstein’s martial adventures, but Curie’s were new to me. There is a fair bit here on the context in which both scientists’ work went on, which is handy if you like to see how everything fits together.

Two Scientists:
Common Topics in Biographies of
Albert Einstein and Marie Curie

An Essay
John J. Reilly

Toward the end of 2006, I happened to read back-to-back the biographies of two scientists who rose to prominence around 1900: Denis Brian’s Einstein (1997) and Susan Quinn’s Marie Curie (1996). Albert Einstein’s dates are 1879 – 1955 and Marie Curie’s (born Maria Sklodowska) are 1867 – 1934. To some extent their areas of study overlapped, so it’s not surprising that they often met, or even that their families sometimes vacationed together. Nonetheless, I was struck by the parallels in their lives, so much so that I began to wonder whether the parallels lay primarily in the lives of these contemporaneous people or in the interests of their contemporaneous biographers.

The biographies emphasize that both began as marginal, even bohemian figures, and ended their careers in that eminence beyond emeritus that attaches to the founders of major institutions. Einstein’s prestige saved the Institute for Advanced Study at Princeton from becoming just a make-work project for émigré Europeans, while the French state eventually created the Radium Institute to accommodate the Widow Curie. They both won Nobel Prizes: Curie twice, once for physics in 1903 with her husband Pierre Curie and Henri Becquerel, and again for chemistry in 1911 on her own. She is the only person so far to win two Nobel prizes. Both her prizes were connected with the accomplishment she is famous for, the discovery and refinement of radium. Einstein received his prize in 1921 for his early work on the photoelectric effect rather than for relativity. Both Curie and Einstein became celebrities before the term was invented. Both had scandalous personal episodes that these biographies treated with similar, lengthy sympathy, though Einstein and his executors did a good job of keeping his scandal confidential until long after his death.

Maria Sklodowska’s outsider status rested on being Polish, and a woman, and an agnostic; in the first and third points she resembled her father. She came from a landless branch of the numerous Catholic gentry in the Russian-controlled part of Poland. Both her parents were teachers in Warsaw. Her biographer emphasizes just what an odd place Russian Poland was. The official language of instruction in the schools, even the private schools, was Russian. This led on one hand to the preparation of Potemkin curricula to show the state inspectors when they came to visit, and on the other to remarkably high levels of illiteracy. Still, readers will be struck by how much less onerous the Russian Empire was than Poland’s later status as a nominally independent ally of the USSR. Apparently, people, goods, and money could flow to and from Poland without much difficulty.

When the time came for Maria to consider a place for higher education (which really was impossible for women in Poland at the time), the options were St. Petersburg or Paris: Berlin was not even on the screen. That, perhaps, tells us something about the cultural spheres of influence in Eastern Europe at this period. When Marie, as she was soon known, arrived at the Sorbonne, she was very unusual in being a female student but less so by being a foreigner. The two actually went together: a far higher percentage of foreign students were women than could be found among the native French students.

Einstein was an outsider in part because he was Jewish, but far more because of his personal eccentricities. His family was not observant. His biographer describes how Einstein’s parents took care to dampen an episode of adolescent piety on their son’s part. Einstein was born in Germany, but spent much of his youth in Italy, where his father and uncle operated one of a series of unsuccessful electrical engineering firms. His biographer points out that Einstein was a pretty good nuts-and-bolts engineer. He held several patents, for instance, and continued to freelance as a patent consultant long after he had become a famous physicist. Curie’s biographer makes much the same claims for her subject: after all, Curie received two Nobel prizes for practicing table-top physics on a nearly industrial scale. Later, as a director of a government-subsidized institute, she managed what was in effect a small processing-plant for heavy elements.

Einstein’s early academic career was spent largely in Switzerland, with a brief posting to Prague in what was then the Austro-Hungarian Empire. Again, the biographer is keen to make Eastern Europe seem as surreal as possible. In contrast to Curie, he was not a super-student. He was obviously very good at math and physics: he was able to secure just enough faculty patronage to ensure that his weaknesses in other areas were overlooked. He was not aggressive or confrontational, but he had no gift for faculty politics. He was the sort of person who would appear at an important social event not wearing socks. He wasn’t being rude; he would just forget. Once he had colleagues rather than superiors, however, he was able to develop the personal contacts that would secure him an appointment in 1914 at the University of Berlin. There he would remain until leaving Europe for Princeton in 1933.

Pierre and Marie Curie would have one of the great collaborative marriages in the history of science. Pierre Curie was older than Marie (his dates are 1859 – 1906). He came from yet another agnostic family, which was not so odd in academic France at the time, but he was very unusual in not having passed through the great preparatory schools. He was home-schooled; then he just showed up at the Sorbonne and started passing tests. He shared Einstein’s ineptitude for professionally advantageous socializing, but the lack of institutional alternatives to the Sorbonne in the French system may actually have helped to keep his professional progress steady. The Curies had two daughters, one of whom, Irène, would become a physicist and marry Frédéric Joliot: they became another noted husband-and-wife scientific team.

The interesting thing about the personal life of Einstein is that he tried to do just that with one Mileva Maric. She had been a science student, too. Unfortunately, she was not only not a super-student, she was unable to pass the Swiss qualifying exams that would have allowed her to teach science. They would have a daughter out of wedlock, whom Einstein insisted on giving up for adoption. He lost track of the child, but later dreaded the effect on his reputation should she make public her connection with him. He and Mileva married and later had two sons. The trajectory of their marriage seems to have been that, first, they could talk about science; then they could talk about domestic matters; then they could not talk at all. Einstein made no effort to remain faithful, and while he never abandoned Mileva in the sense of leaving her and his sons without support, the tale of the collapse of the marriage does him little credit. They divorced in 1919, against Mileva’s wishes. Albert later married his cousin, one Elsa Löwenthal, who never aspired to be his intellectual equal.

All these people were affected, more or less, by the intellectual fashions of their time. Einstein was mildly interested in spiritualism, at least to the extent of being persuaded that some reports of psychical phenomena were true, but he never pursued the matter. Pierre Curie, in contrast, was an enthusiast and a frequent participant at séances. Marie seems to have been persuaded, too, but like Einstein, she never chose to devote her own time to research in this area. Perhaps in a victory of ideology over the metaphysical impulse, both Curies felt that Émile Zola had said all that needed to be said when he attacked the reports of miracles at Lourdes. It would be inaccurate to call Einstein an atheist, or even an agnostic: he had a strong mystical streak, and though he repeatedly insisted he did not believe in a personal God, he made quite a few references to the Lord that do not seem to have been meant wholly metaphorically.

Einstein and the Curies were “people of the Left” in a general sort of way, but none was particularly interested in politics. Nonetheless, Marie’s academic career after the death of her husband in a traffic accident was affected by the Dreyfusard – anti-Dreyfusard structure which the politics of the Third Republic retained long after the Dreyfus Affair itself had been resolved.

Marie Curie was an obvious candidate when an opening became available for membership in the French Academy. However, there was another, somewhat senior, scientific candidate, one associated with the Catholic Institute, who had respectable qualifications. Curie’s work had been more important, but she was young and it was arguably the older candidate’s turn. This was the sort of question about which reasonable people could differ, but the anti-Dreyfusard press in particular was not in the business of being reasonable. Through some newspaper alchemy, Marie Curie became the candidate of a Jewish-atheist cabal at the Sorbonne, a cabal that sought to undermine French family life by touting the accomplishments of a woman working outside the home. We are informed that, even then, the French were worried about demographic collapse because of low birthrates: a reasonable point, though we should also note that France would have actually lost population in the first half of the 20th century had it not been for immigration from Eastern Europe. For whatever reason, Curie lost the election. That would not have been very important, were it not for the fact that the press had assigned her an ideological category and would react accordingly when she next became a figure of public note.

Marie Curie came close to public disgrace when, not long after the affair of the French Academy, she was revealed as The Other Woman in a different sort of affair, this one concerning the separation of her colleague, Paul Langevin, from his wife.

Langevin was an important physicist: Einstein once named him the person most likely to have formulated Special Relativity if Einstein had not done so first. In any case, by his own account, Langevin had a singularly unhappy marriage. His wife was physically abusive, he said, and extremely jealous; in Curie’s case, with good reason. After the scandal passed and the Langevins reconciled, she demonstrated that she was willing to tolerate her husband keeping an ordinary mistress of lower social status. The Nobel Laureate Curie, however, was at least his equal and a threat to the marriage.

The aggrieved wife went to the press and played the story brilliantly. Once again, the press chose sides; the biographer treats us to long samples of the invective that the reactionary Right heaped on Madame Curie. Entertaining as all this is, we may note that, by concentrating on the many unfair things that were said about her subject, the biographer relieves herself of the need to defend her subject. Curie really was conspiring to break up the marriage of a woman with four children, even if the mother-in-law told the newspapers there were six. In this case, Dreyfus was guilty.

The First World War obviated these matters. Curie soon won great public credit by designing and organizing the production of mobile X-ray wagons for the Allied field hospitals. Meanwhile, Einstein in Berlin continued to refine General Relativity: this biography makes clear the extent to which relativity was always a work-in-progress for him. Robert Heinlein once cruelly remarked that Einstein was a pacifist until his own ox was gored. In the First World War, that had not happened yet, and Einstein found himself more and more alienated from nationalist colleagues at the university. After the war, he tended to blame both sides equally. He generally sided with the moderate Left and maintained an early skepticism toward the Soviet Union. In later years he hesitated to criticize the USSR, however, having decided it was the chief bulwark against Nazism. Curie, for her part, never doubted that the Allies were wholly in the right, and was comfortable with such conventional initiatives as the League of Nations.

The one type of politics that always engaged the enthusiasm of both Curie and Einstein was ethnic. One of the first two new elements that the Curies isolated was named “polonium” after its co-discover’s homeland. Before Polish independence, Curie supported Polish causes to the extent she could do so without getting other émigrés in trouble; afterward, she was understandably made a national hero, and did what she could to promote Polish science and education. Einstein, for his part, was an early Zionist. After the First World War, his triumphal first trip to the United States was actually a fundraising campaign for Zionist causes. Of course, Einstein being Einstein, he could never quite stay on message: he was quite capable of describing Zionism as an effort to establish a Jewish homeland that would not necessarily be in the Middle East.

Curie made fundraising trips to America, too. The biographer seems rather scandalized that she let herself be taken in hand by one “Missy” Meloney, the editor of The Delineator, one of the major American women’s magazines of the era. This was a pro-family publication which made much of the fact Curie was raising two children. Nonetheless, the magazine saw no reason why Curie’s being a mother meant that she should not have the radium she needed for her institute, so a tour was organized to raise money to buy a gram of it. The gram, or rather the key to the lead-lined box in which the gram was kept, was presented to her by President Harding. A decade later, President Hoover presented her with another. Some people just attract hard-luck presidents, it seems.

Though Einstein and Curie were scientific celebrities during the same decade, and in large part because of their receptions in the United States, there was a difference in how the public viewed them. Neither of the Curies had ever dealt with the press very well, at least until Missy Meloney came along. They gave interviews, but favored one-word answers, and usually let reporters know they would be happy not to see them again. Einstein could be short with the press, too, but he quickly perfected the persona of the Trickster Sage. People wanted to know his opinions about everything. It was one of Einstein’s great strengths as a human being that he resisted the temptation to believe that he was omniscient just because everyone assumed he was. Nonetheless, he produced more than his share of quotable quotes on a wide range of subjects, most of them tactful and none with intent to cause offense. He was politely ironic, at a distance, to Adolf Hitler.

As for Marie Curie, Einstein knew her and liked her, but he remarked in correspondence that she was a grouch. If there are treasuries of the wit and wisdom of Madame Curie, her biographer does not mention them. She gave intelligent answers to intelligent questions about the medical uses of radium. For the most part, though, no one seemed much interested in what she had to say.

Einstein and Curie in later years were notable for their solicitude in helping young scholars get the support they themselves never had. As an academic bureaucrat, Marie Curie in particular was in a position to offer not just recommendations but jobs. Perhaps that is a predicate for biographies like these: you must produce a class of associates willing to talk about you.

A final point: Everyone knows that Einstein’s brain was removed from his body for study. Most of it is still in one jar, but parts have gone missing. From this biography I learned that Einstein’s eyes were taken, too: his ophthalmologist dropped by the morgue at the university hospital in Princeton where Einstein died and asked politely for them. They are now in a closet somewhere in New Jersey. Madame Curie died relatively young of what seems to have been anemia caused by radiation poisoning. You can read her notes at the Bibliotéque nationale if you have a mind, but to see some of the material you must sign a waiver of liability: it’s still “radioactive,” a term she and Pierre coined.

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

What works in healthcare?

Short answer: almost nothing.

I’m being flip about a very serious subject, but at the same time I am in fact serious. Modern medicine probably works a lot less well than you probably think it does.

How could I say such a thing when you look at graphs like this?

Maternal mortality over time Our World in Data Global Health

Maternal mortality over time
Our World in Data Global Health

I say it because that is precisely what the evidence shows. If you look at Cochrane, the world’s preeminent aggregator of medical statistics, it is hard not to come away a bit disappointed. Effect sizes [usually the difference between the test group and the control group scaled to standard deviation units] tend to be rather modest.

Here are a few examples:

You can amuse yourself by finding your own examples. There are a few things that genuinely work well. But even for things like MMR, the evidence isn’t as good as you might think.

I suspect what is going on is that medicine works, just barely, on average. You get things like the long slow decline of maternal mortality from the confluence of lots and lots of little things added together. If you look at anything else, heart disease, or cancer, you will see the same pattern. Vaccines are an exception. Disease rates for things with effective vaccines just drop off immediately.

Polio dropped right off

Polio dropped right off

Heart disease is on a long, slow decline

Heart disease is on a long, slow decline

Which brings us to my motivation for bringing this up at all. Random C. Analysis just published an updated argument that healthcare spending in the United States isn’t badly out of line with the rest of the world, once you take into account how much richer we are than just about everyone else. We have more, so we spend more. According to RCA’s data, when income goes up 1x, healthcare spending goes up 1.6x.

My contribution to this is to suggest the reason for this is we purchase more and more healthcare is precisely because it doesn’t work very well. At lot of modern medicine treats symptoms better than causes, because we don’t understand the causes very well. You buy as much symptom relief as you can afford. Even when treatments are genuinely curative, the success rates are often low. For example, the controversial statistic number needed to treat attempts to quantify how many patients need to be treated in order to produce one cure.

This number is often quite large, often on the order of 50 to 100. Even for a really good treatment with an NNT of 10, 9 times out of 10 it isn’t better than the alternative it is being compared to. That is a lot of wasted time, effort, and money.

We just don’t know how to predict the 1 time it works, so we treat everybody and hope for the best.

You can find speculation like this from Goldman Sachs that you can’t make enough money curing disease as compared to offering palliative care. I’m sure there really are businessmen who would gladly milk you for everything you are worth, but I don’t worry about in reality because no one understands how to cure most of the things that ail us. It isn’t like miracle cures are being withheld, or even that research is being directed away from cures. We don’t know enough to do that.

If we really wanted to limit costs for healthcare, it might be possible if we ruthlessly limited access to only things that really worked. We would get 80% of the benefit for 20% of the cost. But people would be pissed. The other 20% of the benefit does actually work, sometimes. I don’t think this is possible, or even really a good idea. Any real solution will involve technology we don’t currently possess.

Linkfest 2018-08-06: Now with more science!

Barrow Steelworks  By unknown - 1877 or earlier, republished by University of Strathclyde project -, Public Domain,

Barrow Steelworks

By unknown - 1877 or earlier, republished by University of Strathclyde project -, Public Domain,

Somehow I had never really captured the term, Second Industrial Revolution. This is the far more interesting one that came in the late-nineteenth, early-twentieth century. This is where we got electricity and steel and mass production.

A long journey to reproducible results

Reproducibility is often an afterthought in science, which means it is often quite hard to *actually* reproduce someone's results from their method section. Sometimes it is hard even if you call the scientist and ask them how they did it. True standardization is one of the fruits of the second industrial revolution, but we have forgotten how to use it.

Plan to replicate 50 high-impact cancer papers shrinks to just 18

A high profile project runs into trouble because of a lack of attention to standardization and reproducibility when experiments were first run. If you have experience doing this, it can be easy to help the next experimenter down the line. But you only get that experience by doing it....

Not a problem limited to the sciences either. One of the ways in which you can enable replication is to make all of the intermediate products of your research available, which I think ought to be a wider practice, especially for publicly funded research. With the raw data, and the analysis script(s), you can then run the numbers yourself and see what happens. With online appendicies, this could be easy.

A fine thread on the implications of the ability to make guns at a craft scale instead of the factory scale. 3D printing isn't the real issue, it is about machining know-how and a ready market in non-gun parts that can be turned into truly functional modern firearms.


I missed this one somehow, possibly because I wouldn't have waited for it to download when I was on dial-up. I just wanted to play Quake.

Why is so little plastic actually recycled?

A Danish and Swedish report on the practical difficulties of plastic recycling.

Grandmotherhood across the demographic transition

Longer lives meant more time with grandparents.

A step closer to BMD shield: India successfully test-fires interceptor missile

Outside of the context of American politics, a number of countries are working on missile interceptor technology.

Parking rules raise your rent

How Much Should Parking Cost?

Two data driven looks at the true cost of parking requirements.

Brief evolution of European armor

Brief evolution of European armor

A nicely done graphic.

Linkfest 2018-07-30

Lord of the Rings by Frank Frazetta

Lord of the Rings by Frank Frazetta

The images from today's linkfest are Frank Frazetta illustrations of the Lord of the Rings. Frazetta was a prolific illustrator of comics, book covers, album covers, and paintings. His style is instantly recognizable to any fan of science fiction and fantasy, and perhaps is the epitome of SFF cover art. There are a lot of links this week about science fiction and fantasy works, so this just seemed right when it came through my Twitter feed. His children and grandchildren still benefit from his work, so please patronize their online shops.


Warhammer 40k is the thing I had most often heard described as grimdark, but it turns out there is a wide variety of books that could be described by that label. I might have to check it out.


The first of two related Brad DeLong links this week. An nice capsule history of China's relative position in the world during the twentieth century.

Curing cancer statistically via mammography

Many modern diagnostic techniques, while quite accurate in absolute terms, can have false positive results in numbers higher than true positives because the actual occurrence rate of what is being sought is low.

A slightly gloating post, but arguably deservedly so, that self-published authors are overtaking traditional publishing at a rapid pace in science fiction and fantasy, with lots of graphs. Even more damning is the fact that much of the traditional science fiction and fantasy book sales of the traditional model are The Handmaid's Tale, currently trendy as an anti-Trump book.

Congress is giving the officer promotion system a massive overhaul

I once considered a career in the military. This is a big change in how promotions, especially the end of up or out.


Robots and Jobs: A Check on Fear

A reasonable take, based on historical data about automation.


I might argue he never left, but there is a genuine neo-Aristotelian moment in analytic philosophy.

Underestimating the power of gratitude – recipients of thank-you letters are more touched than we expect

I just received a handwritten thank you note from my mother, so this came at the right time.

Why did the Industrial Revolution occur in England?

Why did the Industrial Revolution occur in England?

Pseudoerasmus tweets a chart looking at how few people were employed in the English agricultural sector in the eighteenth century.


A counter-point to DeLong's piece on China above, but with a disputed claim about agricultural productivity in Japan.

Compulsory Licensing Of Backroom IT?

I would genuinely like to know if the claim that different executions of custom IT software are  a large differentiating factor in the market right now is true.

Dollars for Docs

Public records on payments to physicians from pharmaceutical companies and medical device companies in the US.


Some data on why it makes economic sense [for developers] to build expensive housing right now.


The Marriage of Sam Gamgee and Rosie Cotton

A beautiful reflection on the little touches that make Tolkien so great, and why the Fellowship was comprised of bachelors.

When Ramjets Ruled Science Fiction

Some of the most fun ideas in science fiction get disproven later. Ah well.

I need this for professional purposes.

The humanities are suffering from not being vocational.



LinkFest 2018-07-16

On the left, what everyone thinks machine learning is. On the right, what is actually is.

On the left, what everyone thinks machine learning is.
On the right, what is actually is.

Ways to think about machine learning

I've been a skeptic about artificial intelligence in general, and a critic of the ways the actual technology has been hyped. This is a pretty reasonable take from someone who is willing to invest a lot of money in machine learning. Machine learning is another kind of automation. We've been seeing big things come out of automation for 100 years, it makes modern life possible, but it is easy to lose perspective.


Why the Future of Machine Learning is Tiny

An example of what machine learning can mean in practice.


Snapping Spaghetti

Applied mechanics of fracture with slo-mo video! Why does a piece of spaghetti break into three or more pieces when bent? Now you can find out!

Manufacturing output per capita, colored by what percent of the economy manufacturing is

Manufacturing output per capita, colored by what percent of the economy manufacturing is

Manufacturing output divided by employment in manufacturing, Canada and Taiwan were missing the employment estimate

Manufacturing output divided by employment in manufacturing, Canada and Taiwan were missing the employment estimate

Global manufacturing scorecard: How the US compares to 18 other nations

Manufacturing stats are a subject of interest to me. I don't find much of interest in the Brookings manufacturing scorecard, which is just their subjective rating of various things. Rather, I plotted the manufacturing output for each country per capita, and per person employed in manufacturing, a kind of crude productivity number.

I think the *really* interesting thing here is how much Switzerland sticks out. The parts of the economy in Switzerland I am most familiar with are chemical precursors for pharmaceuticals and medical devices, which are both high value sectors.


When Evidence Says No, But Doctors Say Yes

This is a great article on how hard it is to find clear evidence that common therapies work, and how hard it is to disseminate that knowledge once we have it.


Israeli space probe to land on Moon in 2019

I was going to say this isn't surprising from a country that also made their own nuclear weapons, and then I saw the money for it came from a South African businessman. Israel and South Africa *probably* cooperated on nuclear weapons too.


Thou Shalt Not Wirehead: Religion vs Gratification

This is pretty good. I think I mostly agree, except I am also very interested in whether religion is *true*. Religion can be pretty helpful in encouraging behaviors that help you in this world, for example, the prosperity Gospel is pretty popular because it actually works out that way. If you give up drinking, gambling, and whoring, usually your life materially improves. But sometimes religion can make you do things that are the opposite of helpful in this world. For example, the Xhosa.


Welcome to the Party, Pal

A reflection on how the political coalitions in the United States came to be.


Does Free Trade Bring Lower Prices?

Dani Rodrik reminds us that we have to describe the world as it is when we make economic projections, not a model of it.


Donald Trump tells us truths we don’t want to hear

Matthew Paris argues that Donald Trump acts like an Emperor, and you shouldn't be surprised by that.


The Fear of White Power

What is the value of political correctness to a minority in society? And is its cost?


Shortwave Trading | Part III | Fourth Chicago Site, East Coast, Patent, Regulation, and Farmer Kevin Mystery

High volume traders are rolling their own radio networks to get a leg up on the competition.


Traditional Euro-bloc: what it is, how it was built, why it can't be built anymore

The perfect counter-point to my post on modern urban development. We can't just build things because we like how they look, we have to care about money, and how neighborhoods evolve, and what will actually work for the people who live there.

Linkfest 2018-06-18

Perhaps Monday is the new Friday around here.

Conan the Barbarian: A Review, an Analysis, and a Little Bit of a Misunderstood and Improperly Played - While Talking About the Pulps

I found this reading the Conan roundup from Monday. I also rate the 1982 Milius Conan higher than Rick Stump. I love that movie, and I am astounded by how well it holds up. Nonetheless, this is a fantastic reflection on Robert E. Howard and his influence on the storytelling of the twentieth century.


There are reputable companies working in the same space as Theranos, but since there is either no hype or no scandal, we don't hear much about them.

There’s a Place for Us: Revoice and Gay Christian Futures

There’s a Place for Us Part II: More on Revoice & Gay Christian Homemaking

I really enjoyed Eve Tushnet's two-parter on being a gay Catholic, and I think she's completely right that an obsession on avoiding even the possibility of sexual feelings has cramped the friendships of too many people. As Eve rightly notes, this is not limited to those who identify as gay or lesbian, but affects all of us to some degree. This reminds of things the Art of Manliness has written about friendship, from a completely different direction. Anytime I find two people with completely different perspectives and agendas talking about the same thing, I take notice. 

The Murder That Changed Germany

I read John Schindler extensively for a while, then I started to be concerned that he had lost his mind. I'm glad to see he can still write a cogent column. The murders of so many young women in Germany by migrants of various sorts was the kind of thing predicted after Angela Merkel so unwisely threw open the borders. This prediction was then dismissed as racist trash, and inconveniently, happened anyway.

Violent crime rises in Germany and is attributed to refugees

This Reuters report states the facts succinctly.

Why Working on the Railroad Comes With a $25,000 Signing Bonus

Railroad work is irregular, hard, and dangerous. Consequently, it also pays well. Of course, this kind of thing can be highly cyclical, and under railroad union rules, the guys who get laid off will be the ones with the least seniority. Nonetheless, this is really good work.

The Lesser Cruelty on Immigration

Ross Douthat pens the kind of column on the fuckup at the southern US border that I wish I had written. I am resolutely against mindless cruelty, but there has to be some level of cruelty in a rich nation's border enforcement, or that nation will end.

McMoon: How the Earliest Images of the Moon Were so Much Better than we Realised

The more classified stuff comes out that we did during the Cold War, the more sympathetic I am to the idea that innovation in the US has slowed down.


Time has been kind to Francisco de Orellana.

Tollins: Explosive Tales for Children Book Review



Tollins: Explosive Tales for Children
by Conn Iggulden and illustrated by Lizzy Duncan
HarperCollins (2009)
175 pages
ISBN 987-0-06-173098-6

I previously knew Conn Iggulden from his work, The Dangerous Book for Boys, soon to be an Amazon Original series. Since I rather enjoyed that book, I picked this one up on sight. I wasn't disappointed.

In collaboration with illustrator Lizzy Duncan, Iggulden has created a rather charming children's book that is a not-so-secretly a paean to science and the industrial revolution, in a very English way. I enjoy the dry, subtly sarcastic humor Iggulden uses to describe the Tollins, and their home of Chorleywood.

I opened up the book in the store and I read the opening paragraph:

Tollins, you see, are not fairies. Though they both have wings, fairies are delicate creatures and much smaller. When he was young, Sparkler accidentally broke one and had to shove it behind a bush before his friends noticed.

And I immediately started snickering. Paging through the first chapter, I quickly found more bon mots like this. My kids wanted to know what was so funny, so I had to sit down and start reading it to my 6-year-old and 3-year-old. My 6-year-old especially loves this book. The mixture of humor, adventure, and romance is just right for him.

Lizzy Duncan's illustrations really make this book work. Her work is expressive and in perfect counterpoint to the text. I enjoyed Sparkler and Wing's joy, consternation, and determination written on their faces. And of course, the super pathetic Tollins in jars.

Link is apparently making fireworks

Link is apparently making fireworks

This is a fine work that I look forward to reading many, many times to my children. I'll probably pick up the sequels as well.

My other book reviews

The Long View: The Hedgehog, the Fox, and the Magister's Pox

Greg Cochran has written one entry in what he threatened would be a new series, Speaking Ill of the Dead. The subject of that essay was Lynn Margulis, who came up by coincidence here recently. Stephen Jay Gould would probably be a feature in Greg's series also, if he hadn't so frequently excoriated him already. Even the gentle Henry Harpending [RIP] wasn't above mocking Gould.

Since I know that Gould wasn't above lying in the name of what he thought was a good cause, I have to consider all of his work suspect until proven innocent. I find my opinion of Gould ameliorated some by the fact that he knew the Renaissance humanists where an intellectual black hole: knowledge when in, but it never came back out.

Nonetheless, despite Gould's immense popularity, I have to question whether his net contributions to science were positive, largely because he used his popularity to taint well-supported ideas he found politically distasteful.

The Hedgehog, the Fox, and the Magister's Pox:
Mending the Gap Between Science and the Humanities
By Stephen Jay Gould
Harmony Books, 2003
274 Pages, US$25.95
ISBN 0-609-60140-7


This is Stephen Jay Gould's last book on natural history. The book began as the author's presidential address to the National Academy of Sciences in 2000, but most of the text is new, and not a compilation of previously published articles. Unfortunately, the author did not live long enough to proof the manuscript, with the result that some of the prose is a bit mysterious. This is true even of the title. The “hedgehog” and the “fox” need no introduction, perhaps (one of them knows one big thing, and the other a thousand small things), but the “magister” is merely identified as an Inquisitor of the Diocese of Pisa who failed to excise Erasmus's name from a book of natural history. Readers will puzzle why the magister is there. Despite these defects, however, this is a gripping book for anyone concerned with the place of the natural sciences in intellectual history. What we have is a moderate postmodernist's concept of a Theory of Everything.

Gould is trying to do three things in this book. The first is to trace “the battle between the ancients and the moderns,” which began with the beginning of modern science in the 17th century, and which has erupted episodically ever since. The second is set out a model of knowledge that can reconcile the scientific and humanistic portions of the modern mind. The third is to refute a competing model devised by Gould's Harvard colleague, the sociobiologist E. O. Wilson. In 1998, Wilson published a book on this subject entitled “Consilience,” a term coined by the Victorian divine and Oxford naturalist, William Whewell. Gould reminds us that he had rescued Whewell's term from almost complete obscurity some years ago. In this book, Gould plainly wanted it back.

Echoing a theme he has made in previous works, Gould points out that the Whiggish historical model of the development of science as a long battle with religion is simply false. The early scientists were all at least conventionally pious, some were fervently devout, and not a few looked on their research as a form of apologetics. Their real opposition was the humanistic tradition of the Renaissance. That type of scholarship, even when it dealt with natural history, was wholly literary. It was concerned with recovering and ornamenting the Classical past. Therefore it was suspicious of novelty, including new results from scientific research. To overcome that cast of mind, early modern science developed some bad habits, such as a studied indifference to literary style, and an implausible insistence that science is free, not just from biases, but also from preferences.

Gould recognizes that he is walking across a historiographical minefield here. The very idea of a “Scientific Revolution” has been called into question: early modern science was in fact pretty continuous with late medieval science. He also recognizes that “the Battle of the Books” or “The Battle of the Ancients and the Moderns” was not essentially a struggle between natural scientists and humanists. Many of the new science's greatest supporters were notable humanist scholars. Still, a point that he does not make is that humanist studies changed quite as much in the 17th century as science did. The whole of European intellectual life was moving away from the mere amassing of facts and discernment of correspondences, and toward synthesis of ideas and an appreciation of fresh observation. Science, really, was just one instance of a larger trend.

One may question whether the later episodes of intellectual conflict that Gould discusses are really rematches of the Battle of the Books, but they do have their own interest. Regarding the 19th-century conflict over evolution, Gould notes that, again, it was not a matter of science versus religion. Many scientists did not accept Darwin's conclusions, while many sophisticated religious people did. Actually, of those proponents of evolution who were inclined to pick a theological quarrel, it was not so much religion they opposed as Catholicism: Protestantism was supposed to be compatible with science.

The next supposed battle of the books, the controversy sparked by C.P. Snow's famous book, “The Two Cultures,” happened just at the beginning of Gould's academic career. He is now inclined to dismiss it as a tempest in an Oxbridge teapot. Maybe there were common rooms, in the colleges familiar to Snow, where all the dons were literary types who knew no science, and maybe there were other common rooms where the dons were scientific philistines whose minds were untainted by humane learning. If so, these creatures rarely ventured into the public eye.

Gould is no doubt right when he says that the “Two Cultures” debate was a battle of strawmen, but the matter is more problematical when we come to the “Science Wars” of the 1990s. At that time, some social theorists began applying the sort of deconstructive techniques to the methods and results of the physical sciences that had long been familiar to the humanities. At the extreme, this project could be taken to imply a denial of the impossibility of an objective account of physical reality: the scientific view of the world becomes just another construct. Gould denies that any considerable number of scholars ever went to that extreme.

Frankly, my own impression is that a large percentage of the participants in these polemics conformed to stereotype, but maybe they were not representative of their disciplines. (In any case, as Gould points out, few working scientists even heard about the debate.) Gould himself was a member of the moderate social-constructivist wing. Though not denying the final validity of confirmed scientific results, he has always been keen on the relationship between science and cultural history. He is, moreover, a notable essayist: like Freud, he might retain some status as a literary figure, even if all his science were later dismissed. In fact, when Gould talks about the reconciliation of the sciences with the humanities, one cannot escape the impression that he means that more people should do what he does.

And how might science and the humanities learn to live together? Through the recognition of NOMA: “non-overlapping magisteria.” The physical sciences have their own methods and manner of growth. They deal with the objective material world. Their results, however, tell us nothing about art or beauty or ethics or theology (theology properly understood, no doubt). Gould quotes Hume to the effect that you can't get to “ought” from “is,” but insists that “ought” is vitally important. Ethical issues may be addressed through systems of organized knowledge, and the results of scientific research may settle the factual context of ethical questions. However, questions of ethics and value, though quite legitimate, just are not scientific questions.

One might say that what we have here is another example of the model of the “Two Swords.” “Render to Caesar the things that are Caesar's and to God the things that are God's”: this grasping for a dichotomy that is not a dualism appears in couplets from the distinction between noumenon and phenomenon to the separation of Church and State. However, this model did not prevent the Investiture Controversy, and its performance has been spotty ever since. For one thing, it is never clear which side has a right to what. Gould sometimes seems to suggest that science is the rightful possessor of quantity, whereas theology and the humanities are the proud possessors of qualia, but he never quite makes the leap. One could see why: that would be an admission that science is the realm of gray theory, while the humanities deal with real life. Of course, there are other reasons to hesitate: the humanities don't really seem to be up to shouldering the responsibility.

Gould's model of knowledge, then, is not a monolith, but a “coat of many colors.” It fact, it bears more than a little resemblance to his model of the biosphere. It is governed by contingency. Its future evolution is fundamentally unpredictable. However, even though it is non-hierarchical, it is yet by no means an anarchy. Furthermore, it is progressive. As we noted above, Gould revived the notion of “consilience,” or “jumping together,” from the 19th-century historian of science, William Whewell, who coined the term to refer to the act of induction that creates a new synthesis. The most famous example is Newton's realization that the fall of an apple and the orbit of the moon are expressions of the same principle. Gould proposes Darwin's formulation of biological evolution as the most important example of consilience in history. In any case, while Whewell noted that consilience operates within each physical science and even between them, he cautioned against attempting to synthesize the sciences and the humanities by this method. If I understand Gould's précis correctly, Whewell's position was that the humanities just did not function like the sciences. Consilience was the way that Whewell had observed science to proceed. No similar pattern appeared in art or politics or theology. Though Gould suggests that the sciences and the humanities might unite in a consilient fashion to address important issues, he agrees with Whewell that they cannot be incorporated into a single structure, especially not one that assigns places of higher and lower.

It was precisely such a hierarchical structure that Gould's Harvard colleague, E. O. Wilson, proposed in his book, “Consilience.” Wilson's hierarchy actually appeals to Gould. Scientific hierarchies tend to take physics as the summit and the model, granting a progressively lower status to each discipline as its subject matter becomes more complex and murky. In such systems, Gould's own subject of evolutionary biology is pretty far down the scale (Wilson is an expert on ants, by the way). The social sciences are submerged in the nether muck.

Wilson flips this structure. Physics is only the beginning, the least of the sciences. The hard questions are the ones that deal with life, with community, and finally with mind. So, if Gould accepted Wilson's model, Gould's area of study would be raised up high, though not so high as, say, musicology. The price would be that the phenomena of every discipline must be assumed to be reducible to a mechanical physical substrate.

Gould tries to distinguish the practice of materialist reduction, which Wilson advocates, from Whewell's consilience. It's harder than you might think. In any case, what Gould really advocates is the different, but essentially complementary, attitudes toward knowledge that are represented by the animals of the title of Gould's book. The hedgehog addresses a problem by taking it apart into ever-smaller problems, and always finding more to discover. The fox takes bits and pieces of information from every direction of the compass, and puts them together to make something new. Gould's objection to Wilson's version of consilience is that it would require all foxes to become as monomaniacal as hedgehogs.

Wilson is aware that his materialism is a metaphysical proposition. Still, he feels confident that his metaphysics will be born out. The only reason that Logical Positivism failed in the first half of the 20th century, Gould presents Wilson as saying, was that we did not know how our brains work. Now, according to Wilson, we are starting to learn just that, and eventually we will have a neurological explanation of qualia. (Curiously, though Wilson addresses this question in his book, the term “qualia” does not appear in the index.) Then we will have not only a mechanical account of subjective experience; we will understand ethical and even esthetic issues through neuroscience, informed by evolutionary biology. In Wilson's sociobiology, the moral sense is simply an embedded version of evolutionary experience. Conformance to that experience is the only meaning that “right” can ever have.

There are some obvious things to say about this model, and Gould says them. He points out that Logical Positivism was not abandoned because of a lack of information, but because of internal incoherence. As a general matter, in fact, it is hard to see how questions of pure logic could ever be usefully reduced to neurological events. More important from Gould's point of view, it is difficult to see how a subject like evolutionary biology could be incorporated into a reductionist system, since so much of life seems to be a collection of frozen accidents. And if you take a strong view of emergent behavior, which is to say, that those features which a system displays that its parts do not, then reduction becomes impossible in principle: you can't reduce some phenomena down to a lower level because they just do not appear on a lower level.

Yet for all this, I thought that Gould refuted Wilson less thoroughly than might have been the case, simply because they are working from the same metaphysical premises. Putting aside the awkward question of the ontological status of mathematics, we see Gould objecting that Wilson's system would provide no method for critiquing such venerable practices as cannibalism or infanticide, if they turned out to be part of the evolutionary heritage of the moral sense. Actually, Gould does concede that tendencies to such behavior may well be part of our evolutionary heritage, but he assures us that we are not bound by those tendencies. We can critique and control them through other ways of knowing. To that, Wilson could respond that higher morality is also part of the inherited moral sense, and that it is precisely the goal of consilient science to show why these manifestations of the moral sense are arranged hierarchically.

So what can we say about Gould's scheme of reconciliation? Certainly his instinct toward tact is to be applauded, as is his honest curiosity about opposing points of view. Every intellectual era has its characteristic virtue, and the gift of postmodernism was the technique of “teaching the conflict.” If some great question cannot be resolved, or cannot be resolved now, still there is great merit in the ability to set out all the major arguments and make them talk to each other. The characteristic flaw of postmodernism was to make the postponement of finality a permanent principle. This is why Gould's attempt to mend the gap between the sciences and the humanities had to fail. A Theory of Everything is a sort of eschaton. That is where all the questions have to be answered.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2006-01-18: Suicide, Iran & Environmental Collapse? Damn the Demographics!

I was first introduced to James Lovelock and Lynn Margulis' work with the Gaia Hypothesis through the simulation game SimEarth. A year ago I posted to an interview with Lovelock, and at the time I assumed that he was crazy things because he was old. I can now see that Lovelock was always crazy.

Lovelock is 98, and still around to give astonishing interviews. Margulis is not, but she was equally crazy. She did one big thing right, supporting the notion that mitochondria and chloroplasts originated as independent prokaryotes, and then massively went of the rails everywhere else.

Suicide, Iran & Environmental Collapse? Damn the Demographics!


Regarding the Oregon Suicide-by-Doctor law, which the Supreme Court has just upheld, let me repeat that I dislike this law as a matter of public policy. Nonetheless, the statute was valid. It did not contradict federal drug-control statutes, much less the federal constitution. The question, in fact, was enough of a no-brainer that it was an embarrassment to see that three justices were willing to say they thought otherwise.

Look, the Conservative Party is supposed to be the Principled Party. That means the party that is willing to accept defeat on a partisan issue if that is what is necessary to maintain the rule of law. Had a majority of the court voted to overturn the law, a specific evil would have been avoided, but at the expense of the rule of law and of the credibility of the justices. The court is going to need that credibility if the case that overturns Roe is to be seen as anything more than a press release from the Republican National Committee.

* * *

And what of Niall Ferguson's credibility? A historian is always regarded with suspicion by his colleagues when he speculates about the future, as Ferguson does in the brief essay: The origins of the Great War of 2007 - and how it could have been prevented:

The devastating nuclear exchange of August 2007 [between, probably, Iran and Israel] represented not only the failure of diplomacy, it marked the end of the oil age. Some even said it marked the twilight of the West. Certainly, that was one way of interpreting the subsequent spread of the conflict as Iraq's Shi'ite population overran the remaining American bases in their country and the Chinese threatened to intervene on the side of Teheran.

Yet the historian is bound to ask whether or not the true significance of the 2007-2011 war was to vindicate the Bush administration's original principle of pre-emption. For, if that principle had been adhered to in 2006, Iran's nuclear bid might have been thwarted at minimal cost. And the Great Gulf War might never have happened.

We should note any such conflict today could not expand in the fashion of the First World War. In 1914, an algorithm of treaty obligations and diplomatic understandings ensured that, in a matter of weeks, the war would spread across Europe. (Ferguson has said otherwise; he's wrong.) Even the lesser European powers had deployable forces all ready to go. That was an unusual start for a major war, of course. Compare World War II, which arguably began when the Japanese pushed south into China in 1937 and was still adding major participants in 1941. That is the sort of slow accretion that an initial strategic nuclear exchange would exclude. Israel as a nation might not survive such an exchange. It would be surprising if the Iranian state survived. The wild card would be what would happen to Islam if the holy sites in Saudi Arabia were nuked and the hajj became impossible.

And of course, the hypothesis of an "exchange" is unlikely, too. Iran wants ballistic nuclear weapons to obtain a measure of immunity from regime removal by the United States. That would allow Iran to operate a terror network and conventional forces with a measure of impunity. That calculation would work until missile defenses can reliably stop short and medium-range ballistic missiles. We can expect that to happen at no distant date.

* * *

According to James Lovelock, originator of the Gaia hypothesis, The Earth is about to catch a morbid fever that may last as long as 100,000 years. As he notes more in sorrow than in anger in The Independent:

This article is the most difficult I have written...Gaia has made me a planetary physician and I take my profession seriously, and now I, too, have to bring bad news. ...She has been there before and recovered, but it took more than 100,000 years. We are responsible and will suffer the consequences: as the century progresses, the temperature will rise 8 degrees centigrade in temperate regions and 5 degrees in the tropics...Curiously, aerosol pollution of the northern hemisphere reduces global warming by reflecting sunlight back to space... We are in a fool's climate, accidentally kept cool by smoke, and before this century is over billions of us will die and the few breeding pairs of people that survive will be in the Arctic where the climate remains tolerable.

Collapse is not quite so imminent that we might not hope to read all about it in Dr. Lovelock's forthcoming book, The Revenge of Gaia.

Regarding this article at hand, readers of Olaf Stapledon's First and Last Men may be reminded of the end of the First Men, who survived in the Arctic in a few small groups after Earth undergoes a sudden and violent heating. Elsewhere in this article, mention is made of a future ruled by barbarian warlords, which gets us back to Mad Max country. All in all, the article is an exercise in apocalyptic nostalgia.

Lovelock's assessment is that ecological collapse is irreversible. Some of the brighter environmentalists understand that this view might interfere with fundraising:

"If any of us back up behind that idea we might just as well slit our wrists," said Aubrey Meyer, the director of the Global Commons Institute, which campaigns hard for an approach to limiting greenhouse gas emissions known as Contraction and Convergence, based on moving to equal emissions entitlements per person everywhere around the globe.

The Gaia Hypothesis has a sensible version: the biosphere and the atmosphere interact over time to keep the surface temperature within a narrow range. That's probably true. However, there is also a nonsensical version, promoted at times even by Lovelock himself, that says the biosphere is a living thing, with many of the attributes of a deity. That is, to put it politely, a category mistake.

I am a great fan of decreasing CO2 and methane emissions, if only as a matter of better engineering. Actually, I am quite as capable of panicking about global warming as the next guy: when I woke up this morning, the temperature was 60-degrees Fahrenheit. By 8:00 AM, the sky was still so dark that the street lights were shining. Neither of these things is supposed to happen in the New York area in mid-January. However, the environmentalists are the last people I would ask for an explanation of what is happening or what to do about it. The environmental industry derailed nuclear power in the 1970s; today they advocate vacuous non-solutions like windpower. The really strange ones are trying to discourage hydroelectric power. And all their specific predictions have been wrong for 40 years.

* * *

And demography is worse, if you believe J. R. Dunn in his article How Demography Fails. He takes particular aim at the notion that Europe must inevitably turn into Eurabia, a development that is now regarded in some quarters as inevitable as environmental collapse:

The argument is straightforward: the native European population is dropping, with birthrates in all countries below replacement level. The Muslim populace, for the most part unassimilated, is still expanding. One curve is going up, the other down. When they cross, Europe will have effectively come under Muslim control.

But is it truly that simple? After all, there’s a reason why you’re not reading this in a U.S. with a population of 500 million+, which is what demography foresaw in 1950. Or in the 2006 world of 8 billion souls, as predicted ten years later. And certainly not in the 21st century universally forecast in the 70s, in which a few survivors grub about in the ruins left by the Great Crash following a runaway population explosion.

Yes: where do they sell Soylent Green? Perhaps they will on Svalbard Island, when the remnants of humanity in that Arctic country grow peckish.

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2006-01-13: After Roe; Predators; Indigos; Pigs; Panspermia

Lots of fun little bits in this one. There are reasonable protections to extended to protect wilderness, and then there is the Wildlands Project, a scheme to return vast areas of the United States to howling wilderness. This goal isn't particularly hidden, but you can find references to it in the project materials linked in Wikipedia.

I like natural landscapes too, but this is all a little nutty.

There is also a reference to Arthur C. Clarke's creepy book Childhood's End. Many of the mid-twentieth century sci fi authors had really weird streaks, and Childhood's End displays Clarke's. Gnosticism is always lurking somewhere.

Also. Glowing pigs.

After Roe; Predators; Indigos; Pigs; Panspermia


Plausible deniability: this was all the Alito hearings were about. There was never much hope that the nomination could be stopped. The Democratic senators were chiefly concerned to ensure that the cultural left does not strike at them when Roe v. Wade is overturned. As I have remarked, the party will greatly benefit from that event, because the party will be able to field pro-life candidates, or at least candidates who do not have to take a pro-choice position. The opportunities in the Red States are for the future, however. In the near term, the senators had to placate the interest groups that made their election possible.

The hearings were yet more evidence that Roe would have to be repudiated even if it were about double parking. (I know I have said that before; it's a good line.) The cultural left has adopted the position that there is an invisible ham sandwich in the Liberty Clause of the Constitution. Since the existence of this sandwich cannot be demonstrated by text or history, constitutional jurisprudence has become the art of selecting those very special judges who can see it.

I cannot emphasize too strongly that it will not be enough for the Supreme Court to repudiate the holding in Roe (it might be possible to justify the holding in Griswald, the case that found a right to use contraceptives, but with far narrower reasoning). The Court has to repudiate the style of constitutional interpretation that made the decision possible. If the decision that overturns Roe simply declares that the Court, of its goodness, has determined to exercise its discretion in the opposite direction, then we will just be waiting for the next constitutional explosion.

* * *

But foreign courts are worse, as we see in this outrage from Sweden:

A Swedish farmer sentenced to six months in prison for shooting a wolf on his farm in Dalsland, central Sweden, is appealing to the government for a pardon...The man was found not by the district court not to have broken the law, but he was convicted in the court of appeal. It was decided that although the man had reason to believe that the wolf would attack, as the wolf had attacked his neighbour's sheep an hour earlier, too much time has passed between the attack on the sheep and the farmer shooting the wolf.

I am at a loss to understand the fondness of the environmental lobby for dangerous predators. The reasons for keeping dangerous creatures away from farms and homes are primordial:

A South African anthropologist said Thursday his research into the death nearly 2 million years ago of an ape-man shows human ancestors were hunted by birds...[An] Ohio State study determined that eagles would swoop down, pierce monkey skulls with their thumb-like back talons, then hover while their prey died before returning to tear at the skull. Examination of thousands of monkey remains produced a pattern of damage done by birds, including holes and ragged cuts in the shallow bones behind the eye sockets...Berger went back to the [australopithecus child's] skull, and found traces of the ragged cuts behind the eye sockets.

The dangerous animals don't need to be extinct. Because developed countries are rapidly becoming reforested, there should be no lack of places for them to live. That's why they should be shot on sight when they enter inhabited areas.

Tough, but fair.

* * *

Speaking of impending extinction, this lifestyle piece from the New York Times is extremely sinister:

If you have not been in an alternative bookstore lately, it is possible that you have missed the news about indigo children. They represent "perhaps the most exciting, albeit odd, change in basic human nature that has ever been observed and documented," Lee Carroll and Jan Tober write in "The Indigo Children: The New Kids Have Arrived" (Hay House). The book has sold 250,000 copies since 1999 and has spawned a cottage industry of books about indigo children.

More prosaically, "indigo children" seem to be intelligent tykes with attention deficit disorders whose mothers prefer to believe that their offspring are the next stage in human evolution than that these kids need either drugs or no-nonsense discipline. "Indigo" is supposed to describe the auras of these prodigies. The term "indigo" is sometimes used these days to mean a young, creative person. There are some links here

I knew from my studies of esoteric fascism that this notion of a mutant generation was the sort of thing that Madame Blavatsky used to go on on about. I quickly discovered that I was not the first to make the connection:

We are the last generation of the 5th root race. Our soul color is violet. Since about 1975 the first generation of the 6th root race has been coming in. Since the year 2000, 100% of the children being born are of the 6th root race. Their soul color is indigo. These are the "new and improved" spiritually evolved humans. Every Indigo Child has a " creative genius" within them waiting to be discovered and expressed. They have something new, something advanced, to bring to the world to evolve humanity, be it in the field of art, science, technology, philosophy, religion, and so on.

We went through all this in the 1960s, you know. Remember Consciousness III in The Greening of America? Now the question is whether sales of Arthur C. Clarke's Childhood's End are picking up again.

* * *

Indigo children may be imaginary, but fluorescent pigs are a fact, according to the BBC:

Scientists in Taiwan say they have bred three pigs that glow in the dark...They claim that while other researchers have bred partly fluorescent pigs, theirs are the only pigs in the world which are green through and through...The pigs are transgenic, created by adding genetic material from jellyfish into a normal pig embryo....In the dark, shine a blue light on them and they glow torch-light bright.

The scientists did not just do this on a bet: it's easier to work with genetic material if it fluoresces. If they begin to work on flying pigs, however, we will know they are not serious.

* * *

The term panspermia does not appear in the NASA pages about the Stardust mission. Stardust is the space probe that collected dust from the comet Wild 2; Stardust is supposed to land in the Mojave Desert this weekend. "Panspermia," of course, is the notion that life spreads through space in the form of living or nearly living spores. If that is the case, then we are relieved of the embarrassment of figuring out how microorganisms appeared so quickly after the Earth formed.

Stardust was dispatched in part to answer questions about proto-biology, but I have not seen any speculation about whether the probe might bring back something living. When samples were brought back from the moon, both the samples and the astronauts were quarantined against the possibility that they might carry an infectious lunar organism. In connection with a comet, though, such precautions would make little sense. Cometary material is always raining into Earth's atmosphere: if Wild 2 carries spores from deep space, they would be here already.

Actually, I seem to recall that an Indian scientist proposed that novel infectious diseases do in fact drift down from space. Better not to think about it.

* * *

You are generous people, you who have been sending money and buying stuff through the boxes on this website. This is all very much appreciated. Thank you.

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Years of Rice and Salt

N = 1

N = 1

This book review is the source of one of my favorite cocktail party theories: a number of seemingly well-established sciences are built upon an n of 1. In a grand sense, geology and biology fall into this category, since the big theories like plate tectonics and evolution depend on one big sequence of inter-related events. In a micro-sense, you can see if similar things happen in different times and places, but the overall development of life on earth, or the development of the earth itself, only happened once, and we lack the capacity to conduct meaningful experiments about such things. Of course, the universe itself, the subject of the grandest of all theories in science, also falls in this category. Perhaps that explains the need to invoke the multiverse.

I don't have any complaints about the way these sciences have been pursuing, it just strikes me as funny that some really big scientific ideas aren't actually amenable to experiment. We can conduct experimental programs that build up the foundations of such ideas, but we can't wind the universe back up and set it down and see what happens the second time, which is the foundation of all experimental philosophies of science. Maybe that is why I like alternative history and science fiction: this is how we try to acknowledge our weaknesses here.

The Years of Rice and Salt
By Kim Stanley Robinson
Bantam Paperback 2003
(Hardcover 2002)
763 Pages, US$7.99


This review appeared in the
Spring 2006 issue of
Comparative Civilizations Review


Once upon a time, a course in science-fiction writing was offered at Rutgers University. The grade was based on stories written by the students, but the instructor offered an exam option as a joke. It included this memorable question: “Describe the influence of the papacy on medieval Europe.” The question posed by this novel is actually more ambitious: what was the effect of post-medieval Europe on world history; or more precisely, what would the world be like if there had never been a European modernity? In the course of answering this question, Kim Stanley Robinson has written what may be the finest example thus far of Alternative History: historiographically sophisticated, with plausible characters, the book is essentially world history made readable as a series of biographies. Best of all, at least from the prospective of an admiring reviewer, the book presents a model of history that is both demonstrably and instructively false.

The premise of the story is that the outbreaks of plague in 14th century Europe were far more deadly than they historically were. The whole continent, from Britain to Constantinople, and from Gibraltar to Moscovy, is wholly depopulated. The action starts around 1400, when a deserter from the horde of Timur the Lame gets an inkling of the disaster as he wanders through the deserted landscapes of Hungary and the Balkans. He is enslaved by Turks; he is sold to the treasure fleet of Zheng He, who happened to be in East Africa on one of his famous oceanic expeditions. Eventually, the deserter dies as an innocent bystander at a court intrigue of the early Ming Dynasty.

In the course of this man’s adventures we meet pretty much all the people we will be meeting for the next 700 years. The conceit that holds the book together is that people are reincarnated, in much the way contemplated by Tibetan Buddhism, and that they normally progress through time with the same companions. In “The Years of Rice and Salt,” the principal companions are the Revolutionary, the Pious Man, and the Scientist; the Idiot Sultan puts in several appearances, too. Some of the most interesting passages in the book are set in the bardo state, between incarnations. Depending on the period in which they most recently lived, the companions take these interludes more or less seriously. During one such incident, the Revolutionary becomes exasperated with the Pious Man’s spiritual and historical optimism: “We may be in a hallucination here, but that is no excuse for being delusional.”

Macrohistory in this scenario differs from that of the real world more in detail than in broad outline. The 15th century discovery of the Americas is cancelled, for obvious reasons. Less than a century later, however, a Chinese fleet sent out to establish a base in Japan discovers the Inca Empire. Not long thereafter, the oceanic explorers from Firanja, a Europe resettled from North Africa, discover the east coast of the western continents. These penetrations from Eurasia are slow enough, however, to allow the politically ingenious people around the northern continent’s great freshwater lakes to adapt to the new diseases and to organize defenses. In later years, their model of democratically representative federal government would become the best hope of mankind.

The parallels continue. In Samarqand, in what would have been the late 17th century if anyone were using that reckoning, an alchemist notes that different weights of the same material fall at the same speed; soon there is a mathematics to express acceleration. Move forward another century, and we see scholars in the fracture area between China and Islam trying to reconcile the intellectual traditions of the two. The result is the beginning of a secular, enlightened science of humanity. A noble passage from their work runs thus:

“History can be seen as a series of collisions of civilizations, and it is these collisions that create progress and new things. It may not happen at the actual point of contact, which is often wracked by disruption and war, but behind the lines of conflict, where the two cultures are most trying to define themselves and prevail, great progress is often made very swiftly, with works of permanent distinction in arts and technique. Ideas flourish as people try to cope, and over time the competition yields to the stronger ideas, the more flexible, more generous ideas. Thus Fulan, India, and Yinzhou are prospering in their disarray, while China grows weak from its monolithic nature, despite the enormous infusion of gold from across the Dahai. No single civilization could ever progress; it is always a matter of two or more colliding. Thus the waves on the shore never rise higher than when the backwash of some earlier wave falls back into the next one incoming, and a white line of water jets to a startling height. History may not resemble so much the seasons of the year, as waves in the sea, running this way and that, crossing, making patterns, sometimes to a triple peak, a very Diamond Mountain of cultural energy, for a time.”

The hopes of this period for universal reconciliation are shattered by power politics; the power in this case coming from the steam engines of the trains and warships of southern India, whose Hindu regions were the first to master mechanical industrialization. These techniques soon spread universally, however. In the earlier parts of the book, it sometimes seemed to the characters that China would take over the world. This fear performed the minor miracle of uniting the huge and fractious Islamic world, which in turn posed a threat to China and India. Thus, in the closing decades of the decrepit Qing Dynasty, the Long War began, which essentially pitted eastern and southern Asia against the Middle East, Firanja, and northern Africa. It went on for 67 years, killing perhaps a billion people all told. Even in the middle of what would have been the 21st century, the world had still not recovered from it psychologically, however much social and technological progress had occurred.

In some ways, the postwar parts of the book are the most fun. In western Firanja, disgruntled intellectuals chatter in cafes about the history of everyday life and the perennial oppression of women. A musician takes the name “Tristan” and becomes a sort of one-man Solesmes, resurrecting the plainchant of the vanished Franks. There is a subplot about how physicists collude to avoid building an atomic bomb. There are conferences of historians in which the author gets to critique his own devices. A panel on the nature of the plague that destroyed Europe comes no closer to explaining what happened, perhaps for the excellent reason that the real Black Death was probably the worst that could have happened. We get a discussion of reincarnation as a narrative device and, better still, of narrative structures in historical writing, particularly in narratives of historical progress.

The book ends peacefully, with an elderly historian, the Pious Man, settling into semi-retirement at a small college in a region that is not called California. In a way, he had achieved the era of perpetual Light that people like him had always hoped for, but the eschaton is more like that of Francis Fukuyama than of any of the great religions. There was really only one way that history could go, we are led to believe. In the closing sections, children in his campus village hunt for Easter eggs in springtime, but of course they don’t call them Easter eggs.

The speculation in "The Years of Rice and Salt” presents the same sort of issue that Stephen Jay Gould addressed in “It’s a Wonderful Life.” In the latter work, Gould considered what would happen if biological history were begun again. Would it follow the course of the history we know, and arrive at something like our world? Gould answered “no.” His principal evidence, an interpretation of the Burgess Shales, collapsed a few years later when better preserved fossils from the same period were discovered. His larger contention is still open to debate; the matter can be decided only when we can compare the evolutionary history of Earth to that of another earth-like planet. At this point, it seems to me that Gould was probably wrong: evolution does tend toward certain solutions. I would say the same about human history, and so, apparently, would Kim Stanley Robinson. In this novel, however, the most remarkable effect of the deletion of the West is that there is no effect. This is almost surely wrong.

Consider a few of the notable figures in this alternative history: a Chinese Columbus, an Uzbek Newton, an Indian Florence Nightingale. They not only perform roughly the same historical functions as their real-world counterparts; except for the Columbus figure, they each do so at roughly the same time as each of their real-world counterparts. It is hard to see why this should be. The West did not decisively influence the internal affairs of the two greatest non-Western imperia, China and the Ottoman Empire, until well into the 19th century. There is no particular reason why sailors from Ming China could not have discovered America. For all we know, maybe a few did. Even if that discovery had become well-known, however, it would have made little difference. For internal reasons of cultural evolution, China was no longer looking for adventures. Similarly, there is no reason why the physics of Galileo and Newton could not have been discovered in Central Asia in the 17th century, if all that was necessary was cultural cross-fertilization and a frustrated interest in alchemy.

There are in fact good reasons for making India the site of an alternative industrial revolution. Its patchwork of states, so reminiscent of Baroque Europe, might well have offered both the intellectual sophistication and the political license to develop a machine economy. The problem is that no such thing seems to have been happening when the English acquired control over most of the subcontinent in the 18th century. There was considerable Indian industry, of course, but it was not progressive in the way that European industry was in the same period. It was not just a question of technique; industrial development requires financial sophistication and acceptable political risk quite as much as it requires engineering. India was kept from developing by the government of the Idiot Sultan, and he was wholly indigenous.

Toynbee defined civilization to be a class of society that affords an intelligible unit of historical study. The nations or other units that comprise a civilization could not be understood in isolation from each other; the larger ensembles to which a civilization might belong are accidental or not constant in their effects. Toynbee modified his ideas in later life, but this definition is helpful here.

We see even in the dates in this book that something literally does not compute. Most numerical dates are given in the Muslim reckoning; actually, it is easiest to find your way around if you keep a chronological list of Chinese emperors handy. Even though there is a very sketchy timeline at the beginning of the book, there are still occasions for confusion. Because of the difference between the lengths of the lunar and solar years, a Muslim century is (if memory serves) only about 97 Gregorian years. The omission of the Christian calendar, however necessary because of the book’s premise, makes the world history the book seeks to describe almost inconceivable.

There is a sense in which Columbus, and Newton, and Florence Nightingale were world-historical figures, but if we are to discuss them as a group, we must start with the fact they were all products, indeed characteristic products, of Western civilization. The line of development that led from one to the other (or from the social milieu that produced one to the social milieu that produced the other) was a process within Western civilization. There had, perhaps, been figures parallel to these great names during the pasts of other civilizations, but the parallels were not chronologically simultaneous.

This does not mean that there is no such thing as world history. Another of Toynbee’s notions is helpful: the idea that civilizations appear in generations. The most ancient civilizations, those of the river valleys, were local affairs, however widely their influence spread. The “classical” civilizations of the next generation, of Rome and the Han and the Gupta, were regional. The third generation, including the Islamic cultures, post-Tang China, and the West after the Dark Age, are all third generation, as indeed are other societies, notably Japan and Hindu India. What Islam, the West, and China, have in common is that they are all, in principle, universal. During their great ages, Islam and China both reached just shy of global influence before consolidating their activities to certain broad regions. The West finally did achieve global scale, in the 15th century, and so created the possibility of a genuinely ecumenical society.

This is the gospel according to Toynbee, and you can take it or leave it; as we have noted, “The Years of Rice and Salt” includes a quite sophisticated discussion of metahistory. Nonetheless, the incontestable fact is that, whatever malign influence you might want to ascribe to European imperialism in the 19th and 20th centuries, the other great civilizations during early modern times were simply not efflorescent in the way the West was. Without too much speculation, we can make a good estimate of the course of the world’s major civilizations in the absence of the West.

China was winding down from its Song climax; the Ming and Qing Dynasties would have followed much the same course with or without Western influence. The result would have been another minor dark age in the 20th century, as after the Latter Han in antiquity. Similarly, the Ottoman Empire, the greatest of Islamic states, was losing control of North Africa and the hinterland of the Middle East before the Europeans ever became a factor. The empire would probably have unraveled in pretty much the way it did in our timeline, perhaps with the exception that the caliphate might have survived as a venerable anachronism. As for India, it is a commonplace that the English stepped into a vacuum left by the decline of the Mughal Empire. Doubtless other forces would have stepped in if the English had not been available, but there is no particular reason to suppose that the new situation would have been discontinuous with earlier Indian history.

There would still have been dynamic societies in the world, of course. Japan’s social evolution has its own internal logic; Western contact in the mid-19th century was an opportunity that Japanese elites chose to exploit. During the same period, Burma was literate, mechanically ingenious, and of an imperial turn of mind; only annexation by the British Empire prevented what might have been a new Buddhist civilization from forming. Anything at all might have happened in the Americas, but for the time being, it would have been of only local significance. The “classical” generation of American civilizations would still have been in the future.

On the whole, Earth by the middle of the 20th century might have seemed like a planet with a great future behind it. However, there have been general breakdowns of civilization before, notably at the end of the Bronze Age. Even in the barbarous early Iron Age that followed, however, techniques and ideas spread from land to land. Similarly, in the third millennium, it would have been just a matter of time before one or more societies wove the new ideas into a civilization with universal potential.

That history would have taken another 500 to 1000 years to reach the state of things that we see from the college in the land that is not Calfornia. A book about it would have to be very good indeed to compare to “The Years of Rice and Salt.”

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Years of Rice and Salt
By Kim Stanley Robinson

The Long View: The Rosicrucian Enlightenment



The Rosicrucian Order is the kind of thing I couldn't get enough of when I was a teenager and a young man, and now bores me to tears. Accordingly, I'm sympathetic to John's critical reading of what the history of such movements really means.

The Rosicrucian Enlightenment
By Frances Yates
Routledge, 2004
(First Published 1972)
333 Pages, US$14.95
ISBN 0-415-26769-2


Did companies of English actors once prowl the capitals of western Germany and Mitteleuropa like Cathar troubadours, providing entertainment to the masses, to be sure, but also heralding to the wise a the impending arrival of an alchemical Millennium? And did the mind of modernity really spring from the Monas Hieroglyphica, John Dee’s dense and enigmatic little book, whose evangel Dee spread during his time in Europe in the 1580s, when he became a figure at the uncanny court of Holy Roman Emperor Rudolph II at Prague? To answer a flat “yes” to either of these questions would be to put the matter more crudely than it appears in this careful classic study of the Rosicrucian moment in European intellectual history. (And actually, in asking those questions, I put in the comparison to the troubadours myself, though the author does make much of the role of Elizabethan drama.) Conspiracy theorists cannot be greatly comforted by this book, since it deals in large part with a public if overambitious political project, though the book does touch on the origins of the Freemasons and the other not-particularly-secret societies that began to flourish in the 17th century. Be that as it may, the real theme of the work is that the origins of the culture of modern science are closely linked with millennialism and a form of neoplatonism.

The centerpiece of the story is brief reign of the Winter King and Queen of Bohemia: Frederick V (1596-1632), Prince-Elector of the Palatinate, and of Princess Elizabeth Stuart (1596-1662) of England. In 1619, the Kingdom of Bohemia rejected the Catholic Habsburg heir to the throne and chose the Protestant Frederick instead. His marriage in 1613 to Elizabeth, the daughter of James I of England, had been seen as a great strengthening of the Protestant cause in Europe. The offer of the Bohemian crown raised the possibility of a league of Evangelical princes that would break the hegemony of Habsburg and Spanish power. The supporters of the Bohemian project, as manifested in the literature of those years that purported to issue from an ancient but theretofore secret order of the “Rosy Cross,” also seem to have hoped that Frederick’s move from Heidelberg to Prague, the old capital of Rudolph II (1552-1612), might be the preliminary to Frederick’s ascension to the throne of the Holy Roman Empire (an elected position, remember: Frederick was one of the electors). Thus the connection between that ancient federation and the Church of Rome would be broken, and the empire would become the instrument, in the words of the Rosicrucian literature, of “a general reformation of the whole wide world.”

As it happened, few enterprises have ever turned out quite so badly. There had been a long truce in the wars of religion in the decades before the Bohemians chose Frederick. In those days, every state in Germany and Middle Europe seems to have been ruled by Ludwig the Mad, and the recluse Rudolph with his hermetic studies and keen interest in alchemy is usually denounced as the most frivolous of all. In retrospect, though, his absent-minded tolerance was probably the best course. His Habsburg successor (after the brief reign of his brother) refused to accept the loss of Bohemia to the Protestant cause (though it was largely a Protestant country). The Thirty Years War began with the Battle of the White Mountain in 1620, when Frederick and Elizabeth were driven from Prague. Frederick simultaneously lost the Palatinate to invading Catholic armies. They then established a long-running but penurious court at the Hague.

The key documents on the Rosicrucian Furore, as it was called in Germany, were the pseudonymously published pamphlets, the Fama (to use the abbreviated title), which appeared in printed form in 1612, and the Confessio, which appeared two years later. To some extent, they were just partisan literature extolling the future of Frederick and his House. However, as we have seen, they also announced the existence of a secret society, an “Invisible College” of long-lived persons founded by one Christian Rosenkreuz, who was said to have acquired his knowledge in the East. This society promised to inaugurate an era of universal enlightenment in the very near future. The recovery of ancient wisdom was to be the foundation of this new reformation, but a crucial feature of it was to be the perfection of natural knowledge gained by experiment and by the consultation of scholars.

These documents and associated publications included the numerological reworking of ancient prophecies to prove that a great change was imminent. The model of history they proposed was not so different from the postmillennialism familiar from later centuries, which holds that the Millennium will be established on Earth by human effort before the Second Coming; this model is not so different from Age of the Holy Spirit forecast by Joachim of Fiore, to which the author of this book tells us the literature actually alluded.

The Christian Rosenkreuz after whom the Rosicrucian fashion was named was not so much a myth as a joke, an imaginary monk who was said to studied in Damascus and Fez. The spirit of the anonymous literature is captured in the work of a “Rosicrucian” whose name we do know, Johann Valentin Andreae. His allegory, written in German but appraently under influence of English drama, is known in English as The Chymical Wedding of Christian Rosenkreutz. The work depicts a royal wedding spread over seven days, whose events track in some ways the alchemical process understood as a spiritual exercise. The Wedding ends with the sending out of “missionaries” to spread the new science. It is reasonably clear that this did not describe an actual missionary enterprise, but the spread of a new historical optimism based on the hope for a new synthesis of knowledge.

Persons less astute than Andreae took the Rosicrucian brotherhood literally. In Germany, until the defeat of Fredrick V in 1620 burst the bubble of irrational enthusiasms, there was a flood of literature by people defending and attacking the brotherhood; many sought admittance to it. The echo of in France of the German furore was a sort of witch hunt, occasioned by the appearance in 1623 of posters in Paris announcing the arrival of the invisible Rosicrucians. “Invisibility” was sometimes taken to mean not just “clandestine,” but unable to be seen. Young Rene Descartes, who had actually fought at the Battle of the White Mountain on the Catholic side, was rumored to be a Rosicrucian until his return to Paris proved him to be visible after all.

What was this new science that the Rosicrucian literature claimed to be about to transform the world? It was Renaissance Hermeticism, heavily focused on mathematics but with a keen interest in the mechanical arts developed by the engineers of antiquity. Frederick’s gardens at Heidelberg, for instance, were famous for their automata and other mechanical marvels. “Hermetic” in this context usually meant the theosophy of “Hermes Trismegistus,” who was purported to be a philosopher of ancient Egypt whom Renaissance had identified with Moses, though in fact the writings ascribed to him date from the Greco-Roman period. Like the earlier Renaissance, it included a systematic interest in alchemy, but in the “new” alchemy of Paracelsus (1493-1541), with its heavy focus on medicine and the philosophy of the parallel nature of the macrocosm and microcosm. The novel feature was the Cabala. This, too, had been an element of Renaissance thought since at least the 15th century, but the Rosicrucian Cabala was the new, Lurianic Cabala that developed in the Levant after the expulsion of the Jews from Spain. It was not just Messianic, it was “reformist,” looking to the reconstruction of a damaged world.

The English element clarifies the story greatly. The order of the Rosy Cross itself, for instance, is plausibly explained as an allegorical reworking of the rose and cross among the symbols of the English Order of the Garter, which James I bestowed on his new son-in-law Frederick, and which had recently been given to the prince of Württemberg. Most significant of all, however, was the role of John Dee (1527-1609).

Dee was a serious mathematician and a notable statesman. He is sometimes credited, perhaps with a measure of exaggeration, with founding the British Secret Service. He was interested in the natural world as such; and to use Francis Bacon’s later phrase, he hoped to use natural knowledge for the relief of man’s estate. He was also quite chatty with angels.

In his Monas Hieroglyphica, Dee tried to unite all these themes in a synthesis whose ambitions are at least as great as, say, Thomism, or the search for a Theory of Everything. Empirical science was an element of what Dee sought to promote, but as a component of a grander structure whose focus was elsewhere. As we read in this history:

To return to the general analysis of the Rosicrucian outlook. magic was a dominating factor, working as a mathematic- mechanics in the lower world, as celestial mechanics in the celestial world, and as angelic conjuration in the supercelestial world. One cannot leave out the angels in this world view, however much it may have been advancing towards the scientific revolution. The religious outlook is bound up with the idea that penetration has been made into higher angelic spheres in which all religions were seen as one; it is the angels who are believed to illuminate man’s intellectual activities.

Readers will note how these ideas reflect the doctrine of Perennialism and anticipate later speculation about the transcendental unity of religions. In connection with ecumenicism, this reviewer notes that Dee’s Hermetic progressivism seems to have been an element of what Paul Johnson later called the Third Force: Johnson’s treatment of the topic in his History of Christianity (1976) is largely a paraphrase of The Rosicrucian Enlightenment. According to Johnson, this Third Force operated in both the Catholic and Protestant regions of Europe before the Thirty Years War to mitigate the friction between Catholic and Protestant, and between the different denominations in the Protestant camp, with a view to eventual reconciliation. The author of this book does note the existence of associations during that period whose members were systematically indifferent to confessional affiliation. They might claim to belong to any church, while adhering to their own version of slightly esoteric Christianity.

Sometimes the esotericism of these years overbalanced the Christianity. That seems to have been the case with Giordano Bruno (1548-1600), yet another familiar of the court of Rudolph II. Unlike the pious Evangelical Dee, Bruno espoused turning to the “Egyptian Religion,” by which he meant the new synthesis of Hermeticism and alchemy.

As for Dee himself, his version of what we must call “Rosicrucianism” (though that is not necessarily a term he would have heard himself) certainly had political dimension. In addition to his still somewhat murky adventures in Prague, during his stay in Europe in the 1580s he seems to have been attempting some such link between England and the Palatinate that the marriage of Frederick and Elizabeth later achieved. This policy was not necessarily anti-Catholic. Dee’s own Anglicanism had not quite gelled yet as a Protestant confession, for one thing. Dee could still talk to the Emperor Rudolph without a strong sense that each was a member of a different confession. The strongest insistence that Catholic and Protestant choose sides, this book suggests, came from the Society of Jesus. Officially recognized as an order in 1540, the Jesuits required some decades to become the ubiquitous, and allegedly omniscient, face of Counter-Reformation Catholicism. Indeed, maybe the phantasm of the Order of the Holy Cross was intended as an image of the Jesuits as they should have been. (We may note that rumors were not lacking that the Rosicrucians actually were the Jesuits, presenting themselves in other guise.) In any case, by the time of the marriage of Frederick and Elizabeth, the Rosicrucian movement had become less generically reformist and more specifically anti-Catholic, or at least anti-Jesuit. At the same time, its focus on the improvement of the secular world had become more emphatic:

The Rosicrucian manifesto may now take a somewhat wider meaning. It calls for a general reformation because the other reformations have failed. The Protestant Reformation is losing strength and is divided. The Catholic Counter Reformation has taken a wrong turning. A new general reformation of the whole wide world is called for, and this third reformation is to find its strength in Evangelical Christianity with its emphasis on brotherly love, in the esoteric Hermetic-Cabalist tradition, and in an accompanying turning towards the works of God in nature in a scientific spirit of exploration, using science or magic, magical science or scientific magic, in the service of man.

The actual outcome of Frederick’s Bohemian adventure was sufficiently appalling to occasion what Richard Landes of Boston University has called “millennial disappointment,” which is what happens when the perfection of the world is promised but does not arrive. This was a key theme in Endless Things, the last novel of John Crowley’s Ægypt series. The series is based on an analogy of the Rosicrucian Enlightenment to the Consciousness Revolution of the 1960s; it tells tales from both eras in parallel. However, as The Rosicrucian Enlightenment reminds us, it is possible to argue that the Rosicrucian Millennium did arrive, though not quite in the manner expected by John Dee or Frederick V.

The later story of the Rosicrucians links in obscure ways to the other obscure beginnings of the 17th century. It had something to do with beginning of the Freemasons (of the real Freemasons, as distinct from the bogus lineage that runs from the Temple of Solomon through the Templars). It also had something to do with the foundation of the Royal Society in 1659. That august institution is, perhaps, the Invisible College made visible, however much its founders sought to distance themselves publicly from all the occult sciences, and especially from any taint of association with John Dee. The important link here is Francis Bacon (1561-1626), the statesman and philosopher who is sometimes credited, not altogether accurately, with the discovery of the scientific method.

Certainly some of Bacon’s ideas were diametrically opposed to those that we have been considering. He had no interest in secret societies or invisible colleges; he was keen, rather, to promote the exchange of scholars and discoveries among the visible colleges of Europe. Though he, too, urged the development of the sciences, mathematics does not seem to have been on his list of disciplines that needed perfection. Mathematics seemed to him to be too close to conjuration. (Some of his contemporaries thought the same, and supported the development of mathematics for just that reason.) No doubt his disinterest in this subject was related to his rejection of the Copernican model of the solar system. Be this all as it may, though, to read Bacon’s New Atlantis is to be confronted with a Rosicrucian utopia, down to the rosy crosses on the turbans of the Christian priest-scholars who benevolently manage the great temple and research institute on the hidden continent with which the story deals. These scholars dispatch secret observers to the rest of the world, to keep abreast of developments in every country. What more could a Rosicrucian ask for?

Bacon was certainly the spiritual founder to whom the first histories of the Royal Society looked back, though this book reminds us that the Society had a pre-history at Oxford before its official founding in London, a prehistory when its membership may have felt less need to be intellectually respectable. Be that as it may, the precautions of the founders to disassociate themselves from subjects that Bacon would have considered questionable was in vain. In its second generation, the world reputation of the society was made by the mathematical attainments of Isaac Newton, who was also an alchemist and a millenarian, though he was discrete about those interests. Unlike John Dee, he did not talk to angels, or at least not that we know of.

This book emphasizes the reciprocal Rosicrucian influences that went back and forth between England and the continent, particularly in the form of refugees from Germany and Bohemia. It may have been that the foundation of the Freemasons was “blowback” (a term the book does not use) from Dee’s sojourn in Europe. The most interesting political figure in this story is Elizabeth, the Winter Queen of Bohemia. She maintained her court-in-exile after the death of her beloved but not particularly useful husband. During the English Civil War and the Protectorate, she contrived to stay on good terms with both Roundheads and Royalists. Meanwhile, intellectuals and persons of a mystical bent flocked to the Netherlands to be near her. (Descartes was devoted to her daughter.) If she had seriously hoped to become empress of a magical transformed Europe, then no doubt she was disappointed. Still, she did live to see her son restored as ruler of at least part of the Palatinate. Much later, her grandson became George I of England.

Copyright © 2009 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-09-09

Posting has been light of late, my home PC needs a new power supply. The replacement should be here by Tuesday.

When Correlation Is Not Causation, But Something Much More Screwy

UCLA sociologist Gabriel Rossman explains how easy it is to fool yourself with the way you collect your data.

Toyota’s Research Institute head says full autonomous driving is “not even close”

I'm a bit of a skeptic about how easy it really is to completely automate driving.

The Tater Tot Is American Ingenuity at Its Finest

The Tater Tot was made out of french fry waste products.

Moving the Finish Line: The Goal Gradient Hypothesis

This is a fancy term for the idea that the closeness of a goal can influence our motivation. This is the idea Uber uses to get drivers to work longer, and how video games are made more addictive to play. Something that doesn't get discussed here is risk. For example, a big difference between the cited example of getting $12,000 at the end of the year as a bonus, or $1,000 at the end of the month, is that bonuses are dependent on financial performance. In the real world, you might get more money from the monthly option, which chops up the risk of the company not making enough money into smaller bits.

A Simple Design Flaw Makes It Astoundingly Easy To Hack Siri And Alexa

I imagine it was easier not to take frequency into account when designing these apps. This seems easy to fix, in principle.

Voynich manuscript: the solution

This turned to a be a thick problem. You needed a lot of the right knowledge in the right head to solve it.

My shelf [and a half] of Jerry Pournelle books

My shelf [and a half] of Jerry Pournelle books


Jerry Pournelle, one of my all-time favorite authors, died yesterday. I followed Jerry's website and writings for 16 or 17 years. Jerry was an early adopter of the Patreon method of earning a living, as he was an early adopter of so many things. I supported him for the last eight years or so. Jerry outlasted a stroke and brain cancer, and while those slowed him down a lot, he was actively writing and blogging until the end. 

Jerry led a long and interesting life. I would have loved to read his memoirs, which he never got around to writing. Hopefully someone else will fill the gap.


The Long View: Strange Angel: The Otherworldly Life of Rocket Scientist John Whiteside Parsons

I think Tim Powers should write a book about Parsons. A native Angelino, with ties to both rocketry and the occult, who died young under mysterious circumstances. This would probably write itself.

Strange Angel:
The Otherworldly Life of Rocket Scientist John Whiteside Parsons
By George Pendle
Harcourt, 2005
US$25.00, 350 Pages
ISBN 0-15-100997-X


The death of John Whiteside Parsons (1914-1952) sketched the major themes of his life with tabloid strokes. Within a few days of the fatal explosion in his laboratory in Pasadena, California, the public was reminded of his history as a founder of the science of rocketry and as a forensic explosives expert, while being introduced to his parallel career as a black magician and former head of the Agapé Lodge of the Ordo Templi Orientis (OTO), the local outpost of Aleister Crowley's Thelemite religion. The rumors that the explosion was really a spectacular suicide were probably false; his elderly mother did kill herself within a few hours of learning of the accident, however.

And that was only the half of it. Jack Parsons seems to have known, and sometimes rented rooms to, most of the great names of the Golden Age of science fiction. His circle included Ray Bradbury, Robert Heinlein, L. Sprague de Camp, and many others. L. Ron Hubbard was of their number: Parsons not only performed apocalyptic ritual magic with him, but was fleeced out of more than $20,000 by him. Moreover, Parsons moved at the fringes of the Communist underground of scientists and technicians; he attended Leftist salons with Robert Oppenheimer's brother, a known Party member, which was among the reasons for Parsons' many awkward interviews with the FBI in later years. In retrospect, however, we see that politics was the least of his subversions; he was actually trying to inaugurate a new age through magic. Most dreadful of all, there was this: Jack Parsons wrote macabre poetry that makes Lovecraft's look good.

George Pendle is chiefly a science writer. He focuses on Parsons' work on rocket fuels, which is as it should be. However, by no means does he neglect Parson's Californian milieu of science, magic, and popular literature, or the question of how Parsons' occult interests meshed with his scientific ones. Parsons was the black sheep of the story of rocketry, scarcely mentioned for many years by the newly respectable rocket division at Caltech, or by the Aerojet Corporation, or at the Jet Propulsion Laboratory, all of which he helped found. This sober biography helps to correct the record. Parsons also has an apparently growing influence on popular culture, for which this book provides essential background.

Parsons was born in Los Angeles, to a wealthy but broken family; his philandering father was forced to return to his native New England. (Father and son were both named “Marvel” at birth, by the way, a technicality that both forgot as quickly as possible.) He was raised by his mother and maternal grandparents in Pasadena, where Parsons would spend the most important periods of his life. The Great Depression hit his family hard. He did finish high school, but he would never hold a higher degree. He built his long-standing interest in explosives and fireworks into a career in explosives manufacturing, an industry where he always seemed able to find work when other opportunities were scarce.

Parsons and his friend, Edward Forman, had experimented with rockets since childhood. They did not simply like fireworks; they dreamed of manned space flight, a concept made real to them by the budding genre of science fiction. This notion then found little favor in the academic establishment. Robert Goddard, the inventor of the liquid fuel rocket, found that out the hard way in 1919, when he was humiliated by the press after he alluded to the possibility of sending a rocket to the moon.

(Goddard plays oddly little part in this story, however. After his public disgrace, he retreated like the mad scientist of science fiction to the desert, to pursue his cautious and secretive researches on a foundation grant. The desert he retreated to was Roswell, New Mexico. You can't make this stuff up.)

Undaunted by respectability, Forman and Parsons visited the California Institute of Technology (Caltech), conveniently located in Pasadena, with a proposal for a program of modest but systematic rocket research. They did find a graduate student, Frank Malina, who was able and willing to provide the mathematical treatment their empirical research lacked. They also found a senior faculty patron in Theodore Kármán, who allowed them the off-hours use of the laboratories and other facilities, but no funding. This was the beginning of the Suicide Squad, a growing group of rocket enthusiasts whose numerous explosions, planned and otherwise, irked the university hierarchy but delighted the students and local press. Before long, they took the more pyrotechnic aspect of their work to the desert, where it became the basis of the American aerospace industry.

One of the merits of Strange Angel is that it illustrates how an international network of nerds was quite capable of instigating a technological revolution long before the invention of the Internet. The world's chief rocket societies were located in Britain, Germany, and the United States. They were part of a network that included the new subculture of science fiction, which communicated through the old pulp magazines. The most technically precocious domain of this network was in Germany. The Germans had to drop out of the network quite early in the 1930s, when the military became serious about funding their research. Nonetheless, Wernher von Braun kept up his subscription with Astounding Science Fiction even during the war years, through a maildrop in neutral Sweden.

The role of military subsidy in the development of rocket research in the United States was not so different, except that the Suicide Squad and their supporters could not attract the military's attention until the eve of World War II, and even then they had to employ euphemisms in their proposals: the Jet Propulsion Laboratory in Pasadena has that name because the term “jet” evoked less official distain than “rocket.”

Parsons ensured the success of the project that made the military take rockets seriously. What they wanted was a way to assist the takeoff of heavily laden aircraft from confined spaces, such as bombers taking off from aircraft carriers. The obvious answer was some sort of auxiliary rocket, but there were problems. Liquid-fuel rockets were too complicated and required hard-to-store fuels. Solid-fuel rockets gave a thrust that was of short duration, and also notoriously uneven. Parsons solved the problem by making the biggest innovation since solid-fuel rockets were invented: he replaced black powder with a homogenous solid fuel that consisted, in essence, of asphalt plus an oxidizer. In later years, the biggest boosters would use liquid fuel, but Parsons' solid-fuel invention remains the basis of most practical applications of rocketry.

Parsons himself was a handsome and even fashionable young man with wide cultural interests. In the mid-1930s, when he was just an explosives engineer with an odd hobby at Caltech, he acquired an even odder hobby when he began to attend celebrations of the Gnostic Mass at the Agapé Lodge of the OTO. His attendance was not in itself all that remarkable. Academics, members of the film industry, and bohemians of various sorts visited the Lodge. The meetings, as one member later recalled, were “like a Fellini movie.” Parsons, however, progressed from an appreciation of the meetings as occasions for fancy dress to a systematic interest in the works of Aleister Crowley and the Church of Thelema. He and Helen, his first wife, joined the Lodge.

Parson became the apple of Aleister Crowley's eye, not least because Parsons was doing well financially with his war work; while Crowley, living in England with his personal demons and an expensive heroin habit, desperately needed the money. And in fact Parsons was good to the Lodge, if not necessarily good for it. He transferred its headquarters to the mansion in Pasadena that have been built by Arthur Flemming, the lumber magnate who had, not altogether coincidentally, provided the funding to establish Caltech. In addition to providing a venue for the performance of ritual magic, the mansion became a boarding house for artists and writers. The life of the house anticipated the communes of the 1960s and '70s, except that it included a full range of generations, from children to the elderly. We see it reflected, perhaps, in the plural families described in the novels of Robert Heinlein, such as The Moon is a Harsh Mistress and Stranger in a Strange Land.

Parsons own marriage disintegrated in the warm bath of sexual openness. The Lodge leader, Wilfred Smith, left with Parsons' wife when he finally accepted Crowley's decision that Parsons should replace him as head of the Lodge. (Frustrated by Smith's reluctance to leave, Crowley wrote a scripture that revealed Smith was actually a god, and needed to go on a spiritual retreat to discover which god he was; human resource managers should take note of this procedure when they need to terminate a senior executive.) Parsons himself had taken up with his wife's indictably younger half-sister, who would eventually run off with L. Ron Hubbard, along with most of Parsons' money.

The Church of Scientology acknowledges the presence of Hubbard at the Lodge. They say that he was assigned by naval intelligence to investigate a black-magic cult, which he dispersed, rescuing a girl in the process. Certainly Hubbard was not merely a passive assistant in the operation that Parsons conducted in 1946, which he believed to be of world-historical significance. Pendle describes it thus:

The “magical working” he now began was his most ambitious yet. He believed he could incarnate an actual goddess on earth, a female messiah named Babalon. The goddess Babalon (its spelling “corrected” by Crowley from Babylon to provide a more auspicious cabbalistic number) first appears as a character in literature in the Revelation, where she is described as a scarlet woman riding on the back of the Great Beast. Parsons believed that his Babalon would also ride on the back of the Beast—Crowley—and augment Crowley's teachings. He hoped that this “Babalon Working” would resound his name through the ages like the name of a William Bolitho hero.

Bolitho was the author of Twelve against the Gods, a book about heroes who changed the world. The most immediate influence for Parsons' concept of the Working, however, seems to have been a story by one of his science-fiction-writer friends, Jack Williamson, entitled Darker Than You Think. This short story, later expanded to a novel, is actually about werewolves, and their campaign to reassert the dominance over mankind that they had enjoyed in prehistoric times.

This conjunction of ideas explains a few odd corners of popular culture. Female messiahs and Antichrist-like women have been appearing in print and on television for some years now: one thinks particularly of the Buffy series and its spinoffs. Since the Babalon Working is apparently not particularly obscure, it could be the origin of the notion. Note also that Williamson may have conceived the motif of lycanthropy as a genetic condition, not a spiritual one.

In any case, the Agapé Lodge did indeed break up not long after the Babalon Working, though perhaps less because of anything Hubbard did than because Parsons was starting to break up, too. There has never been a lack of scientists and engineers with a mystical streak; most histories of the invention of the atomic bomb note Robert Oppenheimer's claim that he recited a verse from the Bhagavad-Gita to himself at the first bomb test: “I am become death, the destroyer of worlds.” Parsons was less pretentious, but more alarming: during static tests of rocket engines, he would stamp his feet in the test pits as he recited Crowley's “Hymn to Pan.” He was no longer surrounded by his Animal House friends from the Suicide Squad, but by a new cadre of humorless engineers and industrial bureaucrats, who noted that this dangerous lunatic had also stopped taking baths and was obviously using amphetamines.

He first lost his security clearance because of his earlier political contacts. He got it back temporarily, but his later personal behavior (including negotiations to move to Israel that involved transferring information from his current employer, Hughes Aircraft), assured that he would never have anything to do with government-funded research again. He resigned from Agapé and sold the building to developers, who demolished it. He did retain a second wife, nicknamed “Candy,” whom he believed he had summoned through magic. In his final years, he took some steps toward founding his own religion, which he called “Witchcraft,” but it did not come to much. Pendle notes that, despite Crowley's efforts to build a spiritual empire, the only one in this story to launch a spiritual enterprise with popular appeal was L. Ron Hubbard.

Parsons continued to work as an explosives expert. His last job was as a special-effects man in the film industry. He probably killed himself accidentally while working on a rush order just before he planned to leave with his wife for a long stay in Mexico.

Parsons' life can be explained in part through the working of “stigmatized knowledge,” which Michael Barkun describes in A Culture of Conspiracy. Knowledge, and apparently also the desire for knowledge, that has been dismissed by society as a whole may yet survive in little subcultures. These intermingle, so that rejected ideas of all sorts come to seem more plausible just because they have been rejected.

Pendle tells us again and again that, until the fourth decade of the 20th century, rocketry was an underground enthusiasm, particularly as it applied to space flight. Those we now see as the fathers of the theory of space flight, Konstantin Tsiolkovsky and Hermann Oberth, attracted no academic interest. Robert Goddard and the Suicide Squad at Caltech continuously met with the confused idea, even among physicists, that rockets would not work in a vacuum.

Parsons and friends were not quite in the position of someone who believes that all civilization comes from Atlantis: the Suicide Squad could produce evidence, given enough work and detonations. Still, they could only conjecture that their grander objectives were possible, and there was no way at all to verify the desire for space flight. In the world where Parsons lived, his project was no more or less plausible than the Stalinism and Theosophy with which it coexisted. These ideas were of uneven quality, to say the least, but it is not at all surprising that they incubated in the same milieu, and even in the same minds.

Parsons ensured that space flight would shuffle toward Pasadena to be born, and that still seems the most likely of his enthusiasms to prosper in the long run. Still, as this is written in 2005, that assessment is a matter of faith: NASA seems to be doing for manned space flight what the Soviet Union did for socialism. In contrast, “Babalon” for many people is not just a misspelling anymore. Parsons was wise to cultivate science-fiction writers: the future really is full of surprises.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site