The Long View 2006-07-19: The Road to Perdition

Model of the temple district of Tenochtitlan at the  National Museum of Anthropology   By Thelmadatter - Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=3744781

Model of the temple district of Tenochtitlan at the National Museum of Anthropology

By Thelmadatter - Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=3744781

I recently read that Tenochtitlán, unlike almost all other pre-modern cities, was a population source instead of a population sink. This was due to a combination of the productivitiy of the floating farms, and a lack of the nasty diseases typical in cultures that have been farming for longer periods of time.


The Road to Perdition

 

Is this news, or just tabloid mischief? We read in today's New York Post:

July 19, 2006 -- JERUSALEM - Hezbollah yesterday warned the United States: You're next on our hit list. The threat against U.S. interests came as the FBI revealed it is searching for Hezbollah terrorist agents operating on American soil.

I have no doubt the facts are true: Hezbollah is making threats and the FBI is looking for sleepers. On the other hand, there are no signs of an imminent terrorist attack in the US, unless you count President Achmadinejad's remarks yesterday that Muslims will rejoice soon. Should an attack in the US occur, of course, it would remove the political and legal obstacles to US strikes against Iran and Syria. Iraq's connections with Al Qaeda were tenuous, elliptical, and contingent. Hezbollah, in contrast, is publicly subsidized and facilitated by Syria and Iran.

I am still inclined to think that all this unpleasantness will blow over: Hezbollah started it, not because it felt confident, but because it had to do something to stay relevant. Still, one can at least see a way now in which the situation could become awkward.

* * *

Meanwhile, this news from the Western Front: California: SACRAMENTO -- Gov. Arnold Schwarzenegger has proposed a powerful new centralized authority under his direct control that would be charged with implementing one of the nation's most far-reaching initiatives to curb global warming.

And we let ourselves be convinced that California would be his last territorial demand in North America.

* * *

Here's a disease I can relate to: prosopagnosia, or face blindness. Without the tireless efforts of the disability industry, I would not now have a term for the fact I am slightly worse than usual at remembering faces. It's certainly true that I tend to identify people by their hair. There was a time in the 1980s when I could not tell one English rock singer from another.

* * *

In a piece about the reintroduction of primitive violence to modern societies, Mark Steyn notes just how ghastly primitive life really is:

Lawrence Keeley calculates that 87 per cent of primitive societies were at war more than once per year, and some 65 per cent of them were fighting continuously. "Had the same casualty rate been suffered by the population of the twentieth century," writes Wade, "its war deaths would have totaled two billion people." Two billion! In other words, we're the aberration: after 50,000 years of continuous human slaughter, you, me, Bush, Cheney, Blair, Harper, Rummy, Condi, we're the nancy-boy peacenik crowd. "The common impression that primitive peoples, by comparison, were peaceful and their occasional fighting of no serious consequence is incorrect. Warfare between pre-state societies was incessant, merciless, and conducted with the general purpose, often achieved, of annihilating the opponent."

We sometimes hear that Late Neolithic man was healthier and better fed than early civilized man, so much so that a question has arisen about how civilization could have started at all. The answer seems to be that civilization is lots safer.

* * *

But is civilization demographically sustainable? Demographers of the early 20th century used to express surprise that pre-modern cities did not, for the most part, sustain themselves by local births, but relied on immigration to maintain their populations. Now those worryworts at Brussels Journal remind us yet again that the same seems to be true of modern urban societies:

“Europe and Japan are now facing a population problem that is unprecedented in human history,” said Bill Butz, president of the Population Reference Bureau. Countries have lost people because of wars, disease and natural disasters but never because women stopped having enough children. Japan announced that its population had shrunk in 2005 for the first time, and that it was now the world’s most elderly nation.

The current situation is not absolutely unprecedented. The population of France fluctuated between 12 and 20 million in the late medieval and early modern eras. The same, or worse, was happening in the rest of Europe. The Ottoman Empire seems to have coincided with a period of population stagnation in its territories. War and disease sometimes depressed populations violently, but sometimes, for economic reasons, people did decline to have the large number of children it took to overcome the high infant mortality. The novelty, perhaps, is fertility levels that are inadequate even when infant mortality is close to zero.

Readers who need to worry about demographic collapse will enjoy the site of The Population Reference Bureau, and especially the PRB's country-by-country demographic profiles

* * *

And yes Virginia, there are spelling reform blogs.. Here is an example from My Space. My own preferences for an upgraded orthography, though fluid, are otherwise

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2005-08-23: The Perfection of the Species

The image in the header is the image John referenced in his joke about contributing to the state of perpetual surveillance. The man in the image is Herbert Kitchener, 1st Earl Kitchener, scourge of the Boers and one of the few generals who thought the Great War would be long.

I appreciate John's simple computation of the average tenure of each Supreme Court Justice in groups of ten. It is a simple thing in now, and in 2005, to look up such information to double-check something like now Chief Justice John Robert's 34-year old speculation that the framers of the Constitution hadn't anticipated how long people live now.

Justice Roberts made a common mistake, which is thinking increasing average lifespans means that adults live 20 or 30 years longer than they used to. There is some increase for adults, but almost all of the change in the average was driven by changes in deaths under the age of 5. 

Something that struck me just now is that I've seen a lot of things on the subject of average human lifespans that assumes that childhood mortality was as high in Classical times or earlier as it was in early modern Europe. However, we know that what we now call childhood diseases are mostly recent things, largely within the last 2000 years or so. The human disease burden has slowly been getting worse, which might mean that childhood was somewhat less dangerous before the arrival of measles and smallpox.


The Perfection of the Species

 

Supreme Court Nominee John Roberts had some thoughts many years ago about limiting the terms of federal judges, and was foolish enough to put them on paper:

The Constitution "adopted life tenure at a time when people simply did not live as long as they do now,'' Roberts wrote in an Oct. 3, 1983, memo to White House Counsel Fred Fielding that is now on file at the Ronald Reagan Presidential Library..."A judge insulated from the normal currents of life for 25 or 30 years was a rarity then but is becoming commonplace today,'' Roberts wrote. "Setting a term of, say, 15 years would ensure that federal judges would not lose all touch with reality through decades of ivory tower existence.''

Term limits for judges may or may not be a good idea, but I had my doubts about the premise of Roberts' critique. The great increases in life expectancy we have seen over the past two centuries chiefly relate to infant mortality; the older you get, the less dramatic the increases become. Certainly it is not the case that maximum human longevity is increasing. How does this relate to the Supreme Court?

On Wikipedia, I found a list of the justices of the Supreme Court of the United States in chronological order of appointment. Then I took the average of the terms of service of each group of ten. In the list of these averages set out below, the date is the end of the period in which each group of ten was appointed:

1796
8.9 yrs

1811
(John Marshall appointed)
20.9 yrs

1823
19.2 yrs

1845
19 yrs

1864
14.3 yrs

1903
13 yrs

1921
15.4 yrs

1939
13.3 yrs

1953
17.3 yrs

1970
20.6 yrs

1994
(Current)
20.6 yrs

The average tenure for the first ten justices was indeed short, but that had little to do with longevity. The Supreme Court was new and not very prestigious in the early days of the Republic. The justices tended to quit in order to move on to better things. It was only during the tenure of John Marshall as Chief Justice that the Court acquired an authority comparable to that of Congress and the President. There then followed a long period during which justices stayed on the court for about as long as they have since the beginning of the final quarter of the 20th century. The composition of the current Court is uniquely old, but again, that's not biology: the continuing Roe v. Wade controversy has blocked the normal turnover of the Court.

John Roberts was probably correct if he thought that the current, long tenure of Supreme Court justices is contrary to the expectation of the Founders, but not for the reason he cited. The Founders probably did not expect that justices, once appointed to the Court, would cling to their office for the rest of their lives.

* * *

Recently I saw Gattaca, a film released in 1997 about a near-future world (though not quite so near as our own, evidently) in which pre-natal genetic enhancements and genetic testing in general put people who are conceived naturally at a considerable disadvantage. The story is about one such Invalid (accent on the second syllable) who steals the genetic profile of a supernormal in order to qualify to pilot the first manned spaceship to Titan.

Gattaca has a reputation as an underappreciated minor film. I can only agree. It comes close to the ideal of science fiction played on a bare stage. The sets are subdued Modern; there are no special effects. As for the cast, no less a person than Gore Vidal has a bit part as Director Josef of the Gattaca organization. He even turns out to be the murderer, though the murder is a red herring. There were several real actors, too.

Since I saw this film, I have been trying to track down a quotation that I am almost sure comes from Tolkien. It runs something like this:

No, I have never much liked the idea of spaceflight. It seems to be promoted mostly by people who want to turn the whole world into a big train station, and then to establish similar stations on other planets.

The journey to Titan (which we do not see) is just a Maguffin, like the statuette in The Maltese Falcon, but it leaves the film hollow, intentionally so. It is not at all clear why the impeccably dressed and immaculately clean personnel of Gattaca would want to do something as crudely industrial as explore another planet. As for the colonization of Titan, we must ask whether the universe really needs another planet covered with office parks and Ikea furniture. Indeed, does it really need any?

The character of the hero is defined by his determination to belie the projection for a mediocre future that his real genetic profile suggested, including a high probability of an early death from heart failure. Though fraud was necessary to allow him to compete for his ambitions, he fought against his fate chiefly through study and exercise. A friend of mine in high school received a similar prognosis. He became the first fitness fanatic I ever met. He died at 28.

* * *

Incidentally, Gattaca is available in Esperanto. So are 14 other films: look here.

* * *

Speaking of near-future paranoia, I have done my bit to bring about a world in which no public moment goes unrecorded; my condominium now has security cameras. To ensure that no one forgets this fact, I made this poster [BIE I put this in the header] to remind everyone to be good.

Speaking of graphics, the Latin Mass folks at Holy Rosary Church asked me to do a simple webpage for them. So, I did this[BIE link removed, since Holy Rosary Church isn't really the point here. A fine chapel though, as I verified]. The sound file of the Magnificat is surprisingly good, considering the microphone we were using; the church has wonderful acoustics.

That page is supposed to be uploaded to the parish website. No doubt it will be, eventually, but getting the authorization is harder than authorizing that expedition to Titan.

* * *

"Nothing Burger" is a good characterization of the whole embryonic stem-cell controversy. Even if omni-potent stem cells turn out to have clinical applications, it is hard to imagine a goofier way to get them than by harvesting them from embryos, cloned or otherwise. In any case, new techniques should soon return the subject to its deserved obscurity, as we see in The Washington Post:

Scientists for the first time have turned ordinary skin cells into what appear to be embryonic stem cells -- without having to use human eggs or make new human embryos in the process, as has always been required in the past, a Harvard research team announced yesterday.

So are we done with the subject? Not quite:

Because it involves the fusion of a stem cell and a person's ordinary skin cell, the process leads to the creation of a hybrid cell. While that cell has all the characteristics of a new embryonic stem cell, it contains the DNA of the person who donated the skin cell and also the DNA that was in the initial embryonic stem cell.

The Post notes this, however:

They do not mention that several teams, including ones in Illinois and Australia, have said in recent interviews that they are making progress removing stem cell DNA from such hybrid cells...Some even suspect that the new technique for making personalized stem cells would still work even if the "starter" stem cells' DNA were removed before those cells were fused to the skin cells.

Nonetheless, embryonic stem cells have become like ethanol fuels to some people: it's something they want the government to subsidize whether it does any good or not:

"I think we have to keep our eye on the ball here," [John Gearhart, a stem cell researcher at Johns Hopkins Medical Institutions] said. "If this stuff proves to work, that's wonderful. But we're just not there yet, and it's going to take a long time to demonstrate that. Meanwhile, other techniques already work well. So let's get on with it."

By all means; but the useful research has little to do with the public polemic.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

LinkFest 2016-01-22

Source Criticism is not Credible

Pascal-Emmanuel Gobry is one of my recent follows on Twitter. Here, Gobry discusses one of the roots of the recent argument between Ross Douthat and members of the Catholic academy: source criticism. One of the most popular theories in the academy is that a hypothetical lost source document, known as Q for Quelle, was used to compose the Gospels. This theory was advanced in the 19th century to explain the similarities in the Gospels, which at the time were thought to have been composed in the second or third centuries. Present biblical scholarship estimates much earlier dates of composition, so the original reason for proposing the source theory is no longer pertinent. The theory, however, persists. Part of the reason for this seems to be is that it allows for an essential flexibility in biblical exegesis. Unfortunately, source criticism doesn't seem useful or correct on its own terms, so the current popularity of the theory seems to be an exercise in special pleading.

Everything Aristotle Said is Wrong

This essay was mentioned in passing by Gobry when he was discussing Source Criticism. In part, this essay is about modern philosophy, but it is also a fascinating history of Great Books programs in the United States.

The decline of deaths from Coronary Heart Disease worldwide

I've seen this paper before, and this graph is the most dramatic image contained within. Other papers on the same subject do not have quite as sharp of a peak in the data, and the ending and starting rates seem too low as well, it is pretty clear that CHD deaths in the Western countries peaked in the 1970s, and have steadily declined since. Compare this image from the NIH:

 

 What is a lot less clear is exactly why this is. There have been many, many changes in treatments for CHD over this time period, and big changes in diet and smoking rates. It is pretty hard to tease out these things all together, but looking at dietary cholesterol in particular, the amount of cholesterol people in Western countries eat has steadily increased over the time period in question, which is part of reason why dietary cholesterol is no longer seen as so critical to heart disease.

The Long View 2003-04-22: Pandemics

This bit about SARS is interesting twelve years later. Influenza is interesting. The official mortality rate went up compared to what John had here, from 5.6% to 9.6% according to the WHO. Unfortunately, better record-keeping doesn't always equal better stats. This number is likely to be a massive overestimate, since only the sickest get counted in official tallies.

Influenza, and other similar diseases that infect both animals and humans, is no joke. It is easy to dismiss, especially since many of the deaths are concentrated in the elderly. We haven't had anything nearly as bad as the 1918 Spanish Flu, but such a thing would probably be much, much worse in an age of frequent air travel. You haven't seen panic yet.

I also got a laugh from John's comment on The Stand. Of course, if resistance to the virus in that book were a gene, then it would run in families. King isn't really a sci-fi author, so perhaps that lapse is forgivable. One might postulate that the gene in question was a de novo mutation, of which everyone has about 100, but those get passed down too, changing frequency depending on their relative fitness. So, unless everyone involved got the same de novo mutation at birth around the same time, it would still run in families.

This post also features a mostly successful prediction: private space companies would be capable of routine manned spaceflight in ten years. Private spaceflights are becoming routine, although manned flights are a little less so yet.

Pandemics
As a sanguine soul, my first reaction to the advent of Severe Acute Respiratory Syndrome (SARS) was the observation that it's not as bad as the great influenza pandemic that occurred around the end of the First World War. Now I learn that the mortality rate for SARS is actually four times higher, though the absolute number of deaths from SARS is still infinitesimal in comparison to the tens of millions who died between 1917 and 1919. Even more disturbing is this headline from The New York Times : Death Rate from Virus More Than Doubles.
Normally, mortality rates for new infectious diseases fall fairly quickly. This is partly because treatments are developed, and partly because physicians learn to spot asymptomatic cases of the disease. The jump in the world-wide SARS mortality rate to 5.6% is almost certainly a statistical mirage, which will disappear when reporting improves. Even the idea of "world" statistics for SARS means little, considering the different ways the disease behaves in each country. On the other hand, it is possible that the virus is mutating quickly, and the changes in the statistics reflect real changes in the lethality of the disease.
We know that SARS has already had an appreciable effect on business in Asia. The travel industry in particular is in sackcloth and ashes. If the disease is not contained, or otherwise made manageable, SARS could also create a new issue for the US presidential election of 2004. Forty million people in the US have no health insurance. Many others, like me, have deductibles so high that they will not visit a doctor until they are at death's door. This kind of health system is inefficient even at the best of times. In conjunction with an epidemic disease that kills one out of 20 victims, it would be a template for a public health catastrophe.
The question of health insurance in the US has long been discussed in terms of esoteric notions of "portability" and "choice." The political system lost sight of the fact that the first function of any health system is to preserve public order by detecting and treating epidemic disease. You can't fight the Black Death with tax incentives.
* * *
Here is a very small pet peeve. Readers may be familiar with Stephen King's novel, The Stand. That is the one in which almost the entire population of the world is killed by an influenza virus; designed in a weapons lab, it mutates after a victim contracts it until it finally kills him. The only survivors were people with a certain rare gene, which granted immunity. The book dwells on sad scenes in which each of the rare survivors lose their families.
May I ask what Mr. King's editors thought they were about? If immunity were genetic, then it would be passed down in family lines. We learn late in the book that a single parent with the gene will provide enough immunity for their children to recover from the virus. Whole families should have lived through the plague. This anomaly has been bothering me for years.
* * *
Speaking of minor peeves that have been bothering me for years, a bunch of them met at the University of Chicago recently and declared that modern critical theory was a waste of time. We learn this from another Times article, The Latest Theory Is That Theory Does not Matter
The panel discussion at which this declaration of intellectual bankruptcy occurred was organized by Critical Inquiry, a noted journal of theory. There were more than two dozen participants, including Henry Louis Gates Jr., Homi Bhabha, Stanley Fish, and Fredric Jameson. As the Times mildly observes, "the leftist politics with which literary theorists have traditionally been associated have taken a beating." People's political hopes are often disappointed, of course. The tragedy for this crew was that, when you took away the politics, there was nothing left.
It's the students you feel sorry for. At some point, they must have liked literature. They intuited that it was important; that was why they majored in it, or went on for graduate degrees. By and by, their healthy instincts were corrupted, while their prose became more unreadable and ideologically subservient. At the Critical Inquiry panel, however, they would have learned that they damned themselves to no purpose. As Stanley Fish told them: "I wish to deny the effectiveness of intellectual work. And especially, I always wish to counsel people against the decision to go into the academy because they hope to be effective beyond it."
"Effective" here does not mean helping others to become better people, or adding to knowledge for its own sake. One deluded student complained "how much more important the actions of Noam Chomsky are in the world than all the writings of critical theorists combined." Noam Chomsky has been shilling for concentration-camp states for 30 years. The impotence of scholarship for these people means their regret that they did not succeed in turning their own country into North Korea.
One panelist did try to defend the life of the mind, as he saw it: "intellectual work has its place and its uses...[y]ou can have poems that are intimately linked with political oppositional movements, poems that actually draw together people in acts of resistance." The notion that you can also have poems that are good as poems, that civilization exists in part so that there can be poetry, is completely absent. So, of course, is any value in literature aside from its use as political propaganda. The panelists' problem is that now even they cannot deny it can't do that, either.
Critical theory has sacked the liberal arts. The theorists, in their folly, have driven away the funding and the graduate students from the departments they came to dominate. No doubt, after the panelists die or retire, literary studies will recover. The next time, maybe, they will be about literature.
* * *
On the subject of next times, I often correspond with people about the future and historical significance of manned spaceflight. It is easy to be unfair about NASA (as perhaps I have been myself), but it is pretty clear that the era of manned flight that began in the 1960s was a false dawn. In some sense, we have to begin again.
"Why," you ask? Because it's there. As C.S. Lewis once observed about the question of life on other planets, this is a matter that people are either passionately interested in or find too repulsive to discuss. "Passionate" may be too strong a word to describe my interest in spaceflight, but certainly I support it. I am therefore greatly encouraged by headlines like this: Passenger-Carrying Spaceship Makes Desert Debut.
The spaceship in this case is the work of the ingenious Burt Rutan. The flight he hopes to make in the near future will be suborbital, but he does claim to have a full, reusable launching system, capable of reaching LEO. This is not a prototype, he emphasizes: this is hardware. He says that manned flight could be routine within the next ten years.
I have been hearing that since I was eight years old. The difference now seems to be a convergence of private investment and the slow accumulation of off-the-shelf technology. This time, maybe there will be an industrial-technological breakthrough. Manned spaceflight may yet be The next Big Thing. I would much prefer that to nano-technology, which I dislike almost as much as wireless.
* * *
Even if our timeline does begin to overlap that of Heinlein's The Man Who Sold the Moon, we could also be threatened by vampires in the streets, many of them tourists.
Copyright © 2003 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Stand
By Stephen King

The Long View 2003-04-03: Consider the Alternatives

Here is an interesting one on the Euro and Germany. John felt that Germany couldn't pursue a sane fiscal policy because of the Euro. Twelve years later, this turned out to be very true, but not precisely for the reasons John thought. He did get the overall dynamic right however, the big economies, like Germany, are tied down by monetary union, and the smaller ones, the PIIGS, are overstimulated and prone to meltdown. Well, we proved that one right. John also was correct that no one dared question the idea of making Germany and Greece in some sense economic equals, at least until everything blew up.

A prediction that did not go so well is that China's financial system would also have blown up by now. There has been a market crash over there recently, but the kind of thing Gordan Chang has been going on about for fifteen years keeps not happening. John once pointed out that things like the unusual financial growth China has been experiencing of late tend not to go on forever. To date, the Chinese have been giving it their best shot.

I do think John was right to fear another nasty influenza epidemic. SARS looked pretty bad for a while, but it fizzled in comparison the the Spanish flu. Another pandemic of that magnitude would do very bad things to a word with daily international flights. We have just been lucky.

Consider the Alternatives
Here's an interesting point that Kermit L. Schoenholtz, the chief economist of Salomon Smith Barney, made about the decline in foreign investment in the US:
"If people believe that the events we've seen in Iraq are not one-off events, it will affect their investments."
The New York Times piece in which this appears deals chiefly with the failure of the gradual decline in the value of the dollar to spur exports. Currency fluctuations are temporary, but we could have a long-term problem. It is likely that the Iraq campaign will not be the last of the 911 Wars (though it may be the largest: North Korea could turn out to be surprisingly brittle). Will people decline to invest their money in the US if the country is conducting a string of military campaigns?
* * *
Well, as the ancient comedian George Burns used to say when people asked if he minded getting old: "Not when I consider the alternatives." There is a good article in the Spring issue of The National Interest by Adam Posen: "Germany's Path to Economic Perdition," which covers not only the German economy, but also Japan's. Most important, there is a critique of the euro system.
We won't dwell on Japan's problems. Posen endorses the familiar assessment that it's an institutionally "blocked" society that can't summon the political will to pump the bad debt out of its submerged financial system. He says that Germany has not quite reached the same point, but it is in danger of a deflationary spiral. The reasons are different from Japan's. Germany's markets are freer, and the economy on the whole is more dynamic. The problem is that the euro system prevents Germany from adopting a sane fiscal policy. Today we are in the sort of period in which a country with control over its own currency would run large deficits and reduce taxes. Germany, however, is biting the bullet by keeping its deficits within the range prescribed by the European Central Bank in Frankfurt.
The charter, not just the policy, of the Bank requires a deflationary bias. This might not have been such a bad idea, if Europe had consisted of many small, roughly equal economic units. Unfortunately, it consists of highly unequal ones. The pattern seems to be that the smaller economies in the euro system are overstimulated (hence the Irish "Celtic Tiger," now gone a bit mangy), while the larger ones are depressed. No one has dared address the fact that this is an inherent feature of the system.
The French deal with the fiscal prescriptions of Frankfurt by ignoring them. When the Germans start to do that, there won't be much of a system left.
* * *
Then there's Asia. We have already noted the problems of the Japanese financial system. The Chinese a similar situation, but exacerbated by an order of magnitude by the remnants of a command economy. Still, the economy posts large nominal gains, so people who you think would know better continue to pour money into the country.
At any rate, they did so until recently. Now the SARS pandemic has come along, which is going to make people reluctant to travel to southern China, or indeed to receive people from that region as guests.
As any epidemiologist can tell you, SARS is not a big deal as pandemics go. (I knew an old man who had been a teenager at the time of the influenza epidemic that occurred toward the end of the First World War. There were seven people in his immediate family before the epidemic; just he and his father survived it.) The Chinese problem is that their investment-based economy is still more prospect than actual return. It is not quite a Ponzi scheme, but it is vulnerable to an interruption in capital investment. It is conceivable that SARS could occasion the bursting of the bubble that Gordon Chang writes about.
* * *
Contrary to some expectations, the United Nations did not turn into a pumpkin when the Iraq War began; neither did NATO turn into six white mice. However, even though the UN is going to survive, it will be hard to take it altogether seriously hereafter. Ideas are surfacing for a supplementary organization that could be trusted with serious security issues, but which would be more than the telephone numbers on the American president's speed-dial.
Anyone in immediate need of a proposal might take a look at Adam Garfinkle's Democratic Union. He actually proposed this in the early 1990s, but to small effect. It is time to work out the details, I think.

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Host Cell Lines and Homo Sapiens

Greg Cochran has re-posted an interesting article about host cell lines.

A host cell line is a microorganism that was until fairly recently a part of some higher organism – roughly speaking, a contagious cancer. We know of one good example, transmissible venereal tumor, also known as canine venereal sarcoma or Sticker’s sarcoma, a contagious neoplasm of dogs. It is not contagious in the same sense as liver or cervical cancer, which are (usually) consequences of viral infections. In those cases, it is the virus that is infectious; here it is the cancer itself. Viable cells become engrafted onto mucous membranes and grow in the new host animal. Transmission is usually sexual, but licking or inhaling sometimes causes oral or nasal tumors. Chromosomal and genetic studies indicate that all cases of TVT share a common origin – all share a particular pattern of chromosomal rearrangement and carry characteristic insertions.

Greg goes on to speculate that the cell line derived from a cervical adenocarcinoma in Henrietta Lacks in 1951, HeLa, might be something like a host cell line. Descended from a homo sapiens, but a new species. Since I just happen to be reading Dune, this reminds me of the Bene Gesserit belief that not everyone who happens to be a homo sapiens counts as a human.

Which further reminds me of one of my favorite ideas of John Reilly's: humans, homo sapiens, and persons, are different things. Discoveries like this just reinforce my conviction that John was right. However, I've found that right-thinking people seem obscurely scandalized when I repeat this. I think this is probably a good thing, because de-humanizing people is usually the first step in justifying doing something bad to them. To say that a human being and a person are not logically identical is not the same thing as saying we should de-personalize some human beings. However, it does open that up as a possibility. Thus I am not surprised when people seem off-put.

However, it does not therefore follow that those three things are logically identical. They cannot be, because they are different kinds of things. It is a category mistake to identify them. John summarized thus:

A human is an essence (if you don't believe in essences you don't believe in human beings); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. As for person, which is an entity, conscious or otherwise, that you can regard as a "thou," is conflated with the notion of person, as an entity able to respond in law, either directly or through an agent.

I think that the human beings we know of are homo sapiens, and that homo sapiens are persons. I just think you have to make an argument that these things are true, rather than making an indefensible assumption about it.

The last distinction John makes in the quote above often trips people up. If you conflate the two senses of the word person, and then further identify that with human being, I can see how that idea might be offensive. The problem is, it isn't true. If you can look past the controversies of contemporary American politics, the idea that a corporation can be a person has allowed institutions to flourish in the West, as opposed to tribes, nations, or dynasties, which are defined by common descent. An institution can continue through time once the founder has died, regardless of the familial relations of the people who comprise them.

Societies that lack the ability to create groups with a common purpose that do not depend on ties of kinship are weaker than those that can. We shouldn't cast that aside blithely.

I Am Legend Movie Review

I AM LEGEND

Directed by Francis Lawrence
Written by Mark Protosevich, Akiva Goldsman, Richard Matheson, John Corrington, and Joyce Corrington
Starring Will Smith and Alice Braga


Will Smith is not the first action star to portray Robert Neville on the big screen. Charlton Heston's Omega Man is another memorable performance based on the same story by Richard Matheson. Matheson's novel is one of my favorites, original and unexpected. It is a modern apocalyptic classic, identifiably influential on any number of zombie and vampire stories.  Anyone who has not read it should.

This version of I Am Legend differs from the book in a number of ways, including having an ending that reverses the circumstances of Neville's death. However, I think this version is both a faithful adaption of the source material, and in some ways superior to the original.

Smith's Neville is a masterful study in loneliness. This is an element that is done well by each medium. Gaunt and haunted, Smith gives us a good sense of what it would feel like to be the last man on Earth.

Always dancing on the edge of madness due to grief and isolation, Neville relies on his routines to sustain him. Fortunately, his obsession with finding a cure for the plague that swept man from the Earth is strong too. A bit of OCD seems to have some survival benefit.

When Neville finally meets another human, he no longer knows how to interact with her. This is really where this version diverges from Matheson's. 

The crucial difference is hope. There was no room in the novel for hope, or providence, both of which figure in the original theatrical ending for I Am Legend. Another ending was filmed, but it performed poorly with test audiences. I can see why. The original ending attempted to combine Matheson's dark and subtle conclusion with a bright and happy Hollywood ending where everyone rides off into the sunset. That ending was terrible, and I'm glad they redid it.

The theatrical ending is the happier, because in the Aristotelian sense, you cannot really say a life was a happy one until you see how it ends. In the alternate ending, Neville's whole life has been for naught. All his efforts, fruitless. All his suffering, pointless. Plus we get a lame version of the "monsters are people too, they are just misunderstood" trope. The way the vampire/zombies were handled in the novel made this seem like a good idea for the movie too, but their characterization in the movie is too feral to support this. 

By spending his life, Neville redeems himself from the brink of despair. The sappy alternate ending is not plausible because Neville no longer had any will to live. The providential arrival of another survivor allowed him an opportunity to die for something instead of nothing. Admittedly, the Legend tagline doesn't make near as much sense anymore, but the Job-like turn of this remake more than makes up for it.

My other movie reviews