The Long View 2005-05-02: Human Mice; Roe Misapprehension; Gödel versus Immanence

I demand to see my attorney!

I demand to see my attorney!

I remember being struck by this story when it came out. If your human-mice hybrids start acting too human, the obvious solution is to just kill them all!


Human Mice; Roe Misapprehension; Gödel versus Immanence

 

No, deranged scientists are not trying to create human-mouse hybrids that have squeaky voices and demand to see lawyers to get them released from their cages. However, the scientists' lawyers do have a contingency plan, just in case:

In January, an informal ethics committee at Stanford University endorsed a proposal to create mice with brains nearly completely made of human brain cells. Stem cell scientist Irving Weissman said his experiment could provide unparalleled insight into how the human brain develops and how degenerative brain diseases like Parkinson's progress.

Stanford law professor Hank Greely, who chaired the ethics committee, said the board was satisfied that the size and shape of the mouse brain would prevent the human cells from creating any traits of humanity. Just in case, Greely said, the committee recommended closely monitoring the mice's behavior and immediately killing any that display human-like behavior.

There is more to humanity than cytology, so no doubt Stanford is correct in dismissing the possibility that the mice will be even partially human in any serious sense. On the other hand, if the mice do start to manifest human behaviors, might it not be a better idea to stop cutting their skulls open and begin being really nice to them?

* * *

For those of you who cannot wait for the news, here's an item dated November 9, 2008, by Stuart Taylor Jr. of National Journal, entitled How the Republicans Lost Their Majority:

In a succession of blockbuster 5-4 rulings, the Bush Court in 2007 approved state-sponsored prayers at public school functions such as graduations and football games (overruling the 1992 decision Lee v. Weisman); went out of its way to overrule Lawrence v. Texas, the 2003 decision that had recognized a constitutional right to engage in gay sex; and struck down key aspects of the Endangered Species Act as unconstitutional overextensions of Congress's power to regulate interstate commerce.

Then, this June, the same five justices banned consideration of race in state university admissions, overturning another 2003 precedent (Grutter v. Bollinger); this ruling sets the stage for a dramatic plunge in black and Hispanic enrollments at elite schools. Two days later, the same five-justice majority overturned Roe v. Wade, holding that it was up to elected officials to decide whether to allow unlimited access to abortion, to ban the procedure, or to specify circumstances in which it should be allowed or banned.

This last decision roiled the country and immediately transformed many elections -- for state legislature, governor, Congress, and the presidency -- into referenda on abortion. Republican candidates at all levels found themselves facing a politically impossible choice that put many on the road to defeat: Those who declared their support for a broad ban on abortion scared moderates into the arms of the Democrats. Those who opposed such a ban, or waffled, were deserted by much of their conservative base.

As I have noted before, the reversal of Roe would, and probably will, be a net boon for the Democrats, but not for the reasons this piece suggests. That decision has been an albatross around the neck of the Democratic Party for 30 years. If the court deconstitutionalized the question, then Democratic candidates around the country would be able to adapt their platforms to the views of their constituents. That would leave them free to focus their campaigns on economic issues, where it is not at all clear that the Republicans have an advantage. In other words, it would turn the clock back to about 1960, when the Democrats usually won.

* * *

On a not entirely different note, there is a new book by Rebecca Goldstein, Incompleteness: The Proof and Paradox of Kurt Godel (Great Discoveries), which was favorably reviewed by Polly Shulman in yesterday's New York Times. I've been reading about Gödel for years (I have a review of another biography here), but the review helped to clarify some points for me. Shulman notes:

The dream of these formalists [of the early 20th century] was that their systems contained a proof for every true statement. Then all mathematics would unfurl from the arbitrary symbols, without any need to appeal to an external mathematical truth accessible only to our often faulty intuition.

This reminded me that the main thrust of phenomenology during this period was in the same direction: toward a philosophy that was completely immanent, with no transcendent elements. For existentialists, and for postmodernists who hold that the only truth is intersubjective, immanence is still the last word in sophistication. (In a foggy way, the notion even wafts through legal theory.) What the review, and apparently the book, remind us of is that, in formal logic, this project collapsed:

To put it roughly, Gödel proved his theorem by taking the Liar's Paradox, that steed of mystery and contradiction, and harnessing it to his argument. He expressed his theorem and proof in mathematical formulas, of course, but the idea behind it is relatively simple. He built a representative system, and within it he constructed a proposition that essentially said, ''This statement is not provable within this system.'' If he could prove that that was true, he figured, he would have found a statement that was true but not provable within the system, thus proving his theorem. His trick was to consider the statement's exact opposite, which says, ''That first statement -- the one that boasted about not being provable within the system -- is lying; it really is provable.'' Well, is that true? Here's where the Liar's Paradox shows its paces. If the second statement is true, then the first one is provable -- and anything provable must be true. But remember what that statement said in the first place: that it can't be proved. It's true, and it's also false -- impossible! That's a contradiction, which means Gödel's initial assumption -- that the proposition was provable -- is wrong. Therefore, he found a true statement that can't be proved within the formal system.

Thus Gödel showed not only that any consistent formal system complicated enough to describe the rules of grade-school arithmetic would have an unprovable statement, but that it would have an unprovable statement that was nonetheless true. Truth, he concluded, exists ''out yonder'' (as Einstein liked to put it), even if we can never put a finger on it.

Note that this does not mean the truth is unknowable, only that some truths are unknowable solely through logic. Again, we are back to Aquinas.

* * *

Speaking of which, anyone in the New York area who is interested in a Latin liturgy for Ascension Thursday (May 5) might consider Holy Rosary Church in Jersey City. Mass is at 5:30 PM; the address is 344 Sixth Street And while I'm at it, here's my chant commercial again.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-06-23

The Stock Market Speaks: How Dr. Alchian Learned to Build the Bomb

Sometimes public information is the best spy you have.

I saw a series of Tweets this week about how the boss-class is sticking it to workers, usually with the active involvement of the political Left. You expect that kind of thing from the Right, but the increasing alignment of wealth and education with Left-wing parties means the boss votes liberal. My favorite story was where the drywall guy complained that no one but illegal immigrants from Mexico wanted to work 90 hour weeks for him. I miss unions with teeth.

How the Democrats Lost Their Way on Immigration

A look at the political battles that made center-Left immigration skepticism disappear.

I (Don't) Like You! But Who Cares? Gender Differences in Same Sex and Mixed Sex Teams

Hopefully it replicates.

OUR OUTDATED DEBATES

A fascinating article at First Things makes the argument that technological change is leaving our static political debates behind. For example, the double-edged sword of artificial wombs.

Social Justice and the End of Moral Certainty

A look at the progressive mindset.

8 Figures on Gun Ownership, and Attitudes, in America

There continues to be a partisan gap in America, but it is smaller than the gap in the professional activists who make the most noise on the subject. For example, the vast majority of both Democrats and Republicans are not in favor of concealed carry without a permit. 

The Long View: Is Mathematics Constitutional?

A recent popular [well, as popular as a massive book full of equations can be] exposition of mathematical Platonism is Roger Penrose's The Road to Reality. It even has practice problems in it with devoted communities of amateurs trading tips on how to solve them. Mathematical Platonism, or something much like it, really is something like the default position of many mathematicians and physicists.

Since I ended up an engineer, perhaps it isn't really surprising that I always found the moderate realism of Aristotle and Aquinas more appealing. 

There is a good quote in this short essay that I've used to good effect:

"Because the whole point of science is to explain the universe without invoking the supernatural, the failure to explain rationally the 'unreasonable effectiveness of mathematics,' as the physicist Eugene Wigner once put it, is something of a scandal, an enormous gap in human understanding."
I, for one, was a little taken aback by the proposition that science had any "point" other than to describe the physical world as it actually is, but let that pass.

Philosophy of science is a field in fine shape, but many fans of science try to use it as a cudgel upon religious believers. Insofar as that attempt is mostly ignorant of both science and philosophy, it isn't particularly illuminating.


Is Mathematics Constitutional?

 

The New York Times remains our paper of record, even in matters of metaphysics. For proof, you need only consult the article by George Johnson that appeared in the Science Section on February 16, 1998, entitled: "Useful Invention or Absolute Truth: What Is Math?" The piece was occasioned by a flurry of recent books challenging mathematical Platonism. This is the belief, shared by most mathematicians and many physicists, that mathematical ideas are "discovered" rather than constructed by the mathematicians who articulate them. Consider the following sentence:

"Because the whole point of science is to explain the universe without invoking the supernatural, the failure to explain rationally the 'unreasonable effectiveness of mathematics,' as the physicist Eugene Wigner once put it, is something of a scandal, an enormous gap in human understanding."

I, for one, was a little taken aback by the proposition that science had any "point" other than to describe the physical world as it actually is, but let that pass. The immediate philosophical peril to the world of the Times is more narrow. That is, it is hard to be a thoroughgoing secular materialist if you have to acknowledge that there are aspects of reality that cannot be explained as either products of blind chance or of human invention. Supreme Court Justice William Kennedy has even suggested that systems of ethics claiming an extra-human origin are per se unconstitutional. Judging by some of the arguments against mathematical Platonism presented by the Times piece, however, we may soon see Establishment Clause challenges to federal aid for mathematical education.

The best-known of the books that try to de-Platonize mathematics is "The Number Sense: How the Mind Creates Mathematics," by the cognitive scientist Stanislas Dehaene. His argument is that the rudiments of mathematics are hardwired into the human brain, and so that mathematics is foundationally a product of neurology. The evidence is various. There are studies of accident victims suggesting there may be a specific area of the brain concerned with counting, as well as stimulus-response studies showing that some animals can be trained to distinguish small-number sequences. (Remember the rabbits in "Watership Down," who had the same name for all numbers from five to infinity?) Relying on even more subtle arguments is a recent article by George Lakoff and Rafael E. Núñez, "Mathematical Reasoning: Analogies, Metaphors and Images." [BE: the actual article is titled The Metaphorical Structure of Mathematics: Sketching Out Cognitive Foundations for a Mind-Based Mathematics] The authors suggest that numbers are simply extrapolated from the structure of the body and mathematical operations from movement. (The article is part of an upcoming book to be called "The Mathematical Body.")

I have not read these works, so it is entirely possible I am missing something. Still, it seems to me that there are two major problems with analyses of this sort. First, if the proposition is that mathematical entities are metaphysical universals that are reflected in the physical world, it is no argument against this proposition to point to specific physical instances of them. In other words, if numbers are everywhere, then it stands to reason that they would be inherent in the structure of the brain and body, too.

If Dr. Dehaene has really found a "math-box" in the head, has he found a fantasy-gland or an organ of perception? The Times article paraphrases him as saying that numbers are "artifacts of the way the brain parses the world...like colors. Red apples are not inherently red. They reflect light at wavelengths that the brain...interprets as red." The distinction between things that are "really red" and those that "just look like red" has always escaped me, even in languages with different verbs for adjectival predicates and the copula. Doesn't a perfectly objective spectral signature identify any red object? In order to avoid writing the Monty Python skit that arguments about perception usually become, let me just note here that the experience of qualia (such as "redness") has nothing to do with the cognitive understanding of number. Like the numbers distinguishing the wavelengths of colors, for instance.

There is a more basic objection to the physicalistic reductionism at work here, however. Consider what it would mean if it worked. Suppose that proofs were presented so compelling as to convince any honest person that mathematics was indeed nothing more than an extrapolation of the structure of the nervous system, or of the fingers on the hand, or of the spacing of heartbeats. We would then have a situation where we would have to explain the "unreasonable effectiveness" of the human neocortex, or even the universal explanatory power of the human anatomy. This would be anthropocentrism come home to roost. You could, I suppose, argue that we only imagine that the human neurological activity called mathematics lets us explain everything; the reality is that we only know about the things that our brains let us explain. Well, maybe, but then that suggests that there are other things that we don't know about because our brains are not hardwired to explain them. Maybe those are the things that are really red?

There are indeed problems with mathematical Platonism, the chief of which is that it is hard to see how the physical world could interact with the non-sensuous ideal forms. (John Barrow's delightful "Pi in the Sky" will take interested readers on a fair-minded tour of the philosophy and intellectual history of this perennial question.) The most workable solution is probably the "moderate Realism" of Aquinas. He held that, yes, there are universals, but that we can know about them only through the senses. This seems reasonable enough. In fact, this epistemological optimism is probably the reason science developed in the West in the first place. There may even be a place for Dr. Dehaene's math-box in all this, if its function is regarded as perceiving numbers rather than making them up. What there can be no place for is the bigotry of those who believe that science exists only to support certain metaphysical prejudices.

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2005-03-18: Extinctions: Periodic & Deserved

As far as I know, nothing serious ever came out of Rohde and Muller's 2005 on cyclical extinctions, but I linked the image above to a copy of the original paper. I think is usually a mistake to look for alternative explanations for the Cretaceous extinction.


Extinctions: Periodic & Deserved

 

Just when you thought it was safe to read the paleontological journals again, this story appears:

BERKELEY, CA -- A detailed and extensive new analysis of the fossil records of marine animals over the past 542 million years has yielded a stunning surprise. Biodiversity appears to rise and fall in mysterious cycles of 62 million years for which science has no satisfactory explanation...For their study, Muller and Rohde defined fossil diversity as the number of distinct genera alive at any given time...Muller suspects there is an astrophysical driving mechanism behind the 62 million year periodicity...[On the other hand] "My hunch, far from proven," Rohde said, "is that every 62 million years the earth is releasing a burst of heat in the form of a plume formation event..."

That asteroid impact on the boundary between the Cretaceous and the Tertiary, the one that is supposed to have exterminated the dinosaurs, is pretty well established. On the other hand, there have always been paleontologists who insist that extinctions were already underway when the asteroid (or asteroids, in some accounts) struck the Earth. Perhaps the impact simply worsened a bad situation. Conversely, maybe a similar impact at another time would not have such serious effects.

Oh, and by the way: if that last big die-off was about 65-million years ago, and the period of diversity collapse is about 62-million years, then...OH MY GOD!!!

* * *

As another example of an aggravating factor, consider the lead opinion piece in the Weekly Standard of March 21, entitled "Let 'er Rip." Written by Fred Barnes, who suffers from the delusion he is doing the Administration a favor, the piece encourages President Bush to ignore all those reports he has been hearing that his privatization plan for Social Security makes the national gorge rise. Rather, Barnes advises, the president should promote his proposal in season and out, until the public awakes to the splendor of the concept. Barnes concludes with this flourish:

The Washington Times checked what would have happened if individual accounts, invested in market index funds, had been established in 1978. The Dow since then has soared from 820 to nearly 11,000, the S&P 500 from 96 to more than 1,220, and Nasdaq from 118 to roughly 2,050. Retirees would be living in high style. So, Mr. President, let 'er rip on individual accounts. You've got nothing to lose and momentous reform and a booming Republican Party to gain.

This is like an incident in The Hitchhiker's Guide to the Galaxy: the captain of the C Ship that crashes on the primitive Earth decides to jumpstart the new colony's economy, so he declares the leaves on all the trees to be currency. That does make everyone on the C Ship a multi-trillionaire, but it means that a single peanut from ship's stores will cost three deciduous forests.

Look, all those corporations listed in the securities exchanges are worth only so much. You can't make them worth more by pouring the savings of tens of millions of people into the markets in which their securities are traded. At best, you would get a much lower return on capital. At worst, and more likely, you would get a bubble bigger than the South Sea, followed by a bust big enough to scare a dinosaur to death. That is in fact what happened to the private-account systems in Britain and Sweden, to the great consternation of all concerned.

Would someone please put this wounded Grendel of an idea out of its misery?

* * *

And speaking of bad ideas, here's the worst idea for a new performance genre since cold-water mud-wrestling:

"Baghdad Burning: Girl Blog From Iraq" is not a very good play, but it's worth your attention for two reasons. It's the only political drama in New York written from the point of view of an Iraqi who lived through the American invasion, and, for better or worse, it inaugurates an entirely new (and seemingly inevitable) theatrical genre - the blog play...

The blog in question is Riverbend, by the way. And what exactly happens on stage?

Instead of building a character, the show includes readings of her words from three women and one man, which adds to the muddled feel...When not speaking, the actors pace in a triangle or perform synchronized gestures that make them look like backup singers to a 1960's pop band.

It might be better to put laptops on the stage, linked to big flat-panel screens, on which the text of the blog and of the reader comments scrolls down. That way, the actors could be absent as well as the audience.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-06-09

I love the movie Cars, and I am very excited about Cars 3, which looks less like a cashing-in on the initial success of the first movie sequel [you gotta pay for studios somehow] and more like a real Pixar-worthy sequel with better animation technology.

This post is nearly ten years old now, and I think still pertinent. 

I didn't learn anything new here, but I've got a better memory and a deeper interest in history than the average American. The thing that gets me is no historical figure can stand this kind of scrutiny. For example, here is Lincoln debating with Stephen Douglas. This is the argument of the Progressive Left, that no one before the present is redeemable in any way, but it surprises me when less radical people advocate for ideas that destroy their own position.

Speech on the Kansas-Nebraska act at Peoria, Illinois

Speech on the Kansas-Nebraska act at Peoria, Illinois

It has been a while since I posted something about statistical software and graphs, so here you go!

An article attempting to link the Younger Dryas with Göbekli Tepe. I lack subject matter knowledge, but this is an interesting idea. Greg Cochran thinks the Younger Dryas impact hypothesis is bunk.

David Warren argues for the return of unsafe spaces at universities [with tobacco!]. 

I think I started reading David Warren about the same time I started reading John. J. Reilly, shortly after 9/11. He wrote a column for the Ottowa Citizen at the time, and his beat was terrorism.

Warren was better suited than most. He had been to Afghanistan in his salad days. He ended up a big supporter of George W. Bush and of the War in Iraq as a clash of civilizations. Warren spent quite a bit of time in the wilderness repenting of his sins following 9/11, and I still occasionally read his essays.

Answer: somewhat.

I've never felt like the Hobbit was a kid's book, but I have friends who feel otherwise.

The Long View: The Obvious Proof

I think there are better arguments for classical theism than the Argument from Design, but as eminent a philosopher as Antony Flew was eventually convinced by it. Philosophical atheism is a respectable position, but many of atheism's most vocal defenders are not actually espousing that position, but rather a juvenile and reactive atheism that does them no credit. For those individuals, a psychological explanation may have merit.


The Obvious Proof: A Presentation of the Classic Proof of Universal Design
by Gershon Robinson and Mordechai Steinman
CIS Publishers, 1993
$13.95 Hardcover, $10.95 Paper
141 Pages

 

Is there such a thing as an honest atheist? Maybe not, according to Gershon Robinson and Mordechai Steinman (both of whom are writers, the latter with a physics degree). This short book (really an extended essay) does not add much new to the Design debate. What it does do is try to turn the intellectual tables by interpreting atheism as a species of willful irrationality.

The thesis of "The Obvious Proof" is that the scientific evidence for intelligent design in nature is at least as great as the evidence that would normally persuade us that something is artificial. The authors' benchmark for common sense in this matter is the black obelisk buried beneath the surface of the Moon in the film "2001," which audiences around the world immediately intuited to be a product of intelligence. (This argument is set out more briefly at the website The 2001 Principle, where you can also order the book.) The authors present a useful summary of several popular treatments of the Anthropic Principle in cosmology and the extraterrestrial "seeding theory" of the origin of life on Earth. However, the book does not attempt a comprehensive presentation of the Argument from Design. (Among other things, such a presentation would require a discussion of the evidence from chaos and complexity studies that the natural world is in large measure self-organizing.) Rather, the authors assume that Design is such an obvious explanation for order in nature that the reluctance of certain scientists to accept it can have only a psychological explanation.

The explanation that the authors favor is the Gestalt psychology principle of "cognitive dissonance," which causes people to reject empirical information that does not fit into their mental categories. The authors sometimes seem to equate intellectual cognitive dissonance with Freudian repression. (Perhaps the distinction may not be hard and fast. In any case, a more purely Freudian explanation for atheism was developed a few years ago by the psychologist Paul Vitz.) What the authors are talking about here is not a failure of the imagination among scientists, which is what cognitive dissonance normally implies in a scientific context. Rather, they seek to define the reasons for the emotional reluctance found among at least some scientists to accept the theistic implications of empirical research.

The five emotional grounds the authors present for this reluctance are rather intriguing. Three are things you might expect: the desire for complete moral autonomy, outraged intellectual pride faced with the unknowable, and mere intellectual habit. One of the others, however, is the ontological anxiety that might occur should you accept that you are a product of another will. It's an interesting point: a meaningless universe is less threatening than an arbitrary one. The most engaging reason for atheism, though, is almost a kind of shyness. If God exists, then He must have abandoned us, since otherwise He would not be so enigmatic. Do you really want someone to exist who probably does not like you?

"The Obvious Proof" could be taken as a commentary on Rabbi Elchonon Wasserman's commentary on Maimonides' commentary on the beginning of the Decalogue. Maimonides concluded from the words "I am G-d, your Lord, Who took you out of the land of Egypt," that there is an actual duty under Jewish Law to believe in God. Such a commandment is reasonable, according to Rabbi Wasserman, because the existence of God should be obvious even to a boy by the time of his Bar Mitzvah. Those who deny the evidence for God, according to this view, do so because they have intellectual or emotional "investments" in a non-theistic universe.

It is certainly true that some scientists have a psychological ax to grind on the question of the existence of God. (My suspicion is that a disproportionate number of these people write popular science for just this reason.) It is probably also true that the perception of design in nature is a matter of intuitive common sense. However, intuitive common sense, even when it is correct, is not the same thing as a rigorous philosophical proof.

 

End

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: The Lucifer Principle

Oath of the Horatii

Oath of the Horatii

This is a rare book I read before John recommended it. I think this is a worthy summary.


The Lucifer Principle: A Scientific Expedition into the Forces of History

by Howard Bloom

The Atlantic Monthly Press, 1995
466 pp., $24.00
ISBN 0-87113-532-9

 

If, like the present reviewer, you are a sucker for theories of history, you could do worse than this Social Darwinist theory-of-everything by Howard Bloom, a writer of popular science and noted PR man. The basic idea is reasonable enough: ideas are subject to much the same kind of survival pressures that genes are. Richard Dawkins, who popularized the notion that living organisms are simply carrying-cases that genes use to preserve and multiply themselves, also suggested that clusters of ideas, which he called "memes," similarly use human brains to survive. Bloom has taken this notion for a spin around the block. Just as genes care little for the fate of individuals and frequently promote behaviors that result in short, violent lives for individuals, so ideas drive people to war, revolution and communal strife in such a way as to promote the ideas' own distribution. The "Lucifer Principle" of the title is the thesis that history has been so bloody because ruthless "natural selection" applies to all levels of life, both the biological and the social.

Theory-of-history buffs know, of course, that one of the charms of this type of literature is the opportunity to watch reasonable-sounding universal principles turn into parodies of themselves as their implications are developed. "The Lucifer Principle" does not disappoint on this score. The book has especially wonderful chapter titles, such as "Oliver Cromwell--The Rodent Instincts Don a Disguise," "Righteous Indignation = Greed for Real Estate," and "Are There Killer Cultures?" It is, of course, perfectly true that ideas often prosper because of their success as social glue rather than because of any intrinsic merit they may have. Still, the fact of the matter is that the chief "survival" tests that ideas must pass are their own internal consistency and their conformance with the empirical world. Bloom sometimes seems to suggest that any idea which is not a matter of immediate sense experience is fictitious, a denizen of "the Invisible World," and so can be judged only with regard to its ability to survive. However, it is perverse to simply assume that abstractions which have spread widely through the world, from the theology of the Trinity to the idea of the number "zero," owe no part of their success to their intrinsic aesthetic properties or to practical utility. To return to the biological analogy, there are indeed genes that survive simply as genes, as biochemical tricks that are played on living organisms. We call such genes "viruses," and we recognize that they are parasitic on real life. Much the same is probably true of ideas.

An interesting feature of the book is the way that memes and their deterministic control over everything seem to have less and less explanatory power as you get to social questions that interest the author. Thus, for instance, the answer to the question "Are There Killer Cultures?" is "yes," and the chief example is Islam. This religion is, it seems, a particularly bloody-minded "meme." It is far more aggressive and cruel than today's Western civilization. Although social interactions are governed by the inhuman struggle of memes to survive, still we are told, with no preamble, that these things are "a matter of degree." The final chapters of the book simply restate the arguments of the "declinists," such as those found in Paul Kennedy's "The Rise and Fall of the Great Powers," augmented by a metaphorical interpretation of international relations as a barnyard pecking order. Bloom's policy recommendations sound less like those of a Social Darwinist than of a back-to-basics education reformer.

The author's metaphysical system (which is what his "theory" is) really leaves no perspective from which to criticize nature, "red in tooth and claw." It is therefore not clear why he expresses the hope we might channel our violent tendencies into peaceful pursuits. It would be more logical to argue, like Nietzsche, that the most natural thing to do is overcome our inherited humanitarian prejudices. The fundamental incoherence of the book is not because of some particular flaw in the author's logic, however, but because of the impossibility of what he is trying to do. Historical reductionism simply does not work, whether the universal principle you propose is class conflict, or a Masonic conspiracy, or a covert extraterrestrial breeding program. In the final analysis, the best such theories of history can be is entertaining.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Richard Feynman and Isaac Asimov on Spelling Reform

It is not just myself, John Reilly, and Emil Kirkegaard who talk about this.

I will note that I'm a little dubious of the idea that orthographic reform will boost English reading ability dramatically.


Richard Feynman and Isaac Asimov on Spelling Reform

 

by John J. Reilly

Note: This essay appeared in issue No. 25 of the Journal of the Simplified Spelling Society (1999/1).

The subjects of this note were both American scientists who became major figures in popular culture. [1] The physicist Richard Feynman (1918-1988) is best known for the work in quantum electrodynamics that won him a Nobel Prize in 1965. His career extended from work on the Manhattan Project in the early 1940s to a conspicuous role on the official commission of inquiry into the causes of the space shuttle Challenger disaster of 1986. Isaac Asimov (1920-1992) was a biochemist who taught at Boston University for many years, but became famous as a prolific writer of science fiction and popular science. Estimates of the number of his books run to over 500; he himself lost count.

Both Feynman and Asimov became public sages of a sort. Many scientists, given a little encouragement, are willing to express opinions on anything under the sun, but these two belonged to the rather smaller class of such people whose opinions were actually sought by a wide audience. Considering the range of topics on which they commented, it is not really surprising that they touched on the reform of English spelling. While both advocated reform, neither had more than a passing interest in the subject. The few remarks I discuss here may be all they ever had to say on the matter.

While my research has not been exhaustive, the only recorded remarks by Richard Feynman on spelling reform I have been able to discover were made in the course of a talk entitled "This Unscientific Age," one of the John Danz Lectures that Dr. Feynman delivered at the University of Washington in April, 1963. [2] The burden of that talk is that social and scientific progress is inhibited by received opinions. In the course of his remarks, Feynman compares psychiatrists to witchdoctors and professors of English to medieval scholars who neither jettisoned old errors nor made useful innovations. Having disposed of literary scholarship, he went a step further: "Now let me get to a lower level still in this question. And that is, all the time you hear the question, `Why can't Johnny read?' And the answer is, because of the spelling."

After making a few allusions to the history and theory of alphabetic writing, Dr. Feynman observes that "things have gotten out of whack in the English language," which leads him to ask, "[w]hy can't we change the spelling?" In what may be taken as an expression of exasperation with his colleagues in the liberal arts, he declares: "If the professors of English will complain to me that the students who come to the universities, after all those years of study, still cannot spell `friend,' I say to them that something's the matter with the way you spell `friend.'"

So obvious does Dr. Feynman find the need for improvements in English spelling that he has trouble seeing what arguments could be raised against such a project: "[I]t can be argued ..... that [language reform is] a question of style and beauty in the language, and that to make new words and new parts of speech might destroy that. But [the professors of English] cannot argue that respelling the words would have anything to do with the style. There's no form of art form or literary form, with the sole exception of crossword puzzles, in which the spelling makes a bit of difference to the style. And even crossword puzzles can be made with a different spelling."

This brings us to the question of how a reform might be accomplished: "And if it's not the English professors that do it, and if we give them two years and nothing happens -- and please don't invent three ways of doing it, just one way, that everybody [can get] used to -- if we wait these two or three years and nothing happens, then we'll ask the philologists and the linguists and so on because they know how to do it. Did you know that they can write any language with an alphabet so that you can read how it sounds in another language when you hear it? [sic] That's really something. So they ought to be able to do it in English alone."

In some ways, Feynman's ideas are most illuminating for what they fail to consider. Even a cursory acquaintance with the history of attempts to reform English spelling shows that more than "two or three years" have been needed to devise a universally acceptable reformed system. Experience has also shown that, at any one time, there are likely to be far more than "three ways" under consideration as candidates for such a system. One interesting point is that Dr. Feynman seems to regard the problem as purely technical. It should be entrusted to the "philologists and linguists," who at least use an abstruse symbology, rather than to those frowzy-minded professors of English.

Reading this, I was reminded of a critique I read some time back, entitled Higher Superstition, [3] that sought to explain the postmodern assault on the objectivity and institutional prestige of the natural sciences. According to the authors, the attempt to reduce science to a merely cultural phenomenon is revenge for the dismissive attitude taken by natural scientists toward the liberal arts during the late `50s and early `60s, when the hard sciences got all the grant money.

Of course, the most important element that is lacking in Dr. Feynman's remarks is any consideration at all of how a reformed system would be implemented. The assumption seems to be that, once the linguists have cooked up a way to reproduce the phonetic precision of the IPA in the English version of the Latin alphabet, then the new spelling could be adopted simply by fiat. Again, history suggests otherwise. As we turn to Isaac Asimov's thoughts on spelling reform, we will find more serious attention to the problem of how to get people to use a reformed system. There are, however, other conceptual problems with what this popular sage has to say.

In 1982, Dr. Asimov published two essays that touched on spelling reform. In the later of the two, "A Question of Spelling," [4] he followed Dr. Feynman in linking the deficiencies of English spelling with the problems of education. The particular occasion for the essay, he says, was a mail solicitation from an organization calling itself the "Reading Reform Foundation." The letter recited the familiar complaints about the high degree of functional illiteracy in the United States. However, Asimov was not much persuaded by the Foundation's argument that the key to alleviating the problem is better teaching methods (which no doubt the letter asked him for money to promote). He was particularly unimpressed with the letter's claim that 87% of all English words are spelled phonetically. That left 13% that were not phonetically spelled, and those were likely to be the most commonly-used words in the language.

Unlike Feynman, Asimov jumps right in and makes a stab at some suggested respellings. Consider "through," "coo," "do," "true," "knew" and "queue," he asks. Why not just spell them "throo," "koo," "doo," "troo," nyoo" and "kyoo"? These respellings would in fact fit within some familiar reform proposals, though perhaps few reform advocates would go along with his assertion that the obvious respelling of "night" should be "nite." Then there is a larger problem.

Noting that the plural of "man" is "men," but that young children will naturally assume that "mans" is the plural, he goes on to assert that the children are right. Thus, along with his advocacy of spelling reform, he includes an argument for a completely regularized grammar, though he does not elaborate on it as fully. The suggestion, "Why not reform grammar, too?" is a common retort made by people who have just been introduced to the idea of spelling reform. Why some people confuse these things is a mystery to people who don't confuse them. In any case, Asimov's essay is the first instance I have ever seen of someone who equated spelling and grammar and who also proposed to reform them both. [5]

Asimov does acknowledge that a great deal of trouble would be occasioned by implementing the reforms he proposes. However, he give three reasons for why it would be worthwhile for everyone to take the trouble:

(1) However much trouble the reforms would be to us, they would make the lives of our children and grandchildren immeasurably easier. This is the sort of sacrifice that parents should be willing to make for their children.

(2) The reforms, once in place, would promote literacy. This would boost worker productivity and assist in enhancing national prosperity.

(3) Earth is in need of a common second language, and English is the most widespread current candidate. Removing the idiosyncrasies of English would promote its spread, which would promote international understanding and world peace.

The gist of the article is the suggestion that computers, particularly word-processing dictionaries, could greatly facilitate a transition to reformed spelling. Certainly he did not think that much hope of change was offered from any other quarter: "...I think that the home computer industry won't be putting out reformed `dictionaries' in response to an independent movement for spelling reform. I have no hope for an independent movement being powerful enough to achieve anything."

Nevertheless, history was on the side of spelling reform. We could expect to see modifications in the graphical representation of English in order to make it easier for machines to use: "...I think it is inevitable that computers [will] be designed to read the written word, and reproduce it; and even to hear the spoken word and put it into print or follow its orders. This can be done with the language as it is, but how much easier it would be if spelling is phonetic and grammar is regular." How much indeed.

In this essay, Asimov seems to have foreseen a great deal of software that had not been written yet. Still, despite his genuine prescience, the arrival of the technologies he anticipated has made little impact on the chaotic nature of English spelling. Neither is there much sign that anyone is about to take his suggestion to create an "Academy of Spelling Reform," a body he hoped would be authorized to issue those new "word-processing dictionaries." (The term "spell-checker" had perhaps not yet been coined at the time this essay was written.) History has taken a frustrating turn. In 1900, it was common sense to many educated people that English spelling should be reformed, while the suggestion that machines might someday read texts aloud was inconceivable even to science fiction writers. Today, just shy of the year 2000, I have software that reads texts aloud, while it is spelling reform that has become inconceivable.

In closing, it should be emphasized again that neither Richard Feynman nor Isaac Asimov was greatly interested in spelling reform. To them, English spelling was just another inheritance from an irrational past that needed to be restructured. It is clear from what we have seen that their accomplishments in other areas gave them no special insight into the question. Nevertheless, it is worth considering their ideas in some detail and spreading awareness of them further. The substantial posthumous fame of Feynman and Asimov makes even their slight engagement with the subject a possible enticement for their many admirers to examine the question more closely.

NOTES[1] Elaborate websites with eponymous URLs have been dedicated to each, a good indication that they have risen to at least subcultural significance. As of November 1998, the chief website relating to Feynman was at http://www.feynman.com. The most useful Asimov sites are at http://www.clark.net/pub/edseiler/WWW/asimov_home_page.html and http://www.asimov.com. All three links have extensive bibliographical information. The material relating to Asimov is particularly comprehensive.

[2] "The Meaning of It All: Thoughts of a Citizen Scientist," by Richard Phillips Feynman, (Helix Books, 1998), page 116.

[3] Higher Superstition: The Academic Left and Its Quarrels with Science, by Paul R. Gross and Norman Levitt (The Johns Hopkins University Press, 1994), page 86.

[4] "A Question of Spelling," in The Roving Mind, by Isaac Asimov, (Prometheus Books 1983), page 340. First published in Popular Computing (July 1982). An earlier essay, which I have been unable to obtain, is "Spell that Word!" in The Dangers of Intelligence, by Isaac Asimov, (1986). First published in American Way (March,1982).

[5] There are three conventional answers to the assertion that grammar reform and spelling reform are equivalent:

(a) Written alphabetic language is the servant of the spoken language. Alphabetic writing systems can be assessed by how well they represent speech. This is a fairly objective criterion. In contrast, there is no similarly objective way to assess which grammar is better than another.

(b) English grammar is not particularly irregular compared to most European languages. The same cannot be said of the written form of English as compared to the written forms of those languages.

(c) Shut up.

End

Copyright © 1998 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Higher Superstition: The Academic Left and Its Quarrels with Science

The kind of thing that John Reilly laments in this book review is alive and well. If you want a taste of it, check out New Real Peer Review on Twitter, which simply reprints abstracts of actual, peer-reviewed articles. A favorite genre is the autoethnography. Go look for yourself, no summary can do it justice.

I will disagree with John about one thing: race and sex matter a lot for many medical treatments. For example, the drug marketed under the trade name Ambien, generically called zolpidem, has much worse side effects in women than in men, and it takes women longer to metabolize it.

This effect was memorably referred to as Ambien Walrus. I find this pretty funny, but I delight in the sufferings of others.

You can't ignore this stuff if you want to do medicine right. The reasons for doing so vary, but you'll get a better result if you don't.


Higher Superstition: The Academic Left and Its Quarrels with Science
by Paul R. Gross and Norman Levitt
The Johns Hopkins University Press, 1994
314 pp, $25.95
ISBN 0-8018-4766-4

 

The Enemies You Deserve

 

If you are looking for an expose' of how political correctness in recent years has undermined medical research, corrupted the teaching of mathematics and generally blackened the name of science in America, this book will give you all the horror stories you might possibly want. There have been rather a lot of indictments of the academic left, of course, but this is one of the better ones. However, the book is most interesting for reasons other than those its co-authors intended. To use the same degree of bluntness that they use against the "science studies" being carried on by today's literary critics, what we have here is an expression of bewilderment by a pair of secular fundamentalists who find themselves faced with an intellectual crisis for which their philosophy provides no solution.

Paul Gross is a biologist, now at the University of Virginia but formerly director of the Woods Hole Marine Biological Laboratory, and Norman Levitt is professor of mathematics at Rutgers University in New Jersey. They repeatedly insist, no doubt truthfully, that they have no particular interest in politics and that they are not programmatic conservatives. What does worry them is the increasing number of faculty colleagues in the liberal arts who take it as an article of faith that the results of empirical scientific research are biased in favor of patriarchy or capitalism or white people. The people who have this sort of opinion they call "the academic left," a catchall category that includes deconstructionists, feminists, multiculturalists and radical environmentalists.

The authors have a good ear for invective, such as this happy formula: "...academic left refers to a stratum of the residual intelligentsia surviving the recession of its demotic base." There has always been something rather futile about the radicalization of the academy, and in some ways the movement is already in retreat. The ideas of the academic left are based in large part on Marxist notions that were originally designed for purposes of revolutionary agitation. Revolutionary socialist politics has not proven to have the popular appeal one might have hoped, however. Marxism has therefore been largely replaced among intellectuals by that protean phenomenon, postmodernism. Although postmodernism incorporates large helpings of Freudianism and the more credulous kind of cultural anthropology, it remains a fundamentally "left" phenomenon, in the sense of maintaining an implacable hostility to market economics and traditional social structures. However, postmodernists have perforce lowered their goal from storming the Winter Palace to inculcating the "hermeneutics of suspicion" in undergraduates. The results of these efforts were sufficiently annoying to incite Gross and Levitt to write this book.

Postmodernists presume that reality is inaccessible, or at least incommunicable, because of the inherent unreliability of language. Science to postmodernists is only one of any number of possible "discourses," no one of which is fundamentally truer than any other. This is because there are no foundations to thought, which is everywhere radically determined by the interests and history of the thinker. Those who claim to establish truth by experiment are either lying or self-deluded. The slogan "scientific truth is a matter of social authority" has become dogma to many academic interest groups, who have been exerting themselves to substitute their authority for that of the practicing scientists.

The French philosophical school known as deconstructionism provided the first taste of postmodern skepticism in the American academy during the 1970s. It still provides much of its vocabulary. However, self-described deconstructionists are getting rare. Paul de Man and Martin Heidegger, two of the school's progenitors, were shown in recent years to have been fascists without qualification at certain points in their careers, thus tainting the whole school. On the other hand, while deconstruction has perhaps seen better days, feminism is as strong as ever. Thus, undergraduates in women's studies courses are routinely introduced to the notion that, for instance, Newton's "Principia" is a rape manual. Even odder is the movement to create a feminist mathematics. The authors discuss at length an article along these lines entitled "Towards a Feminist Algebra." The authors of that piece don't seem much concerned with algebra per se; what exercises them is the use of sexist word problems in algebra texts, particularly those that seem to promote heterosexuality. The single greatest practical damage done by feminists so far, however, is in medical research, where human test groups for new treatments must now often be "inclusive" of men and women (and also for certain racial minorities). To get statistically significant results for a test group, you can't just mirror the population in the sample, you have to have a sample above a mathematically determined size for each group that interests you. In reality, experience has shown that race and gender rarely make a difference in tests of new medical treatments, but politically correct regulations threaten to increase the size of medical studies by a factor of five or ten.

Environmentalism has become a species of apocalyptic for people on the academic left. It is not really clear what environmentalism is doing in the postmodern stew at all, since environmentalists tend to look on nature as the source of the kind of fundamental values which postmodernism says do not exist. The answer, perhaps, is that the vision of ecological catastrophe provides a way for the mighty to be cast down from their thrones in a historical situation where social revolution appears to be vastly improbable. Environmentalists seem to be actually disappointed if some preliminary research results suggesting an environmental danger turn out to be wrong. This happens often enough, notably in cancer research, where suspected carcinogens routinely turn out to be innocuous. However, on the environmental circuit, good news is unreportable. The current world is damned, the environmentalists claim, and nothing but the overthrow of capitalism, or patriarchy, or humanism (meaning in this case the invidious bias in favor of humans over other animals) can bring relief. Only catastrophe can bring about this overthrow, and environmentalists who are not scientists look for it eagerly.

The basic notion behind the postmodern treatment of science is social constructivism, the notion that our knowledge of the world is just as much a social product as our music or our myths, and is similarly open to criticism. The authors have no problem with the fact that cultural conditions can affect what kind of questions scientists will seek to address or what kind of explanation will seem plausible to a researcher. What they object to is the "strong form" of social constructivism, which holds that our knowledge is simply a representation of nature. The "truth" of this representation cannot be ascertained by reference to the natural world, since any experimental result will also be a representation. Constructivists therefore say that we can understand the elements of a scientific theory only by reference to the social condition and personal histories of the scientists involved. This, as the authors correctly note, is batty.

The lengths to which the principle of constructivism has been extended are nearly unbelievable. Take AIDS, for instance, which has itself almost become a postmodernist subspecialty. The tone in the postmodernist literature dealing with the disease echoes the dictum of AIDS activist Larry Kramer: "...I think a good case can also be made that the AIDS pandemic is the fault of the heterosexual white majority." Some people, particularly in black studies departments, take "constructed" quite literally, in the sense that the AIDS virus was created in a laboratory as an instrument of genocide. Kramer's notion is more modest: he suggests that the extreme homosexual promiscuity which did so much to spread the disease in the New York and San Francisco of the late 1960s and early 1970s was forced upon the gay community by its ghettoization. This is an odd argument, but not so odd as the assumption that you can talk about the origins of an epidemic without discussing the infectious agent that causes it. The upshot is that AIDS is considered to be a product of "semiological discourse," a system of social conventions. It can be defeated, not through standard medical research, but through the creation of a new language, one that does not stigmatize certain groups and behaviors. (Dr. Peter Duesberg's purely behavioral explanation of AIDS, though it has the attractions of scientific heresy, gets only a cautious reception because of its implied criticism of homosexual sex.) The postmodern academy actually seems to have a certain investment in a cure for AIDS not being found, since the apparent helplessness of science in this area is taken as a license to give equal authority to "other ways of knowing" and other ways of healing, particularly of the New Age variety.

The postmodernist critics of science usually ply their trade by studiously ignoring what scientists themselves actually think about. The anthropologist Bruno Latour, for instance, has made a name for himself by subjecting scientists to the kind of observation usually reserved for members of primitive tribes. Once he was commissioned by the French government to do a post-mortem on their Aramis project. This was to be a radically new, computerized subway system in which small trams would travel on a vastly complicated track-and-switch system along routes improvised for the passengers of each car. The idea was that passengers would type their proposed destination into a computer terminal when they entered a subway station. They would then be assigned a car with other people going to compatible destinations. The project turned into a ten year boondoggle and was eventually cancelled. The French government hired Latour to find out what went wrong. Now, the basic conceptual problem with the system is obvious: the French engineers had to come up with a way to handle the "traveling salesman" problem, the classic problem of finding the shortest way to connect a set of points. This seemingly simple question has no neat solution, and the search for approximate answers keeps the designers of telephone switching systems and railroad traffic managers awake nights. Latour did not even mention it. He did, however, do a subtle semiological analysis of the aesthetic design of the tram cars.

Postmodernists regard themselves as omniscient and omnicompetent, fully qualified to put any intellectual discipline in the world in its place. They have this confidence because of the mistaken belief that science has refuted itself, thus leaving the field clear for other ways of understanding the world. They love chaos theory, for instance, having absorbed the hazy notion that it makes the universe unpredictable. Chaos theory in fact is simply a partial solution to the problem of describing turbulence. Indeed, chaos theory is something of a victory for mathematical platonism, since it shows that some very exotic mathematical objects have great descriptive power. The implications of chaos theory are rather the opposite of chaos in the popular sense, but this idea shows little sign of penetrating the nation's literature departments. The same goes for features of quantum mechanics, notably the uncertainty principle. Quantum mechanics actually makes the world a far more manageable place. Among other things, it is the basis of electronics. To read the postmodernists, however, you would think that it makes physicists flutter about their laboratories in an agony of ontological confusion because quantum theory phrases the answers to some questions probabilistically.

On a more esoteric level, we have the strange cult of Kurt Goedel's incompleteness theorem, first propounded in the 1930s. Now Goedel's Theorem is one of the great treasures of 20th century mathematics. There are several ways to put it, one of which is that logical systems beyond a certain level of complexity can generate correctly expressed statements whose truth value cannot be determined. Some versions of the "Liar Paradox" illustrate this quality of undecidability. It is easy to get the point slightly wrong. (Even the authors' statement of it is a tad misleading. According to them, the theorem "says that no finite system of axioms can completely characterize even a seemingly 'natural' mathematical object..." It should be made clear that some logical systems, notably Euclid's geometry, are quite complete, so that every properly expressed Euclidean theorem is either true or false.) Simply false, however, is the postmodernist conviction that Goedel's Theorem proved that all language is fundamentally self-contradictory and inconsistent. Postmodernists find the idea attractive, however, because they believe that it frees them from the chains of logic, and undermines the claims of scientists to have reached conclusions dictated by logic.

Postmodernism, say the authors, is the deliberate negation of the Enlightenment project, which they hold to be the construction of a sound body of knowledge about the world. The academic left generally believes that the reality of the Enlightenment has been the construction of a thought-world designed to oppress women and people of color in the interests of white patriarchal capitalism. Or possibly capitalist patriarchy. Anyhow, fashion has it that the Enlightenment was a bad idea. Now that modernity is about to end, say the postmodernists, the idea is being refuted on every hand. Actually, it seems to many people of various ideological persuasions that the end of modernity is indeed probably not too far off: no era lasts forever, after all. However, it is also reasonably clear that postmodernism is not on the far side of the modern era. Postmodernism is simply late modernity. Whatever follows modernity is very unlikely to have much to do with the sentiments of today's academic left.

Granted that the radical academy does not have much of a future, still the authors cannot find a really satisfying explanation for why the natural sciences have been subject to special reprobation and outrage in recent years. In the charmingly titled penultimate chapter, "Why Do the People Imagine a Vain Thing?", they run through the obvious explanations. It does not take much imagination to see that today's academic leftist is often a refugee from the 1960s. Political correctness is in large part the whimsical antinomianism of the Counterculture translated into humorourless middle age. Then, of course, there is the revenge factor. In the heyday of Logical Positivism from the end of World War II to the middle 1960s, physical scientists tended to look down on the liberal arts. In the eyes of that austere philosophy, any statement which was not based either on observation or induction was literally "nonsense," a category that therefore covered every non-science department from theology to accounting. The patronizing attitude of scientists was not made more bearable by the unquestioning generosity of the subsidies provided by government to science in those years. The resentment caused by this state of affairs still rankled when the current crop of academic leftists were graduates and undergraduates. Now they see the chance to cut science down to size.

While there is something to this assessment, the fact is that the academic left has a point. Logical Positivism and postmodernism are both essentially forms of linguistic skepticism. Both alike are products of the rejection of metaphysics, the key theme in Western philosophy since Kant. The hope of the logical positivist philosophers of the 1920s and 30s was to save just enough of the machinery of abstract thought so that scientists could work. Science is not skeptical in the sense that Nietzsche was skeptical, or the later Sophists. It takes quite a lot of faith in the world and the power of the mind to do science. And in fact, the authors note that Logical Positivism, with a little help from the philosophy of Karl Popper, remains the philosophical framework of working scientists to this day. The problem, however, is that Logical Positivism leaves science as a little bubble of coherence in a sea of "nonsense," of thoughts and ideas that cannot be directly related to measurable physical events.

Logical Positivism has many inherent problems as a philosophy (the chief of which being that its propositions cannot themselves be derived from sense experience), but one ability that even its strongest adherents cannot claim for it is the capacity to answer a consistent skepticism. In their defense of science, the authors are reduced to pounding the table (or, after the fashion of Dr. Johnson's refutation of Berkeley's Idealist philosophy, kicking the stone.) Thus, it is a "brutal" fact that science makes reliable predictions about physical events, that antibiotics cure infections while New Age crystals will not, that the advisability of nuclear power is a question of engineering and not of moral rectitude. Well, sure. But why? "Because" is not an answer. Without some way to relate the reliability of science to the rest of reality, the scientific community will be living in an acid bath skepticism and superstition.

The authors tell us that the scientific methodology of the 17th century "almost unwittingly set aside the metaphysical assumptions of a dozen centuries...[that] Newton or Leibnitz sought...to affirm some version of this divine order...is almost beside the point...Open-endedness is the vital principle at stake here...Unless we are unlucky, this will always be the case." In reality, of course, it surpasses the wit of any thinker to set aside the metaphysical assumptions of a dozen centuries, or even entirely of his own generation. The scientists of the early Enlightenment did indeed scrap a great deal of Aristotle's physics. Metaphysically, however, they were fundamentally conservative: they settled on one strand of the philosophical heritage of the West and resisted discussing the matter further.

As Alfred Whitehead realized early in this century, science is based on a stripped-down version of scholasticism, the kind that says (a) truth can be reached using reason but (b) only through reasoning about experience provided by the senses. This should not be surprising. Cultures have their insistences. Analogous ideas keep popping up in different forms throughout a civilization's history. When the Senate debates funding for parochial schools, it is carrying on the traditional conflict between church and state that has run through Western history since the Investiture Controversy in medieval Germany. In the same way, certain assumptions about the knowability and rationality of the world run right through Western history. The Enlightenment was not unique in remembering these things. Its uniqueness lay in what it was willing to forget.

It would be folly to dismiss so great a pulse of human history as the Enlightenment with a single characterization, either for good or ill. Everything good and everything bad that we know about either appeared in that wave or was transformed by it. Its force is not yet wholly spent. However, one important thing about the Enlightenment is that it has always been a movement of critique. It is an opposition to the powers that be, whether the crown, or the ancient intellectual authorities, or God. The authors of "Higher Superstition" tell us that the academic left hopes to overthrow the Enlightenment, while the authors cast themselves as the Enlightenment's defenders. The authors are correct in seeing the academic left as silly people, who do not know what they are about. The authors are mistaken too, however. The fact is that the academic left are as truly the children of the Enlightenment as ever the scientists are. Science was once an indispensable ally in the leveling of ancient hierarchies of thought and society, but today it presents itself to postmodern academics simply as the only target left standing. Is it any wonder that these heirs of the Enlightenment should hope to bring down the last Bastille?

This article originally appeared in the November 1995 issue of Culture Wars magazine.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2004-12-15: Good Ideas; Bad Reasons

Sunset Crater with the San Francisco Peaks in the background

Sunset Crater with the San Francisco Peaks in the background

I read State of Fear and I liked it. I also thought it was massively unfair. It was a fun story, given its premises. Taken as a flight of fantasy, it was a blast. If you think it was an unbiased representation of the facts....

Some of the action in the book takes place in Northern Arizona. Many of Crichton's books feature Arizona in some fashion. I remember reading something by him that one of his inspirations for his works was the natural beauty of Sunset Crater. I'm a sucker for books that mention Flagstaff or Northern Arizona in some way.


Good Ideas; Bad Reasons

 

The scandal of the season seems to be Michael Crichton's new novel, State of Fear. The book is based on the thesis that the hypothesis that global warming is caused by human activities, or even that it is occurring, is politicized pseudo-science. In an address last year, Crichton argued that global-warming belief is just one example of a increasing corruption of science. The paradigm case, he tells us, is the theory behind the search for extra terrestrial life (SETI):

[Consider the] Drake equation:

N=N*fp ne fl fi fc fL

Where N is the number of stars in the Milky Way galaxy; fp is the fraction with planets; ne is the number of planets per star capable of supporting life; fl is the fraction of planets where life evolves; fi is the fraction where intelligent life evolves; and fc is the fraction that communicates; and fL is the fraction of the planet's life during which the communicating civilizations live.

This serious-looking equation gave SETI an serious footing as a legitimate intellectual inquiry. The problem, of course, is that none of the terms can be known, and most cannot even be estimated. The only way to work the equation is to fill in with guesses. And guesses-just so we're clear-are merely expressions of prejudice. Nor can there be "informed guesses." If you need to state how many planets with life choose to communicate, there is simply no way to make an informed guess. It's simply prejudice.

Most of the policy points that Crichton makes in this essay have merit. The nuclear-winter hypothesis was a propaganda scam. Second-hand smoke has never been shown to do any harm except to the people who are sued for claims that it causes cancer. The Malthusian "Population Bomb" argument is the Thing That Will Not Die, no matter how far it diverges from the facts. Nonetheless, regarding the one example of alleged scientific malpractice that Crichton discusses in detail, SETI, he is plainly talking nonsense.

There may or may not be space aliens, but the question is not undecidable. Certainly it is possible to make observations that would yield a good estimate of the number of Earth-like planets. The values for the final variables in the Drake Equation cannot be known a priori, but even they could be limited by whether signals are detected or not. In fact, to some degree they already have been.

If this is Michael Crichton's idea of science, we should take what he says with a grain of salt.

* * *

Meanwhile, the ingenious argument for Intelligent Design has gained a notable convert:

New York (AsiaNews/CWN) Anthony Flew, the British scholar who has been one of the world's most [eloquent proponents] of atheism, has conceded that scientific evidence points to the existence of God...Early this year, writing in Philosophy Now magazine, Flew had indicated that his commitment to atheism was wavering. He wrote: "It has become inordinately difficult even to begin to think about constructing a naturalistic theory of the evolution of that first reproducing organism..."...Flew--whose 1984 essay, The Presumption of Atheism, fixed his place as the leading proponent of that view--emphasises that he has not accepted Christianity. He said: "I'm thinking of a God very different from the God of the Christian, and far and away from the God of Islam." He likened his current position to the deism of Thomas Jefferson, explaining that he is now sympathetic to the researchers who theorise about an 'intelligent design' in the working of creation.

The Argument from Design for the existence of God is the only major proofs, if I am not mistaken, whose critics never claimed to have refuted as matter of logic. Hume himself simply noted that it was an empirical question that was very difficult to answer. The recent Intelligent Design hypothesis claims to have solved by the empirical question by proving that biochemistry could not have evolved by chance within the estimates age of the universe.

Theism as a metaphysical postulate is actually a clarifying agent. If you assume that the universe is personal, or at least rational, you will find out more about it than would a true skeptic. In fact, I would argue that the great deadends in intellectual history have been caused by "fear of religion." In biology and the social sciences, the rejection of teleology in principle has hindered research for decades.

So, I don't think that Anthony Flew will come to much harm for having dropped his guard on the God question. However, with regard to the argument from design, Hume was right.

* * *

Whether or not there are space aliens, we can still look forward to strange things appearing in the sky:

WASHINGTON, Dec 14 (Reuters) - Top U.S. Air Force officials are working on a strategy to put surveillance aircraft in "near space," the no man's land above 65,000 feet but below an outer space orbit, Air Force chief of staff Gen. John Jumper said on Tuesday.

Jumper said he would meet next Tuesday with the head of the Air Force Space Command, Gen. Lance Lord, to map out plans to get lighter-than-air vehicles into that region above the earth, where they could play a vital role in surveillance over trouble spots like Iraq...But in near space, such aircraft could carry out radar and imaging missions, carry communications nodes and even potentially relay laser beams from a ground-based source against a wide variety of targets, industry sources said.

That last bit is important. The tactic that will finally put paid to the strategic nuclear era is local defense of the targets. The lasers are more or less ready. One can imagine these relay platforms perpetually on station by the middle of this century, as prominent a feature of major cities as skyscrapers are today. Of course, the loss of nukes would make conventional intercontinental war possible again, but you can't have everything.

* * *

On the subject of strange forms of life, consider this bewildered report from the New York Times's Richard Bernstein on the long-awaited and perfectly predictable European rejection of immigration:

A Continent Watching Anxiously Over the Melting Pot: And so the question: why are Germans - and not just Germans but other Europeans as well - in such a state of anxiety and uncertainty about matters that have been more or less settled in the world's biggest country of immigration, the United States, for years? Why this discomfort with multiculturalism, this belief that assimilation, accepting the leitkultur, is the only way?

What planet does this writer live on? The issues in the United States are different, of course, but can even the Times not know that even most immigrants in America want immigration drastically reduced? And that people look on multiculturalism as a racket the nation can no longer afford? Just wait till 2008.

* * *

A parting threat: my Spelling Reform top page has been updated. No more Mr. Nice Guy.

Copyright © 2004 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

 

Linkfest 2017-04-28

How Western Civilization could collapse

Cyclical models of history are always popular.

Eye Makeup Used To Protect Children Can Poison Them Instead

This is something that should be on the short list of things to get rid of that will improve life for everyone.

The deep imagery of coal mining in the 1970s shows a lifestyle of peril and persistence

I love photo essays like this.

Hope for preemies as artificial womb helps tiny lambs grow

This kind of technology is always a double-edged sword; you can use it to save premature babies, or you could use to completely separate sex from reproduction.

America's Next Great Metropolis is Taking Shape in Texas

An older article about the San Antonio-Austin corridor.

How enemies became friends in this unique lesson of Vietnam

Soldiers often find they have more in common with their enemies than the civilians at home. 

50 Years Ago, This Was a Wasteland. He Changed Everything

A beautiful example of wealth well spent.

How Online Shopping Makes Suckers of Us All

Armies of PhDs with algorithms are competing to fleece us of our money.

From Vilified to Vindicated: the Story of Jacques Cinq-Mars

Sometimes it takes a very long time to be vindicated.

What is adaptation by natural selection? Perspectives of an experimental microbiologist

Our understanding of evolution owes much to E. coli.


10 questions for Adam K. Webb

A fascinating 2006 interview by Razib Kahn. I largely agree with Webb's positions, and I want to read his books.

The synthesis of ancient and modern physics and politics.

It has been a long time since I stopped by James Chastek's Just Thomism, but this is the kind of post that keeps me coming back. This is exactly how I feel about the philosophy of Aristotle and St. Thomas Aquinas.

The Long View: Island of the Day Before

The accurate measurement of longitude was a driving force of much of science during the Age of Exploration. A hell of a lot of good research came out of nations competing to do this faster and better, in search of filthy lucre. This is one of the foundational elements of my cocktail party theory of progress in science.

This book review from 1996 is the ultimate source of my characterization of hard science fiction [as opposed to space opera] as a way of introducing the reader to some useful concept in story form. In addition to the importance of measuring longitude, the baseless self-referential system of symbols that dominated the thought of the Renaissance and the Enlightenment is on display here.


The Island of the Day Before

 

by Umberto Eco
Harcourt Brace & Company, 1995
(trans. 1995 by William Weaver)
$25.00, 515 pp.
ISBN: 0-15-100151-0

The Memory-Scow of Fr. Wanderdrossel, S.J.

No matter how complicated a novel's plot or how subtle its message, all reviews of novels should start by telling you what the book is about. This novel is perfectly simple. Boy grows up during the Thirty Years' War. Boy goes on quest in order to find a way to determine longitude. Boy finds Jesuit. Boy goes mad and drowns. Everything else is a digression. Which is the problem.

If you believe you live in a world where getting there is not just half the fun but the only fun you are likely to have, novels should be written as a garland of digressions. Doubtless there has to be some unifying thread of plot to keep the whole thing together, but the treasure chest the characters have been seeking must always turn out to be empty. Of course, it would not do to have the characters ever quite realize just how much they are wasting their time. The author and the reader can know that the world, or at any rate the story, is meaningless; the characters' job is to try to find meaning and to fail in the attempt.

Umberto Eco, professor of semiotics at the University of Bologna, has written this kind of novel more than once. The trick is to use the book as a lecture room in which to instruct the reader in the milieu of some historical period or social setting, but without waxing tediously didactic. This, of course, is the method of good "hard" science fiction, which leaves the reader usefully instructed in certain principles of physics or biology after reading a story that otherwise closely resembles a Western. Eco does this very well. In "The Name of the Rose," we learned a great deal about late medieval ecclesiastical politics in the course of a story that did not pretend to be anything more than a merry parody of a Sherlock Holmes adventure. In "Foucault's Pendulum," we became much the wiser about the subsidy publishing business while following what I for one think was a slightly superior occult conspiracy. (Of course, Eco's occult conspiracy was not as good as the one in Theodore Roszak's underappreciated novel, "Flicker," but you can't have everything.)

"The Island of the Day Before" is even more ambitious, since we are treated to nothing less than a tour of the episteme of the 17th century. If you believe Foucault (the twentieth century deconstructionist, not a man after whom any type of pendulum is named), the eighteenth century was a time of logical, schematic knowledge. As exemplified by Linnaeus's system of biological classification, the Enlightenment mind was a-historical, given to discerning timeless formal patterns. The episteme of the nineteenth century, in contrast, was evolutionary in its view of both the physical world and of society. Together, these two epistemes constitute the mind of modernity. As a way, perhaps, to discerning the characteristics of the postmodern era, Eco tries to give us a sense of the European mind on the eve of modernity, before the epistemes of the modern era overwhelmed other ways of understanding the world.

People in that age did not expect the world, or their own lives, to make much sense as a linear narrative. It was not that they were suspicious of such narratives; as with Eco's plots, there was always one handy to tie things together. Rather, their first instinct was to look for subtle connections between particular and particular. Their politics and their science, and not least their prose, were complex, obscure, allusive. One did not try to understand the world by extrapolating from first principles. Rather, they lived in a world of signs and symbols. There was no high road to understanding. An education meant going from book to book, ancient and modern, in order to understand obscure allusions made by others, and as preparation to make a few of your own.

Most symbols were obviously of human manufacture. It was a great age for emblems and crests and heraldic devices, from which a suitably informed person might be able to deduce a great deal about the user's history and philosophy. Since they also thought that the natural world worked in much the same way, natural knowledge was a catalogue of the hidden sympathies between metals and birds and plants and planets. Medicine was an understanding of how the humors and parts of the body fit into this dense web of sympathies and pointers. I have long suspected that the Hermetic tradition fascinated Yeats because it provided a language in which things and not just words could rhyme. In the period in question, all sophisticated people thought like that.

All of this sounds as text-driven as recent schools of literary criticism, and maybe in practice it was. However, the big difference between late premodernity and early postmodernity was that the former was not incredulous of the possibility of certainty, of reliable foundations for thought and belief. Europe in the first half of the 17th century was clearly in a transitional state. Western Christendom had broken up politically and confessionally into divisions that could not yet acknowledge each other's legitimacy. Traditional Ptolemaic cosmology was no longer acceptable, but no alternative was available that was consistent with contemporary physics. Europe had become aware of the size of the planet and how alien many of the societies on it were, but as yet had no idea how to fit this new information into received ideas about history and providence. Of course, in another few decades, all these questions would be answered in what seemed to be a perfectly satisfactory manner. The connecting theme of this book, however, is a search for certainty that failed. The search was cartographical, the search for a method to determine longitude.

An extended discussion of this most interesting problem in the history of applied science is perhaps out of place here; readers who want a full account are referred to Dava Sobel's excellent recent book, "Longitude." Basically, the chief problem faced by early oceanic navigators was that, while it is not hard to tell how far north or south of the equator you are (the measure of latitude), there is no comparably simple way to tell how far east or west you are (the measure of longitude). You can determine latitude, for instance, by measuring how far the sun rises above the horizon at noon. It is a natural quantity, produced by the fact the earth is a sphere that spins on its axis. Longitude, however, is a relative, artificial concept. You must pick an arbitrary line that runs north and south all around the earth, through both poles, and then try to figure how far east or west of this line you are. (Today, of course, this line, called the prime meridian, is the meridian of Greenwich in England.) If you know the difference between your local time and the time at the baseline, you can easily determine how far east or west of the line you are, since every hour's difference means 15 degrees difference in longitude. If you have an astronomical observatory and the leisure to make certain very fine astronomical measurements, such as the relative position of the moon to the fixed stars, you can determine what time it is at the prime meridian. However, such measurements are hard to do aboard ship. It took until John Harrison's invention in 1761 of a spring-regulated clock, suitable for use aboard ship, to finally solve the problem. Until then, the ambiguity of longitude created a doubt about one's position in the world that seemed almost ontological, or so Eco would have us believe.

Supposedly, we know about the story in this novel because the author acquired the papers of one Roberto della Griva. Born in 1614, he was a member of a minor noble family of northern Italy, self-described vassals of the marquis of Monferrato. This memoir-romance-love letter collection was written while the author was cast away on an abandoned ship, whose whole company save one had been eaten by cannibals. The ship was anchored off an island in the south Pacific, located on a meridian which Roberto believed to be the natural prime meridian. (For reasons which still make a fair amount of sense, many people of the time thought the prime meridian should run through the Canary Islands.) He thus believed that he was on the west side of what today we would call the international date line, the island on the east side. When he looked at the island, he was therefore looking at the day before. As I said, the book is simple.

As a child, Roberto conceived the notion that he had a wicked brother, kept secret by the family, to whom Roberto ascribes all his own bad actions. Roberto believes, with varying degrees of seriousness, that he goes through life being punished for his brother's misdeeds. This imaginary brother, named Ferrante, serves not so much to relieve Roberto of moral responsibility as to explain Roberto's bad luck. If something bad happens to Roberto, it is Ferrante's fault, one way or another. Roberto finds the putative existence of Ferrante ever less comforting with the passage of time.

Roberto's experience of the homicidal meaninglessness of life begins at age 16 at the siege of Casale, whose fortress is key to the frontier between France and Italy. Eco explains with great lucidity the dispute which caused the French and their Italian allies, including Roberto and his father, to defend the city against the Spanish and the Holy Roman Empire. Even after the explanation, the siege still makes little sense. None of the participants was acting irrationally; logic simply worsened the tangle. The siege eventually degenerates into a truce whereby the Spanish occupy the town and the French the citadel. Finally the whole thing is settled by negotiation. Roberto's father is killed early on, to no particular purpose. Roberto returns to his ancestral land only long enough to arrange for an income for himself, and then travels to France.

The early 1640s find him in Paris, at the moment of the transition between the regime of Cardinal Richelieu and that of Cardinal Mazarin. Roberto does not really have a philosophical mind, but he is interested in scientific and metaphysical questions, so he frequents salons attended by astronomers and philosophers. We thus learn a great deal about what the early 17th century thought about the plurality of worlds and the possibility of a vacuum. The young Pascal puts in an appearance, and one character gets a letter from an officer serving in Holland whom are not told is named Descartes. Roberto sees a successful application of a substance called the "powder of sympathy." This is used to treat wounds, not by application to the wound itself, but by application to the weapon that caused it. He becomes something of an expert on sympathetic medicine, with grave consequences for his future. He also becomes infatuated with one of the great ladies of the salons. He believes, through a fanciful interpretation of the available information, that she is equally infatuated with him. He starts writing her self-revealing letters, a practice he continues even when there is no way to deliver them. This habit eventually produced Eco's holographic manuscript.

These pleasant years in Paris are ended when Cardinal Mazarin dragoons Roberto for a machination. (Roberto blames Ferrante for the misunderstanding that puts Roberto into the Cardinal's power.) Mazarin, like the leaders of other maritime states of his time, was much interested in the longitude question. He was particularly concerned that the English might find a solution before France did. Learning that the English were about to conduct experiments using the principle of the powder of sympathy to transmit the time to ships at sea, the Cardinal blackmails Roberto to take passage on the Amaryllis, a Dutch vessel on which the experiments would be made. Since Roberto is being sent to act as Mazarin's spy, the Cardinal gives him a good measure of sound advice about human nature and the ways of the world, such as one might expect from a contemporary of Baltasar Gracian, author of "The Art of Worldly Wisdom." (Actually, Mazarin's instructions also sound like Elrond's farewell address to the Fellowship of the Ring in the "Lord of the Rings," or at least what Elrond would have sounded like, had he been a pompous ass.)

Both the Dutch and the English being too stupid not to take paying passengers on a secret mission, Roberto has no trouble booking passage on the Amaryllis and sailing to the south seas. He also has no trouble finding out what the English are up to. A dog had been wounded with sword and brought on board, where an appalling English physician kept the wound from closing. The sword remained in London. At set times in the day, the sword was heated, which was supposed to make the dog howl and whimper. Noting when the dog exhibited acute distress, its tormentors on the Amaryllis believed that they could tell exactly what the time was in London. Happily, the ship sank in a storm. Roberto was the only survivor. He could not swim, but he had sense enough to grab a plank.

Roberto washes up, not on a deserted island, but on a deserted ship. This is the Daphne, another Dutch ship, also obviously on some kind of scientific expedition. There is a roomful of all manner of time pieces. There are a garden and an aviary. There is unending succession of storerooms filled with remarkable stuff. Indeed, one of these cubbyholes turns out to contain Father Caspar Wanderdrossel of the Society of Jesus. It should be mentioned that the first third or so of the book consists of Roberto's recollections incited by one or another of the chambers of the Daphne. To me, at least, this procedure is reminiscent of "The Memory Palace of Matteo Ricci," Jonathan Spence's biography of the great missionary to China. The book was much concerned with Ricci's science of mnemonics, which works by creating associations between facts you want to remember and an imaginary structure you know well. I could be wrong. Roberto had also been exploring the Amaryllis's seeming endless stores of aqua vitae, so it is hard to say. ("Aqua vitae" is Latin for the Irish "uisce beatha," which of course is whiskey. Did 17th century Dutch ships carry barrels of whiskey? Rum maybe? The question is irrelevant, but in keeping with the spirit of the book.)

Fr. Wanderdrossel was himself looking for a way to determine longitude (hence the roomful of clocks), but only to help prove a larger thesis about the origin of the waters of the Great Flood. The priest believed that the excess water arose physically from submarine fissures in the antipodes, and then was magnified temporally by being passed from one day to another across the international date line. At least, this is what I think he said. Fr. Wanderdrossel had spent many years in Rome, but unfortunately he spent most of his time there speaking Latin to other Jesuits. His talk is therefore a jargon of Latin and German and such English (doubtless meant to represent the Italian of the original) as used to be found in the comic strip, the "Kaztenjammer Kids." The result is unlovely, yet his dialogues with Roberto go on for pages and pages. However, the content of their discussions, which dealt in large part with the structure of the solar system, are very interesting. Roberto defended the Copernican system, whereas Fr. Wanderdrossel endorsed the more moderate hypothesis of Tycho Brahe, which had the sun and moon orbiting the Earth and the planets orbiting the sun. The remarkable thing is that, absent Newton's laws of motion, Tycho Brahe has the better of the argument.

For reasons that seemed sufficient at the time, the crew had abandoned Fr. Wanderdrossel and taken the Daphne's only boat to the island, where the cannibals got them. Despite the cannibals, Roberto and the priest look for a way to reach the island. The priest cannot swim either, so he tries to teach Roberto how to swim. Despairing of the young man's progress, he brings a remarkable machine out of storage, a sort of diving bell with an open bottom that would let the occupant walk to shore over the sea floor. Roberto lowers him over the side, and he is never seen again. Roberto considers the possibility that the floors of all the seas are covered with hidden Jesuits.

Roberto thereafter divides his time among swimming practice, the aqua vitae, and his increasingly fanciful writings. The latter come to deal almost exclusively with the wicked deeds of his brother, Ferrante. By and by, Roberto describes how Ferrante himself sets out in search of the prime meridian. When Ferrante reaches it, he uses its time-travelling capacity to sail back to the time of Christ, whom he kidnaps from the Garden of Gesthemane and imprisons on the Island of the Day Before. The Redemption never having taken place, the whole human race is damned. Roberto does devise a fitting end for Ferrante, however. Not long after, Roberto drowns, leaving his papers to astonish a later world.

There is nothing wrong in principle with a story that has no particular point, or whose point is that there is no point. Unfortunately, none of this was really as much fun as it should have been. A pessimist, or a professor of semiotics, might say the same about life. However, books are supposed to be better than life.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-04-07

A really fun map, and as the article reluctantly notes, not especially credulous about the tall tales travelers shared.

I've linked stories on Uber's evil genius before. This seems to be their attempt to make Hari Seldon real. The New York Times article includes spiffy simulations.

An intriguing look at what kept the Romans, so very advanced in some ways, from an industrial revolution.

Father Matthew Schneider defends oil pipelines from a Catholic point of view.

I'm not surprised, but I've also known some truly gifted scam artists.

A fun interactive map that lets you see where people live.

A labor union associated think tank argues that free trade lowered wages in Mexico.

Noah Smith defends Case and Deaton.

James Miller interviews Greg Cochran, part 1. A wide-ranging conversation, covering Less Wrong, microbiology, federal funding, and free speech. Well worth a listen. [I helped fund this.]

Greg Cochran points out the flaws and misunderstandings in Cordelia Fine's new book. [I also helped fund this.]

Hollywood accounting is pretty shady. I wish Shearer the best of luck.

Damon Linker points out that about as many people die from alcohol overdose as heroin overdose in the US, and the overall rate of alcohol problems mirrors the rate of things like depression: about 1 in 3 over a lifetime.

Kevin Drum has a chart of the rates of death by drug overdose from selected periods and causes that dovetails nicely with Linker.

Ireland's population still hasn't recovered from the combination of famine and mass emigration in the mid-nineteenth century.

  • Crime rate in the US

This chart comes from US Census Bureau data, and the paper containing it seems to be this: J. A. Miron, 1999. "Violence and the U.S. prohibitions of drugs and alcohol," American Law and Economics Association, vol 1(1), pages 78-114. I am intensely curious about the very low homicide rate at the beginning of the twentieth century. Particularly since it differs from other charts of the same thing, for example this one from Steven Pinker's Better Angels.

The solution seems to be that a 1995 article in Demography contains a model of homicide data that has been widely used to estimate missing data. Why do we need a model you ask? Because the data from the period didn't include everywhere in the United States. Not being an expert, the model seems reasonable, but don't forget it is a model.

Eckberg, D. L.  1995. “Estimates of Early 20th-Century U.S. Homicide Rates,” Demography.

I've stumbled on the same dataset twice now. The Tuskegee Institute started keeping track of lynchings in 1892. The data only goes back to 1882, which is the year the Chicago Tribune started keeping numbers. The NAACP also started collecting numbers in 1912. You can see in the chart the point when lynching stopped being just a kind of frontier justice, and started being a way to terrorize black Americans. If data existed for the entire 19th century, I think this trend would be even more clear. Data in EXCEL here.

A post on the reddit for Slatestarcodex questioning the qualifications of Emil Kirkegaard, whom I'm linked several times. I think most of the points made by the anonymous poster are reasonable, and also wrong in Kirkegaard's case.

Emil Kirkegaard's response.

Ross Douthat argues that we should just go all the way to true Imperialism. A position that I endorse. S. M. Stirling gives a notable fictional example of how an actual empire, a universal state, can be genuinely multicultural. Also, the Hapsburgs.

Linkfest 2017-03-26

Tollense Battle

Last summer I posted an article about a battlefield from the Bronze Age. Here is another article about the same archaeological site, with an alternative explanation of what happened. Archaeology always requires interpretation, so it is wise to keep in mind exactly what you found, and how it could have gotten there.

'London Bridge is down': the secret plan for the days after the Queen’s death

The end of an era has been planned in advance.

How Aristotle Created the Computer

This is a pretty survey of the developments in mathematics and philosophy that allowed us to develop digital computers. While the article notes that Boole's algebra was seen as spectacularly useless when he came up with it, it also took WW2 and the Cold War to really develop this technology. Also see James Ross' Revenge of the Analysts: Aristotle's Revenge: Software Everywhere.

More evidence for my cocktail theory of science

A long comment at Greg Cochran's blog gets at the cultural difference between East and West that seems to lie at the heart of why Science is a Western thing, via a comparative study of martial arts.

Do Immigrants Import Their Economic Destiny?

Short answer: Yes.

Johan Egerkrans' Balder

© All rights reserved by Johan Egerkrans

© All rights reserved by Johan Egerkrans

C. S. Lewis once said he "loved Balder before he loved Christ". Seeing an image like this, I can imagine why.

Charles Murray’s SPLC page as edited by Charles Murray

Murray said he had fun writing this.

Germany's grand First World War jihad experiment

This sort of thing has been going on for a very long time. I enjoyed the Talbot Mundy reference in the article, since I'm currently reading King of the Khyber Rifles.

'Fallout: New Vegas' Writer Chris Avellone: "Fantasy is Not My Happy Place"

Chris Avellone has worked on many of my favorite games, from Fallout to FTL.

American Indian Firewater Myths Are No Myths

The Pine Ridge reservation may be one of the saddest places in America.

Organelle Transplant

Yesterday on Twitter, Razib Khan remarked that he hadn't realized pro-life Christians relate genetics to souls.

Since I wasn't party to the conversation, I have no idea what was said. I have heard things like this however, and it made me go hmmm....

I decided to respond in a blog post, since Twitter sucks for anything moderately complicated.

The bigger context for this is a proposal to treat mitochondrial disease that was approved by the Human Fertilisation and Embryology Authority in the United Kingdom. In what seems like an attempt to annoy the maximum number of people possible, this procedure is usually described as a 'three-parent baby'. While there is a germ of truth in this description, you could also call it an organelle transplant, since the intent is to replace defective mitochondria with working ones.

The germ of truth is this: the replacement mitochondria should breed true, because the technique referenced in the article, pronuclear transfer, removes the male and female pronuclei from one fertilized egg [the one with defective mitochondria in the cytoplasm] and moves them to another fertilized egg [whose pronuclei have been removed] with different mitochondria. These new mitochondria are in fact from a third person, and are genetically distinct from the other woman's.

I use organ transplant as a reference point, because a donated organ also contains DNA different from the recipient's. The key difference here is that your donor liver's DNA cannot be passed down to future descendants.

So why does anyone care? People care because 1) pro-life Christians are generally essentialists, meaning that essences or forms [in the Platonic or Aristotelian sense] define what things are, and 2) popular science accounts of genes or DNA usually describe these things as our 'essence' [in the loose popular sense of the word]. Thus our genes probably seem real important to some folks, and tampering with them is tantamount to playing God. I think this is a misunderstanding, albeit a predictable one.

In my opinion, I don't think the fact of getting DNA from a different source matters at all in its own right. One reason is much the same one Razib talks about in his tweet:

Some of our genes are indeed from viruses and stuff. There is a theory that mitochondria were once separate organisms that have become symbiotes. A lot of genes are common to all life on Earth. Strictly considered, a gene is just a way of encoding information about proteins. Any gene that works in some fashion is a real gene, although some clearly work better than others.

The second reason is that I think a lot of pro-life Christians have made a philosophical mistake in conflating the terms we use to talk about people. John Reilly said it, and I just stole it:

A human is an essence (if you don't believe in essences you don't believe in human beings); a homo sapiens is a kind of monkey; and a person is a phenomenon. Perhaps I read too much science fiction, but it is not at all clear to me that every human must necessarily be a homo sapiens. As for person, which is an entity, conscious or otherwise, that you can regard as a "thou," is conflated with the notion of person, as an entity able to respond in law, either directly or through an agent.

Human ≠ homo sapiens. It just ain't. Popular science accounts are correct insofar as homo sapiens is a biological concept, it can be usefully defined using genes. Human is a philosophical concept, moreover one that is dependent on a specific context to really be cogent. I think that at the very least Neanderthals were humans too, and possibly other hominins. Hell, if we were consistent, pygmies might be considered a separate species from homo sapiens, because they split off from other humans 300,000 years ago, which is before the currently defined date of the origin of anatomically modern humans

I have my doubts about the current theories, but that doesn't matter. Human is a status that is in principle independent of lineage. In practice, it isn't, but that is different from saying that they are identical.

Now, what about this mitochondrial replacement therapy? I'm still opposed. The reason has nothing to do with genes. In my philosophical tradition, there are three criteria an act must meet to be considered good:

  1. Right act
  2. Right end
  3. Right circumstances

The techniques in the Wikipedia article all involve IVF, which means creating embryos using harvested eggs and sperm, which has a pretty horrible success rate [10-20%]. That in itself isn't damning, but the way in which unused embryos are discarded [that essentialism again], and the way in which sperm and eggs are collected are objectionable in their own right. Only criterion 2) is met: preventing disease is a very good thing, especially if you can help reduce future occurrences. Anyone who doesn't share my premises about human embryos [if you don't believe in essences, you don't believe in humans], will likely not agree with my objections to IVF, although I do note that even people who are in theory in favor of it tend to find it icky and horrible when they see it.

Linkfest 2017-03-17: St. Patrick's Feast Day Edition

No evidence to back idea of learning styles

Steven Pinker [among others] writes a letter to the Guardian against currently fashionable learning styles fads in education. In the post pointing to this, Steve Sailer offers a mild counterpoint based on his position that a lot of "neuroscience" findings are better thought of as something like marketing, a real benefit, but nothing lasting like science should be.

Why so many conservative Christians feel like a persecuted minority

Damon Linker pens a sympathetic and critical take on Rod Dreher's The Benedict Option.

Geoarchaeologist Proposes There Was a “World War Zero”

I first came across this idea on Jerry Pournelle's website as the first dark age. This was a period of steep decline that makes the dissolution of the Western Roman Empire seem minor in comparison. In the first dark age, even the memory of writing was lost. When the Greeks began to rebuild, the fortifications of their predecessors were seen as the work of monsters, rather than men, because no one could conceive of building anything as massive. I had not heard the term 'Luwians' to describe the people of the Anatolian peninsula who may perhaps be the 'Sea People' who overran much of the civilized Eastern Mediterranean in that time.

The Fall of Rome and "The Benedict Option"

I'm not really sympathetic to Rod Dreher's Benedict Option, and a big part of the reason is that his metaphor is a really bad description of what actually happened in the fifth century.

When Public Policy meets Elementary Biology

To go along with Ross Douthat's plan to create a series of immodest proposals to try and shift public policy debates into more useful channels, here is Henry Harpending's take on how we should shift welfare policies to take into account human biology. Henry implied at the end of the post that his suggestion needed amendment to prevent bad consequences, but to my knowledge, Henry never published a followup to this before he died, which is a damn shame.

An Oxford comma changed this court case completely

I've always been a fan of the Oxford comma.

Immigrations and Public Finances in Finland Part I: Realized Fiscal Revenues and Expenditures

Emil Kirkegaard posted this on Twitter. The graph in the source report is astonishing.

Net current transfers without indirect taxes by country of birth in 2011.

Net current transfers without indirect taxes by country of birth in 2011.

Net fiscal effects by country of birth in 2011. Averages for populations aged 20-62 years old.

Net fiscal effects by country of birth in 2011. Averages for populations aged 20-62 years old.

The originating organization is a Finnish anti-immigration group, but the results astonished just about everyone. The methodology is an attempt to account for all taxes, direct and indirect, as well as government spending of all kinds. I'm not sure I would hang my hat on it, but I'm not sure it's wrong either.

Lean In’s Biggest Hurdle: What Most Moms Want

Any attempt at statistical parity in childcare is doomed to failure, because many women actually like having kids and raising them. This isn't to say that every woman wants kids, or that every woman must stay home, but given the option, many women do choose to either work part-time, or leave work entirely for a period of time.

BOOK REVIEW: SEEING LIKE A STATE

I can't improve on SSC's opening paragraph:

Seeing Like A State is the book G.K. Chesterton would have written if he had gone into economic history instead of literature. Since he didn’t, James Scott had to write it a century later. The wait was worth it.

Right or wrong direction: The nation generally

This Reuters poll on whether the nation is generally going in the right direction is pretty striking. Especially if you compare it to this Gallup poll on President Trump's approval ratings.

I naively expected these results would roughly track [keep in mind the timeframes are very different]. They don't at all, which is pretty interesting. 

Consistent Vegetarianism and the Suffering of Wild Animals

I also have a hard time taking complaints about modern animal husbandry seriously.

Linkfest 2017-03-10

Why the campus protests at Middlebury matter

C. C. Pecknold argues the position I took last week: the existing political coalitions in the US are breaking up.

The four fallacies of warfare, according to Donald Trump’s new national security advisor

John J. Reilly used to argue for #4 on a regular basis. 

We need more useless knowledge

I don't buy this argument. Most of the examples cited were wartime research projects.

A Different Bargain on Race

Ross Douthat channels Steve Sailer to argue that we should alter the deal.

What if Donald Trump and Hillary Clinton Had Swapped Genders?

The actors for this nailed the delivery, gestures, and body language of the candidates.

Letter from Secretary Ryan Zinke

I'm glad to see he says he is opposed to the sale of public lands.

American Carnage

This is a harrowing read about the rise of opioid deaths in the US.

Peter Thiel talks fracking and globalization

Thiel credits fracking with a bigger impact than Silicon Valley, which seems about right.

The Ancient Ghost City of Ani

The ruins of an medieval Armenian capital.

F.D.A. Official Under Bush Is Trump’s Choice to Lead Agency

The FDA drug approval process is actually pretty prompt, given what needs to be done. The hard part is getting the necessary evidence to prove your therapy works.

The plot against the Pope

Damian Thompson points out that even some of Pope Francis' supporters seem to hope he will resign.

Linkfest 2017-01-22

Happy New Year! I've taken two weeks off for the birth of my third child, but now I am back at it!

Peak conception time by daylight hours

I saw the following chart on Twitter, and found it intriguing.

Roy Wright was interested enough to write a blog post going into further detail.

A Lot of What Is Known about Pirates Is Not True, and a Lot of What Is True Is Not Known.

A great piece about the gradual transformation of piracy in the American colonies from just another job to an act of rebellion.

The first ever vending machine stopped people from stealing holy water

Hero of Alexandria is a remarkable figure, known for his almost modern seeming machines

San Francisco Asks: Where Have All the Children Gone?

Philosophies that frown on reproduction usually don't survive.

Peanut Allergy Prevention Advice Does a 180

A nice summary of how the conventional wisdom on peanut allergies was upended by one good study.

Should social science be more solution-oriented?

Duncan Watts argues in Nature: Human Behavior that my cocktail party theory of science is correct.

Housing supply is catching up to demand

Unfortunately, supply grows very slowly in this area. But this is good news.

Origin of computing terms "patch", "bug", "loop", "library"

Great history.

LinkFest 2016-12-31

Last link roundup of 2016. Happy New Year!

Stun guns and male crew: Korean Air to get tough on unruly passengers

This article is interesting for all kinds of reasons. The proffered explanation that "Asian carriers including us [Korean Air] have not imposed tough standards because of Asian culture", the claim that the offending passenger had two and a half shots of whiskey and then claimed to be blackout drunk [possible for a Korean, but not plausible in my mind], and the recent increase in violent incidents on Korean airlines.

Nine charts that show how white women are drinking themselves to death

Part of my sad, continuing series on how American women are having increasing problems with alcohol.

What Made 2016's Doom Great

I'm less and less comfortable with graphic violence in videogames, but I appreciate this review of the new Doom.

Farewell

Thomas Sowell was a formative influence on me. I read a number of his books in high school, and while I haven't read his column in a very long time, I think of him very fondly. Enjoy your retirement, Dr. Sowell.

Varieties of Religious Experience

Ross Douthat explores the mysteries of human life.

The volatile history of Star Wars videogames

A bit of history on LucasArts, Lucasfilm's in-house videogame studio.

The desert that revealed the ultimate ice age

A short piece about the Snowball Earth hypothesis.

A Call for a New Strenuous Age

Brett McKay at the Art of Manliness argues that we need to rediscover healthy challenges to restore our masculinity to balance.

Accounting for Thanksgiving’s Ghosts

Jacobin Mag entertains a counter-factual about what the United States would be like if disease hadn't killed most of the inhabitants of the Americas post-Columbus.

LinkFest 2016-12-18

Who was first in the race to the moon? The tortoise

I expected a bit more out of the punny title, but a nice bit of space history.

Skill Builder: Understanding Handsaws

A nice primer on the way saws work.

When an Animal's Sex Is Set by a Microbe

Genetics is weird. And fun.

The Parochial Progressive Obsession with Ayn Rand

I went through a Rand phase myself, so I understand the confusion, but outside of teenagers, very few people take Rand seriously.

Scientists reconstructed the face of St. Nicholas – here’s what they found

A week late, but let us see St. Nick!

How Rogue One’s Alan Tudyk Turned Himself Into a 7-Foot Droid

After my Thrawn review, I'm in a Star Wars mood. Plus I'm a fan of Alan Tudyk.

Approve Drugs for Safety Only – it’s Like “Back to the Future” – Not for Me

There really isn't any such thing as safety only. All drugs [and all medical treatment] carry risk as well as benefit. You can't assess the value of the treatment without taking both into account.

Childhood forecasting of a small segment of the population with large economic burden

A paper published using the Dunedin data, one of the longest running longitudinal studies. I've seen this kind of result before, so I don't find this result particularly surprising. I haven't really got any idea of what kind of practical public policy could come out of this, at least anything that hasn't already been tried. The authors of the paper found that poor "brain health" at age 3 was associated with poor ourcomes later in life. Searching through the paper, I found "brain health" to be a euphemism for a combination of intelligence, the ability to defer gratification, and the ability to get along with others. Right, I could have told you that mattered. Personality psychology is the one part of psychology that replicates well.