Superbugs: The Race to Stop an Epidemic Book Review

Superbugs: The Race to Stop an Epidemic
by Matt McCarthy, MD
Avery (May 21, 2019)
ISBN 0735217505

I received a free copy of this book through LibraryThing’s Early Reviewers program.

Superbugs is a fascinating book, and I’m glad I had the chance to review it. This book is a window into the management, and hopefully curing, of difficult antibiotic-resistant infections from the point-of-view of a physician who sees the worst the world has to offer. McCarthy wrote it in a chatty, personable, and slightly ADD style that probably makes it more accessible. This is a difficult thing to get right with a work of popular science, which I take this book to be.

There is an infamous rule of thumb that including one mathematical formula in your book will reduce your readers by half. Each additional formula continues the process of exponential decay. McCarthy has clearly decided to maximize his potential readership by avoiding mathematical formulae, or worse, skeletal formulae of organic molecules.

Dalbavancin, public domain  By Hbf878 - Own work, CC0, https://commons.wikimedia.org/w/index.php?curid=73536309

Dalbavancin, public domain

By Hbf878 - Own work, CC0, https://commons.wikimedia.org/w/index.php?curid=73536309

However, while he doesn’t show them, he talks about them a lot. If you know what is going on, you can either envision the diagrams or look them up, but organic chemistry isn’t needed to tell the stories that McCarthy wants to tell.

The first story is McCarthy’s work with Allergan on the antibiotic dalbavancin, and his journey to learn how to write a protocol for a clinical trial and gain consent from often frightened and bewildered patients who show up in Emergency Rooms with methicillin-resistant Staph Aureus infections. His meandering style allows him to digress into the second story, which is a capsule history of the development of antibiotics, and the sometimes checkered history of human experimentation in medicine.

Sir Alexander Fleming, looking rather intense for the photographer  By Official photographer - http://media.iwm.org.uk/iwm/mediaLib//32/media-32192/large.jpgThis is photograph TR 1468 from the collections of the Imperial War Museums., Public Domain, https://commons.wikimedia.org/w/index.php?curid=24436974

Sir Alexander Fleming, looking rather intense for the photographer

By Official photographer - http://media.iwm.org.uk/iwm/mediaLib//32/media-32192/large.jpgThis is photograph TR 1468 from the collections of the Imperial War Museums., Public Domain, https://commons.wikimedia.org/w/index.php?curid=24436974

His history of antibiotic development includes well-known figures like Alexander Fleming, and the overlooked, like Elizabeth Lee Hazen and Rachel Brown, who developed nystatin, the first antifungal drug.

Elizabeth Lee Hazen and Rachel Brown  By Smithsonian Institution - Flickr: Elizabeth Lee Hazen (1888-1975) and Rachel Brown (1898-1980), No restrictions, https://commons.wikimedia.org/w/index.php?curid=18386483

Elizabeth Lee Hazen and Rachel Brown

By Smithsonian Institution - Flickr: Elizabeth Lee Hazen (1888-1975) and Rachel Brown (1898-1980), No restrictions, https://commons.wikimedia.org/w/index.php?curid=18386483

The book is probably worth it just for this well-done short summary of the powerhouses of modern pharmaceuticals [and more evidence for my theory that the greatest period of technological advancement in the twentieth century was between 1920-1950]

By the early 1950s, ninety percent of the prescriptions filled by patients were for drugs that had not even existed in 1938. pg 101 [citing Miracle Cure by William Rosen 2017]

However, you also get a good look at how medicine is practiced in the United States today, from the practitioner’s point-of-view. Physicians need to manage conflicts of interest, like the portion of McCarthy’s salary that is paid by Allergan and other corporations, patients that are bound and determined to pursue courses of treatment that the evidence doesn’t support, and the sheer soul-crushing burden of seeing so much suffering day-in and day-out.

We Americans expect our doctors to be superhuman: to work without rest, to diagnose without fail, and resist the siren call of wealth. Doctors receive enormous deference for our unrealistic expectations, but a subtext of McCarthy’s book is the toll this takes on our often genuinely selfless and dedicated physicians. Who do in fact accept honoraria and speaking fees from pharmaceutical companies and miss their children while they work long hours.

Another interesting aspect of American medical practice is its insularity. Nearly every reference in McCarthy’s book is from a medical journal, which is the mental world of most physicians. However, medicine might progress faster if physicians were to be a little bit more widely read. For example, McCarthy devotes a fair bit of space to the research of Vincent Fischetti, who isolates enzymes from bacteriophages. But phage therapy was a thing before antibiotics were invented, and was largely forgotten in the initial enthusiasm for antibiotics. Phages and adjacent technologies would be a useful adjunct to antibiotics, but medicine, meaning mostly expert physician opinion, has been pointedly disinterested for seventy years or more. I appreciate that McCarthy is trying to do something about that, but reading and citing mostly medical journals is only going to perpetuate the attitude that pushed useful therapies aside because it wasn’t the hot new thing, or because it came from the wrong field.

All in all, I enjoyed this book. I think McCarthy did a fine job making the history of antibiotics accessible, and was remarkably honest about himself and his field, frankly admitting the challenges physicians face today. This book could have been dry, but it wasn’t, so I am willing to embrace the rapid alternation between the present and the past. McCarthy made this style work. One can learn a lot about the world, past and present, from this book.

In a final note, there is a short letter tucked in my review copy that public results for McCarthy’s dalba study are expected on or around May 21st, just under a week from the publication of this review. I hope everything went well, because I like having options when the bacteria evolve faster than us.

My other book reviews | Reading Log

Is the rate of production of useful ideas really dependent on the number of people involved?

Paul Romer at the Nobel Memorial Prize Ceremony  By Bengt Nyman from Vaxholm, Sweden - EM1B6039, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=74934767

Paul Romer at the Nobel Memorial Prize Ceremony

By Bengt Nyman from Vaxholm, Sweden - EM1B6039, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=74934767

Conversations with Tyler is one of my favorite long reads at the moment. This recent talk with economist Paul Romer [recent winner of the Nobel Memorial Prize in Economics] overlaps nicely with many of my current obsessions [including English orthography!]. Today, let’s look at the rate of production of useful ideas. Romer brings up a paper by Bloom, Jones, Van Reenen, and Webb “Are Ideas Getting Harder to Find?“.

The model used here is a pretty simple one:

Here is what Romer says on the subject:

Chad Jones has really been leading the push, saying that to understand the broad sweep of history, you’ve got to have something which is offsetting the substantial increase in the number of people who are going into the R&D-type business or the discovery business. And that could take the form either of a short-run kind of adjustment cost effect, so that it’s hard to increase the rate of growth of ideas. Or it could be, the more things you’ve discovered, the harder it is to find other ones, the fishing-out effect.

I’ve had some thoughts myself about whether it really is harder to find new ideas, but I wonder whether the model posited is really telling us anything interesting. The equation has the form of a rate [research productivity per researcher] multiplied by the number of researchers. But per the notes in the paper, research productivity is defined as TFP growth divided by research effort, which is proxied by the number of researchers scaled by wages. This just cancels out the number of researchers, and gives us something like growth equals research productivity with an average wage fudge factor.

Number of researchers = people who work in IP generation

Number of researchers = people who work in IP generation

These things are anti-correlated, given the way they are described.

These things are anti-correlated, given the way they are described.

What it looks like to me is the rate of intellectual discovery is flat to slightly declining [when defined as equal to TFP growth], and that the number of people involved is completely irrelevant once you reach some relatively small threshold. I think the growth model referenced above is mostly useless for what it purports to be about.

That is a pretty bold statement, but I stand by it. My prediction is that you would get about the same rate of growth if you took most of the people doing “research” and had them do something else. On balance, they contribute nothing. [Or in my darker moments, I suspect a net negative contribution is possible….] When I think about this, I’ve made a number of simplifying assumptions, so let’s look at those.

#1: Innovation and scientific discovery are almost wholly the product of a few brilliant minds

This growth model matches up with other kinds of growth models in economics. When you are talking about how fast you make stuff, it is pretty plausible to think that adding more people will increase the overall rate of making stuff, even if you account for differences in ability. This is because given a method of production, there really isn’t much absolute difference in ability to make stuff. You can probably useful model such things as a random normal, which works well in general linear models.

For intellectual activity, this doesn’t seem to be the case. The distribution of accomplishment is nothing like ability. In practice, a tiny fraction of scientists produce the vast majority of results, with a distribution that looks something like a power law distribution. This isn’t a particularly obscure result, but it doesn’t seem to enter into the model.

How Jones et al. attempt to compensate for different levels of ability

How Jones et al. attempt to compensate for different levels of ability

What was done instead was an attempt to account for variations in productivity by looking at average wages. But again, this has the wrong form. Wages don’t vary as much as productivity does, or in the same way.

#2: The roles of the most innovative researchers are already filled by the most productive people

I think this one is arguable, but close enough, especially for the kind of outsize talents that really drive Solovian growth. Especially in a meritocratic age, the vast majority of bright, talented people already get a chance. To a first approximation, the most talented people are already doing what they are good at, so if you add more people, you are going to be adding researchers with a small probability of adding anything of huge impact. This is true even if you find smart people to do more research, given assumption #1.


Counterarguments

I can think of plausible arguments that should count against my argument above. I’ve made some of them before.

#C1: We aren’t talking about science, but engineering

The data behind the Lotka curve and other similar metrics mostly looks at unusual accomplishments, like publishing a lot of papers or winning big prizes. However, the data Jones et al. are looking are are mostly about total factor productivity growth, which is pretty clearly applied science, or what most of us call technology and engineering. This of its very nature is more diffuse, and needs a broader range of talents to actualize than a seminal paper does.

#C2: The historical rates of accomplishment in technology growth probably should be discounted because important things were left out

It is easy to build great things fast if you don’t need to worry about fracture analysis or environmental impacts. I’m sometimes horrified by the huge costs borne by the public during the Industrial Revolution, but I don’t face the choices they did either. Modern engineering is more labor intensive than it used to be because we have to integrate a much more comprehensive body of knowledge. And consequently, accidents of all types, environmental pollution, and infrastructure disasters are all less common than they used to be [with a huge caveat for China].

I take this as a justification for including all of the extra people who get paid to generate intellectual property. To be fair, not everyone involved in STEM work in the US falls into this bucket, and depending on how broadly IP is defined, it could also include a lot of non-STEM workers too.

#C3: Sectors like Pharmaceuticals seem to show this pattern of declining efficiency

Eroom’s Law

Eroom’s Law


On balance, I still think it is a little off-base to inflate the number of effective researchers so heavily over the last 90 years. When you take everything into account, I think even in technology, real advances are more Lotka curve like, but you also need a lot more people to get things done, but not 20 or 30 times as many, which is what Figure 1 from the Jones paper implies.

Pharma does look bad, but if you look at something like how much better imaging is, which heavily leverages Moore’s Law, medicine as a whole has developed quite a lot of new technology. What you get for it is another story, of course.

What works in healthcare?

Short answer: almost nothing.

I’m being flip about a very serious subject, but at the same time I am in fact serious. Modern medicine probably works a lot less well than you probably think it does.

How could I say such a thing when you look at graphs like this?

Maternal mortality over time Our World in Data Global Health https://ourworldindata.org/health-meta

Maternal mortality over time
Our World in Data Global Health https://ourworldindata.org/health-meta

I say it because that is precisely what the evidence shows. If you look at Cochrane, the world’s preeminent aggregator of medical statistics, it is hard not to come away a bit disappointed. Effect sizes [usually the difference between the test group and the control group scaled to standard deviation units] tend to be rather modest.

Here are a few examples:

You can amuse yourself by finding your own examples. There are a few things that genuinely work well. But even for things like MMR, the evidence isn’t as good as you might think.

I suspect what is going on is that medicine works, just barely, on average. You get things like the long slow decline of maternal mortality from the confluence of lots and lots of little things added together. If you look at anything else, heart disease, or cancer, you will see the same pattern. Vaccines are an exception. Disease rates for things with effective vaccines just drop off immediately.

Polio dropped right off

Polio dropped right off

Heart disease is on a long, slow decline

Heart disease is on a long, slow decline

Which brings us to my motivation for bringing this up at all. Random C. Analysis just published an updated argument that healthcare spending in the United States isn’t badly out of line with the rest of the world, once you take into account how much richer we are than just about everyone else. We have more, so we spend more. According to RCA’s data, when income goes up 1x, healthcare spending goes up 1.6x.

My contribution to this is to suggest the reason for this is we purchase more and more healthcare is precisely because it doesn’t work very well. At lot of modern medicine treats symptoms better than causes, because we don’t understand the causes very well. You buy as much symptom relief as you can afford. Even when treatments are genuinely curative, the success rates are often low. For example, the controversial statistic number needed to treat attempts to quantify how many patients need to be treated in order to produce one cure.

This number is often quite large, often on the order of 50 to 100. Even for a really good treatment with an NNT of 10, 9 times out of 10 it isn’t better than the alternative it is being compared to. That is a lot of wasted time, effort, and money.

We just don’t know how to predict the 1 time it works, so we treat everybody and hope for the best.

You can find speculation like this from Goldman Sachs that you can’t make enough money curing disease as compared to offering palliative care. I’m sure there really are businessmen who would gladly milk you for everything you are worth, but I don’t worry about in reality because no one understands how to cure most of the things that ail us. It isn’t like miracle cures are being withheld, or even that research is being directed away from cures. We don’t know enough to do that.

If we really wanted to limit costs for healthcare, it might be possible if we ruthlessly limited access to only things that really worked. We would get 80% of the benefit for 20% of the cost. But people would be pissed. The other 20% of the benefit does actually work, sometimes. I don’t think this is possible, or even really a good idea. Any real solution will involve technology we don’t currently possess.

LinkFest 2018-07-16

On the left, what everyone thinks machine learning is. On the right, what is actually is.

On the left, what everyone thinks machine learning is.
On the right, what is actually is.

Ways to think about machine learning

I've been a skeptic about artificial intelligence in general, and a critic of the ways the actual technology has been hyped. This is a pretty reasonable take from someone who is willing to invest a lot of money in machine learning. Machine learning is another kind of automation. We've been seeing big things come out of automation for 100 years, it makes modern life possible, but it is easy to lose perspective.

 

Why the Future of Machine Learning is Tiny

An example of what machine learning can mean in practice.

 

Snapping Spaghetti

Applied mechanics of fracture with slo-mo video! Why does a piece of spaghetti break into three or more pieces when bent? Now you can find out!

 
Manufacturing output per capita, colored by what percent of the economy manufacturing is

Manufacturing output per capita, colored by what percent of the economy manufacturing is

Manufacturing output divided by employment in manufacturing, Canada and Taiwan were missing the employment estimate

Manufacturing output divided by employment in manufacturing, Canada and Taiwan were missing the employment estimate

Global manufacturing scorecard: How the US compares to 18 other nations

Manufacturing stats are a subject of interest to me. I don't find much of interest in the Brookings manufacturing scorecard, which is just their subjective rating of various things. Rather, I plotted the manufacturing output for each country per capita, and per person employed in manufacturing, a kind of crude productivity number.

I think the *really* interesting thing here is how much Switzerland sticks out. The parts of the economy in Switzerland I am most familiar with are chemical precursors for pharmaceuticals and medical devices, which are both high value sectors.

 

When Evidence Says No, But Doctors Say Yes

This is a great article on how hard it is to find clear evidence that common therapies work, and how hard it is to disseminate that knowledge once we have it.

 

Israeli space probe to land on Moon in 2019

I was going to say this isn't surprising from a country that also made their own nuclear weapons, and then I saw the money for it came from a South African businessman. Israel and South Africa *probably* cooperated on nuclear weapons too.

 
 

Thou Shalt Not Wirehead: Religion vs Gratification

This is pretty good. I think I mostly agree, except I am also very interested in whether religion is *true*. Religion can be pretty helpful in encouraging behaviors that help you in this world, for example, the prosperity Gospel is pretty popular because it actually works out that way. If you give up drinking, gambling, and whoring, usually your life materially improves. But sometimes religion can make you do things that are the opposite of helpful in this world. For example, the Xhosa.

 

Welcome to the Party, Pal

A reflection on how the political coalitions in the United States came to be.

 

Does Free Trade Bring Lower Prices?

Dani Rodrik reminds us that we have to describe the world as it is when we make economic projections, not a model of it.

 

Donald Trump tells us truths we don’t want to hear

Matthew Paris argues that Donald Trump acts like an Emperor, and you shouldn't be surprised by that.

 

The Fear of White Power

What is the value of political correctness to a minority in society? And is its cost?

 

Shortwave Trading | Part III | Fourth Chicago Site, East Coast, Patent, Regulation, and Farmer Kevin Mystery

High volume traders are rolling their own radio networks to get a leg up on the competition.

 

Traditional Euro-bloc: what it is, how it was built, why it can't be built anymore

The perfect counter-point to my post on modern urban development. We can't just build things because we like how they look, we have to care about money, and how neighborhoods evolve, and what will actually work for the people who live there.

The Long View 2005-08-23: The Perfection of the Species

The image in the header is the image John referenced in his joke about contributing to the state of perpetual surveillance. The man in the image is Herbert Kitchener, 1st Earl Kitchener, scourge of the Boers and one of the few generals who thought the Great War would be long.

I appreciate John's simple computation of the average tenure of each Supreme Court Justice in groups of ten. It is a simple thing in now, and in 2005, to look up such information to double-check something like now Chief Justice John Robert's 34-year old speculation that the framers of the Constitution hadn't anticipated how long people live now.

Justice Roberts made a common mistake, which is thinking increasing average lifespans means that adults live 20 or 30 years longer than they used to. There is some increase for adults, but almost all of the change in the average was driven by changes in deaths under the age of 5. 

Something that struck me just now is that I've seen a lot of things on the subject of average human lifespans that assumes that childhood mortality was as high in Classical times or earlier as it was in early modern Europe. However, we know that what we now call childhood diseases are mostly recent things, largely within the last 2000 years or so. The human disease burden has slowly been getting worse, which might mean that childhood was somewhat less dangerous before the arrival of measles and smallpox.


The Perfection of the Species

 

Supreme Court Nominee John Roberts had some thoughts many years ago about limiting the terms of federal judges, and was foolish enough to put them on paper:

The Constitution "adopted life tenure at a time when people simply did not live as long as they do now,'' Roberts wrote in an Oct. 3, 1983, memo to White House Counsel Fred Fielding that is now on file at the Ronald Reagan Presidential Library..."A judge insulated from the normal currents of life for 25 or 30 years was a rarity then but is becoming commonplace today,'' Roberts wrote. "Setting a term of, say, 15 years would ensure that federal judges would not lose all touch with reality through decades of ivory tower existence.''

Term limits for judges may or may not be a good idea, but I had my doubts about the premise of Roberts' critique. The great increases in life expectancy we have seen over the past two centuries chiefly relate to infant mortality; the older you get, the less dramatic the increases become. Certainly it is not the case that maximum human longevity is increasing. How does this relate to the Supreme Court?

On Wikipedia, I found a list of the justices of the Supreme Court of the United States in chronological order of appointment. Then I took the average of the terms of service of each group of ten. In the list of these averages set out below, the date is the end of the period in which each group of ten was appointed:

1796
8.9 yrs

1811
(John Marshall appointed)
20.9 yrs

1823
19.2 yrs

1845
19 yrs

1864
14.3 yrs

1903
13 yrs

1921
15.4 yrs

1939
13.3 yrs

1953
17.3 yrs

1970
20.6 yrs

1994
(Current)
20.6 yrs

The average tenure for the first ten justices was indeed short, but that had little to do with longevity. The Supreme Court was new and not very prestigious in the early days of the Republic. The justices tended to quit in order to move on to better things. It was only during the tenure of John Marshall as Chief Justice that the Court acquired an authority comparable to that of Congress and the President. There then followed a long period during which justices stayed on the court for about as long as they have since the beginning of the final quarter of the 20th century. The composition of the current Court is uniquely old, but again, that's not biology: the continuing Roe v. Wade controversy has blocked the normal turnover of the Court.

John Roberts was probably correct if he thought that the current, long tenure of Supreme Court justices is contrary to the expectation of the Founders, but not for the reason he cited. The Founders probably did not expect that justices, once appointed to the Court, would cling to their office for the rest of their lives.

* * *

Recently I saw Gattaca, a film released in 1997 about a near-future world (though not quite so near as our own, evidently) in which pre-natal genetic enhancements and genetic testing in general put people who are conceived naturally at a considerable disadvantage. The story is about one such Invalid (accent on the second syllable) who steals the genetic profile of a supernormal in order to qualify to pilot the first manned spaceship to Titan.

Gattaca has a reputation as an underappreciated minor film. I can only agree. It comes close to the ideal of science fiction played on a bare stage. The sets are subdued Modern; there are no special effects. As for the cast, no less a person than Gore Vidal has a bit part as Director Josef of the Gattaca organization. He even turns out to be the murderer, though the murder is a red herring. There were several real actors, too.

Since I saw this film, I have been trying to track down a quotation that I am almost sure comes from Tolkien. It runs something like this:

No, I have never much liked the idea of spaceflight. It seems to be promoted mostly by people who want to turn the whole world into a big train station, and then to establish similar stations on other planets.

The journey to Titan (which we do not see) is just a Maguffin, like the statuette in The Maltese Falcon, but it leaves the film hollow, intentionally so. It is not at all clear why the impeccably dressed and immaculately clean personnel of Gattaca would want to do something as crudely industrial as explore another planet. As for the colonization of Titan, we must ask whether the universe really needs another planet covered with office parks and Ikea furniture. Indeed, does it really need any?

The character of the hero is defined by his determination to belie the projection for a mediocre future that his real genetic profile suggested, including a high probability of an early death from heart failure. Though fraud was necessary to allow him to compete for his ambitions, he fought against his fate chiefly through study and exercise. A friend of mine in high school received a similar prognosis. He became the first fitness fanatic I ever met. He died at 28.

* * *

Incidentally, Gattaca is available in Esperanto. So are 14 other films: look here.

* * *

Speaking of near-future paranoia, I have done my bit to bring about a world in which no public moment goes unrecorded; my condominium now has security cameras. To ensure that no one forgets this fact, I made this poster [BIE I put this in the header] to remind everyone to be good.

Speaking of graphics, the Latin Mass folks at Holy Rosary Church asked me to do a simple webpage for them. So, I did this[BIE link removed, since Holy Rosary Church isn't really the point here. A fine chapel though, as I verified]. The sound file of the Magnificat is surprisingly good, considering the microphone we were using; the church has wonderful acoustics.

That page is supposed to be uploaded to the parish website. No doubt it will be, eventually, but getting the authorization is harder than authorizing that expedition to Titan.

* * *

"Nothing Burger" is a good characterization of the whole embryonic stem-cell controversy. Even if omni-potent stem cells turn out to have clinical applications, it is hard to imagine a goofier way to get them than by harvesting them from embryos, cloned or otherwise. In any case, new techniques should soon return the subject to its deserved obscurity, as we see in The Washington Post:

Scientists for the first time have turned ordinary skin cells into what appear to be embryonic stem cells -- without having to use human eggs or make new human embryos in the process, as has always been required in the past, a Harvard research team announced yesterday.

So are we done with the subject? Not quite:

Because it involves the fusion of a stem cell and a person's ordinary skin cell, the process leads to the creation of a hybrid cell. While that cell has all the characteristics of a new embryonic stem cell, it contains the DNA of the person who donated the skin cell and also the DNA that was in the initial embryonic stem cell.

The Post notes this, however:

They do not mention that several teams, including ones in Illinois and Australia, have said in recent interviews that they are making progress removing stem cell DNA from such hybrid cells...Some even suspect that the new technique for making personalized stem cells would still work even if the "starter" stem cells' DNA were removed before those cells were fused to the skin cells.

Nonetheless, embryonic stem cells have become like ethanol fuels to some people: it's something they want the government to subsidize whether it does any good or not:

"I think we have to keep our eye on the ball here," [John Gearhart, a stem cell researcher at Johns Hopkins Medical Institutions] said. "If this stuff proves to work, that's wonderful. But we're just not there yet, and it's going to take a long time to demonstrate that. Meanwhile, other techniques already work well. So let's get on with it."

By all means; but the useful research has little to do with the public polemic.

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View 2005-06-07: The New Jersey Primary, and Worse

Rofecoxib

Rofecoxib

This is a great quote:

When I was young, I was quite a prig about not using drugs, and I have not mellowed over time. On the other hand, I often shock people who know me with the argument that most drugs should be legalized, even for recreational purposes, simply because prohibition causes more trouble than it is worth. I have no opinion about the efficacy of medical marijuana, but I think it could never do as much harm as prescription blood-pressure medicine.

The problems with rofecoxib [trade name Vioxx] had hit shortly before John wrote this. I'm not sure that legalizing recreational drugs is a good idea, but I am at least willing to consider it, given the way in which prescription drugs overseen by otherwise responsible doctors have harmed the public too. With the ever increasing opioid overdoses in the US, we enjoy the worst of both worlds: legal and illegal drugs working together to really mess people up. 


The New Jersey Primary, and Worse

 

The party primary elections in New Jersey are today. Registered Republicans who have listed telephone numbers have been bombarded day and night, and sometimes in our dreams, by political telemarketing calls for one or another of the gaggle of people who are seeking the Party's nomination for the gubernatorial election in November. Many of the calls are from a roster of Formerly Famous People, such as Jack Kemp, Tom Keane, and Steve Forbes. Actually, I soon developed the habit of hanging up before I was told whom these people were endorsing, but most of them seem to be partial to Bret Schundler. Doug Forrester's ads tend to feature Ordinary Citizens and doting family members. The other five often endorse themselves. I am not sure that all their calls are recordings.

Walking to the polling place to vote this morning, I was interviewed by WPIX, the local television affiliate of UPN. Taken by surprise, I was quite unable make an apt allusion to Julius Evola, or even James Madison. I did mention property taxes, which seemed to be what the news lady wanted to hear.

When I voted, the polls had already been open for two hours, on as fine a late-spring day as you could ask for. I was voter number 6. The people are disenthused, I fear.

* * *

Meanwhile, I see that the United States Supreme Court, in Gonzales v. Raich, has held that the federal Controlled Substances Act does allow the federal government to prosecute the users of medical marijuana, even if the users have a valid prescription issued under state law.

The chief curiosity here is that the Controlled Substances Act is based largely on the Commerce Clause of the Constitution, which allows the federal government to control the distribution of goods in interstate commerce. The Court held that power also justified the application of the Act to this situation, where the marijuana was homegrown and had not moved in commerce at all. This reasoning is not a novelty: the courts have long held that the federal government could regulate what farmers grow on their own land for their own use, on the theory that local production displaces goods from outside the state. That principle is probably necessary, but it still looks like an instance of coaxing a camel through the eye of a needle.

When I was young, I was quite a prig about not using drugs, and I have not mellowed over time. On the other hand, I often shock people who know me with the argument that most drugs should be legalized, even for recreational purposes, simply because prohibition causes more trouble than it is worth. I have no opinion about the efficacy of medical marijuana, but I think it could never do as much harm as prescription blood-pressure medicine.

Still, I have to say the Supreme Court majority was right: both the Controlled Substances Act and its application here are necessarily valid. Whatever doubts I might have had about the matter were dispelled by this bit of incoherence from Justice O'Connor's dissent from Justice Steven's majority opinion:

There is simply no evidence that homegrown medicinal marijuana users constitute, in the aggregate, a sizable enough class to have a discernable, let alone substantial, impact on the national illicit drug market --or otherwise to threaten the CSA regime.

If Congress has to present "evidence" to Justice O'Connor's satisfaction every time it passes a law, the Republic is doomed.

* * *

Speaking of formerly famous people, Mark Steyn has taken to prophesying that Senator Hillary Rodham Clinton may well win the presidential election in 2008. In a column entitled Last Man Standing, he issues these oracles:

If the Democrats ever want to take back the White House, 2008 is their best shot. After the 2010 census, the electoral college apportionment for the 2012 Presidential campaign will reflect the population shifts to the south and west ...

Frankly, that sounds a little like the belief in the British Labour Party in the 1920s that the rise of the Party to the status of permanent governing party could be calculated with arithmetical certainty. After all, the electorate could grow only more working class over time, couldn't it? But let the point pass.

Bill Clinton was about as good a Democrat as you could get: he liked to tell friends he governed as an "Eisenhower Republican"...

Wasn't that how John Kerry during last year's election promised to govern? The leading sentiment within the Democratic Party now might be: "Let's give the real Left a chance."

As a rule, Governors make the best Presidential candidates...The Republicans do have a popular governor of a large state, but his name's Jeb Bush, and even loyal Baathists might have drawn the line at Saddam being succeeded by both Uday and Qusay. On the other hand, if Jeb wants to avoid being penalised by American distaste for dynastic succession, the 43rd President's brother running against the 42nd President's wife may be the most favourable conditions he'll ever get.

Jeb has said he will not run in 2008, and I see no reason to doubt him. Still, that is a good point: a Hillary candidacy would shortcircuit the nepotism issue.

You see that were are already well into the next election cycle? The presidency is becoming Ixion's Wheel.

* * *

Once again, let me repeat that I am attending the annual conference of the International Society for the Comparative Study of Civilizations later this week, at the University of St. Thomas in St. Paul in Minnesota, USA. The conference topic is Civilizations, Religions and Human Survival. I will post comments when I get back about the conference. My paper, "The Second Religiousness in the 21st Century," will appear online eventually, but the ISCSC might want first-publication rights for their journal, the Comparative Civilizations Review.

Finally, you are again invited to send money here to support this study of metahistory. That, and the Heineken Brewing Company, green jewel of the Dutch Empire.

Thanks!

Copyright © 2005 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

The Long View: Higher Superstition: The Academic Left and Its Quarrels with Science

The kind of thing that John Reilly laments in this book review is alive and well. If you want a taste of it, check out New Real Peer Review on Twitter, which simply reprints abstracts of actual, peer-reviewed articles. A favorite genre is the autoethnography. Go look for yourself, no summary can do it justice.

I will disagree with John about one thing: race and sex matter a lot for many medical treatments. For example, the drug marketed under the trade name Ambien, generically called zolpidem, has much worse side effects in women than in men, and it takes women longer to metabolize it.

This effect was memorably referred to as Ambien Walrus. I find this pretty funny, but I delight in the sufferings of others.

You can't ignore this stuff if you want to do medicine right. The reasons for doing so vary, but you'll get a better result if you don't.


Higher Superstition: The Academic Left and Its Quarrels with Science
by Paul R. Gross and Norman Levitt
The Johns Hopkins University Press, 1994
314 pp, $25.95
ISBN 0-8018-4766-4

 

The Enemies You Deserve

 

If you are looking for an expose' of how political correctness in recent years has undermined medical research, corrupted the teaching of mathematics and generally blackened the name of science in America, this book will give you all the horror stories you might possibly want. There have been rather a lot of indictments of the academic left, of course, but this is one of the better ones. However, the book is most interesting for reasons other than those its co-authors intended. To use the same degree of bluntness that they use against the "science studies" being carried on by today's literary critics, what we have here is an expression of bewilderment by a pair of secular fundamentalists who find themselves faced with an intellectual crisis for which their philosophy provides no solution.

Paul Gross is a biologist, now at the University of Virginia but formerly director of the Woods Hole Marine Biological Laboratory, and Norman Levitt is professor of mathematics at Rutgers University in New Jersey. They repeatedly insist, no doubt truthfully, that they have no particular interest in politics and that they are not programmatic conservatives. What does worry them is the increasing number of faculty colleagues in the liberal arts who take it as an article of faith that the results of empirical scientific research are biased in favor of patriarchy or capitalism or white people. The people who have this sort of opinion they call "the academic left," a catchall category that includes deconstructionists, feminists, multiculturalists and radical environmentalists.

The authors have a good ear for invective, such as this happy formula: "...academic left refers to a stratum of the residual intelligentsia surviving the recession of its demotic base." There has always been something rather futile about the radicalization of the academy, and in some ways the movement is already in retreat. The ideas of the academic left are based in large part on Marxist notions that were originally designed for purposes of revolutionary agitation. Revolutionary socialist politics has not proven to have the popular appeal one might have hoped, however. Marxism has therefore been largely replaced among intellectuals by that protean phenomenon, postmodernism. Although postmodernism incorporates large helpings of Freudianism and the more credulous kind of cultural anthropology, it remains a fundamentally "left" phenomenon, in the sense of maintaining an implacable hostility to market economics and traditional social structures. However, postmodernists have perforce lowered their goal from storming the Winter Palace to inculcating the "hermeneutics of suspicion" in undergraduates. The results of these efforts were sufficiently annoying to incite Gross and Levitt to write this book.

Postmodernists presume that reality is inaccessible, or at least incommunicable, because of the inherent unreliability of language. Science to postmodernists is only one of any number of possible "discourses," no one of which is fundamentally truer than any other. This is because there are no foundations to thought, which is everywhere radically determined by the interests and history of the thinker. Those who claim to establish truth by experiment are either lying or self-deluded. The slogan "scientific truth is a matter of social authority" has become dogma to many academic interest groups, who have been exerting themselves to substitute their authority for that of the practicing scientists.

The French philosophical school known as deconstructionism provided the first taste of postmodern skepticism in the American academy during the 1970s. It still provides much of its vocabulary. However, self-described deconstructionists are getting rare. Paul de Man and Martin Heidegger, two of the school's progenitors, were shown in recent years to have been fascists without qualification at certain points in their careers, thus tainting the whole school. On the other hand, while deconstruction has perhaps seen better days, feminism is as strong as ever. Thus, undergraduates in women's studies courses are routinely introduced to the notion that, for instance, Newton's "Principia" is a rape manual. Even odder is the movement to create a feminist mathematics. The authors discuss at length an article along these lines entitled "Towards a Feminist Algebra." The authors of that piece don't seem much concerned with algebra per se; what exercises them is the use of sexist word problems in algebra texts, particularly those that seem to promote heterosexuality. The single greatest practical damage done by feminists so far, however, is in medical research, where human test groups for new treatments must now often be "inclusive" of men and women (and also for certain racial minorities). To get statistically significant results for a test group, you can't just mirror the population in the sample, you have to have a sample above a mathematically determined size for each group that interests you. In reality, experience has shown that race and gender rarely make a difference in tests of new medical treatments, but politically correct regulations threaten to increase the size of medical studies by a factor of five or ten.

Environmentalism has become a species of apocalyptic for people on the academic left. It is not really clear what environmentalism is doing in the postmodern stew at all, since environmentalists tend to look on nature as the source of the kind of fundamental values which postmodernism says do not exist. The answer, perhaps, is that the vision of ecological catastrophe provides a way for the mighty to be cast down from their thrones in a historical situation where social revolution appears to be vastly improbable. Environmentalists seem to be actually disappointed if some preliminary research results suggesting an environmental danger turn out to be wrong. This happens often enough, notably in cancer research, where suspected carcinogens routinely turn out to be innocuous. However, on the environmental circuit, good news is unreportable. The current world is damned, the environmentalists claim, and nothing but the overthrow of capitalism, or patriarchy, or humanism (meaning in this case the invidious bias in favor of humans over other animals) can bring relief. Only catastrophe can bring about this overthrow, and environmentalists who are not scientists look for it eagerly.

The basic notion behind the postmodern treatment of science is social constructivism, the notion that our knowledge of the world is just as much a social product as our music or our myths, and is similarly open to criticism. The authors have no problem with the fact that cultural conditions can affect what kind of questions scientists will seek to address or what kind of explanation will seem plausible to a researcher. What they object to is the "strong form" of social constructivism, which holds that our knowledge is simply a representation of nature. The "truth" of this representation cannot be ascertained by reference to the natural world, since any experimental result will also be a representation. Constructivists therefore say that we can understand the elements of a scientific theory only by reference to the social condition and personal histories of the scientists involved. This, as the authors correctly note, is batty.

The lengths to which the principle of constructivism has been extended are nearly unbelievable. Take AIDS, for instance, which has itself almost become a postmodernist subspecialty. The tone in the postmodernist literature dealing with the disease echoes the dictum of AIDS activist Larry Kramer: "...I think a good case can also be made that the AIDS pandemic is the fault of the heterosexual white majority." Some people, particularly in black studies departments, take "constructed" quite literally, in the sense that the AIDS virus was created in a laboratory as an instrument of genocide. Kramer's notion is more modest: he suggests that the extreme homosexual promiscuity which did so much to spread the disease in the New York and San Francisco of the late 1960s and early 1970s was forced upon the gay community by its ghettoization. This is an odd argument, but not so odd as the assumption that you can talk about the origins of an epidemic without discussing the infectious agent that causes it. The upshot is that AIDS is considered to be a product of "semiological discourse," a system of social conventions. It can be defeated, not through standard medical research, but through the creation of a new language, one that does not stigmatize certain groups and behaviors. (Dr. Peter Duesberg's purely behavioral explanation of AIDS, though it has the attractions of scientific heresy, gets only a cautious reception because of its implied criticism of homosexual sex.) The postmodern academy actually seems to have a certain investment in a cure for AIDS not being found, since the apparent helplessness of science in this area is taken as a license to give equal authority to "other ways of knowing" and other ways of healing, particularly of the New Age variety.

The postmodernist critics of science usually ply their trade by studiously ignoring what scientists themselves actually think about. The anthropologist Bruno Latour, for instance, has made a name for himself by subjecting scientists to the kind of observation usually reserved for members of primitive tribes. Once he was commissioned by the French government to do a post-mortem on their Aramis project. This was to be a radically new, computerized subway system in which small trams would travel on a vastly complicated track-and-switch system along routes improvised for the passengers of each car. The idea was that passengers would type their proposed destination into a computer terminal when they entered a subway station. They would then be assigned a car with other people going to compatible destinations. The project turned into a ten year boondoggle and was eventually cancelled. The French government hired Latour to find out what went wrong. Now, the basic conceptual problem with the system is obvious: the French engineers had to come up with a way to handle the "traveling salesman" problem, the classic problem of finding the shortest way to connect a set of points. This seemingly simple question has no neat solution, and the search for approximate answers keeps the designers of telephone switching systems and railroad traffic managers awake nights. Latour did not even mention it. He did, however, do a subtle semiological analysis of the aesthetic design of the tram cars.

Postmodernists regard themselves as omniscient and omnicompetent, fully qualified to put any intellectual discipline in the world in its place. They have this confidence because of the mistaken belief that science has refuted itself, thus leaving the field clear for other ways of understanding the world. They love chaos theory, for instance, having absorbed the hazy notion that it makes the universe unpredictable. Chaos theory in fact is simply a partial solution to the problem of describing turbulence. Indeed, chaos theory is something of a victory for mathematical platonism, since it shows that some very exotic mathematical objects have great descriptive power. The implications of chaos theory are rather the opposite of chaos in the popular sense, but this idea shows little sign of penetrating the nation's literature departments. The same goes for features of quantum mechanics, notably the uncertainty principle. Quantum mechanics actually makes the world a far more manageable place. Among other things, it is the basis of electronics. To read the postmodernists, however, you would think that it makes physicists flutter about their laboratories in an agony of ontological confusion because quantum theory phrases the answers to some questions probabilistically.

On a more esoteric level, we have the strange cult of Kurt Goedel's incompleteness theorem, first propounded in the 1930s. Now Goedel's Theorem is one of the great treasures of 20th century mathematics. There are several ways to put it, one of which is that logical systems beyond a certain level of complexity can generate correctly expressed statements whose truth value cannot be determined. Some versions of the "Liar Paradox" illustrate this quality of undecidability. It is easy to get the point slightly wrong. (Even the authors' statement of it is a tad misleading. According to them, the theorem "says that no finite system of axioms can completely characterize even a seemingly 'natural' mathematical object..." It should be made clear that some logical systems, notably Euclid's geometry, are quite complete, so that every properly expressed Euclidean theorem is either true or false.) Simply false, however, is the postmodernist conviction that Goedel's Theorem proved that all language is fundamentally self-contradictory and inconsistent. Postmodernists find the idea attractive, however, because they believe that it frees them from the chains of logic, and undermines the claims of scientists to have reached conclusions dictated by logic.

Postmodernism, say the authors, is the deliberate negation of the Enlightenment project, which they hold to be the construction of a sound body of knowledge about the world. The academic left generally believes that the reality of the Enlightenment has been the construction of a thought-world designed to oppress women and people of color in the interests of white patriarchal capitalism. Or possibly capitalist patriarchy. Anyhow, fashion has it that the Enlightenment was a bad idea. Now that modernity is about to end, say the postmodernists, the idea is being refuted on every hand. Actually, it seems to many people of various ideological persuasions that the end of modernity is indeed probably not too far off: no era lasts forever, after all. However, it is also reasonably clear that postmodernism is not on the far side of the modern era. Postmodernism is simply late modernity. Whatever follows modernity is very unlikely to have much to do with the sentiments of today's academic left.

Granted that the radical academy does not have much of a future, still the authors cannot find a really satisfying explanation for why the natural sciences have been subject to special reprobation and outrage in recent years. In the charmingly titled penultimate chapter, "Why Do the People Imagine a Vain Thing?", they run through the obvious explanations. It does not take much imagination to see that today's academic leftist is often a refugee from the 1960s. Political correctness is in large part the whimsical antinomianism of the Counterculture translated into humorourless middle age. Then, of course, there is the revenge factor. In the heyday of Logical Positivism from the end of World War II to the middle 1960s, physical scientists tended to look down on the liberal arts. In the eyes of that austere philosophy, any statement which was not based either on observation or induction was literally "nonsense," a category that therefore covered every non-science department from theology to accounting. The patronizing attitude of scientists was not made more bearable by the unquestioning generosity of the subsidies provided by government to science in those years. The resentment caused by this state of affairs still rankled when the current crop of academic leftists were graduates and undergraduates. Now they see the chance to cut science down to size.

While there is something to this assessment, the fact is that the academic left has a point. Logical Positivism and postmodernism are both essentially forms of linguistic skepticism. Both alike are products of the rejection of metaphysics, the key theme in Western philosophy since Kant. The hope of the logical positivist philosophers of the 1920s and 30s was to save just enough of the machinery of abstract thought so that scientists could work. Science is not skeptical in the sense that Nietzsche was skeptical, or the later Sophists. It takes quite a lot of faith in the world and the power of the mind to do science. And in fact, the authors note that Logical Positivism, with a little help from the philosophy of Karl Popper, remains the philosophical framework of working scientists to this day. The problem, however, is that Logical Positivism leaves science as a little bubble of coherence in a sea of "nonsense," of thoughts and ideas that cannot be directly related to measurable physical events.

Logical Positivism has many inherent problems as a philosophy (the chief of which being that its propositions cannot themselves be derived from sense experience), but one ability that even its strongest adherents cannot claim for it is the capacity to answer a consistent skepticism. In their defense of science, the authors are reduced to pounding the table (or, after the fashion of Dr. Johnson's refutation of Berkeley's Idealist philosophy, kicking the stone.) Thus, it is a "brutal" fact that science makes reliable predictions about physical events, that antibiotics cure infections while New Age crystals will not, that the advisability of nuclear power is a question of engineering and not of moral rectitude. Well, sure. But why? "Because" is not an answer. Without some way to relate the reliability of science to the rest of reality, the scientific community will be living in an acid bath skepticism and superstition.

The authors tell us that the scientific methodology of the 17th century "almost unwittingly set aside the metaphysical assumptions of a dozen centuries...[that] Newton or Leibnitz sought...to affirm some version of this divine order...is almost beside the point...Open-endedness is the vital principle at stake here...Unless we are unlucky, this will always be the case." In reality, of course, it surpasses the wit of any thinker to set aside the metaphysical assumptions of a dozen centuries, or even entirely of his own generation. The scientists of the early Enlightenment did indeed scrap a great deal of Aristotle's physics. Metaphysically, however, they were fundamentally conservative: they settled on one strand of the philosophical heritage of the West and resisted discussing the matter further.

As Alfred Whitehead realized early in this century, science is based on a stripped-down version of scholasticism, the kind that says (a) truth can be reached using reason but (b) only through reasoning about experience provided by the senses. This should not be surprising. Cultures have their insistences. Analogous ideas keep popping up in different forms throughout a civilization's history. When the Senate debates funding for parochial schools, it is carrying on the traditional conflict between church and state that has run through Western history since the Investiture Controversy in medieval Germany. In the same way, certain assumptions about the knowability and rationality of the world run right through Western history. The Enlightenment was not unique in remembering these things. Its uniqueness lay in what it was willing to forget.

It would be folly to dismiss so great a pulse of human history as the Enlightenment with a single characterization, either for good or ill. Everything good and everything bad that we know about either appeared in that wave or was transformed by it. Its force is not yet wholly spent. However, one important thing about the Enlightenment is that it has always been a movement of critique. It is an opposition to the powers that be, whether the crown, or the ancient intellectual authorities, or God. The authors of "Higher Superstition" tell us that the academic left hopes to overthrow the Enlightenment, while the authors cast themselves as the Enlightenment's defenders. The authors are correct in seeing the academic left as silly people, who do not know what they are about. The authors are mistaken too, however. The fact is that the academic left are as truly the children of the Enlightenment as ever the scientists are. Science was once an indispensable ally in the leveling of ancient hierarchies of thought and society, but today it presents itself to postmodern academics simply as the only target left standing. Is it any wonder that these heirs of the Enlightenment should hope to bring down the last Bastille?

This article originally appeared in the November 1995 issue of Culture Wars magazine.

Copyright © 1996 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

Linkfest 2017-01-22

Happy New Year! I've taken two weeks off for the birth of my third child, but now I am back at it!

Peak conception time by daylight hours

I saw the following chart on Twitter, and found it intriguing.

Roy Wright was interested enough to write a blog post going into further detail.

A Lot of What Is Known about Pirates Is Not True, and a Lot of What Is True Is Not Known.

A great piece about the gradual transformation of piracy in the American colonies from just another job to an act of rebellion.

The first ever vending machine stopped people from stealing holy water

Hero of Alexandria is a remarkable figure, known for his almost modern seeming machines

San Francisco Asks: Where Have All the Children Gone?

Philosophies that frown on reproduction usually don't survive.

Peanut Allergy Prevention Advice Does a 180

A nice summary of how the conventional wisdom on peanut allergies was upended by one good study.

Should social science be more solution-oriented?

Duncan Watts argues in Nature: Human Behavior that my cocktail party theory of science is correct.

Housing supply is catching up to demand

Unfortunately, supply grows very slowly in this area. But this is good news.

Origin of computing terms "patch", "bug", "loop", "library"

Great history.

LinkFest 2016-11-06

This LinkFest has been delayed three weeks. I had better publish it before the election and everything gets falsified!


Divided by meaning

A great piece on how Americans are divided by their attachments to hearth and home, or the lack thereof. Fascinating to me, since by education and career, I ought to be a member of what the author calls "the front row kids" who run the country, but I have chosen to live and work in the same small town I grew up in, much like Steve Jobs.

Is there a dietary treatment for multiple sclerosis?

It wouldn't be that hard to design a double-blind RCT on this if you wanted to. You could put everyone on the vegan+fish diet and then supplement animal fats in pill form. If the IRB balked, you might then suspect they secretly believed it might work.

The Crony Economy

Everyone agrees they don't like it, but no one has produced a lasting reform.

How Half Of America Lost Its F**king Mind 

Cracked continues to impress me. Companion piece: Divided by meaning

Is there a dietary treatment for multiple sclerosis?

A really good look at the incentives that push medicine towards pharmaceuticals and away from other kinds of therapies. Related reading: Lions, Tigers, and Bears. Is the placebo powerless?

Revealed: Nearly Half The Adults In Britain And Europe Hold Extremist Views

There is no end to the humor in this, but I find the commitment to democracy kind of sweet and endearing in people who are otherwise horrified when they find out what average people really think. Can you imagine the headline if one were able to conduct the same survey world-wide?

Why Tokyo is the land of rising home construction but not rising prices

Because they almost always tear down old houses and build new ones instead of just moving into them. There are a variety of interesting cultural and practical reasons for this, but one that doesn't appear in the article is the way the Yakuza use construction as their legitimate front. A lot of blue collar work in NYC works much the same way.

How Democrats killed their populist soul

Part of my on-going series of how the Economic Right and the Cultural Left are currently dominant in the West. Until I read this, I hadn't appreciated how the economic theories of Right and Left alike had turned against trust-busting and monopoly prevention.

Estates of Mind

A bit more about anti-trust laws as applied to intellectual property.

Mergers raise prices not efficiency

Since I have worked in manufacturing for my entire career, I don't find this surprising at all. The idea that mergers allow for standardization looks a lot easier on paper than in reality. Supply chains and manufacturing lines can't change with a memo.

The Rise of the alt-Right

Definitely one of the best things I have read about the alt-right. What is going on in the US has a lot of ties to what is going on in Europe.

On the reality of race and the abhorrence of racism

Bo Winegard, Ben Winegard, and Brian Boutwell point out that studying race doesn't make you deplorable.

The ruthlessly effective rebranding of Europe's new far-Right

I said what I read above from Scott McConnell was the best thing about the alt-right, but you really need this one as background.

The election that forgot about the future

In John's review of The Fourth Turning, one of the things that Strauss and Howe said made the Civil War worse than it could have been was the failure of the aging Transcendentalists to step aside and let someone else solve new problems. According to Strauss and Howe's model, the Baby Boomers are currently filling the same role in the United States. And you might note, one of two aging Baby Boomers is about to win the Presidency in a bitterly contested election.

Hacksaw Ridge

Mel Gibson makes another great war movie, about a guy who would not carry a gun.

LinkFest 2016-04-01

April Fool's Edition

The CDC is trying to make 86 million Americans sick

I've long thought the pre-diabetes thing was a bit foolish. While it is a good thing to be able to quantitate, if you don't understand what you are doing it can make you far too certain. Pre-diabetes is a lot like a risk-factor; something that is correlated with diabetes, but is in totality a poor predictor.

It's all Geek to Me

Neal Stephenson's review of the movie 300 is now nine years old, but I still enjoyed reading it. I liked 300 when it came out, and mostly for the same reasons Stephenson did.

My journey through Molenbeek

A nice synopsis of the way in which not particularly devout partially assimilated children of immigrants get radicalized.

These unlucky people have names that break computers

Parsing text is hazardous.

A researcher explains the sad truth: we do know how to stop gun violence. But we don't do it.

Unfortunately for this well-meaning researcher, his suggestions involve pattern recognition, which is currently disfavored.

Peak Water: United States water use drops to lowest level in 40 years

The story is similar for gasoline. Technological progress means we do more with less.

HVAC Techs — Hackers who make house calls

The kind of unglamorous but well-paid job Mike Rowe likes to talk about.

America may DUMP algebra as new study finds it is the main cause of high school drop-outs - and only 5% of jobs need it

This is a fantastic idea. We have raised the bar to graduate high school so far that we are penalizing people of normal intellectual ability.

Immigration and the Political Explosion of 2016

This is a recurring pattern in United States history.

Philosophical Reflections on Genetic Interest

Frank Salter's concept of genetic interest is a philosophical concept that is muddled up with a scientific one. Unfortunately, his philosophy isn't too sharp.

How much of the placebo 'effect' is really statistical regression?

Courtesy of the ever contrary Greg Cochran, a reason to doubt the placebo effect. Here is a recent blog post expanding on this idea, with further reading suggestions.

Beat the Reaper Book Review

Another book review that disappeared. I really love the tagline for this review: "Like Scrubs on crack."

Beat the Reaper
by Josh Bazell
Little, Brown, and Company, 2009
ISBN 0316032220; $9.99

Every time I look at this book, I think of the song by Blue Öyster Cult, "The Reaper". This really has nothing to do with the book, but I think it nonetheless. I'm not normally a fan of mafia books, for the reason that mafiosi are so evil that reading about them depresses me. I did like this book, but I was always struggling against the horror that any semi-realistic portrayal of gangster life elicts.

My favorite part of the book is the random medical facts scattered throughout the book, either in the body or in cute little footnotes. Lest anyone think that the medical mayhem of Manhattan Catholic is entirely fictional, I was recounting to my Mother the episode near the beginning of the book where Dr. Peter Brown comes in for rounds and discovers one of this patients is dead, despite the notation on the chart that claims the patient's temperature is 98.6º with blood pressure 120/80 mmHg. My Mom blurted out, "ooh, that happened to me!" My Mother has a great deal of experience as a nurse, and this exact incident happened to her, with the change that it was the aide who did it to the nurse instead of the nurse to the doctor.

I have worked in hospitals myself, and currently am a designer of medical devices, so everything about Manhattan Catholic rang true, even though that much misery is not usually concentrated in one place. I can indeed confirm the typical surgeon's potty mouth. I've never heard such astounding things as you can hear in an operating room. I also appreciate Dr. Brown's gallows humor. When you work with death, you need something to help you stay sane. You can't go and cry in your beer every time something goes wrong. The most common method is black humor to provide emotional distance. Less common is sanctification, as practiced by chaplains and religious. I never hear gallows humor out of Sr. Elizabeth. Of course, maybe she just isn't sharing.

The denoument of the book reminded of the Hitman series by Eidos. Anyone who has played through Hitman: Blood Money will find some similarities. Not huge ones, just reminiscent. The ending was imaginative. Perhaps I should say, strains credulity, but I'm not sure I could do it better. The medical facts do at least make it possible, if not plausible. If you enjoy gangster books, go for this one. If you want to know what hospitals are really like, read this too. No hospital is this bad, but they definitely share a family resemblance. This book is like Scrubs on crack.

My other book reviews