No, You Can't Be an Astronaut Book Review

No, You Can't Be an Astronaut: Why you shouldn't follow your dreams--and what to do instead
by Patience Fairweather, PhD
Plausible Press (January 18, 2018)
$9.99 paperback; $3.49 kindle edition; 186 pages
ISBN 978-1548082963
ASIN B077Q7GF3T

I received this book for free from LibraryThing's Early Reviewers Program.

Patience Fairweather [a pseudonym] is preaching to the choir. I have also been saying that the STEM crisis is a myth, and that we have plenty of well-educated Americans to do all the jobs we have. I appreciate it when anyone else says it, and backs it up with data.

What is more interesting, is what we should do about. Fairweather has written a book that provides sound, reasonable advice to individuals, especially the very young, or those contemplating a career change. This is not a book of policy, but rather a checklist combined with useful background information, to provide opportunity to ordinary Americans. 

After the introduction, Fairweather has a section on personality assessments. This section is pretty good, especially insofar as it encourages the reader to seek out objective information about what they like, and what they are good at. A variety of different tests are cited, including the popular-but-flawed Myers-Briggs, and the better replicated OCEAN model. The point of all is to find out what you would be willing to tolerate for money, because following your dreams can end very poorly. It is often better to find out what you can stand that someone will pay you to do.

Which is the next section of the book! Fairweather looks at ways to assess your actual likelihood of graduating college, and then assessing whether this would truly be a net financial benefit to you. Sure, on average, college graduates make more money, but will you? Sometimes, the answer is no, and Fairweather provides some tools, for example the Bureau of Labor Statistics in the United States, that can answer that question pretty accurately.

Following chapters contain commonsense advice about using time wisely in college, avoiding social media mistakes that can cost you your job, job-hunting and interview skills, and how to succeed in the workplace. I've done technical recruiting for twelve years, and this is good stuff. If you don't need this book to point this stuff out, great for you, but honest mistakes can cost people chances they would otherwise have. Don't let that be you.

Highly recommended.

My other book reviews

The Long View 2006-01-31: Save the Dinosaurs; Unfriendly Democracy; News Medley

I think John Reilly also mentioned this in other places, but here he advances the suggestion that law school is a scam, and could be folded into an undergraduate curriculum. I lack subject matter expertise to comment on law school in particular, but it seems at least plausible to me.

Bryan Caplan argues that education is less about skills and facts, and more about signaling that you have valuable qualities. In general, I find Caplan's case persuasive, although I think it would be easy to take this idea too far. Some data points in favor: I turned out to be a decent engineer, despite not studying that field in school. The English university system currently uses a much shorter timeframe for degrees, but has equivalent results.

It is always risky to generalize from a n of 1, even more so when it is your own experience. An example of what I am talking about can be seen in this twitter exchange with Greg Cochran about the evolutionary fitness cost of schizophrenia:

It isn't really a good argument to assert "I turned out fine." With that in mind, not everyone with my education background makes a good engineer. There are other personal qualities that matter. Conditional on those things, educational background loses importance as a predictor of success. If you know what to look for.


Save the Dinosaurs; Unfriendly Democracy; News Medley

 

Hugh Hewitt has buried the dinosaurs of the Main Stream Media (or at least so he seems to imagine) in an article in The Weekly Standard of January 30 entitled "The Media's Ancien Regime." There he describes his visit to the Columbia School of Journalism. He was well received by all concerned, particularly the dean, Nicholas Lemann. He found a relatively small graduate institution that has become very keen on teaching its students serious analytical skills, including a fair amount of number-crunching. Hewitt thinks it all for naught:

Lippman's world, Pulitzer's world, even Nicholas Lemann's world of the Harvard Crimson from 1972 to 1976--they are all gone. Every conversation with one of the old guard quoting the old proof texts comes down to this point: There is too much expertise, all of it almost instantly available now, for the traditional idea of journalism to last much longer. In the past, almost every bit of information was difficult and expensive to acquire and was therefore mediated by journalists whom readers and viewers were usually in no position to second-guess. Authority has drained from journalism for a reason. Too many of its practitioners have been easily exposed as poseurs.

It is certainly the case that journalists often seem to be woefully underinformed about what should be common knowledge. (Fr. Neuhaus remarked last week on the reporter who responded to his allusions to the pope as "the bishop of Rome" with a query about whether it was unusual that the current pope was also the bishop of Rome.) However, anyone is gravely mistaken who thinks that New Media and the blogosphere make good the deficit.

Blogs are engines of critique. They are not particularly good sources for primary news. For that, we will continue to need journalists to assemble the first-draft narratives for critique and elaboration in the noosphere (a term i use advisedly, since more than blogs are at work here). It's true that journalism used to be primarily a sort of arbitrage between information-poor and information-rich domains. It still does that, particularly at the local level, where public events enter the information stream only if a shoe-leather reporter puts them there. The effect of communications technology, which simplifies reporting for many classes of stories, is not to abolish journalism, but to allow it to focus on generating value through synthesis.

The preceding paragraph may be the geekiest thing I have ever written. Anyway:

The problem is that the type of person best suited to this is the sort of widely educated liberal-arts graduate that the universities seem determined not to produce any longer. There is something wrong with the very idea of journalism as an academic major, much less as a graduate-school subject. (Law school, in contrast, could easily be folded into an undergraduate curriculum; it's law school that the hoax.) At the end of four years of college, you should know enough languages, and accounting, and history, and sociology, to make yourself useful in a newsroom, assuming you can write acceptable expository prose. There is a craft to reporting, of course. It is ably described in any number of memoirs. Reading them may be more helpful than journalism school.

* * *

Walter Russell Meade's proposal to abolish college, which appears in The Weekly Standard Online is ingenious:

There is no reason the government should try to prevent American families who value the traditional college experience from paying hundreds of thousands of dollars, but perhaps it could offer an alternative: a federally recognized national baccalaureate (or 'national bac') degree that students could earn by demonstrating competence and knowledge.

This would leave us with, what? A think-tank industry to replace the research university and a Chinese-style civil-service test to replace the undergraduate college?

* * *

That Spengler takes no prisoners, if we may so judge by his latest at Asia Times:

Fight a dictatorship, and you must kill the regime; fight a democracy, and you must kill the people. Two years ago I called George W Bush a “tragic character” (George W Bush, tragic character, November 25, 2003) who “wants universal good, but will end up doing some terrible things”. Now we have begun the third act of his tragedy, which shatters the delusions that led him to the edge of disaster. President Bush met Nemesis in the form of Hamas, whose election victory in Palestine last week makes clear that democracy can empower the war party as well as the peace party.

Look, maybe this will help. The purpose of promoting democracy is not to create pro-American regimes. It is to create regimes that have relatively transparent and responsive political systems. They can be as anti-American as they please, but provided they nurse their grudges in public and have to convince the rest of the world that they are stable enough to do business with, then there are far preferable, far safer, than even the friendliest tyranny.

* * *

If all these recent elections are starting to run together in your mind, this piece of January 28 by David Warren will surely make it worse:

After the first TV reports that their party would win the Canadian election, Conservative campaign workers began smashing windows in the Parliament Buildings, and in government offices around Ottawa. They roved through the corridors, beating up clerks and civil servants suspected of having Liberal Party connexions. From St John’s to Victoria, both winning and losing Conservative candidates took to the streets, leading heavily armed supporters in ski-masks, followed by millions of happy, cheering, banner-waving CPC voters, dressed in toques and scarves. Merchants and homeowners raced to get Liberal and NDP signs out of view, as the Tory hordes marched through towns, firing their guns in the air, vandalizing post offices, and looting shops belonging to their opponents.

Perhaps a good test for a perspective journalist would be to examine a pile of putative newsstories and pick out the ones that, like this one, are jokes.

Copyright © 2006 by John J. Reilly

Why post old articles?

Who was John J. Reilly?

All of John's posts here

An archive of John's site

College Learning Assessment Plus

The Wall Street Journal has an article up on the CLA+ test, with an accompanying data set from 68 public colleges obtained through FOIA requests. I put all the data in an EXCEL style spreadsheet as well.  

I always like to plot my data, so here is a scatterplot matrix of the whole thing:

CLA+ Scatterplot Matrix

CLA+ Scatterplot Matrix

There are several columns that are just different ways of saying the same thing, like Freshman Score and Freshmen with Below Basic Skills. So I dropped everything that was just the same data in a different format and we get this:

Subset of CLA+ Scatterplot Matrix

Subset of CLA+ Scatterplot Matrix

Just about the only scatterplot that stands out to me is that higher freshman scores are pretty correlated with higher senior scores. I never would have guessed.

The next most interesting is the relationship between freshman score and the difference between freshmen and senior scores. The correlation is negative, perhaps implying there is a score ceiling in the test, or that average college graduates tend to end up in about the same place by the end of school.

Both freshman scores and senior scores are correlated with graduation rates, but since we are supposed to be using this data to see whether a given college does anything useful, I plotted both freshman and senior scores against graduation rates, but I color-coded the points by the improvement between freshmen and senior scores.

CLA+ relationship between senior score and graduation rates, color coded by score improvement

CLA+ relationship between senior score and graduation rates, color coded by score improvement

CLA+ relationship between freshman score and graduation rates, color coded by score improvement

CLA+ relationship between freshman score and graduation rates, color coded by score improvement

The color-coding looks random on the senior scores graph, but against freshman scores the highest improvements are concentrated at the lower-left. This might be interesting, since point difference versus graduation rates in general looks pretty random in the first scatterplot matrix.

I don't see anything groundbreaking here, which is probably why colleges don't talk about this much. If there was something to crow about, they would.

The Big U Book Review

NOT American MegaversityI picked up The Big U while I was organizing my library, and I decided to see if I still liked it ten years [at least] since the last time I had read it.

It turns out, I do! For me, this is the perfect college satire, on the same level as Thank You For Smoking or Office Space. I read it when I was an undergraduate, and it was hilarious, and a devastating send up of the bizarre world that is the American university. Ten years later, it is still hilarious and devastating. Then I flip to the flyleaf, and I find Stephenson wrote it in 1984.

Stephenson nailed the essence of university life in a way that is still relevant thirty years later. The LARPers. The Goddess worshippers. The terrible cafeteria food. The out of control parties. This is the American university, in all of its glory. American universities have long been at the center of the culture war, fostering, even encouraging, a hothouse culture in which the strangest things can flourish. Add to that a culture that has been intellectually static for the last hundred years, a guaranteed fresh supply of naive teenagers, and you will get a system that loops through the same obsessions, over and over and over.

In the introductory chapter, Stephenson's narrator says:

What you are about to read here is not an aberration: it can happen in your local university too. The Big U, simply, was a few years ahead of the rest.

This turns out to have been prophetic. In the Big U, we have all of the current obsessions of trendy politics. Rape culture. Identity politics. Minoritarianism. Endless curricular disputes. Weird religions. There are few things in the book so outrageous that they have not managed to happen in the last thirty years. It is all so ridiculous, and all so pertinent. I liked it the first time because it seemed very much like my alma mater. I like it now because it seems like all the universities in America. If anything, my own university has only grown more like American Megaversity with the passage of time.

It is fortunate this is a book and not a movie, because it prevents you from seeing out of date clothes and assuming everything in the book happened in the past. With a few minor changes, The Big U could easily be set today. The Stalinist Underground Battalion would have to be replaced with Occupy Wall Street, smart phones would have to be added in, and the university mainframe would have to be replaced with the web, but everything else could stay the same. 

The first time I read this book, I was attracted to the commonalities to my own life. The character who was a budding physicist. The genius programmers. The awkward fit of so many of the viewpoint characters to the dominant party scene. Even the bit with the university locksmith [in college, I worked as a student locksmith for the university]. It just seemed to fit.

Ten years later, there are a few things I appreciate more now than I did the first time. The cynical university president is someone I can now identify with. The Big U administration made poor choices, but now that I have actual responsibility, I appreciate the heroic virtue that would be required to resist those temptations. S. S. Krupp is bright, decisive, and capable. His only flaw is putting the university's reputation [and lots of jobs] ahead of doing the right thing. I am glad I don't face the same choices, because it is hard to see how I could realistically do better in the same circumstances.

The sexual dynamic that drives many of the viewpoint characters is far more obvious in retrospect. Especially if you were a nerd [who I presume is Stephenson's target audience]. Teenagers are driven by their hormones in strange ways, nerdy teenagers even more so, and those of us who have survived that phase can only pity them. This too shall pass.

Of all Stephenson's books, this is the one I like best. The first Neal Stephenson book I ever read was Snow Crash. Snow Crash was recommended to me by my freshman year college roommate, and I liked it enough to try more, although I'm not sure its many fans realize it is a dystopia. The Big U was the second. I really liked The Big U, so I tried a number Stephenson's other books, but I never really enjoyed them. Stephenson wrote Zodiac when it seemed like dioxin was the worst thing ever made by humans. By the time I read it, the evidence was a little more mixed. Thus I had trouble taking the plot seriously. I couldn't get through even the first volume of the Baroque Cycle. Maybe this one was a fluke.

I choose to see it as a stroke of genius. Maybe this book couldn't have been written seriously or intentionally, because we are all too identified with sides in the on-going culture war that rages in the universities. Stephenson has a pretty clear side with the left-Libertarians now, but in this book maybe he hadn't quite found his voice, because even characters on the wrong side seem sympathetic, despite some salvos in favor of his clear favorites. As Lincoln and C. S. Lewis argued in their distinctive ways, the sides we are on, and the sides that are really in the right, may not necessarily turn out to be the same.

My other book reviews

Bryan Caplan on the Signaling Theory of College

Bryan Caplan is a popular blogger and economist at George Mason University. Caplan was recently interviewed on EconTalk about the value of a college education. Short version: a college education doesn't have much intrinsic value. I'm simplifying a bit, but only a bit. Caplan argues that higher education is more important for sorting out the smart and hard-working from the rest than in teaching anything specific.  Of course, I might very well say that. I skipped out on grad school and went into the workforce precisely because I was convinced that more school wouldn't make me any smarter, or teach me anything useful. Of course, I also hated academia.

The signaling theory of education is nothing new to me. I've certainly pooh-poohed American higher education on multiple occasions here. However, it is easy to go too far. Caplan is too careful to say one learns nothing in college. What he is saying is that overall, and for the most part, specific skills are less important than intelligence, the capacity to work hard, and a willingness to play by the rules. These are the things college selects for. These are also correlates of success in America and similar societies.

On the other hand, I can certainly point you to plenty of disgruntled college graduates who cannot readily find work because they have the wrong degree. So there is a sense in which college functions in this signalling fashion that Caplan posits, but there is another sense in which employers, particularly in the much-vaunted STEM fields, really do expect college graduates to know things, very specific things. Personal experience suggests to me that this tendency is perhaps somewhat stronger than necessary to ensure competence, but one needs to understand the difference between ability and skill, or potentia and actualia. Since Caplan is an economist, perhaps he can look at the opportunity cost of hiring an able but unskilled graduate.

[Caplan] The human capital story says that you go to school; they actually teach you a bunch of useful jobs skills; you then finish and the labor market rewards you because you are now able to do more stuff. The signaling model says, no, no, no, no; that's not what's going on. What's going on is that people go to school; they don't actually learn a lot of useful stuff; however, the whole educational process filters out the people who wouldn't have been very good workers. So people who are lower intelligence, lower in work ethic, lower in conformity--those people tend to not do very well in school. They drop out. They get bad grades. And that's why the labor market cares. It's not that the school actually transforms you to a good worker from a bad worker. It's that the schooling, the school puts a little sticker on your head--you know, Grade A student, Grade B student, Grade C student.

...

[Interviewer] I have a natural skepticism about it. And I think a lot of labor economists do as well. And the reason is that it's an extremely expensive signal. So, you are saying, for 4 years, I give up the chance to work; I pay this tuition, whether it's $5000 or $10,000, or $30,000, or $40,000--at a private university. And for that enormous amount of money, I prove that I am a good worker and I get a sticker on my head. Wouldn't there be an easier, cheaper way to get the sticker? If all it's doing is measuring ability, this 4-year slog that's extremely expensive? That's the best way that people have come up with to get the sticker?

...

[Caplan] Yes, it's an arms race. And the fact, if it is a fact, the private return is high is really a very bad argument for pouring more money on. Now, the other point, as we were saying, the return that you should be looking at in terms of this argument of not being able to borrow against your future earnings--what you are looking at is return for the marginal people who are just on the edge of going or not going. And as we've seen, the return for those people is actually not, is actually quite mediocre. And then finally if you adjust for ability and everything else, really I would say that once you appreciate signaling you realize that, so we have subsidized education way past the point of [?] returns. So by my calculations, actually, the social return to education is now quite negative. And it would be a much better policy to drastically scale it back, so rather than encouraging more people to go, I think it's better to discourage them from going or at least to encourage them less. So in fact--so, the biggest policy implication that's going to come out of my book is we just have way too much education. I call this the white elephant in the room. There are way too many people going to school, maybe not from their own selfish point of view, but certainly from a social point of view to go and pour more money on this really is just throwing gasoline on the fire. And we need to do less of it.

h/t DarwinCatholic

Taking me to task

The Family Social Scientist takes me to task in the comments on my post on Decision Fatigue:

while this is an interesting quirk of the data and warrants further exploration, its hardly conclusive . To say that the marshmallow test is "known to predict future success in life" is a little misleading and perhaps a misinterpretation of the results.

Touché. I really cannot complain that the FSS is calling me out in the same way that I do to others all the time. It is indeed perilous to try to glean science out of popular articles. I am a rank amateur in the field, and I know how that seems to an expert because I have not always had the grace to deal lightly with those who have trespassed into my own technical specialty.

Yet, nevertheless, I will persist on this topic, despite the many landmines, because it is absolutely fascinating to me. The FSS brings up some really good points in his comment that need to be considered:

grades are hardly a proper measurement of academic progress and intelligence

Quite true. This was arguably less true in the past, but grades, both high school and undergraduate, have a pretty loose connection with both academic progress and intelligence. This is actually one of the things that first attracted me to psychometrics, because it gave me the mental tools to understand why grades aren't a measure of intelligence. 

By way of example, consider this work by Steve Hsu and Jim Schombert on college GPA and SAT scores. Hsu and Schombert explain some of the complexities their work demonstrated in an interview:

“Freshman GPA is not a satisfactory metric of academic success,” Hsu explains. “There is simply too much variation in the difficulty of courses taken by freshmen.” More able freshmen typically take more difficult courses, whereas less able freshmen take introductory courses “not very different from high school classes,” he says. Under these circumstances, academic success—an “A” in an introductory course versus a “B” in an advanced course—becomes too relative to accurately measure. Course variation decreases in later years, as students settle into their respective majors, working hard in required classes.


The new approach bore fruit: SAT and ACT scores, their analysis showed, predict upper-level much better than lower-level college grades, “a significant and entirely new result,” Schombert says. 

Hsu and Schombert are now working on including personality inventories in this assessment to see whether they can improve their model. As a guess, conscientiousness will probably be a big hitter. But, there is a difficulty here. How do you measure conscientiousness? The short answer is: we don't know how. The longer answer is we try various techniques to quantify a quality, such as personality inventories or the marshmallow test. Personality inventories are easy, but they are also easy to game. If you know what the questions are getting at, you can manufacture any result you want. The marshmallow test, and the ice bath test, are a little better in this respect because they push up against a hard limit that we hope is correlated with the thing we are interested in. Thus, even if you knew that holding your hand in the water was going to be used to judge your mental toughness, this would be a good thing because your ability to endure unpleasantness for a positive social judgement is exactly what the test is after.

This is also related to why grades aren't the best predictor either: the system is easy to game. In college admissions, this is part of the reason grades have become de-emphasized. Good grades in high school aren't by themselves a good predictor of doing well in college, but if you factor in participation in sports and other extracurriculars, you can get a rough estimate of a student's ability to stick something through and their ability to manage competing priorities. This can be gamed too, as Amy Chua demonstrates, but if you can successfully game this system, it means you are probably smart and likely to be wealthy, which is something colleges want anyway. 

An American Professor Weighs in on the Academic Process

Via John D Cook, Matt Welsh explains why he left Harvard for Google:

There is one simple reason that I'm leaving academia: I simply love work I'm doing at Google. I get to hack all day, working on problems that are orders of magnitude larger and more interesting than I can work on at any university. That is really hard to beat, and is worth more to me than having "Prof." in front of my name, or a big office, or even permanent employment. In many ways, working at Google is realizing the dream I've had of building big systems my entire career.

A big reason for this is the amount of time academics spend doing things that aren't teaching or research:

The biggest surprise is how much time I have to spend getting funding for my research. Although it varies a lot, I guess that I spent about 40% of my time chasing after funding, either directly (writing grant proposals) or indirectly (visiting companies, giving talks, building relationships). It is a huge investment of time that does not always contribute directly to your research agenda -- just something you have to do to keep the wheels turning.

There are good reasons to be an academic, but it is not for everyone. 

Academic Ponzi Scheme

Here is another take on the pyramid scheme that is higher education today, this time from England.

Why I am not a Professor

Teaching was not the only criterion of assessment.  Research was another and, from the point of view of getting promotion, more important.  Teaching being increasingly dreadful, research was both an escape ladder away from the coal face and a means of securing a raise. The mandarins in charge of education decreed that research was to be assessed, and that meant counting things. Quite what things and how wasn't too clear, but the general answer was that the more you wrote, the better you were. So lecturers began scribbling with the frenetic intensity of battery hens on overtime, producing paper after paper, challenging increasingly harassed librarians to find the space for them.  New journals and conferences blossomed and conference hopping became a means to self-promotion. Little matter if your effort was read only by you and your mates. It was there and it counted.  

Today this ideology is totally dominant all over the world, including North America.  You can routinely find lecturers with more than a hundred published papers and you marvel at these paradigms of human creativity.  These are people, you think, who are fit to challenge Mozart who wrote a hundred pieces or more of music.  And then you get puzzled that, in this modern world, there should be so many Mozarts - almost one for every department.  

When everyone is Mozart, no one is.

Critical Thinking

Browsing through the comments on the last post I linked to, I was struck by the first one:

Based on my grad and undergrad at elite schools (and classes taken at an excellent community college and excellent regional university while I was in high school), my belief was that college taught you how to think instead of just how to memorize. You would think like a mathematician or an economist or an english major or an artist etc. You would approach problems differently, but systematically and in a similar way to other folks in your major.

Now that I teach graduate students at a solid but not elite school, I understand degree creep and why so many employers favor the masters degree over the BA these days. A lot of what I do is teaching students to think rather than to just memorize facts. The first semester is major cognitive dissonance for most (but not all) of them, but they come out critical thinkers.

So prior to teaching I would have said a college degree teaches and demonstrates thinking skills, but as a teacher I realize that many 4 year schools are just 4 more years of high school.

This is bullshit. Or bollocks if you prefer.

How do I know this is bullshit? You would be hard pressed to find a school in the United States at any level from primary to tertiary that requires much in the way of memorization. All educational theory is against memorization, and all educators at all levels are constantly trying to find ways to encourage their students to understand concepts instead of memorizing facts. There is always some memorization to rail against, because memorization is a key part of education and cannot be expunged, but it is deemphasized at all times and in all places.

Don't believe me? If you are under the age of forty, then name all fifty state capitals. Can you do math in your head? Can you recite the entirety of the Gettysburg address from memory? Can you recite any poem from memory? What are the principal exports of Malayasia? Some people of about my age or a little older can do these things, probably the nerdy ones. Elite education is still good. However, this ability used to be more widespread. 

It is notable that this fantasy is contrasted with the sine qua non of modern American education: critical thinking. Critical thinking cannot be taught. Well, let me correct myself. The ideal of critical thinking cannot be taught. The simulacrum that most people actually refer to is taught quite well. 

Almost everyone that I have ever heard use the phrase does not actually mean that they expect students to be able to form independent judgments by carefully weighing evidence, and to reflect upon their own biases before making a decision. If you asked, most would claim this is what they are after, but if you look and see what criterion is used to determine success you will find that it is "the student forms the same opinions I have". 

This is precisely wrong. If you succeed at teaching critical thinking, then you should expect your students to disagree with you and with each other.  Reality is underdetermined by the facts we have available to us. To think that everyone would agree with you if only they could think straight is the sin of Rawls.

Most students do not learn critical thinking, but rather learn the opinions of their teachers. This is normal. This is why everyone cares about the character of teachers. As C. S. Lewis noted in The Abolition of Man, education is fundamentally like the process by which grown birds teach young birds how to fly. It is a kind of propagation. Your mind is passed down to your students. 

What a true university education is capable of teaching is analytical thinking. The ability to construct (or deconstruct) an argument. The ability to reason abstractly, using words or numbers. Enough detailed knowledge of the world to use as premises for further reasoning. This is not the same as the rarer critical thinking. One can be capable of forming good judgments without formal education, and one can also be a great fool despite extensive education. I would be willing to believe that a good education can foster good judgment, but it is far from a sure thing.

Generic Degrees

From Confessions of a Community College Dean:

Ask the Administrator: The Generic Degree

An occasional correspondent writes:

Some jobs out there are advertised as requiring a college degree, but
the employers don't care what was actually studied. So these employers
are in effect using college as a four-year hundred-thousand-dollar
screening test, with perhaps a bit of intellectual calisthenics for
good measure.

I had a chance to discuss this with a supervisor at one of the
management consulting companies, and he confirmed this is in fact
their policy. I suggested that since they don't care about any
specific knowledge -- only smarts and the willingness to work hard --
they should be open to hiring people right out of high school. Some
high-school students can point to significant intellectual
accomplishments, after all. But no, this is Just Not Done.

A four-year degree seems like a very expensive way of doing
intellectual quality control. Could we do better?



I hate to admit it, but there’s some truth to this.

I saw this quite a bit at PU, where some older students were already well into their careers and doing well there, but they needed their hands stamped in order to move up to the next level. They didn’t care much about the actual content of it; the point was to become eligible for management ranks. I took it as a personal victory when one of those students actually found value in a class I taught.

At an individual level, this can be kind of silly. Certainly it’s possible to be brilliant (or better, wise) without a degree, and to be bovine with one. And it’s also true that some jobs that stipulate college degrees don’t really draw on the skills that a degree is supposed to confer, whatever the major. Degree factories exist for that very reason.

That said, though, I like to think that a bachelor’s degree from a real college -- as opposed to a degree factory -- carries some meaning.

At one level, it shows the ability and willingness to stick to a program. Given the prevalence of college dropouts, those who actually finish have at least shown the ability to get their stuff together sufficiently to fulfill a multiyear commitment. (Along similar lines, students who transfer from cc’s with associate’s degrees tend to finish bachelor’s degrees at far higher rates than those who transfer with scattered credits. The graduates are those who finish what they start.) It shows the ability to navigate a bureaucracy, which is an essential workplace skill for most of the higher-paying jobs.

If the college is at least halfway serious, a degree should indicate some ability to handle complexity, to communicate at least functionally, and keep one’s balance when dealing with numbers. One of my personal indices for wisdom is the ability to handle ambiguity. Clueless people can be trained to do almost anything routine; the real test comes when the routine has to change. Some of that is temperamental, but some has to do with the ability to discern the bigger picture.

The actual content of the degree is another issue. I don’t often draw on my study of Restoration England, but I do draw on some of the skills developed through it. My social science training enabled me to stay awake and attempt to wring meaning out of long, boring, poorly-written texts; on the job, I use that skill every single day.

This is the kind of thing Charles Murray was talking about in his book Real Education. The higher education system has changed over time to meet the needs of the marketplace, but for reasons of educational romanticism has retained the dress and language of an earlier dispensation. This change isn't necessarily bad, but we should understand it for what it truly is.

The most persistent misunderstanding, here shown by the questioner's comment "a very expensive way of doing intellectual quality control", is that sending masses of people to college has anything whatsoever to do with the life of the intellect. As Dean Dad correctly notes, today college serves as a filter for high conscientiousness. Whether college imparts more C or simply sorts people by it is a fair question, but it is clear that those who successfully complete a four-year degree have better work ethic, ability to finish what they start, get organized, etc. than those who have not completed such a course. Thus it is entirely rational for businesses to sort applicants in this fashion.

Whether it is best for everyone is another matter entirely.

Physics for Future Presidents

Physics for Future Presidents is a course offered by Richard Muller at Berkeley. The course is intended primarily for liberal arts types, but Muller opened the course up to physics majors because the subject matter is not typically covered in the required courses.

This course really is amazing: it covers the essential science to understand modern technology, especially the technology that is likely to make a big impact on our lives, and is interesting besides. Nuclear power. Spy satellites. Low earth orbits. Moore's Law.

The objective of the course is understanding, not computation. An entirely reasonable goal. As Muller says, if you really need to calculate something physical, you hire a physicist (or an engineer). However, you need to be able to understand the science well enough to make a good judgment about what you are given, and what to do with it. Something along these lines would be much more effective than forcing all college students to take lab science, because this class effectively focuses on the big picture. Learning how to titrate in University Chemistry I does not necessarily make for scientific insight or good judgment. Learning how to compute a surface integral for Gauss' Law is really no better for that purpose (albeit fun).

That being said, I don't know how well this course would work on the general student population at say, Northern Arizona University (my alma mater). Jokes about liberal arts majors aside, anyone who gets into Berkeley is pretty damn smart. Students at a state college are much closer to the population mean in intellectual ability, so this course may be too hard for them as is. However, I think there is something to this. The science courses offered at most universities really are more like vocational training for scientists than the building blocks of a liberal education. The unelected elites that Charles Murray talks about really do need to understand science, but good judgment is more important than the calculus.

If you ever wanted to know more about science, but hated the classes, or couldn't hack the math, buy the popular version of the book, or watch the lectures on YouTube. It would be better if more people understood science in this fashion.