Education

I’m planning to do some serious heavy reading, but I’m torn between several choices of what to study next.

After reading Radley Balko’s article using a police shooting video to illustrate the faulty nature of human memory, and stumbling across Nathaniel Burney’s comic book explanation of the neuroscience behind faulty memory, and seeing a Reason interview about evolutionary psychology, I’d like to learn more about the science of cognition. I’ve read a few popular books on the subject over the years, and I think I’m ready for something more serious, like a college textbook. I’m not sure whether I want to focus on cognitive psychology, which explores what our minds do, or cognitive neuroscience which explores the underlying neurological mechanisms. Textbooks are expensive, so I need to clarify that before buying anything.

Meanwhile, I’m trying to take my amateur economic studies in the direction of public choice theory, and toward that end I’ve been meaning to read Ronald Coase’s The Firm, the Market, and the Law. Coase was incredibly influential, and changed the way economists think about transactional and social costs. Normally I don’t try to read the great works in a field because great thinkers aren’t necessarily great explainers — nobody learns physics from Newton’s Philosophiæ Naturalis Principia Mathematica or evolution from Darwin’s Origin of Species. However, I’ve heard that The Firm, the Market, and the Law is quite readable for someone with a little basic knowledge of microeconomics.

On the other hand, I think I’d like to learn more about behavioral economics. The traditional microeconomic model of human decision making assumes that people are rational utility maximizers, meaning that they want to be happy, and that they work toward that goal as efficiently as possible in the face of scarce resources and uncertain outcomes. This absurdly simple model (a.k.a. homo economicus) is the economists’ equivalent of the physicist’s assumption that everything is spherical and frictionless — it’s not true, but it still gets you pretty far. Recently, economists have started trying to predict economic decision making using more sophisticated behavioral models that assume people make systematic errors that deviate from rational utility maximization. It’s an obvious point, but one that is hard to analyze rigorously. Again, I’m looking for some textbook-level reading in this area. I think there are a couple of books that might work.

Finally, I think I’d like to do some reading about attempts to use economic thinking on the subject of crime and punishment. Economists are pretty good at explaining some types of criminal activity — markets in illegal goods and services work a lot like any other markets — but not so good at addressing issues like the deterrent effects of punishment. Economists are pretty certain that criminals must respond to incentives like everyone else, but a lot of people who work with criminals are equally certain that punishment is rarely a deterrent. Some of the new behavioral economic models may resolve the discrepancy, and so I’d like to learn more about what the scientific literature says. The is not a well-defined field, however,  which probably means I’d have to read a lot of primary sources, and I’m not sure I want to get into it that much.

I’ll figure something out. The world is a fascinating place.

The Volokh Conspiracy blog has finally made the move behind the Washington Post paywall, and that led to an interesting comment on Twitter by conspirator Orin Kerr about the change in audience from being an independent blog to being part of a major media outlet:

I think we’ve gained some and lost some. I’m worried we lost the law nerds and gained general interest readers.

As Roger Ford adds,

Skimming down the site now, it sort of reads like “Eugene Volokh explains the legal news for lay readers.”

The original Volokh Conspiracy site had long been a source of intelligent discussions about legal theory, but with its new larger and more varied readership, it has apparently become less focused. Scott Greenfield at Simple Justice explained the change in more detail:

Forgive me for digressing, but my thoughts are best expressed with some context. While VC historically highlighted legal scholarship from a somewhat conservative libertarian perspective, it did so with a touch of realism, in connection to real world events, that made it relevant to what practicing lawyers do, as well as judges who decide such matters. VC was the nexus between theory and practice.

SJ is written from the criminal defense lawyer perspective, which meant that it tended to be too rough and vulgar for academics. From my perspective, the critical audience was fellow CDLs; that others, from lawprofs to civil lawyers to non-lawyers, didn’t really matter.  To the extent I was concerned about other people’s views, it was the views of my colleagues, my brethren.

That VC has abandoned its effort to connect academic theory, even with its libertarian tilt, with real world practice, and instead sees its future as persuading the groundlings to embrace its theories, makes no sense to me at all.

Does that mean the ridiculous drivel dished out by Paul Cassell will be the norm?  Does that mean Eugene will no longer offer First Amendment analysis of any depth?  Does that mean Orin will only use small words and abandon trying to explain the mosaic theory?

That’s a common area of tension that shows up in many fields, including the sciences: There are people who are important in their field, and there are people who are experts at explaining their field. There’s not much overlap.

Some of the explainers achieve a degree of fame, but when you look at their scientific contributions, they haven’t usually made major contributions to their field. Carl Sagan was not one of the world’s greatest astronomers, and Neil deGrasse Tyson is not one of the great astrophysicists. Much the same can be said of Richard Dawkins and Steven Pinker in their fields, and Bill Nye The Science Guy is more of an engineer and inventor than a scientist.

I’m not saying these people are idiots or fakes. I’m sure they all did their jobs very well, and they’ve usually contributed something original to their fields, and all of them by definition are good at science education of some kind. Nevertheless, they usually aren’t among the top experts in their fields in the opinions of other experts in their fields.

The real experts are rarely well known to the public. Except for major historic figures like Isaac Newton or Charles Darwin, most of us wouldn’t recognize the names of important research scientists unless they have stuff named after them like Heinrich Hertz and Alessandro Volta or because they have entered popular culture, such as Erwin Schrödinger, known for his cat, and Werner Heisenberg, known for his uncertainty principle (and now for also cooking crystal meth). In their time, however, they weren’t well known to the public.

(Because the major contributors to scientific fields are generally not known to the public, I’m pretty much guaranteed to have characterized someone as an explainer rather than a major contributor because I am unaware of their important contributions to a field other than my own. Sorry.)

By way of example, my background is in computer science, and I think I can come up with a few very important contributors to computer science that you probably never heard of, such as Edsger Dijkstra, Donald Knuth, C.A.R. Hoare, Fred Brooks, Grace Hopper, and Niklaus Wirth. You probably know Noam Chomsky, but for his politics rather than for his influence on computer science, and everyone seems to have heard of the Turing Test for artificial intelligence, but that was not Alan Turing’s most important contribution to computer science.

The division between contributors and explainers often occurs within academia, in the split between teaching and research. Economist Steven Landsburg illustrated the difference between these groups by analogy to a cocktail party involving two groups of people: The researchers are like a group of people in the center who are talking to each other about all the interesting things they do, whereas the educators are are all standing around the edges, talking about what the group in the center has been up to. (I may have mangled this a bit.)

Landsburg asked readers which group they’d rather talk to: The interesting people in the center or the people at the edges who talk about what the folks in the center are doing. To him, the answer obvious answer was that you’ll want to talk to the people in the center, and that’s why students are better off joining academic departments that do research.

I think that misses an important point: Talking to the group in the center is only the best choice if you can understand what the people in the center are talking about. A student new to the field is unlikely to benefit from discussions that assume half a decade of education in the field. More to the point, there’s a difference between understanding complicated subjects, and knowing how to break down complicated subjects into simplified component bits of knowledge that can be taught to students.

One of my introductory calculus classes was taught by a professor who was one of the most important researchers in the math department. It was a terrible class. I have no doubt he understood the subject, but he had no idea of what it was like to not understand calculus, and he was consequently incapable of explaining it to us. Rather than using carefully crafted examples to illustrate how calculus works, he would make up ad hoc problems that required us to spend a lot of time thinking about ancillary issues. The homework problems would be straight out of the lesson plan, which was not always what he had been teaching us. The disconnect was especially bad on the test questions — I’m convinced that some of them required us to know things he didn’t realize he hadn’t taught us yet.

There’s also the question of whether the people in the center will be willing to talk to people who know very little about the subject. After all, they also want to learn the cool new stuff, and that means they have little time for newcomers who can teach them nothing interesting. Serious research professors are known for having crappy office hours.

Switching back to my own field, software development, as an experienced software engineer, I would probably have trouble figuring out how to teach an introductory course in computer programming. For example, when I approach a programming problem, I might think about many aspects of it at once — algorithmic correctness, efficiency, resource consumption, parallel processing, network traffic, database architecture, interface design, scalability, generalizability, separation of concerns, layering, composability, opportunities for refactoring, testability, and so on. I’m not trying to brag. Those are all things that pretty much any experienced software engineer will keep in mind, and they are things that all developers should learn about.

However, it would be a mistake to try to teach someone computer programming from the ground up by teaching them about all those things at the same time. A good teacher would probably start with some foundational skills such as expressions, control structures, and basic class design before moving on to details of the language and the runtime library and then some of the bigger-picture organizational concepts.

Every field needs both kinds of members — those who do it well, and those who explain it well. Those who do it well sometimes look down on those who teach it, especially since the teachers often lack the detailed knowledge of the practitioners, and they often make mistakes. These errors and omissions are a problem, and they should be corrected, but when it comes to teaching a field of knowledge, a skilled teacher who gets some parts wrong can still impart more information to an audience than a skilled practitioner who knows everything but doesn’t know how to explain it.

For example, in discussing the weather, we refer to the relative humidity of the air. The basic idea is that air has a temperature-dependent maximum capacity for moisture — the higher the temperature, the more water the air can hold — and relative humidity expresses how much water is in the air as a percentage of the maximum theoretical capacity.

This concept explains thinks like why items in the refrigerator frost over when you leave the door open — the warm room air cools down, which reduces the water carrying capacity below the amount of water already in the air, forcing the excess water vapor to be deposited as “sweat.” This is also why air conditioners always have to drain off water: The suddenly cooled air can’t hold the water and deposits it on the evaporator coils.

This model also explains why we run humidifiers in winter — your furnace warms the air, which increases it’s water vapor carrying capacity, but your furnace doesn’t actually add water to the air. Since relative humidity is the amount of water vapor in the air divided by the maximum capacity, and only the capacity is increased, your furnace reduces the relative humidity of the air, and it feels too dry. A humidifier adds water vapor to bring the relative humidity back up to comfortable levels.

Further, the human body’s cooling system relies on sweat evaporating from the skin to carry off heat, but if the air is already near its carrying capacity, there’s no “room” for the sweat to evaporate, so your body doesn’t cool enough. This is why dry heat is more comfortable than hot and humid weather. It’s also why we set out thermostats warmer in winter than in summer: The heated air is drier, so evaporative cooling makes us feel chilly unless we bump the temperature up a bit.

This “carrying capacity” model of humidity is widely known, it makes sense of a lot of things we observe about the world, and it is routinely taught by school teachers. And yet it is almost completely wrong. The real explanation of what’s going on is considerably more complex and harder to understand, unless you are used to thinking about systems in equilibrium and know some basic physics of gases.

To be sure, the correct explanation is much better. It can be expressed analytically, and you can use it to solve real-world engineering problems, where it will give accurate answers across a broad range of scenarios. And yet most people can get by with the simple but wrong explanation, because it’s good enough. And in this case, good enough is easier to teach.

Understand, I’m not defending teaching people things that are wrong. That’s always a bad outcome, and in fields like medicine or law, it can be dangerous. What I’m saying is that often someone who’s good at explaining things can be a better educator, even if they make some mistakes, than someone who gets everything right, but can’t get it across to anyone else.

And sometimes it’s not so much that they can’t explain it as that they won’t explain it or they don’t have the time to explain it. We can complain about some of the questionable neuroscience in Carl Sagan’s Dragons of Eden, but most real neuroscientists are busy doing real neuroscience, and they don’t have time to answer your questions or write a popular science book. We can poke fun at law professors who reveal their lack of practical knowledge when they go on talk shows, but most experienced trial lawyers are too busy practicing law to answer questions about the latest trial in the news. In both cases, as long as they don’t actively make people stupider, we’re better off with them than without them.

As for the Volokh Conspiracy, I don’t read enough there to follow what’s going on, but if they are changing their target audience from “law nerds” to general interest readers, that’s going to disappoint the nerds, but maybe it will bring a smarter, more rigorous explanation of the law to lay readers.

It’s graduation season again, which means it’s time for the usual round of news stories about controversial graduation speakers and the attempts by protesters to get them disinvited. For example, after protesters at Rutgers got Former Secretary of State Condoleeza Rice to back out of giving the commencement address, the normally wonderfully cynical P. J. O’Rourke went into full curmudgeon mode complaining about the kids these days and extolling Rice’s experience:

…she also served, from 1989 to 1991, as the Soviet expert on the White House National Security Council under President George H. W. Bush.

1989 happens to be when the Berlin Wall fell. I know, I know, most of you weren’t born, and you get your news from TMZ. A wall falling over can’t be as interesting as Beyonce’s sister punching and kicking Jay Z in a New York hotel elevator. But that 1989 moment of “something there is that doesn’t love a wall” (and I’ll bet you a personal karaoke performance of Beyoncé’s “Single Ladies (Put a Ring on It)” that you can’t name the poet who wrote it) had interesting consequences. Stop taking selfies and Google “Berlin Wall” on the iPhones you’re all fiddling with.

Condoleezza Rice was named National Security Adviser in December 2000, less than a year before some horrific events that you may know of. She became Secretary of State in 2005 during an intensely difficult period in American history (which your teach-in was not going to teach you much about).  And she saw the job through to the end of the fraught and divisive George W. Bush presidency, making moral and ethical decisions of such a complex and contradictory nature that they would have baffled Socrates, Plato, and Aristotle (of whom I suppose, perhaps naively, you have heard) put together.

You know what? Nobody gives a shit what Condoleeza Rice would have said at the Rutgers graduation ceremony. You know why? Because it was graduation day. All the exams have been taken, all the grades have been submitted. This material will not be on the test.

(O’Rourke goes on to make fun of Rutgers for being the 69th best-rated university according to U.S. News & World Report — so you know it must be true — and makes fun of a professor named Bell by nicknaming him “Jingle,” thus showing why he gets paid the big bucks while I toil here for free.)

Another of these articles comes from Stephen L. Carter, who has a sneering attack on oversensitive protesters in BloombergView:

In my day, the college campus was a place that celebrated the diversity of ideas. Pure argument was our guide. Staking out an unpopular position was admired — and the admiration, in turn, provided excellent training in the virtues of tolerance on the one hand and, on the other, integrity.

Your generation, I am pleased to say, seems to be doing away with all that. There’s no need for the ritual give and take of serious argument when, in your early 20s, you already know the answers to all questions. How marvelous it must be to realize at so tender an age that you will never, ever change your mind, because you will never, ever encounter disagreement! How I wish I’d had your confidence and fortitude. I could have spared myself many hours of patient reflection and intellectual struggle over the great issues of the day.

Look, if you’re arguing whether they’re right or wrong to protest, then your reflexive defense of free speech is missing the point. It’s just not the right occasion for controversial speakers and the “ritual give and take of serious argument.” I’m all for debate and new ideas, but by graduation day, you’ve kind of missed the window.

(And how is that argument supposed to take place, exactly? It’s been a while since I graduated, but I’m pretty sure there wasn’t a Q & A session.)

When students at Smith College protested a scheduled commencement address by Christine Lagarde, Managing Director of the International Monetary Fund, she graciously withdrew from the event as college President Kathleen McCartney explains:

I regret to inform you that Christine Lagarde has withdrawn as Smith’s 2014 commencement speaker in the wake of anti-IMF protests from faculty and students, including a few who wrote directly to her. She conveyed to me this weekend that she does not want her presence to detract from the occasion.

“In the last few days,” she wrote, “it has become evident that a number of students and faculty members would not welcome me as a commencement speaker. I respect their views, and I understand the vital importance of academic freedom. However, to preserve the celebratory spirit of commencement day, I believe it is best to withdraw my participation.”

Lagarde understands what the people who scheduled her did not: Students have been working long and hard to get to that ceremony, and it’s supposed to be about them.

On the day I graduated with my Bachelor’s degree, my parents drove down to see me. Neither of them had been to college, so they were proud that they had been able to send me, and I was grateful for their support and encouragement. I was also pretty proud of myself. I had done well and earned a Bachelor of Science degree with High Honors, which at that point in my life was one of the most difficult things I had ever done. I remember showing them around the campus, talking about where I lived and where I did all my studying. They had brought along some friends of mine, and afterwards we went out to dinner at a nice restaurant.

Those are my memories of of graduation, and I think I can safely assume that many of my friends have similar memories. And I’ll bet few of us can remember what the guest speaker said. But thank God that whoever planned the ceremony didn’t bring in some controversial lightning rod of a speaker. Our commencement address was given by physicist Leon Lederman. I think he said something about education.

(Dr. Lederman won the 1988 Nobel Prize in Physics for developing the method he and two other physicists used to discover the muon neutrino. He was also the director of Fermilab for ten years. He was one of the biggest advocates of searching for the Higgs boson and he wrote the most well-known book about it, The God Particle. By reading that book, or some textbooks on particle physics, or really even the Wikipedia article on neutrinos or the Higgs boson, you can learn much more about the real important work of Leon Lederman than anybody learned from his commencement address.)

Meanwhile, Stephen Carter had something to say about the Rutgers situation as well:

Then there are your fellows at Rutgers University, who rose up to force the estimable Condoleezza Rice, former secretary of state and national security adviser, to withdraw. The protest was worded with unusual care, citing the war in Iraq and the “torture” practiced by the Central Intelligence Agency. Cleverly omitted was the drone war. This elision allows the protesters to wish away the massive drone war that President Barack Obama’s administration has conducted now for more than five years, with significant loss of innocent life. As for the Iraq war, well, among its early and enthusiastic supporters was — to take a name at random — then-Senator Hillary Clinton. But don’t worry. Consistency in protest requires careful and reflective thought, and that is exactly what we should be avoiding here.

This just proves my point. You invite someone like Condoleeza Rice, and next thing you know, the political pundits like Carter are insulting your community and somehow linking your graduation ceremony to an attack on Barack Obama and Hillary Clinton.

It gets worse. After Haverford College students protested against former University of California (Berkeley) chancellor Robert J. Birgeneau, leading to his withdrawal, his replacement speaker decided to make a stink about it:

William G. Bowen, former president of Princeton and a nationally respected higher education leader, called the student protesters’ approach both “immature” and “arrogant” and the subsequent withdrawal of Robert J. Birgeneau, former chancellor of the University of California Berkeley, a “defeat” for the Quaker college and its ideals.

So he insulted the Haverford and its students, and it made the newspapers. Awesome job, whoever planned the ceremony. Are you happy with what you’ve done to your graduation this year? Is this what you wanted?

Also, as with O’Rourke, Bowen seems not to understand how commencement works:

“I am disappointed that those who wanted to criticize Birgeneau’s handling of events at Berkeley chose to send him such an intemperate list of “demands,” said Bowen, who led Princeton from 1972 to 1988 and last year received the National Humanities Medal from President Obama. “In my view, they should have encouraged him to come and engage in a genuine discussion, not to come, tail between his legs, to respond to an indictment that a self-chosen jury had reached without hearing counter-arguments.”

Yeah, no question, sending lists of demands is a douchebag thing to do. But how the hell does Bowen think they’re going to “engage in a genuine discussion” during the commencement speech? Does he even hear what he’s saying?

In her letter to the Smith community, President McCartney doesn’t do any better:

I want to underscore this fact: An invitation to speak at a commencement is not an endorsement of all views or policies of an individual or the institution she or he leads. Such a test would preclude virtually anyone in public office or position of influence. Moreover, such a test would seem anathema to our core values of free thought and diversity of opinion. I remain committed to leading a college where differing views can be heard and debated with respect.

Again, “debated”? On graduation day? This is feel-good nonsense. It’s an admirable defense of free speech principals, but if your graduation ceremony has become the subject of a rancorous debate, you’ve kind of already ruined it.

I understand the point McCartney is trying to make. It’s okay to object to speech you don’t like, and it’s okay to speak out and protest against it. But it’s not okay to silence speech you don’t like, and it’s not okay to deprive other people of their right to hear speech you don’t like. Everybody say it with me: The best remedy for bad speech is good speech.

Many colleges and universities are run by people who feel it’s their role to challenge students’ preconceptions and present them with a wide range of viewpoints and opinions. I think those are perfectly valid values for an institution of higher education. It makes a lot of sense to schedule speakers who are unorthodox, who represent unpopular ideas, and who make people uncomfortable.

(Although, if I seem less than completely enthusiastic, it’s because I am annoyed by the amount of importance placed on non-curricular stuff like this. I suppose plenty of people go to college to “have experiences” and “encounter other ways of being” or whatever. But I went to school to learn shit. My degree was in Computer Science, and I spent all my time learning algorithms and data structures and the discrete mathematical structures that underlie so much of computing. I learned computer graphics and databases and operating systems. I spent long hours learning computer architecture and the deep mysteries of compilers, and as hard as some of it was, it was also absolutely fascinating. If you’re really interested in the subjects you’re studying, there are worlds to explore.)

So if you want college to present students with controversial speakers, I’m all for it, and to hell with what a bunch of whiny protesters say. But can we please stop pretending that the graduation ceremony is a crucial moment in students’ education? You’ve had four long years to mold their minds and shape their way of understanding the world. You’ve had plenty of time for all the challenging speakers — or better yet, challenging classes — you could possibly want. If you haven’t done the job right by graduation day, it’s too late.

And if you have done the job right, what’s the point of having a controversial speaker? What more good could it possibly do? The students have done everything you’ve asked for four years and now they just want to celebrate with their friends and families. After all they’ve been through to get there, making many of them sit and listen to someone you know they’ll find offensive is kind of a dick move.

(By the way, you may notice something missing from all these articles complaining about protests against commencement speakers: Quotations. Condoleeza Rice, for example, has about a dozen honorary doctorate degrees, so you know she’s given commencement addresses before, yet for all that P. J. O’Rourke extolls her virtues and accomplishments, he never quite gets around to giving any examples of the awesome things she’s said at any of her other speeches.)

Finally, if there are protests, and your speaker backs down, it’s only going to draw the kind of ugly media attention that Haverford, Smith, and Rutgers have been getting. All you will have accomplished is making your college look stupid and marring the day for your graduating students. Again, it’s wrong that the protesters are able prevent someone from being heard. But it was a predictable consequence that could have been avoided if you had kept the students in mind and chosen a speaker who would complement the occasion instead of dominating it.

Over at Ethics Alarms, Jack Marshall writes:

Most of all, I do not understand the persistence of the myth that a college education can, does, or should qualify a graduate for good job, when it appears that a large percentage of students, if not a majority, leave the campus unable to write, think, or name the men on Mount Rushmore.

Mount Rushmore? That’s old media…

Seriously, though, in the context of qualifying for a job, what does knowing the faces on Mount Rushmore have to do with anything? Still, Marshall’s got a point about the mixed-up priorities of some universities. Read the whole thing.

Christie Wilcox writes one of my favorite blogs, Observations of a Nerd, and is hoping to win a $10,000 scholarship for her graduate studies. She’s an excellent blogger and scientist. Over the past week she had some great articles on evolution which you should check out.

Her competition looks lame, yet she was running behind in the polls when I voted for her. Please give her a hand and vote for Christie Wilcox! (Consider it practice for next Tuesday.)

I really enjoy studying history. I’ve moved around time and the globe delving deeper into Mycenaean influences beyond Greece and studying the nuances of one particular commander who has been maligned in the Battle of Gettysburg. Every time I think I might find some historical event boring, I run into some fascinating element that grabs my attention.

I would never consider myself a professional historian. I would certainly never claim that I was capable of writing a history book for use in a public school.
Just for the fun of it, however, let’s say I did write an elementary school history book. No school board would be stupid enough to buy it, right? I suppose that would depend upon what the school board wanted history to be.
In Virginia, as in most of the country, school boards are popularity contests that have virtually nothing to do with academics. They were so eager to rewrite the history of the US Civil War, they adopted a school history book written not by a historian, but by Joy Masoff who wrote the “correct” history and backed the position up with links to something she happened to run across on an Internet website. She did no fact checking. She didn’t look into the claims to see where they got their “facts”. It’s on the Internet, so it must be true! (*If a student of mine tried something like that on a term paper, I would have them rewrite it.)
Then the school board, which managed to find the history it happened to like, didn’t bother to run it past any actual historians. After all, it’s written in a book, it must be true!
Since it looks like I’ll have some spare time away from Windy Investments, perhaps I should write history books. The first step will be finding out what some school board would like to hear. The rest is easy. Just Google for the information and quote the first source I find.
———————-
* The Internet is a poor to fair resource for scientific research, but has been getting better. Google Scholar allows me to find materials that, just a few years ago, would have required out-of-state trips to university libraries. No one in their right mind, however, would take a basic Google search and assume that all of the results are Gospel.

The science blogging community has been having a good laugh over the past few days about a “scientific” conference being held in South Bend Indiana (near Notre Dame!) on how Galileo was wrong about the heliocentric solar system. Yup, it’s a conference to discuss and review the science and politics of the geocentric model supporting the idea that the Earth is at the center of the solar system.

Along with the expected jabs at the whole notion, a study showing that only 79% of Americans believe the Earth travels around the Sun is often cited. Comments about this usually range from simple ridicule of public knowledge to condemnation of science education in the country. Others say the blame shouldn’t be placed on education but on religious institutions instead. I’ll stick with blaming education.
Bear with me as a make a statement that will, at first, seem as if I’m part of the ignorant 21% of America. Geocentrism (the hypothesis that the Sun and all the planets revolve around the Earth) should not necessarily be ridiculed out of hand as a completely silly notion. When standing on the Earth making observations of the universe it does an amazing job explaining what you can see and measure using simple instruments. It is a good scientific theory in that it uses those measurements to construct a hypothetical model of how the solar system works and makes testable predictions which can be observed.

“But what about retrograde motion of the outer planets?” you may ask. Excellent question! Your class participation is duly noted. We can see that, from the perspective of the Earth, at times some planets appear to stop in their orbit of the Earth, move backwards for a bit, then proceed forward once again. Geocentrism can explain that by placing those planets in their own, smaller orbit about a point which itself orbits the Earth. Back when geocentrism was the accepted theory of how the solar system worked, scientific predictions were made and the future observations were very accurate. The heliocentrics of the day also tried to explain such motion, but their predictions were less accurate. Science, rightly so, considered the Sun-centered model to be wrong. Geocentrism fit the data better and made better predictions.

Here in Chicago the Adler Planetarium has an amazing display of mechanical models of the solar system. I recall seeing one showing just how an Earth-centered solar system works. If you are ever in Chicago it’s worth taking a look at the collection.

The Sun-entered model made such bad predictions because planetary orbits are not circular. Once Kepler developed the theory that planets swept their orbits out in ellipses rather than circles supporters of the now-altered Sun-centered model were able to make predictions just as accurate as supporters of the Earth-centered model. The two theories now had equal evidence to support them. Some hung onto the Earth-centered model since it was better established. Others preferred the Sun-centered model for its simplicity.

There were developments, though, that tipped the balance in favor of the Sun-centered solar system. Ethan Siegel over at Starts With a Bang! explains why and compares the two competing theories. To summarize, the invention of the telescope allowed for new observations. First it was noticed that Jupiter had its own set of moons which obviously orbited Jupiter and not Earth. While this observation discredited the religious notion that everything in the universe revolved around our planet, it was not the nail in the coffin of geocentrism. After all, the outer planets already were thought to revolve around their own central point in an epicycle. Imagining that moons could orbit planets while those systems orbited the Earth was not difficult.

What was difficult to explain, though, was the telescopic observation that Venus had phases and that the phases coincided with an apparent increase or decrease in observed size of the planet. That observation was repeated by independent astronomers again and again. No geocentrist was able to come up with a plausible, testable model to explain the observation. The heliocentric model, on the other hand, actually predicted such an observation. Science finally had the final nail to drive into the geocentric coffin.

The scientific theory that the earth was at the center of the solar system was still a good theory. You could use the scientific method to make predictions and test those predictions. It was rightly accepted as the best model until previously unavailable observations were made. It was rightly discarded once a different theory better fit the observations while still making good testable predictions. A geocentric solar system model wasn’t silly. It was good science. It just happened to be wrong.

The Sun is at the center of our solar system with the Earth and other 7 planets revolving about it in nearly perfect ellipses. One in five Americans do not know this. I blame our education system. For an explanation of why you will need to stay tuned to this channel.

 

The latest micro-storm to hit the legal blogosphere started simply enough with Gideon’s nearly harmless post on “10 things I didn’t learn in law school.” I thought the worst item was #5:

That law review leads to document review. If you want to do real work, take a clinic or something.

That’s a case where the priorities of law school actually hurt your chances in the real world. Everything else was the routine sort of on-the-job stuff that’s really hard to teach in school. Nothing controversial there, or so I thought.

Professor David Papke at the Marquette University Law School would doubtless disagree with me:

With the exception of item #10, I thought the list was cynical to a fault. Too many lawyers have a sad bitterness and mean anti-intellectualism about them. Maybe living in debt and working in the context of hierarchy and bureaucracy produces those attitudes. I wish somehow that lawyers could remember law school as a demanding but enriching academic experience.

Well, they’ll remember it that way if you run your law school right, but I digress.

(To digress some more, trench lawyers like Gideon or Scott or Mark may seem anti-intellectual compared to a tenured university professor, but considering that their jobs routinely involve getting into verbal knife fights, they’re a pretty thoughtful bunch of guys. I think of them like the Doc Holiday character from the movie Tombstone: Educated and articulate, but if the need arises, they can put an opponent in the ground.)

Papke gets things going with this comment:

We don’t want law school to be lawyer-training school. When we cave in to demands of that sort from the ABA and assorted study commissions, we actually invite alienation among law students and lawyers. Legal education should appreciate the depth of the legal discourse and explore its rich complexities. It should operate on a graduate-school level and graduate people truly learned in the law.

Scott Greenfield takes issue in a post is subtitled “Training Lawyers is Beneath Us”:

Imagine, the dirtiness of a law school teaching law students how to practice law.  Disgusting.  Revolting.  How beneath the dignity of such a distinguished scholar.

I think I understand what Scott means, but I can’t help wondering if he’s expecting too much from law school. Is it even possible for academia to teach the things that Gideon is talking about? Or would a better subtitle for Scott’s post be “Training Lawyers is Beyond Us”?

I don’t know anything about lawyering, so I’m going out on a limb here, but based on my own experience as a Computer Science graduate and software developer, I don’t think universities can teach a lot of practical job skills.

When I got my CS degrees, I learned a lot of foundational computer science like data structures, analysis of algorithms, discrete structures, and language theory. I also learned some more practical subjects such as computer graphics, database design, networking protocols, and a smattering of computer languages.

What I didn’t learn, however, were the practical skills of a working programmer. Things like:

  • Working with a team of engineers, software developers, and contract lawyers to write a 300-page proposal for a $5,000,000 project.
  • Gathering software requirements from end users who aren’t sure what the software should do—or disagree about the requirements they are sure of—but absolutely know it has to be finished by the third quarter.
  • Breaking down a large software project into parts that can be built by team members and then integrated into a working system.
  • The Iron Triangle of project management: Schedule, budget, scope. Pick any two.
  • Creating a directory tree to hold all the parts of a software system and writing scripts to build the whole system on demand.
  • Using tools to track and manage bug reports, change requests, and the code itself.
  • Deciding when to freeze requirements, tools, and changes to make a release deadline.
  • Staging code changes into a working production environment.
  • Remembering to keep copies of every development tool you use, so the that you can find them all again when the software suddenly need maintenance five years later.
  • Integrating the latest hot technology into a 250,000-line code base that began life a quarter century ago.
  • Providing support, over the phone, to a $1000/day technician at a customer site nine timezones away.

There are a couple of these items that have been added to Computer Science curriculums since I was at school, but most of this stuff cannot realistically be taught in a classroom. You learn a lot of this stuff simply by doing it—find a professor who’s running a software project, join a team doing open source development, get a job.

I imagine the same thing is true for law school. There’s lots of stuff it can never teach you, and it’s unrealistic to expect it to do so.

Finally, when Papke writes about the pressure to “cave in to demands of that sort from the ABA and assorted study commissions” it sounds a lot like a problem faced by many Computer Science departments: The companies that hire graduating students want universities to teach them the latest hot technology, whatever it is.

In other words, they want the schools to function as their training department. But there are better ways to meet that kind of short-term need than with a university curriculum.

It is here that I think Scott and Gideon should be careful what they wish for, because if law schools become more responsive to the needs of practicing lawyers, they won’t be responding to the needs of criminal lawyers and other solo practitioners. They’ll be responding to the needs of Biglaw, and they’ll be grinding out students who are experts at document review, business law, and probably, these days, bankruptcy.

By now you may have heard (via Balko, Simple JusticeMoby Kip, or new blawger Bobby Frederick) about the brilliant idea some folks in El Camino California came up with to teach students the importance of not driving drunk:

Many juniors and seniors were driven to tears – a few to near hysterics – May 26 when a uniformed police officer arrived in several classrooms to notify them that a fellow student had been killed in a drunken-driving accident.

About 10 a.m., students were called to the athletic stadium, where they learned that their classmates had not died. There, a group of seniors, police officers and firefighters staged a startlingly realistic alcohol-induced fatal car crash…

Though the deception left some teens temporarily confused and angry, if it makes even one student think twice before getting behind the wheel of a car while intoxicated, it is worth the price, said California Highway Patrol Officer Eric Newbury, who orchestrates the program at local high schools.

“When someone says to me, ‘Oh, my God, you’re traumatizing my children,’ I’m telling them, ‘No, what I’m doing is waking them up,’ ” said Newbury, whose father was killed by a drunken driver.

What a great idea! I’m sure the students at the school are very grateful for being taught this important lesson, and soon they’ll be looking for a way to repay Officer Newbury for his efforts.

Maybe one day, while he’s out on patrol keeping the roads safe, the students should call his wife and tell her that he was shot and killed while making a traffic stop. Just imagine the joy that will fill her heart when, a few hours later, he arrives home safe and well. It will be an important reminder to him of the need to be careful even during a routine traffic stop, and to both of them of the precious value of the time we get to spend with our loved ones.

Or maybe a few of the students could contact the media and say that those officers had sexually molested them. Later, they could reveal that it was all a hoax to remind the police of the importance of the presumption of innocence.

Or maybe the parents of one of the students could keep him home the next day, and when the school calls, they could say that he hung himself in the garage last night, and that they don’t understand why because his therapy was going so well, so could anybody at the school think of something that might have upset him? The next day, he could return to school and explain that it was just a way to teach them an important lesson about honesty.

Or maybe a bunch of the families could get together to send the school a lot of official-looking paperwork claiming they were suing for $10 million dollars for intentional inflection of emotional distress. Then the next day they could explain it was all a hoax to teach them an important lesson about thinking before they do things like this.

Then the day after that, they could sue the school for $10 million dollars for intentional inflection of emotional distress. That would teach them a lesson.

Just when I think the zero tolerance rules in public schools can’t get any more idiotic, this story hits the net:

Fairfax County middle school student Hal Beaulieu hopped up from his lunch table one day a few months ago, sat next to his girlfriend and slipped his arm around her shoulder. That landed him a trip to the school office.

Among his crimes: hugging.

All touching — not only fighting or inappropriate touching — is against the rules at Kilmer Middle School in Vienna. Hand-holding, handshakes and high-fives? Banned. The rule has been conveyed to students this way: “NO PHYSICAL CONTACT!!!!!”

Do they really want a generation of children that views all touching as something wrong? Do they really want a student body in which no one ever feels the comforting touch of a friend? This is verging on child abuse.

Why, you may wonder, would they do this?

Deborah Hernandez, Kilmer’s principal, said the rule makes sense in a school that was built for 850 students but houses 1,100. She said that students should have their personal space protected and that many lack the maturity to understand what is acceptable or welcome.

And now they’ll never learn to understand. The stupidity is mind-boggling. Does Principal Hernandez plan to avoid mentioning algebra because the children don’t understand it? Gosh, if only there were some place—some sort of institution perhaps, staffed by people with special training—where young people could learn basic knowledge about how to survive in our society…

“You get into shades of gray,” Hernandez said. “The kids say, ‘If he can high-five, then I can do this.’ “

Yeah, because college degrees in Education just don’t prepare teachers to deal with questions that tricky.

Dr. Helen puts it this way:

This no touch rule seems wrong in so many ways, I don’t know where to begin. I used to think schools were becoming like prisons, but honestly, prisoners have more rights. As one parent so aptly put it in the article, “how will you teach students right from wrong?” Indeed, how? For, if every behavior is seen in terms of black and white, how will kids learn where the boundaries are? Physical touch, along with adult guidance teaches kids where the boundaries are, no touching at all teaches them that normal expressions of behavior are aberrant–or that they have to sneak behind the backs of those in authority to get or show affection. What kind of lesson is that to teach?

The comments at Dr. Helen’s blog are pretty interesting. It’s amazing how many people make references to science fiction stories about dark future distopias. It’s that bad.

John Ruberry, the Marathon Pundit, just did some actual reporting about the veterans’ scholarship scandle at the University of Illinois.

This didn’t sound like much of a story when I first heard about it. The University of Illinois had offered 110 scholarships to Illinois veterans for the night MBA program in downtown Chicago. A bunch of veterans were accepted and received confirmation letters. Later, however, the University cancelled a lot of those scholarships, accepting only 37 of them.

Some people seemed to be trying to spin this into an example of anti-military attitudes in academia, but having worked at a university for a while, it sounded to me like a typical foul-up. The academic side of academia works best when it is very decentralized, with each department making staffing and curriculum decisions on its own. The administrative work, however, requires rigorous standards and careful attention to detail, and departments get themselves into trouble when they try to cut corners. It sounded like the department that runs the night MBA program had promised something that the rules wouldn’t allow it to deliver.

Now that John Ruberry has delved into the story a bit, including an interview with one of the principles, it’s sounding a lot shadier than I thought:

What happened next is shocking. Ghosh, DeBrock, Admissions Dean Sandy Frank and Ikenberry decided to take matters into their own hands. So they got a copy of the admissions database from the Executive MBA program, studied it, and in an ex post facto manner, put in new procedural deadlines for the completion of application materials in order to reduce the number of military veterans in the program.

They basically looked at military candidates’ application data and came up with new deadlines that they knew military candidates hadn’t met. Sort of like betting on a horse a couple days after the race…or moving the goalpoast before a field goal attempt.

Read the whole thing.

this happens:

The Supreme Court ruled unanimously Monday that military recruiters must have the same kind of access as other employers coming onto campus to give out information and conduct job interviews, if the campus receives federal money. Most campuses rely on some share of the $35 billion the government channels each year to higher education.

The law that blocks this funding is known as the Solomon Amendment, and it has become a point of contention for many law schools. Here’s a brief history of the Solomon Amendment that I found at a protest site:

In 1995, Congress passed the first Solomon Amendment, denying schools that barred military recruiters from campus any funds from the Department of Defense. The next year, Congress extended the law’s reach to include funds from the Departments of Education, Labor, and Health & Human Services. In 1999, legislation shepherded by Rep. Barney Frank removed financial aid funds from the federal monies potentially affected by the Solomon Amendment. Defense Department regulations proposed in 2000 and formally adopted in 2002 exponentially toughened the law by interpreting it to require revocation of federal grants to an entire university if only one of the university’s subdivisions (its law school, for example) runs afoul of the law. In 2005, Congress amended the law to explicitly state that military recruiters must be given equal access to that provided other recruiters.

In a sane world, this would be a stupid law. Presumably, these schools are receiving federal money for a reason. Either they are providing services to the government, such as research or program management, or the money is being given to them to serve a public purpose such as educating the people of this country. The point is, the schools are receiving money because the government needs something that they can provide.

The government’s need for the school’s services doesn’t go away just because the school stops allowing military recruiters on campus, so it doesn’t make sense to stop buying that service. If the government still needs whatever it’s paying the school to do, then it should keep paying the school, otherwise it should stop. Recruiting has nothing to do with it.

That would be in a sane world. In our world, a lot of schools receive money as a blatant handout by politicians trying to gain support re-election. The schools ought to expect to find a few strings attached.