Friday, September 14, 2007

"Lookit Me! I'm a Neuroscientist!"

When you're trying to assert that conservatives aren't stupid, you probably shouldn't do it by using a line of argument that makes you sound like an idiot, Will Saletan.

The basic methodology of the study he's discussing, which showed that Liberals were more open-minded and willing to tolerate ambiguity, involved studying the brain activity of conservatives and liberals when they were presented with a simple test: push a button when you see an "M", don't push it when you see a "W" (or vice versa). The test itself was just bog-standard psychological stimuli- the real work was done with an EEG, to figure out exactly how people RESPONDED to the test.

(Remember Bladerunner? Where the point of the VK test was not for people to get the questions "right", but to get an emotional reaction? Same deal.)

But here's Saletan's response:

Fifteen minutes is a habit? Tapping a keyboard is a way of thinking? Come on. You can make a case for conservative inflexibility, but not with this study...

...An "ms"—millisecond—is one-thousandth of a second. That means participants had one-tenth of a second to look at the letter and another four-tenths of a second to hit the button. One letter, one-tenth of a second. This is "information"?
Jeebus. This is just pathetic. The stimuli was simple and the timing swift because they wanted unconscious reactions, not conscious thought.

He goes on:

Go back and look at the first word of the excerpt from the supplementary document. The word is either. Participants were shown an M or a W. No complexity, no ambiguity. You could argue that showing them a series of M's and then surprising them with a W injects some complexity and ambiguity. But that complexity is crushed by the simplicity of the letter choice and the split-second deadline. As Amodio explained to the Sacramento Bee, "It's too quick for you to think consciously about what you're doing." So, why did he impose such a brutal deadline? "It needs to be hard enough that people make a lot of errors," he argued, since—in the Bee's paraphrase of his remarks—"the errors are the most interesting thing to study."

In other words, complexity and ambiguity weren't tested; they were excluded. The study was designed to prevent them—and conscious thought in general—because, for the authors' purposes, such lifelike complications would have made the results less interesting. Personally, I'd be more interested in a study that invited such complications—examining, for instance, whether conservatives, having resisted doubts about the wisdom of the status quo, are more likely than liberals to doubt the wisdom of change.
Er, no. The "complexity and ambiguity" was handled by the pattern itself, and revealed by the EEG information that came from it. This is like saying that machine code is useless because it's only zeroes and ones, except (somehow) even dumber.

But the biggest problem? Saletan's making a difficult claim: that the authors screwed up, that they didn't really understand what was going on. Fine, make it. The problem is that if you want to argue neuroscience, you need to actually use neuroscience. You need to cite it, or at the very least come up with something a bit more scientific than your own half-baked definitions of what the hell a "conservative" is. There was nothing scientific at all in Saletan's defense, and nothing that suggests that the scientists' methodology was wrong in the slightest. In fact, he ignores the principal part of the methodology entirely, by not discussing the EEG source localization element!

I mean, I can sympathize with the idea that scientists can read too much into weak results--the oeuvre of Craig Anderson demonstrates that--but this is just ridiculous!

Then again, another study a ways back said that people who aren't smart are generally not going to understand that they aren't. So maybe this is just par for the course, huh?

Maybe I shouldn't be making fun of Saletan. Maybe I should just feel sorry for him instead.

Edit: funny thing is, there IS a legitimate methodological critique of this study, which is that the sample size is too small. That said, not all science is statistics: you don't necessarily have to have a large sample size if the data is rich enough. (Hence case studies.)

It's hilarious that the one methodological critique that makes sense is the one that Saletan avoids, though.

No comments:

Post a Comment