Neurosciencyness is the new truthyness

Got off the train this morning and noticed a new (ish? I’m not super attentive to these) ad:


L-theanine is an amino acid that crosses the blood-brain barrier and has been shown to increase alpha-band EEG activity. The logic here seems to be the same as that in things like the games that come with the Mindwave biofeedback system:  since alpha-band EEG oscillations occur when people are relaxed, then manipulations that increase in alpha-band activity must be increasing relaxation. (Nobody, to my knowledge, uses this logic while also accounting for the many other cognitive activities that increase alpha-band activity, like actively remembering things, or attending to one object in a crowded field, that do not seem to have much at all to do with relaxation. Brains are, unfortunately, complex, interconnected, emergent systems, and cause and effect are rarely straightforward.)

Neuro seem to have a whole range of these drinks, each with a different additive, a neurosciency gibberish explanation of why that additive will change your brain function, and a different targeted effect. A few years ago, a paper demonstrated that people are much more willing to accept explanations (even bad explanations) if a neurosciency clause is included. Shortly afterwards, another group showed that people are more likely to accept neurosciency explanations if the brain activity being discussed was shown on realistic brain pictures, as opposed to abstract images or bar graphs.


Besides calling their brand “neuro”, these guys have a cute little “EEG activity in the brain” logo, and their website has the slogan “light it up”. Scarily impressive use of the metaphors and descriptors that are used to describe localized neural activity, even by those of us who should know better.

When I posted these to Facebook this morning, one friend said, “I’m not sure whether to be skeptical about their ability to actually modify neurotransmitter availability, or seriously concerned about people actually changing their brain chemistry in untested and unregulated ways.” I think this pretty well sums up my feelings too.


William Gibson on how fashion and information have changed

Maybe what we're really talking about is novelty. Because a mechanism like Twitter is probably the single most powerful and efficient aggregator of novelty that ever existed. The real function of magazines, for the most part, has been to aggregate novelty, to run around to find a lot of new things, put them in your issue, and get them out ideally before the other magazines notices it—then people buy it, bring it home, and wolf down all this novelty.

Now if you've got your Twitter feed set up right, every day you can get more raw novelty dumped on your desktop than you can get buying an entire magazine store. And it changes every day. What does that mean? One thing it means is that magazines have to find something different to do—you have to find some niche that you can operate in.

If novelty is available wholesale at that level and at that quantity for free to a 15 year-old in Nebraska, what's that going to do to the rest of us? I don't know, but I'm sure the next time I'm writing a novel that's going to be one of the post-it notes on the windshield.

at The Atlantic

This interview with William Gibson is a couple months old, but if you're a fan of his, especially of Pattern Recognition and his other recent books, it's most excellent reading.

Scholars Turn Their Attention to Attention – The Chronicle Review – The Chronicle of Higher Education

In the Chronicle:

Foerde and her colleagues argue that when the subjects were
distracted, they learned the weather rules through a half-conscious
system of "habit memory," and that when they were undistracted, they
encoded the weather rules through what is known as the
declarative-memory system. (Indeed, brain imaging suggested that
different areas of the subjects' brains were activated during the two

That distinction is an important one for educators, Foerde says,
because information that is encoded in declarative memory is more
flexible—that is, people are more likely to be able to draw analogies
and extrapolate from it.

"If you just look at performance on the main task, you might not see
these differences," Foerde says. "But when you're teaching, you would
like to see more than simple retention of the information that you're
providing people. You'd like to see some evidence that they can use
their information in new ways."

via marbury

This is one of the sanest reviews I've seen of what we know about multitasking, attention, and learning. No flailing about how the Internet is destroying civilization, but some real concerns about the compatibility of multitasking and specific tasks.

The recent kerfuffle over Chronic Fatigue Syndrome has highlighted how strongly many people still believe in a distinction between the physical and the psychological

When the philosopher A.C Grayling (who seemed to be giving an impression of a philosopher rather than actually being one) came to speak, he peered out at the audience over his horn-rimmed glasses and said something like, "I don't expect there are any dualists here, and there certainly aren't on this panel". His point being that the old Cartesian idea that there are minds and bodies and never-shall-the-twain-meet has been decimated in recent decades, as a respectable intellectual position, by the amazing discoveries of scientists studying the brain.

And yet elsewhere this assumption lives on. In this otherwise very good piece in yesterday's Times on chronic fatigue syndrome, everyone, from the writer to the experts to the sufferers, frames the question of the condition's causes as if we were still living in a dualist world. To take one example:

Stephen Holgate, professor of immunopharmacology at the University of
Southampton, chairs the Medical Research Council’s expert group on CFS/ME.
“As a clinician who sees patients with this group of diseases I recognise
there’s a real thing here, it’s not all psychiatric or psychological,” he


The Mind Hacks article he links to is also excellent.