Wednesday, April 15, 2009

I Dream of Eating

Nocturnal sleep-related eating disorder. Have you guys heard of this? It's a rare disease or disorder...or something in which people eat while they are asleep. It's like sleep walking but it's sleep eating. I just read a case study in which a guy sleep-walked a couple of miles to a sweets store and started going to town. Sounds like people are faking it right? They just say that they were sleeping to cover up their desire to binge? Unlikely. First off, this guy first sleep-ate a few days after beginning taking a sleeping pill which has been known to cause somnambulism. Second, people that sleep-eat often eat really bizarre things that no person in their right mind would do. Like raw chicken, soap, or dog food. You just don't do that when you're awake.

So if you wake up one day with raw chicken on your face you might want to go see a doctor.

Check out Dang et al.'s (2009) article in the International Journal of Eating Disorders for more on this crazy disorder and the above case study.

Wednesday, January 21, 2009

Videogames and Cognitive Functioning

Apparently nursing homes should allow access to Rise of Nations. Fresh out of the laboratory at the University of Illinois is a study in which older adults were trained to play the strategy computer game Rise of Nations. Measures of executive functioning (this is a somewhat abstract and confusing term but you can think of it as one's ability to multitask) were given before, during, and after training. They found that, compared to a control group that did not play Rise of Nations, executing functioning was enhanced by the video game training. These sorts of effects are typically only found when older adults are entered into a cardiovascular exercise program. Thus, if you play Rise of Nations you can be a lazy ass and still be smart.

For those of you interested in the effects of first-person shooter games there has been some research done on younger adults. The typical finding is that the games improve visuospatial functioning but not executive functioning.

I have no idea whether cognitive research has been done on RPGs. My guess is they make you dumber.

Sunday, December 14, 2008

Forgetting is Good!

Exciting news. I got my first manuscript accepted at a journal! It’s pending some final revisions, but it’s very close to being put in press. I’m guessing it’ll appear in print midway through 2009.

So you want to hear about what I did? Yeah you do.

When I was at Furman I was interested in how people forget prospective memories, which are memories for doing things in the future, like taking your medication or calling Mom. Now when I say I was interested in forgetting prospective memories I’m not referring to those occasions in which we actually forget to take our medication (or call Mom), but instead I was interested in whether we remember (or think about) these intentions after we have finished them. Take the example of remembering to return a library book when you see the library drop box. Does seeing the library drop box several days after returning the book remind you to return it again? My guess is no.

Consider what life would be like if you didn’t forget finished (and therefore irrelevant) prospective memories. Secretaries that must remember to deliver messages to their bosses when they see them would remember many previous messages every time they saw their boss and there’d be no chance of actually delivering the pertinent message. If that’s not clear enough, what about remembering to take your medication? If you remembered to take your medication every time you saw your medicine bottle (and weren’t able to at least temporarily forget or inhibit the intention after you took it) then you would either overdose or collapse under the weight of your utter confusion.

Note that I’m not saying that you cannot remember returning the library book, previously-delivered messages, or your last dose of coumadin. If someone asked you about any of these things my guess is you would remember having done them. What I’m saying is that once you have returned your library book, seeing the library drop box is unlikely to reflexively remind you of your intention to return that book (whereas it probably would if you had not yet returned the book). The idea is that reflexive reminder processes are more likely to function for current and relevant prospective memory intentions than old and irrelevant prospective memories.

To get at prospective memory forgetting in the laboratory we first had people perform some prospective memory task. For today’s purposes let’s just say the prospective memory task was remembering to press a specified key on the computer keyboard if a particular cue word appeared in a given context. In the real world, your cue might be the library drop box, your action might be return a book, and your context might be going to the library to study for an exam.

The critical component of this soon-to-be-published experiment was that after performing the prospective memory task once participants were told that they would either perform the prospective memory task again in the future (when they return to a given context) or that the prospective memory task was completely finished (our “forget” condition). Then we gave participants a novel task in which they ONLY had to make simple yes/no decisions such as determining whether a word was a member of a given category (e.g., is Texas a state?). During this task we still presented their prospective memory cue. If seeing the prospective memory cue reminds people of their prospective memory then their focus on their yes/no task will be somewhat disrupted and therefore cause them to make their yes/no decision slower.

The general finding is that if the prospective memory task needs to be performed again then the yes/no decision is slower on trials in which the prospective memory cue appears. This demonstrates what we can all intuit—if you have a prospective memory intention to perform (e.g. take medication) then stimuli associated with that intention (e.g., a medicine bottle) will cue you to remember that intention.

What was really cool was what happened in the forget condition. When people were told that their prospective memory task was finished, the aforementioned slowing/remembering effect was wiped out. It was as if the prospective memory cue was no different from any other item. We interpreted this finding as evidence that young (college-aged) adults can quickly deactivate or otherwise forget finished prospective memories.

So that’s what’s getting published. It was a pretty novel finding and reviewers (who by the way are often quite harsh) were really interested.

My research didn’t stop there. This past year I ran the same study but with older adults (ages 65-85). Just like the younger adults, prospective memory cues triggered remembering of unfinished prospective memory intentions in older adults. This result is pretty cool because it shows that reflexive remembering processes are preserved in older adults (whereas many other memory processes show age-related declines).

What about when the prospective memory task was finished? Older adults still showed the slowing/remembering effect! Thus, the primary aging deficit in prospective memory may not be initially remembering to perform the prospective memory task, but instead the deficit may be quickly forgetting finished prospective memories.

Also interesting was that the cue didn’t remind all older adults of their finished prospective memory intention. Further, the largest risk factor for failing to forget a finished prospective memory was not age, but rather the individual’s health and how many prescribed medications the individual was taking. At least two conclusions may be deduced from these results. First, forgetting is not all bad. In the present case, it is a mark of good health! Second, the individuals we have hoped do not have prospective memory problems—individuals that must remember to take many prescribed medications—are unfortunately the people that are likely to have prospective memory problems. Not good. Don’t shoot the messenger.

My current research project pushes the limits of functional forgetting. Though the slowing/remembering effect is interesting it only implies that older adults may erroneously perform a finished prospective memory intention again (such as take medication twice in one morning). In the current research I wanted to not only show that individuals may reflexively retrieve a finished intention but also demonstrate that under certain circumstances they will actually execute their intention (take medication) again.

To investigate this issue I varied whether the prospective memory cue was very distinctive (i.e., salient). I also varied whether participants reencountered their prospective memory cue in the same context in which the prospective memory task was originally performed or in a novel context. My previous research had used a nonsalient cues and novel contexts.
The results have showed that when the cue was not salient, forgetting was easy in younger adults. Further, if the cue was salient but was encountered in a novel context, younger adults still showed no evidence for remembering the intention. However, if the cue was both salient and encountered in the original context, then younger adults not only remembered their finished prospective memory intention but that some of them (25%) actually performed their prospective memory task again. Perhaps a real world example will help. Let’s say the doctor gives you an abnormally large and bright medicine bottle and you keep it in the bathroom. After taking your medication, if you return to the bathroom half an hour later and see the really distinctive medicine bottle again, then there is some chance that even you high-functioning, college-educated adults will reflexively retrieve the intention to take your medication again (and do so).

If younger adults struggle somewhat under these extreme circumstances then older adults may really struggle. Unfortunately, this is where I say to be continued. I have yet to test older adults in this paradigm. For now I can only speculate (based on my previous results) that older adults will be much more likely than younger adults to erroneously repeat their prospective memory task, and under less extreme circumstances. If so, it’d be pretty convincing evidence that the primary prospective memory problem for older adults lies in forgetting old and irrelevant prospective memories. We’ll see!

My next post will probably be about my master’s thesis. There is some really cool recent research on how sleep benefits memory. It turns out that sleep is not some passive state of consciousness but actually serves to strengthen (not just stabilize) memories. So if you learn something right before going to bed you might remember that information better following a 12-hour delay than you would following a 30-minute wake delay. My master’s thesis will examine the benefits of sleep on prospective memory, which has yet to be examined.

Until next time….

Saturday, September 27, 2008

Professor Hercules, Ph.D.

First, let me explain. I know that the URL above says blogspot.com and that by default makes me a "blogger." Further, I realize that the title above contains the word psychology which also by default breads expectations for paragraphs on emotions, descriptions of disorders, and the inevitable stop-overmedicating-America's-children! arguments. You're not going to find that here. I'm not going to write about feelings--yours or mine--and I'm definitely not going to write about politics. I'll leave that to Johnama. See what I did there?

As a budding scientist I am disturbed by defining concepts by what they are not and therefore I shall attempt to explicate exactly what I wish to achieve with a blog. The major goal of my writing is to disseminate the "cutting edge" of basic cognitive research being conducted by myself and others to a broader audience. In doing so, I hope to learn to more clearly communicate my research to a general audience. And just maybe you'll learn something as well.

Before we get into the nitty gritties of lexical decisions, heirarchical linear regressions, and multinomial models let me tell you a little about hydra psychology. I am first and foremost a psychologist, but one that is bound, fueled, and stimulated by what I call the Hydra Principle. A little background information....

In Greek mythology, there existed a large snake-like monster with multiple heads called a Hydra. It rampaged, it killed, and it had very bad breath. Eventually, the gods had enough of the Hydra monster and sent Hercules to dispose of the serpent beast. So Hercules confronted the Hydra, and being a skilled warrior, he was able to quickly slice off one of the monster's heads. To his shock and horror, instead of the monster collapsing in beheadedness it grew two heads in place of the one that had been cut off. This pattern continued which each swing of the hero's blade. Every time a head was cut off, two sprung up in its place.

So that's a hydra, but what's the Hydra Principle and how does it apply to a scientist?

Well, whenever I answer a research question, two questions pop up in its place. Whenever I finish an experiment I have to conduct two more. To illustrate, when I entered graduate school I was given 2 projects (one experiment each) to work on. I completed experiment 1 for project A and the results begged us to conduct a second experiment. The results of the second experiment were also intriguing and we therefore wanted to conduct a third experiment. Ok so that's three experiments. Were we satisfied yet? No, because we had to write up the 3-experiment project and submit it to a peer-reviewed journal. As a sidenote I should mention that submitted papers are sent out to "experts" in the pertinent field to review the paper (the methodology, statistics, theoretical conclusions, etc.), and journal editors use the reviews to determine whether to accept, reject, or invite a revision/resubmission of the paper. High quality journals never accept papers on their first submission.

Well, about 6 weeks after submitting our paper the editor responded with the reviews which were, in general, pretty positive. But guess what? They wanted us to collect more data before we resubmitted. They suggested a particular experiment, and being the hydra-crazed lunatic that I am, I suggested an idea for another experiment as well. So two more experiments for the project that just won't die.

That was project A. How about project B? I took over project B from another grad student in the lab so when I say experiment 1 for project B, I really mean experiment 3. As you can see, the hydra principle was already hard at work for this project. Like the project I discussed above, project B just won't die. In the last year I have conducted 3 experiments for it and I'm planning a fourth. I keep telling myself it the fourth will be the last experiment for this project but who am I kidding?

What's even scarier about the hydra principle is that it not only works within projects (i.e., multiplicative experiments in a single line of research) but also between projects (i.e., multiplicative lines of research). So when I thought I was going to begin a project C, I was really beginning a project C and D. And when I thought it was a good time to start project E, I was really starting project E and F (remember, each project has their own multiplying experiments). Maybe I'm an idiot or maybe I just didn't have any choice, but I just began project G. I guess I should expect to start project H next week.

As illustrated above, the hydra principle is the empirical finding that "completed" studies spawn new questions and new projects. Further, these new projects do not just cancel out your completed projects, but instead grow exponentially so as to overwhelm you with unanswered research questions. Such is the life of a scientist.

That's all for tonight. Next time we'll get past the principle and dive into what some of these projects have been designed to investigate. I'm thinking that I'll write a little bit about a currently "hot" line of research called survival processing that's designed to investigate the evolutionary processes that appear to have given rise to memory biases in humans. How's that for an amuse bouche?