Saturday, April 21, 2012

Existential Medicine


Earlier in the year I posted about the use of psychedelic drugs, particularly psilocybin (the active ingredient in “magic mushrooms”), to treat the clinically depressed. I’ve just finished reading a recent piece in the New York Times on a growing number of psychiatrists nationwide who are currently conducting studies on the merits of psilocybin in mitigating fears of death in terminally ill patients. Rather than discuss neural basis for psilocybin’s potential psychiatric benefits (about which very little is known), I feel compelled to write about the ethics of such drug administration, which I have been struggling with since my initial post on the subject.
The New York Times piece describes two patients, suffering from a cancer, who were told they had very little time left to live. Initially, both patients struggled immensely with their respective diagnoses, but each found refuge in experimental studies in which psilocybin was administered, followed by prolonged sessions of meditation and introspection. At the end of their studies, each performed much better on a battery of depression and anxiety tests, and each reported a completely different world view, one in which death was not the end of life, but part of “…a process, a way of moving into a different sphere, a different way of being.”
Although psilocybin appears to be effective in mitigating anxiety in depressed or anxious patients, there is something very unsettling about the idea of administering a compound that effects the brain so drastically (in ways yet to be fully understood), so as to convert a state of near panic to one of placidity and tranquility.
While the ends of such administration may be appreciated by the patient, are the means ethical? Perhaps, with death so near, a patient deserves to die in peace, regardless of the ethical implications. But, every time I read about psilocybin as a clinical drug, I cannot help but think about Aldous Huxley’s A Brave New World. True, there is a big difference between recreational use of a powerful psychedelic (as was the case in A Brace New World or 1960’s America) and prescribed use by terminally ill patients, but once the ball gets rolling, I fear it will be hard to stop. As Lauren Slater, author of the New York Time piece writes, “If, say, end-stage cancer patients can have it, then why not all individuals over the age of, say, 75? If treatment-resistant depressives can have it, then why not their dysthymic counterparts, who suffer in a lower key but whose lives are clearly compromised by their chronic pain? And if dysthymic individuals can have it, then why not those suffering from agoraphobia, shut up day and night in cramped quarters, Xanax bottles littered everywhere?” While some may say that such hypothetical scenarios will never come to be if psilocybin use is strictly regulated for terminally ill patients, people like Rick Doblin and his group MAPS (Multidisciplinary Association for Psychedelic Study) have already started petitioning for the legalization of psychedelics for use in a “wide range of clinical indications.”
Psychedelics, such as psilocybin, definitely hold promise for patients suffering crippling depression and anxiety. They also hold great potential for abuse, and when discussing the clinical merits of psychedelics, researchers and psychiatrists need do so with extreme prudence and caution.

Thursday, April 19, 2012

My Search for Ice Cream and How it Became a Lasting Memory


            My first memory is from when I was five years old. It is a pretty emotional memory, so I guess that’s why I still remember it. My mom and I were in the supermarket, and I was pretty upset with her for her lack of interest in the ice cream aisle. So I took the initiative and wandered off to go find it myself. Of course, the inevitable happened and I ended up losing my mom, and it wasn’t until one of the kind supermarket workers returned a very shaken up (and still ice cream-less) me to my mother that we were reunited.
            Since my first memory is from when I was five years old, I’ve always assumed we just don’t form memories before around that age. This was the prevailing view of society too until about the 1980s, at which point it was proven that even very young children do indeed have the capability to form memories. Until more recently, it was believed that these memories were only transient, with young children living only in the present. However that idea has been overturned as well.
            It turns out that very young children remember a lot like adults. In early infancy, the neural structures crucial for memory are coming online: the hippocampus, which is, very roughly, in charge of storing new memories; and the prefrontal cortex, which is, very roughly, in charge of retrieving those memories. But the difference between children and adults is that in children those neural pathways are still developing, so only part of the information is stored. Only part of the present is captured in young children as it goes by.
            Over time, children remember things for longer and longer; their memories are getting stickier so to say. That being said, we still can generally not remember anything before roughly the age of 4 or 5. So at what point do memories start becoming more permanent? New research indicates that this might have to do with society as much as neurology. Parents repeat and retell events to their children, and these events get cemented as more permanent memories. So here’s my question: Is the only reason I remember my excursion to get ice cream that my mom likes to retell the story of me making a fool of myself in search of sweets? If so, I still have no regrets. I just like ice cream a lot.

Wednesday, April 4, 2012

The Uncanny Valley


In recent years, human animating capabilities have soared. Companies like Pixar and Dream Works produce film after film, each with increasingly more realistic looking animated characters. Yet, when some of these films are viewed, the awe at the near perfect animations and the appreciation for what must have been a very skilled animator is replaced by uneasiness, even creepiness. Take for example, the recent Disney animation Mars Needs Moms. The characters in this movie are extremely well animated and look very humanoid, in fact, they look a little too humanoid. It is as if a threshold on animation anthropomorphism has been passed, causing all the characters to appear, well, weird.

Believe it or not, this phenomenon (things that look too human making us uneasy) has a name: The Uncanny Valley. This name, almost as strange sounding as the phenomenon it describes, refers to the visible dip in a graph created by roboticist Masahiro Mori that plots human empathy against how closely a non-humanoid object represents a human (see above). According to the graph, human empathy for human-like objects increases up to a point, at which things get a little too real and empathy drops sharply.

Although no one knows exactly why this Uncanny Valley exists, one theory emphasizes the evolutionary need for a brain mechanism that would trigger dislike for something almost human, but not quite. According to this theory, the reason humans are uneasy upon viewing things like the animations in Mars Needs Moms is that these animations appear to be very humanoid, enough to trick the brain into processing them as humans, yet do not resemble humans perfectly. This contradiction, something that looks human yet does not behave perfectly human, is what causes dislike in the human brain. If you understand the basis of the theory (it is very confusing!), then the evolutionary importance follows easily. Things like disease and mental disorders alter the appearance and actions of humans, and a functioning system to detect and generate fear of such evolutionarily dangerous things is a necessity for human survival.

Maybe someday animation companies will perfect the human form (appearance, movement, etc.), and this Uncanny Valley will disappear. I am not sure if I want that day to come, however, because it means that our brains will be unable to distinguish the animations from real humans, and that makes me feel more uneasy than the current state of humanoid animation.