The Science of Procrastination

Procrastination, or ‘the action of delaying or postponing something’ is a widespread habitual weakness common to most people in the world (unless you are super human, then this article does not apply to you). Specifically, it involves the pursuit of activities that provide short-term gratification while simultaneously delaying any task which is particularly laborious or unpleasant. Procrastination often manifests as a coping mechanism for dealing with pressure and anxiety surrounding personal trials; for example revising for exam1s or writing a novel. Although procrastination has acquired the characteristics of laziness, avoidance and sloth; stressed out students may now take heart to learn that scientists are attempting to understand the neurological and psychological underpinnings of task avoidance and whether it may actually confer cognitively beneficial effects on intelligence, creativity and development.

From a psychological perspective, procrastination is a problem of self-control which leads the procrastinator towards behaviours that provide short-term relief by making a stressful or boring task immediately avoidable. Task avoidance may be easier to understand if we look at another model of self-control, for example, in a dieter. Before going to a restaurant the dieter may be set on not ordering dessert but, once the time comes, they may give in to the lure of a moist sticky toffee pudding. Inevitably, following this decision the would be dieter is likely to regret their choice and may be racked with feelings of guilt and self-hatred. This behaviour stems from a disproportionate preference for immediate gratification (a sugary snack) over, more beneficial, long-term rewards (better health) and is known as ‘systematic preference reversal’. These brief but powerful lapses in self-control govern the brain’s preference for behaviours that provide instant gratification and avoids pursuing goal-directed achievement.

The amygdala brain region (highlighted in red).

Procrastination on a neurobiological level actually appears to be emotionally driven, stemming from an internal desire to protect ourselves from negative feelings associated with the fear of failure. The amygdala is a brain region which has been associated with mediating a diverse number of normal behavioural functions including fear, emotion, memory and decision-making and is also involved in a number of psychological disorders including anxiety and phobias. This complex aggregation of brain cells has also been implicated in the neurobiology of procrastination, due to its role in establishing what is known as the ‘fight or flight’ response. This physiological reaction is most commonly linked to situations involving a threat to survival, but has also been applied as an explanation for task avoidance and procrastination. When we start to feel emotionally overwhelmed by an activity that is particularly challenging, or by the accumulation of multiple demanding projects, the amygdala induces this fight or flight emotional reaction in an attempt to protect us from negative feelings of panic, depression or self-doubt. When the amygdala detects a threat, i.e. when you begin to panic about how much you have to do before your exam tomorrow, it floods your system with the hormone adrenaline. Adrenaline can dull the responses of brain regions involved in planning and logical reasoning, leaving you at the mercy of more impulsive brain systems which may convince you that sitting on facebook for the next few hours is really not such a bad idea, even though you have an exam tomorrow. Short term gratification, can immediately relax us and improve our mood via the production of the neurotransmitter dopamine. Dopamine has a major role in reward-motivated behaviour, indeed, most behaviours that make us happy increase the levels of dopamine in the brain. This is when emotional memory comes into play. Specifically, because the brain will stimulate you to repeat an activity that has increased your happiness (and thus dopamine levels) in the past. Therefore, the amygdala encourages you to pursue such behaviours, despite the seemingly short-term advantages of doing so.

Research by Laura Rabin of Brooklyn College also implicates frontal regions of the brain, involved in what is known as ‘executive functioning’, in the induction of procrastination. Executive functioning is a term that encompasses a number of different processes, including problem-solving, changing actions in response to new information, planning strategies for approaching and completing complex tasks and most importantly regulating self-control of our own emotions, behaviours and cognition. Despite only demonstrating a correlative link (i.e. it is unclear if changes in executive functioning directly cause procrastination), this study suggests that procrastination may emerge as a result of a dysfunction of the executive function producing systems of the brain.

But does procrastination deserve its bad reputation, or may task avoidance actually confer some benefit? Some studies suggest that day dreaming, a well-known form of procrastination, may in fac3t be beneficial in terms of the developing mind. Research, by Daniel Levinson of the University of Wisconsin-Madison, demonstrated that children who are regular daydreamers actually have better ‘working memory’, i.e. the ability to juggle multiple thoughts simultaneously than their less dreamy counterparts. Working memory capacity has been positively correlated with reading comprehension and even IQ score, and may represent a cognitive advantage in the ability to handle multiple complex thought processes all at once. Further to this, daydreaming and procrastination have been argued to be beneficial as a type of rest for the brain. However, these potential benefits remain highly controversial, and are obviously subject to individual trait and personality differences.

So, if you are reading this instead of revising, writing your dissertation, or composing a report, don’t despair. Neuroscience offers us several ways we can tackle our tendency to procrastinate.

As procrastination stems from a split second emotional reaction that represses our reasonable thought processes (i.e. to be productive) and yearns for a happiness boost, we need to train our brains to see the completion of a task itself as the dopamine-producing experience rather than the procrastination. So, we should be focusing on rewarding ourselves for completing steps of a challenging project, rather than punishing ourselves for not getting it done. If a reward (no matter how small) is in sight, the amygdala, and the rest of your brain will encourage working towards receiving that reward, and will therefore be less likely to initiate behaviours that will distract you from achieving your goals. As dopamine release is triggered by things that makes us happy, dividing your work into small sections with a reward at the end of each step may be beneficial, so for every chapter you read or 100 words you write, maybe reward yourself with a YouTube clip of a cat singing, or a sneezing panda?

eAnother key element of procrastination is that we often set ourselves unrealistic goals, and when we fail to complete these we panic – and so does the amygdala – setting in motion a series of neurochemical reactions that will, quite literally, stimulate us to do anything else but the task at hand. So, throw those unrealistic goals out the window, break your work down to smaller manageable pieces and make sure you reward yourself each time you complete one. Finally, if you start to panic about the enormity of the task ahead, stop, take a breather, and give your rational mind the time to remind you that there is light at the end of the tunnel and the less you avoid it, the faster you will get there!

Post by: Isabelle Abbey-Vital

The Health Benefits of Kissing

Pucker up, because it seems kissing has a number of important health benefits ranging Kissing1from improving mood and stress levels, to actually enhancing our bodies natural immunity to illness.

Mouth to mouth kissing is a behaviour seen in almost 90% of all human cultures, and used as a non-verbal communication of intimacy, affection and love. For centuries scientists have been pondering the origins of this primitive behaviour and whether it has a functional purpose in our lives.

So where did kissing come from? Apparently, the earliest record of kissing dates back to 1500 BC where references to  ‘drinking moisture from the lips’ were mentioned in Northern Indian Vedic texts. What’s more is that the Kamusutra, which details over 30 different types of kissing, dates as far back as the 6th century AD. According to Philematologists (scientists that study kissing!), it is hypothesized that kissing evolved from an early primitive behaviour known as the ’maternal permastication of food’, which quite literally involved the mouth to mouth contact between a mother and child in the exchange of food during infancy. Despite kissing not being a necessary requirement for successful reproduction, it is hypothesized that sexual kissing may have evolved from this display of care and affection, to eventually promote pair bonding and to facilitate in assessing mate suitability. While mouth to mouth contact is seen in numerous animals as part of courtship rituals, sexual kissing appears to be unique to our species, and may explain why our inverted shaped lips appear to differ from all other animals, almost as if they were shaped for such a purpose!

Despite the seemingly unhygienic nature of kissing, and the fact that it does expose us to the risk of oral infection, this primitive affectionate behaviour represents an evolutionary benefit in conferring protection from diseases that may impose more serious consequences. Mouth to mouth contact essentially exposes each person to the diseases of the other, which while not sounding particularly clean, can actually enhance our own immunological control of exposure to infection. Kissing2According to research by the journal ‘Medical Hypotheses’, kissing represents an evolutionary conserved biological behaviour that boosts our immunity to the Human Cytomegalovirus (HCMV). HCMV is a particularly nasty type of the Herpes virus that can carry a significant teratogenic risk for women i.e. it can have a severe impact on the their unborn children during development, if primary infection occurs during pregnancy. The risks to infected neotates include a number of serious development abnormalities such as enlargement of the liver and spleen, as well as a number of neurodevelopmental disorders including abnormal brain growth, seizures, cerebral palsy and mental retardation. For 30% of infected fetuses the disease is lethal, and as a result, numerous pregnancies are terminated if infection is detected. HCMV is transmitted in saliva, urine and semen. As the disease is only symptomatic during the active phase, it is not an easy virus to readily detect and thus avoid, especially when trying to conceive. In order to avoid infection of the HCMV during pregnancy,  researchers have hypothesized that kissing has evolved to allow women to control the time of inoculation, and that transmission of small amounts of the virus at this point through the saliva will confer immunity to the condition and prevent the presentation of symptoms.

It is now understood that affectionate behaviour has a number of stress-relieving effects. As stress, mainly via the ‘stress hormone’ cortisol, has a number of detrimental influences on our endocrine, nervous and immune systems, kissing may in fact confer significant health benefits by reducing these effects. Interestingly, not only does kissing improve our mood and thus reduce stress levels, it may actually act to reduce a number of parameters that are exacerbated by stress. Stress can elevate blood cholesterol levels, in one manner through stimulating the release of cortisol. Chronic elevation of cholesterol can lead to the build up of plaques and the clogging of arteries that may eventually trigger the development of coronary heart disease. It was identified that an increase in kissing behaviour between marital couples during a 6-week trial period lead to a decrease in blood cholesterol levels, and thus an improvement of blood lipid composition and reduced risk of cardiovascular complications.

Surprisingly, kissing may also enhance your dental health! While it wouldn’t be recommended as a replacement for brushing your teeth in the morning, the extra saliva generated during a kiss washes bacteria off your teeth, and as a result encourages the break down of oral plaque. Kissing also burns calories and raises your metabolism too. According to the research, a vigorous kiss burns up to two calories a minute and can almost double your metabolic rate (the rate at which you can process food). And it makes sense, as kissing involves the coordinated contraction of more than 30 facial muscles, the constant exercise improves muscular tone in the face.  One in particular, known as the orbicularis oris muscle, is used to pucker the lips and has been informally termed the kissing muscle. It has been suggested that the regular contraction of these muscles during a passionate kiss enhances muscle strength and tone and may actually contribute to maintaining a youthful complexion. So a passionate kiss may be the perfect non-surgical remedy for keeping your face young!

Kissing3On what is seemingly quite an obvious level, kissing enhances the release of endorphins in the brain and has a number of other emotional health boosting benefits that improve mood and mental well-being, reduces depression and stress, and most importantly promotes intimacy and pair bonding. So not that we need an excuse, but it seems that appreciating the importance of a good kiss will benefit your health and mental well-being in more ways than one, and if not for anything else, then use it as a happiness boost!

For more information see;

Hendrie CA and Brewer G (2010): Kissing as an Evolutionary Adaptation to Protect Against Human Cytomegalovirus-like teratogenesis. Medical Hypotheses 74: 222-224.

Floyd K, Boren JP, Hannawa AF, Hesse C, McEwan B and Veksler AE (2009): Kissing in Marital and Cohabiting Relationships: Effects on Blood Lipids, Stress, and Relationship Satisfaction. Western Journal of Communication 73: 113-133.

What is a headache?

We all know the feeling after a long stressful day, when the tensions of the past few 415px-Tension-headachehours begin to amass in your temples, perhaps starting as a dull throb before advancing in waves to a deep pounding ache. The headache is a common malady, but what mechanisms lay behind these debilitating pains and which aspects of your life may be triggering them?

The question of why and how we experience headaches is significantly harder to answer than you might imagine. Particularly since the term ‘headache’ is in itself non-specific, being a broad term used to describe a range of common head pains, each of which may stem from a different underlying cause. Interestingly, however, one thing we do know is that the pain you experience during a headache does not originate from the brain itself. Indeed, the brain lacks pain receptors (nociceptors), therefore does not have the capacity to feel pain.

543px-Gray507But then where does the pain of a headache come from? The pain we experience during an everyday headache originates in pain-sensitive structures surrounding the skull. These include; the extracranial arteries, veins, cranial and spinal nerves, neck and pericranial muscles – all of which express pain receptors and are therefore susceptible to these sensations.

It is possible to pin down a number of simple lifestyle factors which commonly contribute to the development of headaches. These include; emotional disturbances, stress and mental tension, certain types of food, alcohol, cigarette smoke, exercise and even the way you wear your hair – hair-dos (including the tight ponytail, braids, headbands and even tight hats) can strain the connective tissue that lies across the scalp and cause headaches. So simply letting your hair down can relieve this pressure and thus the pain of the headache.

A number of the factors which lead to headaches (including certain foods, cigarette smoke and alcohol) involve the blood vessels which lie around the brain. For example, inhaling nicotine from cigarette smoke causes narrowing of blood vessels around the skull. Narrowing of these vessels can often induce extremely painful headaches. Changes in blood pressure also explain hangover and exercise headaches and why, for some people, certain foods can act as headache triggers.

6708719835_b2f15fc2e3The episodic tension headache (the type you may get after a long day at work) is the most commonly occurring type of headache. However despite the extensive research into the cause of migraines, this common type of headache remains one of the least investigated. As relief can normally be sought through over-the counter painkillers, most sufferers will not consult a doctor. The mechanisms underlying what specifically causes these headaches remains elusive, however, a number of theories regarding their pathophysiology have been proposed:

It appears that the occurrence of headaches are commonly linked to general problems of the musculoskeletal system. Skeletal muscle constitutes the largest muscle mass of the body, controlling movement, breathing, facial expressions and numerous other normal physiological functions. Each individual skeletal muscle is composed of hundreds of cells, arranged in muscle fibres. Each muscle fibre is connected to the nervous system via interactions with a single branch (an axon) of a nerve cell.

Each fibre of a muscle can relax or contract in response to signals sent from the brain via these nerves. These muscle fibres also contain sensory receptors which can feedback the health of the muscle to the brain. This helps tell you when the muscle is tired or overstretched for example. Abnormal activity in these nerves, perhaps as a consequence of injury, stress or poor posture, can therefore result in the relay of pain signals to the brain. For example, bad posture places abnormal pressure on the muscles of the neck which can result in heightened tension and the subsequent development of ‘tension headaches’.

Interestingly, tension headaches can also be induced by activation of so-called ‘trigger points’. A trigger point is defined as ‘a hypersensitive area of the body, associated with taut bands within a skeletal muscle’. Pressure or compression on these localized trigger points can cause the referral of pain along linked nerves to a nearby area. So, the presence of active trigger points in your head, neck and shoulder muscles can refer pain that will be subsequently experienced as a headache.

A number of studies have confirmed this, identifying an increased number of trigger points in the muscles of the head in patients prone to headaches, compared with patients who do not regularly experience headaches. What causes these trigger points to develop in the first place still remains unclear, however, some have speculated that they may be associated with past muscular injuries, fatigue, diet and even as a result of chronic repetitive strain, such as persistent typing.

5621720708_3e3b9c45c1Infrequent headaches, while menacing, are nothing compared to their chronic cousins. Infrequent headaches can become chronic as a result of changes that originate in the brain and spinal cord. This involves so called ‘second-order’ nerve cells which act as connectors between peripheral organs (e.g. the skin and muscles) and nerve cells in the spinal cord and brain.

A number of studies have proposed that chronic tension headaches may be triggered by changes in the sensitivity of these second order nerves, particularly those in the spinal cord and an area of the brain known as the trigeminal nucleus. This process is known as ‘central sensitization’ and can alter pain thresholds and trigger nerve cell activity. It is hypothesized that, in the presence of persistent stress or pain signals from peripheral muscles (such as that brought about through regular bad posture), nerve cells can grow forming new connections and effective contacts to low-threshold nerves that do not normally signal for pain. Furthermore, increased sensitivity can be caused by the enhanced release of chemicals that facilitate nerve cell communication. This increase in the number of pain signalling nerves and their sensitivity to strain and tension results in enhanced pain sensitivity, lower pain thresholds and the development of chronic pain states.

Chronic pain states, may be a result of prolonged stress and musculoskeletal tension, alongside central changes in the brain and spinal cord. So, if regular headaches are wearing you down you might benefit from trying to reducing your stress levels, being aware of dietary triggers, improving your posture and trying exercises to relax your muscles.

Post by: Isabelle Abbey-Vital

The Laughter Prescription

5066327879_ceb9338556On a recent trip to Indonesia I came across a temple, in a small village outside of Ubhud, where a group of local Balinese were rolling around on yoga mats in fits of hysterical laughter. The sound emanating from the temple walls was quite amazing and, intrigued, we went closer to see what they were doing. What we had stumbled across was a laughing yoga class run by a group called ‘Bali Happy’ who move around Bali promoting laughter as an exercise, with the aim of ridding the local people of their ailments. The lovely group leader invited us in, and explained the principles of the class; apparently there are different ‘sounding’ laughs to treat problems in different areas of the body, such as your gut, lungs, throat, head etc. He explained that the Balinese people were suffering from illness resulting from an unusual spate of weather which had left Bali wetter and colder than usual; and that they were helping people by focusing on the healing properties of laughter. He then invited us to join the class. This amazing experience left me questioning; aside from the psychological, emotional and communicative benefits, what are the biological principles underlying the healing properties of laughter, and could it really be of therapeutic value?

Norman Cousins
Norman Cousins

The Western ideology for laughter as a medicine began in 1976, when Norman Cousin published his paper ‘Anatomy of an Illness’ which sparked a cascade of enthusiasm for health benefits of this innate, involuntary reaction. But is laughter really the best medicine? Well one thing’s for sure, it can’t hurt. Indeed, unlike many forms of prescribed medication, laughter certainly has no undesirable side-effects. Also a number of studies have highlighted its health benefits.

The act of laughing causes a series of physiological changes. These act rapidly and are often accompanied by many beneficial consequences; particularly to the muscular, respiratory and cardiovascular systems of the body. One of the most frequently reported benefits of laughter is that it exercises and subsequently relaxes many important muscles. In 1979, Cousins described laughing as “a form of jogging for the innards”; this is because when we laugh our whole body becomes involved, leading to the coordinated action of our facial, chest, abdominal, skeletal and even gastrointestinal muscles. Furthermore, after laughing we experience a period of muscle relaxation with assists in reducing tension in the neck, shoulders and abdominals.

Our cardiovascular and respiratory systems stand to benefit too. Laughter causes a prompt increase in heart rate and blood pressure, which can improve circulation. This, coupled with an elevated respiratory rate, respiratory depth and oxygen consumption, improves the rate of residual air exchange and ventilation. These physiological changes are followed by a drop in heart rate, respiratory rate and blood pressure. Indeed, research has identified an inverse association between the propensity to laugh and coronary heart disease. Laughter has also been suggested as an adjuvant therapy to reduce the risk of heart attack in high-risk diabetic patients. For a review of the health benefits of laughter, see here.

5515038358_09f4285956Studies have also identified that the muscle exertions involved in producing laughter may have a stimulatory effect on the production of endorphins. Endorphins are opioid compounds that stimulate feelings of euphoria and lower pain thresholds. It is widely accepted that a patient’s emotional state will often affect the course of a disease. Therefore any therapy which encourages positive emotions in patients may ultimately improve their prognosis.

Interestingly, laughter therapy it is one of the most frequently used complementary therapies in cancer patients worldwide. It’s success most likely stems from the observation that laughter can reduce stress levels. Any therapy which successfully reduces stress certainly can’t be a bad thing; especially since studies have correlated both laughter and reduced stress with improvements in immune function and increases in pain tolerance. Some studies even suggest that laughter may increase disease resistance. The precise mechanism of this is yet to be defined, but may be linked to attenuation of serum levels of the ‘stress hormone’ glucocorticoid. Glucocorticoids are known to suppress the immune system, making stressed individuals more susceptible to disease. So, in it’s action on the neuroimmune system, it seems that laughter can directly improve disease resistance, by manipulating our innate immune responses and reducing glucocorticoid levels.

Given the known psychological benefits of a positive emotional state, it’s not surprising that laughter therapy has also been suggested to have clinical applications for neurological diseases like dementia and schizophrenia. As with most serious illnesses, dementia can place both sufferers and their families under high levels of stress. Since stress is believed to negatively affect an individuals cognitive ability, this may exacerbate symptoms. Laughter therapy has been suggested as a way of reducing stress in both patients and their families. Indeed, when a positive attitude is shared by patients, families and staff, it can have a positive effect on the emotional-affective and cognitive functioning of the patients.

Laughter has helped patients to withdraw from feelings of irritability, stress, tension, and counteract symptoms of depression; it elevates self-esteem, hope and energy, promotes memory, creative thinking and problem solving; increases aspects of self-efficacy and optimism and improves relationships and general quality of life. In other neurological diseases like schizophrenia, laughter has been shown to reduce hostility, depression and anxiety scores and encourage social competence.

Although complementary therapies such as this are not meant to replace mainstream treatment and are not promoted to cure disease. They may often be effective in controlling symptoms, improving well-being and quality of life. While laughter research is still in it’s infancy, there is much to be said for its numerous psychological and physiological benefits and the potential for it to become a very successful complementary and alternative therapy. With laughter as an exercise emerging into the main-stream, through clubs like laughing yoga, it appears that laughter-based interventions are gaining more acceptance, and hopefully further scientific study will follow as a result. As laughter medicine continues to generate more medical and public interest, it may be important to consider that along with eating your vegetables, exercising regularly and getting enough sleep, laughter is a wonderful way to enhance your health. Most importantly, as demonstrated here, there are more than a few reasons to conclude laughter is, and could in the future be, a widespread and effective complimentary intervention for many diseases.

Post by: Isabelle Abbey-Vital

Do infections speed up memory loss in Alzheimer’s?

How does an infection affect the progression of Alzheimer’s disease?

Microglia (green) and neurons (red) living in harmony…for now. Image by Gerry Shaw, EnCor Biotechnology Inc. (Wikicommons).

Alzheimer’s disease (AD) is the most common form of dementia. It’s a neurodegenerative condition characterized by ongoing cognitive decline, loss of functions such as memory, and behavioural abnormalities. AD usually occurs amongst the elderly, and its prevalence is now so high, that its estimated overall cost to society is 5 times that of cancer, heart disease and stroke. While AD was first identified over a century ago, research into its causes has only really begun to gather momentum over the past 30 years. Some think that damage may stem from the formation of toxic ‘plaques’ and ‘tangles’ that appear to be associated with brain cell death. Unfortunately, understanding the triggers of neurodegeneration has become a much more troublesome challenge.

Some teams have begun to look at the contribution of inflammation to the development of brain atrophy (shrinkage). Inflammation is the first response of the immune system to infection/injury. You can recognise inflammation when you graze your skin or sprain an ankle: as well as pain, there may be redness, swelling and the area may feel hot to the touch. This is thanks to the extra blood carrying immune cells to the site of injury to prevent infection and aid repair. The inflammatory response is an innate and usually protective reaction to injury or infection and requires the co-operation between local, ‘resident’ immune cells at the site of injury and circulating immune cells in the bloodstream.

The hippocampi (red) as seen from below the human brain, looking up. (Image from Wikicommons).

The main resident immune cells of the brain are called microglia; they respond to infection and injury to trigger an inflammatory response. Microglia are hyper-sensitive to changes in their local environment. When the brain is injured, they become ‘activated’. They change shape and behave differently, releasing different chemicals which can be toxic to brain cells. Some research has suggested that it’s possible that activated microglia break down the connections between cells in the memory centre of the brain, the hippocampus. Intriguingly, hippocampal damage and memory loss are the primary symptoms of AD.

Several lines of evidence have implicated microglial inflammation in AD, including the observation of activated microglia in the brains with Alzheimer’s, and the possibility that anti-inflammatory drugs may be neuroprotective. However, clinical trials have failed to show any efficacy yet.

Professor Hugh Perry and his research group at the University of Southampton investigate how inflammation contributes to outcome of brain diseases. About ten years ago Perry and his team developed an animal model (the prion mouse) to better understand the complex role of inflammation in AD. A chemical that causes neurodegeneration was injected into the mouse hippocampus (the memory centre of the brain), and the researchers studied the evolution of the resulting prion disease, which bears some similarities to AD. Thirteen weeks after the injection, although the mice appeared normal, there were more activated microglia found in the hippocampus, even compared to surrounding areas of the brain. The researchers then claimed that this microglial activation was pathological, since the mice showed some behavioural disturbances and deficits in learning tasks.

Perry and his team suggested that systemic infections in patients with AD could worsen cell death in the brain, speeding up neuron deterioration and memory loss. (‘Systemic’ infections are so-named because they infect a number of organs and tissues or affect the body ‘system’ as a whole, instead of being localized in one area.)

Resting (top) and activated (bottom) rat microglia after a brain injury. Images by Grzegorz Wicher (Wikicommons).

To look into this idea, researchers looked at differences between the prion mouse with an ‘infection’ (or rather, an injection of a toxin released by bacteria to mimic an infection) or without infection. The prion mouse given a fake infection had twice as many dead brain cells as the uninfected prion mouse. Researchers concluded that the microglia are primed by the ongoing prion disease and so, when the infection is added, they overreact. They then drive the production of a number of inflammatory chemicals, which triggers a whole host of damaging effects on brain cells.

Perry and his team collaborated with other research groups to identify whether the evidence they had gathered would be relevant to AD patients. In a small pilot study of 85 AD patients with moderate cognitive impairment scores over 2 months, they found that those who had infections showed a more cognitive decline than the other patients in the study. This was the first evidence in a clinical setting that systemic infection may affect neurological disease progression. The next study involved 300 AD patients, 50% of whom had a systemic infection within the recorded 6 months. The researchers saw that patients that got an infection within the 6 months suffered three times the rate of cognitive decline, compared to a small cognitive decline in those who had not had an infection.

These fascinating studies have provided the first clinical evidence that as well as inflammation in the brain driving damage, infection and inflammation in the body can also worsen and speed up neurodegeneration. It appears that brain-resident microglia become primed for activation, so that when patients suffer from a bodily infection, their brain cells become more vulnerable to the damaging effects of an inflammatory response.

Not only did this research provide ideas to potentially help AD patients today, but it also formed the basis for an important direction for current disease research. The evidence on the highly complex interplay between the diseased brain and systemic inflammation can be applied, not just to AD, but as a generic concept to many nervous system diseases.

Post by Isabelle Abbey-Vital

The Brain in Pain

How brain imaging technology is placing emphasis on the potential for the mind to influence our physiology, and how this influence should not be underestimated.

What really determines how we feel pain? Scientists are now suggesting that the complex emotional state of the brain may in fact bias how we process and thus experience this highly protective sensation. What’s more is that this may explain how there is such variation in our pain thresholds and how some individuals can be susceptible to conditions of chronic, prolonged pain.

Our ability to detect pain acts as an alarm system, protecting and guarding our bodies against potentially damaging aspects of our environment. This imperative awareness is mediated by ‘nociceptive’ nerve fibres that innervate our skin and organs, and feed into the pain processing pathways of the central nervous system. These fibres feed via the spinal cord to the brain stem, to the pain generating centres of the brain, most commonly the somatosensory cortex.

Izzy pic 1The complexity of the relationship between peripheral pain input, the actual painful experience, and subsequent report of the sensation makes documenting pain in clinical research particularly problematic. Over the past decade, scientists have been addressing the inherent flaws that exist in pain research, and how this limits our understanding and progress in anaesthetic therapeutics. Experimental neuroimaging is emerging as a highly efficient method in mainstream pain research to accurately identify the key areas of the brain responsible for mediating these protective and highly important sensations. By allowing access to the brain activity where these sensations originate, experimental brain imaging allows for a more accurate and objective measure of the pain experience.

The multidimensionality of pain may be explained by the variety of sensory and cognitive aspects that may ‘tune’ our individual pain thresholds. In anaesthetic research, much emphasis is now being placed on the emotional factors and thought processes that impact how we experience pain. Consider first the manner in which we can allocate our attention to different aspects of our environment. Our ability to focus on relevant events and attenuate our responses to irrelevant events is an integral component of higher cognitive functioning. So much so, that these attentional influences may have an impending impact on the intensity of our peripheral sensations. In 2003, researchers tested the power of distraction by presenting a series of unpleasant odours to subjects whilst they were subjected to thermal pain. They found, like many other psychophysical studies, that when the subject’s attention was focused on the pain, they described the sensation as more intense and unpleasant.

Izzy pic 2Then consider, how our subjective experience of pain could be impacted by our expectations and that these expectations of pain can be attributed in part to individual trait differences: fear and anxiety may play a prominent role in the activation of pain pathways. Our cognitive predictions can be impacted by false and unequivocal beliefs stemming from inaccurate memories or inappropriate anxiety that distorts our interaction with the situation in the future. In a neuroimaging investigation in 2006, researchers identified that those individuals that were more anxious about pain (as determined by the Fear of Pain Questionnaire) showed a heightened response in brain areas that encode the emotional aspects of pain, showing their anticipatory fear could actually physically heighten their sensitivity to the painful sensation.

To the ‘normal person’, pain is a mostly acute and infrequent sensation such as a headache, bruise, or the occasional back twinge. For some people however, pain can persist for months at a time, and is often completely unexplainable. Chronic pain states include migraines, neuropathic pain and arthritis. Chronic pain is one of the largest medical health problems in the developed world because it cannot often be efficiently managed or treated. This is however, not for a lack of trying. Currently, there is little research that focuses on understanding the biology of chronic pain, but the development of neuroimaging techniques may be opening the first window of insight into the neurological framework for such conditions.

The problem with chronic pain states are that ‘secondary pain’ often develop as a consequence of the negative impact of unsuccessful treatments. Prolonged worry and emotional turmoil about chronic pain diagnosis leads to the secondary development of mood disorders and depression. The majority of this area of research has identified that a positive mood has a significant pain-attenuating effect and negative mood increases sensitivity to experimentally induced pain. Furthermore, population-based longitudinal investigations have observed that depressed individuals are at an increased risk of developing chronic pain conditions, than those without mood disorders. This evidence establishes a role for emotion-based tuning of pain modulatory systems and provides a basis for novel strategies in chronic pain management by addressing the negative lifestyle impacts associated with such conditions.

It is a common fallacy that placebo effects lack credibility or significance in modern healthcare systems. In fact, since the post-World War II introduction of placebo effects into mainstream medicine, scientists have used the impact of placebos in controlled drug trials to gather information on the qualitative nature of pain. A placebo describes an ineffectual treatment (often in the form of a sugar pill) that is intended to deceive the recipient to believe they are taking a pharmacotherapy to treat their condition. Placebo research has implicated prefrontal pathways in the brain as a source of cognitive pain modulation, because studies have consistently observed that activation of this area correlates with ‘emotional detachment’ from the pain, and thus a higher ability to cope with it. With its extensive connections to the emotion and pain processing centres, this area acts as a powerful modulator in expectation and reappraisal of the placebo effect, by dampening fear by suppressing amygdala (the emotion centre) activation. Remarkably, other personality traits like dispositional optimism can seriously enhance placebo analgesia. This research reinforces the importance of positive expectations about the efficiency of a drug and may provide an explanation for why many analgesic treatments in chronic pain are unsuccessful at a population level.

Izzy pic 3“To consider only the sensory features of pain, and ignore its motivational and affective properties, is to look at only part of the problem, not even the most important part at that” are words from the pain researchers R. Melzack and K.L. Casey who even 50 years ago, placed emphasis on the multidimensionality of pain perception. The rapid development of neuroimaging techniques means the next 20 years are predicted to be particularly prosperous for identifying new targets in surgical and pharmacological pain relief tools. In the mean time, it seems a ‘mind over matter’ attitude could be more beneficial than we would ever have expected.

Post by Isabelle Abbey-Vital

News and Views: The Brain Activity Mapping Project – What’s the plan?

“If the human brain were so simple that we could understand it, we would be so simple that we couldn’t” – Dr. Emerson Pugh

Isabelle Abbey:

An ambitious project intended to unlock the inscrutable mysteries of nerve cell interactions in the brain is on its way. Labelled America’s ‘next big thing’ in neuroscience research, the ‘BRAIN’ (Brain Research through Advancing Innovative Neurotechnologies) initiative will use highly advanced technologies in an attempt to map the wiring of the human brain.

Cajal drew some of the billions of neurons in the human has come a long way since 1899

Also referred to as the ‘Brain Activity Map’ Project (BAM), the BRAIN initiative aims to decode the tens of thousands of connections between each of the ~86 billion neurones that form the basis of human brain. Scientists believe completing the map will be an invaluable step that may have huge implications for therapeutically tackling neurological pathology.

Moving forward in this manner does seem particularly appropriate. For the past 10 years, we have been reaping the benefits of technologies like fMRI and PET scanning, which have allowed us to visualize the brain in a way that has never been done before. From measuring behaviours to diagnosing abnormalities, the contribution of neuroimaging to our understanding of brain physiology and pathology is undeniable.

Paul Alivastos, the lead author of the paper detailing the BAM proposal, aims to develop novel toolkits that can simultaneously record the activities of billions of the cells in the live brain, rather than from macroscopic slices. Eventually, these technologies will allow for the accurate depiction of the flow of information in the human brain, and how this may differ in pathological states such as in Alzheimer’s or autism.

Despite the daunting nature of the task at hand, this proposal has been met with much political enthusiasm. On 2nd April Barack Obama announced the American Government would be backing the project by approving a $100m funding budget for its first year of operation.

The humble nematode worm, 1mm long

But might this project need some grounding? After all, Alivastos and his co-authors are yet to establish the basis for which such tools can be developed or the extent to which these technologies could be used. The years of extensive research that has concentrated on mapping the wiring of a simple nematode worm, consisting of only several hundred nervous system cells, is yet to allow us to accurately predict the worm’s behaviour. So, some scepticism does seem reasonable.

While we must be cautious in predicting ambitious benefits from such a project, the map Alivastos and his colleagues have envisaged gives reason enough to be hopeful for the next decade in our neuroscientific appreciation of human cognition.

Natasha Bray:

As a neuroscience researcher, I can’t help but take an interest in the BRAIN initiative proposed by President Obama earlier this month. It’s a massive pot of cash designed not only to further the neuroscientific knowledge base, but also to create jobs and technologies that can’t even be described yet. As Izzy mentions above, the project is an ambitious and important undertaking that merits the now fashionable label of ‘big science’.

The BRAIN initiative is funded by a big pot of money from different resources including DARPA (the Defence Advanced Research Projects Agency), the National Science Foundation, the National Institute for Health, Google and various other institutes and charities.

So far, even defining the project and choosing suitable methods has been a challenge. The research leaders have proposed “to record every action potential from every neuron within a circuit”. Bear in mind action potentials (nerve impulses) happen in a matter of a couple of thousandths of a second, while a single circuit may encompass many millions of cells. At the moment, neuroscientists can record action potentials from up to about 100 cells simultaneously. We can work out anatomical circuits. We just can’t record from every cell within them; there is not one single tool in neuroscience’s toolbox that is currently capable of gathering that kind of data (yet).

There are, however, candidate techniques that could be improved or perhaps combined. Imaging techniques, including optical, calcium or voltage imaging, or magnetic imaging such as fMRI and MEG can scan on different scales in both time and space. Neurons’ electrical activity can be recorded using silicon-based nanoprobes or very tightly-spaced electrodes. Researchers have even suggested synthesising DNA that records action potentials as errors in the DNA strand like a ticker tape. Advances in all these technologies are still being made, making them the most likely candidates.

Added to the difficult choice of method is the serious task of storing and analysing quadrillions of bytes of data, plus the fact that it’ll take about ten years just to complete an activity map of the fly brain. It’s clear there are significant hurdles to jump. Then again, no one said big science would be easy…or cheap. But the potential benefits of big science are huge. The Human Genome Project had a projected cost of $3 billion, but was completed within its budget and has already proved a huge investment both intellectually and financially. It’s famously estimated that for every dollar originally invested in the Human Genome Project, an economic return of $140 has already been made.

I see the BRAIN initiative as a very worthy cause, a good example of aspirational ‘big science’ and a great endorsement for future neuroscience. One gripe I have with it, however, is that it seems a little like Obama’s catch-up effort in response to Europe’s Human Brain Project (HBP). The HBP involves 80 institutions striving towards creating a complex computer infrastructure powerful enough to mimic the human brain, right down to the molecular level. Which begs the question: surely in order to build an artificial brain you need to understand how it’s put together in the first place? I really hope that the BRAIN initiative and Human Brain Project put their ‘heads together’ to help each other in untangling the complex workings of the brain.

The Superhuman Savants

Savant syndrome is an incredibly rare and extraordinary condition where individuals with neurological disorders acquire remarkable ‘islands of genius’. What’s more, these ‘superhuman’ savants may be crucial in understanding our own brains. ‘Savant’, derived from the French verb savoir meaning ‘to know’, is a term to describe those who suffer from a condition that often has an profound impact on their ability to perform simple tasks, like walking or talking, but show astonishing skills that far exceed the cognitive capacities of most of the people in the world. Autistic savants account for 50% of people with savant syndrome, while the other 50% have other forms of developmental disability or brain injury. Quite remarkably, as many as 1 in 10 autistic people show some degree of savant skill.

Kim Peek. Copyright Darold A. Treffert, M.D. and the Wisconsin Medical Society, from WikiCommons

The best known autistic savant was a character played by Dustin Hoffman in the 1988 film ‘Rain Man’’. What few people know is that this character was based around the unbelievable skills of a real-life savant called Kim Peek. Kim Peek suffered from developmental abnormalities that meant he was born with a malformed cerebellum – which lies at the back of the brain and is important for coordinating movement and thoughts – and without the corpus callosum, the sizable stalk of nerve tissue that connects the left and right hemispheres of the brain. Known by friends as ‘Kim-puter’, his astonishing powers of memory fascinated scientists for years. Quite literally, he had a phenomenal capacity to store extraordinary quantities of information in his mental ‘hard drive’. He also had a profound ability to recall information, close to the speed at which a search engine can scope the internet. In 2009, at the age of 54 he had read 9,000 books, all of which he could recite off by heart. He could simultaneously read the left page with his left eye, and the right page with his right eye. What seems quite unbelievable is that at the age of 58 he was still unable to perform everyday simple tasks such as buttoning his clothes. He could not comprehend simple proverbs and struggled greatly in social situations, yet is considered one of the most powerfully gifted savants of all time.

Considering the vast repertoire of human ability, it is fascinating that other savant skills mostly occur in a narrow range of just 5 specific categories:

  1. 1.     Music

Leslie Lemke was born with cerebral palsy and brain damage, and was diagnosed with a rare condition that forced doctors to remove his eyes. Leslie was severely disabled: throughout his childhood he could not talk or move. He had to be force-fed in order to teach him how to swallow and he did not learn to stand until he was 12. Then one night, when he was 16 years old, his mother woke up to the sound of Leslie playing Tchaikovsky’s Piano Concerto No. 1. Leslie, who had no classical music training, was playing the piece flawlessly after hearing it just once earlier on the television. Despite being blind and severely disabled, Leslie showcased his remarkable piano skills in concerts to sell-out crowds around the world for many years.

  1. 2.     Art

Stephen Wiltshire was diagnosed as mute and severely autistic at an early age. Despite having no language or communication skills, at the age of 7, he began the first of many masterful detailed architectural drawings of cityscapes that were remarkably accurate. Known as the ‘Human Camera’ Stephen can draw these landscapes after only observing them briefly.  In 2005, Stephen completed a 10m-long accurate drawing of a Tokyo skyscraper panorama from memory after just one short helicopter ride.

  1. 3.     Calendar calculating

George and Charles Finn, known as the ‘Bronx Calendar Twins’ were both autistic savants. Their particular skill was being able to calculate the day of any date in the past and the future. This talent extended so far that they could accurately calculate any day 40,000 years backwards and forward.

  1. 4.     Mathematics

The first documented savant in 1789 was Thomas Fuller, who was severely mentally handicapped but had unbelievably rapid mathematical calculating abilities. When asked how many seconds a man had lived who was 70 years, 17 days, and 12 hours old, he gave the correct answer of 2,210,500,800 in 90 seconds, even correcting for the 17 leap years included.

  1. 5.     Mechanical or Spatial Skills

Ellen Boudreaux, despite being blind and autistic, could navigate her way around without ever bumping into things. As she walks, Ellen moves around using echolocation:  – she makes chirping noises that bounce off objects in her path such that she can detect the reflected sound, a bit like human sonar.

Interestingly, savant syndrome is four times more likely to occur in men than women. This intriguing difference has sparked much interest in the scientific community, and subsequently the ‘right compensation theory’ of savant ability was established. It appears that during foetal development, the left hemisphere of the brain develops slightly slower than the right hemisphere, and is thus subject to detrimental influences at different stages. High levels of circulating testosterone makes the male foetus more susceptible to damage because this sex hormone can impair neuronal function and delay growth of the vulnerable hemisphere. It was proposed that the right hemisphere may then compensate for this impaired growth, by overdeveloping. So while savants may not be able to walk or talk, the skill development on the other side of the brain is highly advanced, and so may lead to these amazing ‘superhuman’ skills. Left hemisphere damage is often seen in autistic patients, so this theory of ‘left damage/right compensation’ may explain how the savant brain develops differently from others’. Although this theory seems credible, the highly diverse nature of savant syndrome means that no single hypothesis can explain every case.

What is important to consider is that not all savants have developmental neurological disorders. The syndrome does sometimes emerge as a consequence of severe brain injury. Orlando Serrell is an ‘acquired savant’ who at 10 years old, was violently struck on the left hand side of his head by a baseball. Following the incident, Orlando suddenly exhibited astonishing complex calendar calculating abilities and could accurately recall the weather of every day since the accident. Orlando’s case and others alike imply the intriguing possibility that a hidden potential for astonishing skills or prodigious memory exists within all of us, expressed as a consequence of complex and unknown triggers in our environment. The prospect of dormant ‘superhuman’ gifts is a much debated topic, and may have a whole range of implications for the future.

These examples are just few of the thousands of savants suffering from autism and other neurological disorders that exist in the world today. While all the anatomical and psychological evidence contests the development of such skills, the reality of such a syndrome questions our modern understanding of ‘normal’ brain functioning. Until we can establish how savant syndrome skills emerge, it is difficult to certify that any proposed models of human cognition and memory are reliable representations of neurological behaviour.

By Isabelle Abbey-Vital

Love is in the air: Why Sperm love the smell of flowers and how this could be used as a fertility aid

Flowers have long been prized for their natural beauty and almost guarantee a positive reception when given as a romantic gift. However, it appears that the chemical responsible for floral smells (‘bourgeonal’) is also linked with love and romance in an intriguing and unexpected way.

As humans our sense of smell is predominantly used to enrich and inform our experience of the world around us. This can be both pleasurable (i.e. smelling a flower) or functional allowing us to avoid har800px-Bouquet_de_roses_rosesm (i.e. smelling to find whether or not food is safe to eat). In most animals smell serves a similar function. However, many animals have the added ability to detect chemical signals known as pheromones. Many species use sex pheromones to drive reproductive behaviour; sometimes relying on such signals to communicate important information about an individuals reproductive state or sexual potency.

The existence of an equivalent pheromone detection system in humans is highly disputed. This is mainly because the genes responsible for making pheromone detectors in humans are inactive, meaning pheromone receptors are not produced in the human nose. Furthermore, humans do not possess the organ (the vomeronasal organ) used to detect these ‘sex signals’. But does this mean that odor plays no role in the way we reproduce?

We experience smell through a series of specialised proteins known as ‘odor receptors’ (ORs). Although we normally associate these receptors with the nose, a healthy body of evidence has now identified these smell detecting proteins in many other areas of the body. Most bizarrely, they appear to exist in sperm cells! So, several questions come to mind; why do sperm need to smell, what can they smell, and could this be how we use smell for sexual communication?

It appears that certain ORs, found both in the nose and in sperm cells, respond to the floral compound ‘bourgeonal’ – the smell we associate with flowers. Therefore it would seem that male reproductive cells can smell flowers! This leads to an obvious question: What functional importance does the ability to detect floral odors play in sperm cell physiology and human reproduction?

The answer to this question may seem even stranger: Research suggests that the presence of floral scent detectors in sperm may be used to help them navigate towards an egg.

Sperm_stainedSperm are motile cells that move via the presence of a ‘tail’. They may move towards the egg via a process known as chemotaxis; where cells direct their movement based on the relative concentration of chemicals around them. Experimental evidence has identified that sperm cell movement is directed towards areas of high concentrations of bourgeonal. Does this therefore suggest that, contrary to popular belief, eggs do not wait passively for the sperm to arrive but in fact produce floral chemicals that in turn attract the sperm to them? This would indicate that human female reproductive cells have acquired the capacity to produce chemical attractants, which may function to increase the probability of successful fertilisation by sperm. Interestingly, the ability for sperm cells to smell can also physically influence their swimming behaviour. It is known that the the swimming speed of sperm can be defined by the presence of particular chemicals. The presence of floral chemicals increase swimming speeds and directed movements. So, in a particularly remarkable fashion, it appears that sperm cells can detect floral odours and use these for navigation.

This research may soon radically transformed the way we understand sperm-egg communication, inferring that female eggs may produce floral odors which attract sperm for fertilisation. Most importantly, such research could provide the basis for novel strategies in the manipulation of human reproduction, offering advances in contraception and fertility treatments.

Infertility is often caused by a deficit in the number and quality of sperm; so bourgeonal could be used in IVF to enhance the swimming ability and targeting of sperm cells to eggs. Furthermore, the development of a drug which can block bourgeonal compounds could be used as an alternative to hormone manipulation strategies, as a new and effective contraceptive. However, the future of this research ultimately lies in the identification of a female-produced floral ‘scent’, which would provide the first empirical evidence for the use of pheromone sex signals in human reproduction.

Post by: Isabelle Abbey-Vital


Why can’t we tickle ourselves while schizophrenics can?

Have you ever tried to tickle yourself? Try it; you will find that the feeling will be nothing like the sensation you get when someone else tickles you. But why is this the case?

The simplest answer to this question is to assume that when you tickle yourself you’re expecting the sensation, so are less likely to react. However, functional magnetic resonance imaging (fMRI) has shown that activity in an area of the brain known as the somatosensory cortex is comparable both when subjects are tickled unexpectedly and when they are warned that they are about to be tickled. This provides evidence that the brain responds to an expected sensation in the same way as it does to an unexpected sensation. Meaning that expectation alone cannot be the explanation for our inability to tickle ourselves.

The brain is constantly receiving sensory input  (information about our experiences communicated by our physical senses)  from everything we touch, see, hear, taste and smell. This constant barrage of information must be sorted and processed by the sensory systems of the brain in order for us to make sense of the world around us. Arguably, the most important feature of normal brain processing is the ability to identify and extract information about externally-induced changes in our environment. Therefore, in order to differentiate between spontaneous environmental changes and those we cause ourselves, the brain categorizes self-produced movements as being less significant than those initiated external to our bodies. Indeed, fMRI scans have identified increased activity in the somatosensory cortex in response to externally produced tickling (as used in the above study) compared to little or no change in activity seen when participants tickle themselves. This data suggests that activity in the brain differs in response to externally and internally produced stimuli, reinforcing the neurological basis for our ability to consciously distinguish between the two.

Cerebellum in purple

Research suggests that this ability to recognise a self-initiated movement may depend on a structure at the back of the brain known as the cerebellum. Circuits within the cerebellum have been termed the bodies ‘central monitor’ and may be the key to distinguishing between self-produced sensations and external stimuli. Neurons of the cerebellum have the capacity to calculate strong and accurate predictions about the sensory consequence of self-tickling. This system takes predictions about our movements and compares them with actual sensory feedback produced by the action. The difference between the two is known as an ‘error signal’. If you attempt to tickle yourself, your internal ‘central monitor’ will accurately predict the sensory consequence because the movement is self-produced and there will be little or no difference in error signal. In contrast, when someone else tickles you (even if you are aware it is going to happen), you will not be able to predict exactly what the sensory stimulation will feel like; that is, its position or strength. Therefore, there will be a difference between your brains prediction and the actual sensory feedback.

So it seems that you can’t tickle yourself? Well, at least this is usually the case. However, research has now stumbled upon a remarkable feature of schizophrenia showing that, unlike the rest of us, schizophrenics actually have the capacity to tickle themselves! It has been suggested that this phenomenon may be a caused by neurological changes in the schizophrenic brain which disable the patient’s ability to detect self-initiated actions. It is possible that biochemical or structural changes in the brain cause a malfunction in the predictive system of the cerebellum. This results in a miscommunication of information concerning internally- vs. externally-generated actions. Essentially this means that, although the patient is able to process the intent to move and is aware the movement has occurred, they cannot then link the resulting sensation (the tickle) with their internal knowledge of making the movement. It is therefore possible that this deficit in self-awareness or monitoring could result in thoughts or actions becoming isolated from the internal appreciation that they are producing them. Consequently, schizophrenic patients may misinterpret internally-generated thoughts and movements as external changes in the environment.

Our ability to control the magnitude of our responses based on prior knowledge of our own actions appears to have numerous advantages. This includes the ability to distinguish between real external threats, such as a poisonous spider crawling up our leg, and those we create ourselves, for example resting our own hand on our leg. Indeed, recognising the difference between an external threat and a self-induced false alarm may, in some situations, be the difference between life and death. The multifactorial basis of the tickling sensation indicates a staggering complexity in central processing in the brain. Science is currently unravelling these complexities and, with luck, this research may lead to both a better understanding of disorders such as schizophrenia and may point the way towards novel treatment strategies.

Post by: Isabelle Abbey-Vital