Superstitious mice

One of the nicest things about being part of a large University is that, if you can drag yourself away from your desk long enough, you get the opportunity to attend some pretty amazing guest lectures discussing cutting edge scientific findings. Last week I sat in on a particularly engaging talk given by a researcher from UCL on ‘superstitious’ mice. Although the title was a bit confusing, leaving me with images of mice refusing to leave their beds of Friday the 13th and saluting whenever they saw a magpie, the actual research gave an amazing insight into how the brain balances its own internal prejudices with its actual experiences of the world. The ‘take home message’ of the talk was that mice and men don’t always believe what they see and, on occasions, will act on what they expect rather than what is actually in front of them. The research behind this finding is both elegant and eye opening and I will attempt to do it justice here:

Research in this lab was not initially intended to test the ‘superstitious’ nature of laboratory mice. The lab was instead interested in how well mice could distinguish between two images and how similar these needed to be before the animals became confused; research such as this is important for understanding how the visual system works. The experiments relied on the notion that mice can be taught to perform specific tasks in response to different commands (similar to training a dog…just on a smaller scale). Mice were kept in special cages with two separate treat dispensers and were taught to watch a screen which flashed one of two images. Each image corresponded to a different treat dispensers, I.e. when image 1 appeared the mouse could get a treat from dispenser 1 and when image two appeared the mouse could get a treat from dispenser 2. To make the task a little bit trickier the scientists sometimes manipulated the images making them harder to tell apart, with the aim of confusing the mice.

This figure expresses the concept behind the test however the images used in this research were not coloured dots.

What was particularly impressive about this experiment was that the scientists worked with two groups of mice, one which performed the behavioural task and another which watched the same images whilst the researchers recorded activity from the visual areas of their brain. This meant that researchers could compare how well cells in the brain responded to the different images with how well the mice performed on the task. Now this is where the findings get interesting! The mice weren’t very consistent when it came to performing the task; meaning that some times they would perform well, even when the images were similar, whilst other times they seemed to be unable to recognise even the clearly separated images. However, when the researchers looked at the corresponding brain activity they found that the visual cells they recorded from were consistently good at differentiating between the images. This caused some serious head scratching as the scientists tried to work out how, when the mouses’ brain could distinguish the images, the mouse itself sometimes behaved as though it could not.

What the group found was that mice based the decision of which treat dispenser to visit, not only on the image they saw, but also on their past experiences – taking into account what choices had previously lead them to receive a reward or not. The mice tried to assign a pattern to the task making assumptions based on what they had already experienced, then combined this internal prediction with what they actually saw. Amazingly these internal predictions (which the researchers called superstitions) could be strong enough to win out over the animals own vision causing it to make the wrong choice. We can perhaps understand this behaviour better by thinking about the times in our lives when we assign patterns to things which are in fact entirely random. Take for example the national lottery. There was a time when you heard news reports speculating on lottery number, making the assumption that since a certain number had not been drawn for weeks it was ‘due’ whilst another number which appeared more regularly may be less likely to appear again. Of course the lottery draw is entirely random, meaning that the frequency of certain numbers being drawn on previous weeks has no influence on what the current draw will be. However, this did not stop us speculating and assigning our own patterns to the draws. It seems that the brain just loves to create patterns!

However, we would never be silly enough to ignore what our eyes were telling us in favor of a ‘superstitious’ belief, would we? Well… before you sit back, quietly mocking the poor mice for being slaves to their internal pattern maker, it is worth noting that they are not the only species to fall foul to the problems of over thinking a scenario. Yes you guessed it, it appears that we do this too! A follow-up experiment used a similar protocol with people and amazingly found the we also sometimes make the wrong decision even though our eyes are obviously capable of telling us we are wrong. So it seems that when it comes to both mice and men our superstitions can occasionally get the upper hand!

Post by:  Sarah Fox

For original work see here (subscription necessary to view full article)

Beware the men (and women) in white coats

There’s something incredibly authoritative about someone wearing a white lab coat. The minute I get into the lab and put mine on I feel powerful, knowledgeable, wise. This attitude changes as I realise I haven’t got a clue what my results actually mean… but for those 10 milliseconds each day I feel like I know things about science.

The problem is that a number of people involved in marketing have also cottoned on to the fact that someone wearing a lab coat and/or glasses looks clever and appears trustworthy. This has lead to a glut of adverts featuring ‘clever-looking’ people in lab coats telling you exactly why their toothpaste, pregnancy test or shampoo is the best. They often use fancy scientific-sounding words (which are sometimes entirely made up) to explain why their product is amazing then seal the deal by flashing you a trustworthy, knowledgeable smile – ‘trust me I look kinda like a doctor’.

A number of cosmetic companies use images scientists in a white coats to sell their products

Beware of these people! Just because they are wearing a lab coat, and usually glasses, doesn’t mean they are scientists or doctors. Even if someone is actually called “Dr”, this still doesn’t guarantee they know what they are talking about. I will (hopefully) be a doctor someday soon. When I finish my PhD I could put on my lab coat, fix you with a serious look, introduce myself as “Dr Walker” and give you a lecture about nutrition, shampoo or teeth; and you’d probably believe me. However, I know nothing about nutrition, shampoo or teeth for that matter and I will undoubtedly be talking absolute rubbish.

These advertisers are exploiting the fact that we are more willing to believe something if it is presented to us by someone who looks authoritative –  in this case, someone wearing a lab coat and/or glasses. Such ‘blind faith’ in authority figures was most famously studied by the psychologist Stanley Milgram in 1961. In his experiment subjects thought they were giving an electric shock to another person in a different room. They had been informed that the person being shocked had a heart condition. Someone in authority would then prompt the subject to administer an electric shock to the other person in increasing doses, often causing them to scream in pain or bang on the wall. Shockingly, many test subjects were willing to administer a potentially lethal dose of electricity as long as they were prompted to do so by the person in authority. Fortunately the test wasn’t real and the person being ‘shocked’ was just an actor, but this experiment showed how people are more likely to go against their own judgement if someone in authority tells them it’s OK.

Obviously I’m not telling you to distrust the authority of your GP or any other medical specialist. These guys have spent many years studying their field and generally know what they’re talking about. I’m referring to the people who crop up on TV self assuredly promoting their own opinions on various controversial subjects, or trying to flog you some skincare products with promises like “it’ll reverse the ageing process due to the addition of polydeageinium*” or some other equally ridiculous statement. For an excellent assessment of the ‘science’ behind cosmetics and why these names they claim to give their creams are often totally bogus, see Ben Goldacre’s website.

A good example of how people’s trust in authority has been misplaced is the PIP breast implant scandal. A French company, Poly Implant Prothèse (PIP) was using potentially dangerous non-authorised silicone for breast implants (see here for more detail on this story). This incident may have occurred because people trusted the chain of authority above them: the patient trusted their plastic surgery team, and the plastic surgery teams trusted their supplier (which was not a huge leap of faith since the PIP implants had been given the ‘CE’ mark, meaning they met European quality assurance standards). This incident has led to fears that the low-quality implants may rupture and, in some cases, have caused the patient a lot of pain. This story highlights how mistakes can be made and how blind trust in another persons authority may not always be a sensible choice.

The PIP story also highlights how divisions can appear within the medical community, with different groups claiming different things – some say the danger of rupture from the PIP-supplied implants is higher than that of medical-grade implants, others disagree. This differing of opinion has become a political issue as well as medical one (see here for more detail). Therefore, It’s also worth keeping in mind that opinions can differ even within the scientific community. This difference in opinion is not unusual since experimental findings are rarely black and white. However, understanding comes as more experiments are conducted, meaning that the consensus scientific opinion is often the closest to fact you can get. This means you should also be wary of ‘real doctors’ expressing opinions which are not held by the rest of the scientific/medical community.

Obviously, the PIP story is a rarity, but it does illustrate how sometimes people can blindly follow someone in authority, whether it’s a doctor, manufacturer or even the European Quality Assurance board! If you see someone on TV claiming to be a doctor or specialist in their subject giving their opinion on a matter which concerns you, it should be easy to search online and discover their credentials, and investigate whether what they are saying agrees with scientific opinion as a whole.

So be aware that a white coat and/or the fact that someone is a “doctor” does not automatically mean they know best. Trust me, I’m (almost) a doctor.

* “Polydeageinium” is not a real chemical. No one would ever seriously come up with a name that stupid would they? Would they??? Maybe I should copyright it, just in case…

Post by: Louise Walker

PKMZeta: a name to remember.

Will it ever be possible to delete certain painful memories from our conscious brains, as suggested in the film Eternal Sunshine of the Spotless Mind?

We all know what it feels like to remember something, like your first kiss or childhood home, but where in your brain are these memories stored, how do we gain access to them and is it possible to enhance or remove them?

These are questions neuroscientists have spent many years researching. The search for a physical manifestation of memory has taken us on a journey from the truly bizarre (for example a now disproved theory assumed that specific memory molecules existed in the brain and that these could be transfered from one individual to another by eating brain tissue), to our current view that memories are spread throughout the brain and develop when small changes occur in the structure of and connections between neurons (for more detail on synaptic remodeling and plasticity see my previous post). We hope that the more we understand about memory formation and storage, the closer we will come to being able to manipulate them and potentially offer relief to people with memory related illnesses.

PKMZeta structure.

When a memory is first formed a number of proteins become active within participating neurons. These proteins help reshape the neurons thus making the memory permanent. Once this reshaping is complete the proteins involved in the process once again become inactive. It was believed that once reshaping had taken place it would be difficult for us to further modify these neurons to remove or enhance specific memories. However, research conducted within the past 20 years is now questioning this assumption. Researchers have uncovered a protein (PKMZeta) which, unlike others involved in memory formation, remains active in cells long after the initial memory forming event…perhaps indefinitely. This discovery led scientists to question whether PKMZeta may hold the key to maintaining memory and, if so, whether this system could be experimentally manipulated.

Amazingly it seems that this is indeed the case. Scientists have found that blocking the activity of PKMZeta days or even months after learning has taken place can interfere with a rats ability to remember a location, a specific taste or an unpleasant experience. Not only does blocking its activity lead to forgetting, but boosting its activity also has the ability to enhance old faded memories.

Total brainwashing is certainly something we should avoid.

Although the discovery of PKMZeta may be a step forward in finding a treatment for memory disorders, it is important that we proceed with caution and ensure we understand the effects this protein has on the memory system before speculating over its pharmacological value. From current research, scientists believe that the memory enhancing or eradicating effects of PKMZeta are not specific to single memories, indeed they may influence multiple memories at once. Therefore, it is important we understand what memory traces are altered by this protein and how it could be made more selective before considering its wider uses. Removing or enhancing multiple memories non-selectively is certainly not desirable! However, the stage is now set for progress in this field and as our understanding grows there may come a time when we can play a more active role in memory formation and retention.

Post by: Sarah Fox

The Basal Ganglia: Your internal puppet master

Have you ever left your house in the morning and wondered whether you locked the door or remembered to close the window? Have you ever arrived at your destination and realised you had no recollection of the journey? Have you ever completed any mundane task, whilst thinking about, well… nothing? If you have, it’s not the case that your memory is leaving you or that something is wrong with the inside of your head. In fact, things are probably working better than you think.

Every one of us is perpetually bombarded with an assortment of stimuli. You are constantly seeing, hearing, smelling, tasting and touching. There are also the less recognised senses such as balance, proprioception (your sense of your “body position”) and changes in temperature. Whilst you are not consciously aware of most of these sensations, under the bonnet your brain works through this huge array of information and sorts the important stuff from the chaff. Even when you are asleep, you might awaken only to critical sounds such as a baby crying in the next room but not, say, a car driving past the window.

But your brain doesn’t just subconsciously extract the interesting stuff. It takes this information and combines it with your internal body state (Hungry? Tired? Bored?) and uses this to decide how you should act. This processing allows you to interact with your environment, seamlessly performing the most complex or the most humble of tasks.

An example: you are sitting in a chair in a room and the window is open. There is a cold draft so you get up to close the window. You probably don’t think about how to rise out of the chair. When you walk over to the window, you aren’t aware of the hundreds of muscles working in concert to move you. You aren’t considering the position of your legs, how balanced you are or the sense of touch on the soles of your feet. But your brain takes these sensations and executes movements. It all happens automatically without any need for you to be consciously aware of the process.

Below is a different example. This man is playing music on his guitar. He has to make a series of movements that are precise in both time and space. He does this in response to the sound of the notes and feeling and seeing the position of his hands. As he progresses, the subsequent sensations trigger the movements for the next section of the piece. It’s not entirely automatic but he wouldn’t be able to play this piece without having practiced and learned it first.

[youtube http://www.youtube.com/watch?v=OhaFINynWqY?rel=0&w=480&h=360]

So why is such concentration required for playing a guitar but not for walking? From your brain’s perspective, any movement that you repeat can be considered “practice”. The more you do something, the better you become and the less you actively think about it. Therefore, it’s simply the case that you spend a rather huge amount of time “practicing” walking but not playing a guitar. Even the greatest Rock God doesn’t spend as much time swinging his axe as he does putting one foot in front of the other. If you picked up a completely new instrument, how much time do you think you would need to learn how to play it? A month? Two months? And how long was it before you learned to walk properly?

Regular guitar playing is also known to result in questionable fashion choices.

Practicing, learning and then reciting these movements is part of your procedural memory. Unlike other forms of memory which are governed by the hippocampus, procedural memory is controlled mainly by the basal ganglia, with a bit of tweaking by the cerebellum. In a previous post, Sarah wrote of HM, an individual who suffered damage to his hippocampi resulting in permanent amnesia. Despite this, he could still be taught mirror writing when encouraged by the scientists working with him. When prompted, he was able to write in reverse with no effort, despite insisting he had no knowledge of ever having done it before!

Players such as Dan Carter are notorious for quick, incisive actions that are beyond that of many of their contemporaries.

Learning a new skill requires a large amount of effort and attention. However, through repetition, the effort and attention required to perform the task can be reduced. For some the practice of complex motor skills consumes their entire lives. In particular, sportsmen and women have huge demands placed upon them during matches, both physical and mental. Whilst the activity of the basal ganglia and procedural memory is certainly not the brain’s only toil, players that are quick thinking and can dictate play are thought to have greater automation of their movement skills, thus freeing up their conscious mind to analyse the game around them.

Even for everyday souls like us this system is utterly indispensable. Below is a man with Parkinson’s disease, which primarily damages the basal ganglia. He is still able to move his limbs, but coordinating himself is a huge challenge.  In the second part of the video he is given a common treatment, L-DOPA, which provides temporary respite. However, eventually even this will not restore normality.

[youtube http://www.youtube.com/watch?v=sf1N0Zf5IqA?rel=0&w=480&h=360]

Illnesses such as Parkinson’s highlight why the basal ganglia, like so many parts of the brain, are fundamental to our everyday lives. Helping to treat such disorders is the primary reason for scientific research in this area. However, if it also helps us to understand why sometimes we don’t pay attention when we pack our bags in the morning or lock our front doors, then I think that can be quite interesting too.

Post by: Chris Logie

From secret agents to drunk rats

There’s a spy film (I can’t remember which one) with a famous scene where the secret agent and his enemy sit down for drinks. The agent secretly slips a pill into his mouth to counter the effects of the alcoholic beverages they both proceed to consume. Throughout the rest of the night, the spy retains all his mental faculties, knowing that meanwhile his enemy will succumb to impaired judgement, delayed reflexes and slurred speech. This is all caused by the alcohol slowing down the enemy’s brain by binding onto ‘depressant’ receptors, called GABAA receptors, making them more active – in turn, slowing down the brain.

Not to mention the secondary bodily actions alcohol has on the drunken enemy. Alcohol limits the production and release of the antidiuretic hormone vasopressin, meaning that important salts and fluids are excreted by the kidneys in his urine. The alcohol irritates the stomach lining so much that his brain concludes that the stomach’s contents must be harmful thus causing a feeling of nausea. The enemy’s sleep is also affected. As a compensatory reaction to the alcohol, his body produces glutamine; a stimulant which prevents deep restful sleep and can even trigger tremors, anxiety, restlessness and high blood pressure the next day. All in all, the next morning the enemy will experience the dreaded post-intoxication syndrome – also known as a hangover.

So what is the agent’s ‘magic’ pill that protected him from this dreaded sequence of events? Today the internet is full of suggestions, many herbal or vitamin-based. Not surprisingly, there is a huge market for ‘miracle’ hangover cures. Yet hardly any claim to be able to curb the primary effects of alcohol – feeling ‘drunk’. Recently, however, scientists at the University of California have tested a natural substance called DHM (taken from an Asian tree) on rats. The rats were given a dose of alcohol equivalent to a binge of 15-20 pints of beer. The rats that weren’t given the DHM lost their ‘flipping’ reflex (their ability to stand up after being pushed over) for over an hour. In contrast, the rats given DHM before the alcohol only lost their ‘flipping’ reflex for around 15 minutes. In other words, DHM made the rats extremely tolerant to alcohol.

Still, perhaps the most important finding from this study was DHM’s longer-term effects on alcohol addiction. Rats, just like humans, can become addicted to alcohol. If the alcohol was mixed with DHM, however, the rats drank much less than their untreated counterparts, possibly because it binds to the same GABA receptors that alcohol does but without the same ‘depressant’ effects. The researchers plan to test DHM on humans, with a view to hopefully using it to treat alcoholism.

Post by: Natasha Bray

Original article can be found here: (subscription required to read full article)

Can a brew help you beat type II?

Coffee and cake – a match made in heaven. It may also be healthier than you think – well the coffee at least. A recent paper has shown that drinking coffee may help prevent obesity-linked type II diabetes. The study showed that three chemicals found in coffee can stop certain proteins from misfolding, clinging together and becoming toxic. These clusters of misfolded protein, known as amyloid bundles, are thought to lead to diabetes by damaging insulin-producing pancreatic cells. This results in the pancreas losing its ability to make insulin and regulate blood-sugar levels.

This research provides a mechanism to explain previously observed links between drinking a lot of coffee and being less likely to develop diabetes. Brilliant – I’m going to go have a large slice of sugary cake and wash it down with coffee – no diabetes for me!

Alas though, it’s not quite that simple. The authors admit that some of the links between drinking coffee and a reduced risk of diabetes may be due to the appetite suppressing properties of caffeine. If you eat less, you’re less likely to be obese and as a result less likely to develop type II diabetes. So it could be the amyloid bundle busting power of coffee or it could be a reduced likelihood of obesity. It could even be both.

Damn it! Maybe just a skinny latte and a jaffa cake for me then. So although this study doesn’t provide an all-you-can-eat cake pass, it does suggest that coffee may have some positive health effects after all.

Post by: Liz Granger

Twitter: @Bio_Fluff

Original article can be found here: (subscription required to read full article)

Scientists are People Too

Nothing stops a conversation at a party quicker than the words “I’m a scientist.” I’ve lost count of the times I’ve had the following conversation:

“So, what do you do for a living?”

“I’m a scientist”.

“Oh, really? That’s fascinating, what are you studying?”

“Biochemistry and Cell Biology”

“Errrm …”

Unsurprisingly, no one knows how to carry on from there. This is mostly because I haven’t yet worked out how to verbalise my work into something remotely understandable (even to myself). However, I do believe that the unrealistic portrayal of scientists in the media makes it harder to explain what we really do on a day-to-day basis.

The problem I find with scientists in the media is that there seems to be only a few categories they’re allowed to fit into. Below is a list of what I believe are the main types of scientist presented to the public:

The Evil Genius: Sadly, I think the most common type of scientist in the media is the megalomaniac genius who tries to take over the world. This is an unfortunate stereotype and I can state with some confidence that none of the scientists I have come across in my career have had dreams of world domination.

 

 

The Super-Geek: Usually men but sometimes women too (see the U.S. sitcom The Big Bang Theory for examples of both). They are asthmatic, allergy-ridden neurotics with an inability to communicate with the opposite sex. Some scientists are indeed like this, but then again so are some accountants. The point is that this portrayal seems to indicate that most scientists suffer from social afflictions, which just isn’t true.

 

 

 

The Know-it-all: These seem to crop up a lot in Hollywood blockbusters. They often manage to know about an abnormally huge range of scientific theories which help to save the day. If the Know-it-all is female, there is a high chance they’ll be wearing a tank top and tiny shorts (for example Dr. Christmas Jones from the Bond film The World is Not Enough). I don’t want to say this is unrealistic, because perhaps there are nuclear physicists who go to work in tiny shorts and have an encyclopedic knowledge of everything scientific. However, most scientists are specialists in a particular field – e.g. cancer, astrophysics or biochemistry – and are unlikely to have the extremely broad range of knowledge the Know-it-all appears to have on board.

The Moral Vacuum: To me, this is the most frustrating portrayal. This scientist ignores any moral or ethical boundaries to make the next big discovery. A good example of this was in the BBC’s most recent series of Sherlock; specifically the Hounds of Baskerville episode. This is in general an entertaining show, but I was a bit dismayed by the portrayal of the scientists in this episode. They did cruel and unnecessary experiments on both animals and humans just to see what would happen. There was even a line when one scientist was asked why they were making fluorescent rabbits, she replied “because we can.” In reality, doing any sort of animal experimentation requires a licence and there are legal documents in which you have to explain exactly how your proposed experiments will be beneficial and that they have a specific purpose. “Because we can” is not an excuse and will never be accepted as one. Don’t get me wrong, I know this is just a show, but it doesn’t do our reputation any good when it appears that scientists are willing to throw out any ethics to achieve their dream of making a famous (or infamous) discovery. Admittedly some scientists, past and present, may be morally dubious but on the whole we’re an ethical lot.

Generally speaking, most scientists live a relatively normal life and don’t fit any of these stereotypes. Many of my colleagues and scientist friends are in stable relationships and are perfectly able to talk to members of the opposite sex, including non-scientists. Many go out and have a good time and regret it the day after. We too have to deal with office politics and occasionally poor boss-employee relations. Personally, when I’m not at work I like watching Pixar movies, eating at nice restaurants with my boyfriend, going to the pub and other typical sociable activities.

Of course, scientists aren’t the only career group who are pigeon-holed by popular media. I’m sure lawyers have similar gripes about Ally McBeal, and doctors with ER or House. However, I do feel that we scientists have it particularly tough as most of the stereotypes presented are negative or even downright scary.

So take it from me, if you meet a scientist at a party, don’t assume that they are like any of the characters shown in the media. Although we do know some pretty interesting technical stuff, we are just as comfortable, if not more comfortable, chatting about films or which local pubs serve the best beer!

Post by: Louise Walker

Cats on the brain

Since their domestication in ancient Egypt, cats have carved their own niche within our society;  controlling pests and delighting owners worldwide. Whether our own, or our neighbours pets; the vast majority of western inhabitants interact either directly or indirectly with cats on a daily basis. Therefore, there is little wonder that at times we share more than just our living space with these animals.

Toxoplasma Gondii

Toxoplasma gondii is a single celled parasite whose life cycle is intimately connected with the cat. Indeed, this parasite is entirely dependent upon conditions within its feline hosts for sexual reproduction! Despite its dependence on cats for the sexual stage of its life cycle, T.gondii is  capable of infecting all mammals…including humans. The parasite can be transmitted from cats to other mammals through ingestion of T.gondii eggs. An infected cat will shed up to 20 million eggs in one leaving, with eggs surviving in the soil for over a year. Transmission occurs when the eggs are ingested by mammals feeding around the infected area. The most common form of transmission to humans is either through unwashed vegetables or undercooked meat.

When ingested by anything other than a cat the parasite reproduces asexually, forming small thin walled structures called cysts which lay dormant in many cells, most notably those of the brain. Although the dormant parasites can remain in this state for the entire life span of their accidental host, they cannot reproduce sexually until they return to their primary host (the cat). Therefore, ideally the parasite must find a way back to the cat!

T.gondii can lead to fatal feline attraction in rodents.

Scientists researching the effect T.gondii has on wild and laboratory rodents have recently uncovered the unsettling means by which this parasite ensures itself a safe return back to its primary host. The parasite has been found to manipulate the rodents behaviour patterns, making them significantly more likely to be caught and eaten by cats. Infected rats were found to be more active and less intimidated by open spaces than non-infected animals, making them easy prey for a hunting cat. However, probably the most unusual finding was that infected rats, unlike their non-infected counterparts, were not scared of the smell of cat urine; actually spending more time in the area of their enclosure treated with this odour. This is particularly unusual since all uninfected rats, even those who have never encountered a cat, show a strong innate fear of this smell.

The mind control adopted by these parasites is probably linked to the presence of cysts within the hosts brain cells. Scientists are not yet certain what aspect of T.gondii infection causes these behavioural changes, however it has been suggested that the parasitic cysts may have the ability to manipulate the hosts brain chemistry. Studies have found that levels of certain neurotransmitters linked to control of movement and behavioural responses to fearful stimuli appear to be altered in infected mice. Specifically, recent findings show that the parasite contains two genes which have the ability to increase levels of the neurotransmitter dopamine in the hosts brain; this may account for observed changes in the animal’s behaviour.

Of course the idea of behavioural manipulation makes sense in the case of prey animals like rats and mice but what happens when humans, who are unlikely to fall prey to cats, become infected? Current medical understanding of T.gondii infection in humans assumes the parasite has no notable effect on the host, with the exception of infection during pregnancy and the occasional adverse reaction to first exposure. However, in light of the recent findings in rats and mice, scientists have been taking a closer look at how T.gondii may influence our behaviour. Work by Professor Jaroslav Flegr has revealed, what he believes to be, particular personality types linked to T.gondii infection. He found that; infected men have a greater tendency to disregard the rules of their society and were generally more suspecting, jealous and dogmatic than non-infected men whilst infected women appear more ‘warm-hearted’, out and easy going but also more conscientious, persistent and moralistic. Both infected men and women also appeared more prone to feelings of guilt than their uninfected counterparts. Links have also been drawn between incidences of schizophrenia and T.gondii infection, perhaps due to altered dopamine transmission.

Since the basic components of our brains are not too dissimilar to those of the rat or mouse, it seems logical to assume that something which exerts an effect on their behaviour should also influence our own. Therefore the question is now open as to how often these parasitic passengers actually jump in the drivers seat? Indeed, T.gondii is not the only parasite carried by humans, leaving open the possibility that development of our personalities has and will continue to be influenced not only by our genes and environment but also by our own personal collection of brain dwelling parasites.

Post by: Sarah Fox

Who do you think you are?

Do our brains define who we are?

The study of the self, what makes us unique and how our brains define who we are is an intriguing and often controversial area of research. Indeed, such work may eventually shape many aspects of society from education to criminal justice.

We already know that damage to the brain can permanently alter an individual’s personality and that the type of alteration depends on the region damaged. One of the most famous examples of brain damage leading to a shift in personality was the case of Phineas Gage. Gage worked as a foreman leading a team of workers preparing the ground for a new railway line. On the 13th September 1848 an accidental explosion blew an iron rod, over 3 feet long and 1 ¼ inches in diameter at its widest point, clean through his head. Although amazingly he survived the incident, he lost sight in his left eye and suffered significant damage to his left frontal lobe. Following the accident, although his intellect remained intact, it is reported that he changed from being a conscientious well liked man to a fitful disrespectful individual with a particularly foul mouth. Indeed, the changes in his personality were so significant that his former employers were forced to let him go.

Gage’s tragic accident was one of the first pieces of evidence linking frontal cortex damage to antisocial personality traits. The knowledge that such physical brain-damage can precipitate a change in personality raises the question of whether undesirable traits seen within the general population can be linked to subtle changes in brain function and ultimately whether these ‘defects’ may be treatable.

An area where a combination of genetics and functional brain imaging is raising just such a question is the study of psychopathology. Scientists have observed genetic and physiological differences in the brains of a number of diagnosed psychopaths. Specifically, brain scans of these individuals reveal lower than average activity and reduced communication between the orbitofrontal cortex and amygdaloid brain regions. Together these regions are important for making emotionally driven decisions, learning from the emotional content of experiences and controlling impulsive behaviour.

These brain changes are often seen in combination with a rare form of the monoamine oxidase A (MAO-A) gene known as the ‘warrior gene’. This gene codes for a protein (MAO-A) which can influence the communication of cells in a number of brain regions. Alterations in this gene lead to abnormal cellular communication, and have also been linked to behavioral abnormalities.  Indeed, individuals with the ‘warrior’ version of this gene are more susceptible to developing antisocial and often violent tendencies, especially if they also experienced maltreatment during early childhood.

Professor Fallon uncovered a long list of murderers and suspected murderers on his dad’s side of the family

Although studies of many psychopathic killers reveal the presence of both the ‘warrior gene’ and reduced orbitofrontal/amygdaloid activity, this is by no means the end of the story. Amazingly James Fallon, a neuroscientist from the university of California who had been studying the brains of psychopaths, recently uncovered some disturbing family history of his own; leading to the disquieting revelation that he himself showed both the same brain type and genetic profile as the psychopathic killers he studied. Although he admits to not always being in-tune with other’s emotions, professor Fallon is a caring husband and father who has never been in trouble with the law. These findings raise the question of how important a biological predisposition towards psychopathy is and how this may be overcome. Professor Fallon notes that he was lucky to have a very warm and loving childhood, and believes that his upbringing likely played a role in preventing any biological tendencies toward psychopathy from taking effect.

Work such as this is beginning to uncover the amazing interplay between our brains and our selves. However, as knowledge in this area increases, many of us can’t help but question where this research is leading and how these findings may be used. Although I believe we are a long way from hearing ‘my brain made me do it’ as an acceptable defence in our courts, this research is already influencing our education system. One pioneering school in the south of England accepts children with behavioural difficulties associated with low amygdaloid function. Along with their regular lessons children are taught how to recognise and respond to emotions in a socially acceptable manner. These children are considered to be at risk of developing antisocial or even psychopathic tendencies and it is hoped that this training will help them integrate into society and lead normal successful lives.

Post by: Sarah Fox

Could stress actually help you pass a job interview?

Now there’s an intriguing question…

I’m sure many of us recognize that feeling just before an important interview when your palms get sweaty and your stomach becomes home to a kaleidoscope of exceedingly acrobatic butterflies. These unfortunate effects of anxiety appear largely unavoidable and are certainly not often classed as beneficial.

However, scientists from the university of Kiel (Germany) believe that the smell of anxiety may induce feelings of empathy in others. This means that the nerves you feel may actually buy you some sympathy in that feared interview.

The study collected armpit (or to use the more fancy scientific terminology: axillary) sweat samples from a group of people both during exercise (non-anxiety condition) and during a nerve wrecking oral examination (anxiety condition). A second group were then exposed to these odors whilst undergoing an fMRI brain-scan. This allowed the scientists to monitor how these odors influenced their brain activity. Interestingly, although subjects did not consciously recognize a difference between the smells of exercise and anxiety sweat, their brains told a different story!

The smell of anxiety activated a number of brain regions believed to be important for recognizing anxiety in others and converting these observations to feelings of empathy. These regions include the insula and orbitofrontal cortex, precuneus, cingulum and fusiform gyrus, shown below. These regions were not activated to the same degree by the smell of exercise sweat.

The red/orange regions highlighted here show the brain areas activated to a greater extent when subjects smelled the anxiety sweat than when they were presented with the exercise sweat.

This work suggests that our brains can detect the smell of anxiety on others and respond by making us more empathetic towards that individual. However, before you decide to ditch the deodorant, it must be noted that despite these compelling fMRI findings, none of the subjects consciously reported feelings of empathy! This could mean that, although our brains can unconsciously register the smell of anxiety and prime us to fell empathy, further physical stimuli may be required before we can consciously recognize and act on these feelings.

So to answer my question ‘could stress actually help you pass a job interview’: perhaps, but more work needs to be carried out first to find whether these unconscious brain signals will actually translate to conscious feelings of empathy in such a situation.

Post by: Sarah Fox

The full paper can be accessed free of charge here: