Bayes and Girls

2533475728_a0dcfd4524_zYou have long hair, never miss the Great British Bake-off and are particularly good at multitasking. You also work in Parliament. So are you more likely to be a man or a woman?

Well, some pretty lazy stereotypes aside, we may believe that the initial description is more likely to be female than male. The cliches certainly push us into thinking that way.

But, we should also consider the under-representation of women at all levels in British politics, a fact which has been highlighted in the news.

So, ignoring the introduction spele, if we were simply asked what is the chance that someone picked randomly from Parliament is a guy or a doll, how could we go about answering this? Well, let’s think in terms of the proportional representation of these two groups. Chances of picking a fella is certainly going to be larger. We can reasonably then go and place our stack of chips with the chaps.

What happens when we also consider the multitasking, baking-lover characteristics? These are things that stereotypically we would associate with the fairer sex so, we may reason, this person is more likely to be female. The result is a conflict between these two guesses.

7204293066_1866f0488a_zHow many men have long hair, like baking shows and can pat their head and rub their stomachs at the same time? Actually, probably quite a few. Maybe not proportionally as many as women but if we then include what we already know about the relative population sizes in Parliament, there may well be, in absolute terms, more men who fit this description. Simply by the prior fact that there are so many blokes in parliament, this may well weigh the answer in their favour.

This example is a different take on a common example of something called decision heuristics which can lead to cognitive bias. That is, rules-of-thumb that you and I use to take mental shortcuts, focusing on one aspect of a problem rather than the whole thing. It involves presupposition and is an example of where our intuition can lead us astray. The description of the person’s characteristics seduces us into imagining a woman but the baseline gender ratio, which is sneaked-in afterwards, is probably a better indicator.

What this little example can show is how we can go about combining two separate bits of information to get an overall, rational answer. In statistics this can be done formally using something called Bayes’ theorem.

In general terms, Bayes’ theorem takes what we already know- called the prior- and combines this with what extra information or data we observe- called the likelihood. As a result, we then have an answer- called a posterior- that is influenced by both of these things. How much it is influenced by each depends on the strength and conviction of the prior belief or data respectively. If they agree, then the answer is more certain than it would have been if we had used only one of them in isolation and if they are conflicting then the answer represents this too.

In the British politics example above, the prior could be what we know about the proportion of men and women and the extra data is the description of the employee. Like a tug-of-war, these two bits of information pull us in different directions. For example, we can not simply go along with the description in isolation and bank on a broad. The conclusion could be that, before hearing the characteristics, we are fairly sure we would get a gent at random and, even after hearing the profile, we may still think this is the most likely outcome, although we’re less sure about it.

5985805174_bd5e2cfe98_zKnowing that the mystery person has long hair moves us towards thinking that it is more likely to be a woman compared to before we knew anything about their hair-do but it’s just not enough to overpower what we already know about the parliamentary gender bias. That said, even though the flowing locks may not actually change our mind, they would introduce more doubt. Bayes’ theorem could help quantify this doubt.

Bayes’ theorem has applications far and wide, including spam filtering, internet search engines and voice recognition software. Originally, its statistical fundamentals were thought a little shaky, so have been extensively discussed and argued but it is fair to say a lot of progress has been made and the theorem has attained acceptance in most fields. That said, It has some way to go before it’s nearly as popular as the Great British Bake-off.

Post by: Nathan Green

Aaahh!! Real Monsters!: How parasites and pathogens colonised fiction.

After the recent torrent of zombie everything and anything, it might feel like science fiction is all about done with weird parasites and diseases.  But the mystery and power of organisms sometimes invisible to the human eye has inspired fiction for decades, including some of the most famous Sci-Fi monsters. I’d take a wager that we’re still a few undead away from total eradication of fictional parasites.

Settle in, pull on a hazmat suit and a facemask, and we’ll delve elbow deep into the parasitic ooze of film, television and video games to take a good look at some of the best parasites and pathogens Sci-Fi has to offer.

Xenomorph or Alien – Alien franchise
Best get the big guns out right away. Alien is one of my all-time favourite films, centred around one of cinema’s most iconic and terrifying Sci-Fi monsters.

Xenomorphs they steal resources from their host from within the host’s body, so we can call them endoparasites. They’ve got a pretty complex life cycle: some life stages needing a host and some able to live in the environment. This mixture of host dependency is seen quite often in real parasites, in human-infective worms such as the roundworms Schistosoma and Ascaris, and flatworms like Fasciola. Like the Xenomorph, these worms use their human host as a place to reproduce or develop, whilst the free living stages search through the environment for new hosts to infect.

 Putting my well-practised, “parasite-life-cycle-specific” drawing skills to good use even years on from all of my undergraduate exams.
Putting my well-practised, “parasite-life-cycle-specific” drawing skills to good use even years on from all of my undergraduate exams.

Real parasitic worms are fairly scary too, responsible for a huge burden of severe and chronic disease especially among the world’s poorest populations. Although we can at the very least be grateful that their method of exiting the host as eggs in the faeces is a little less violent than the “chestbursting” exit of the Xenomorph.

 Ooh hello! Alien, Brandywine productions. Still taken from Backwoods Horror.
Ooh hello! Alien, Brandywine productions. Still taken from Backwoods Horror.

Genophage – Mass Effect video game series
Some of our fear of pathogens is really a result of our fear of our own misuse of them, as bioweapons. Genophage is a phage-like virus in the Mass Effect universe used against the Krogran race to control their population by the Citadel, an intergalactic governing body.

Phages are small, simple viruses that infect bacteria. In doing so, they are able to insert genetic material from themselves or other host cells, into that of their current host. The modus operandi of the genophage virus is not too dissimilar, as it inserts a specific mutation into all the body cells of Krogans that prevent pregnancies carrying to term.

Phages have the power to turn the fairly unpleasant Escherichia coli bacterium into a thoroughly horrible and occasionally fatal O157:H7 form. Scientists are now trying to harness this ability, but for much less nefarious purposes. It’s hoped that modified phages could provide a new mechanism of delivering vaccines or medical treatment against certain infections: seriously cool stuff.

Ceti Eel – Star Trek II: The Wrath of Khan

As a complete non-Trekkie, my one-time viewing of 1982’s The Wrath of Khan didn’t give me a full idea of the wonderful world of Star Trek zoology (TRIBBLES. LOOK AT THEM).

 TRIBBLES. Star Trek: The Original Series. Desilu productions. Still taken from Wikimedia Commons.
TRIBBLES. Star Trek: The Original Series. Desilu productions. Still taken from Wikimedia Commons.

From that one film I was introduced to Ceti Eels, fantastic parasites that set off my love for the gory and gruesome in a manner only paralleled by real parasites on the level of loaiasis and Chigoe fleas. After incubating in the body of its parent, the developed Ceti Eel enters a host through the ear, worming its way into the skull cavity and attaching to the cerebral cortex. As you can imagine this is hugely painful.

The Ceti Eel then unveils its crowning weapon: mind control. Or to be more precise, the infected are left susceptible to suggestion – fantastic news for the enigmatic antagonist, Khan.

Mind control must surely be confined to Sci-Fi? Not so. Both Ophiocordyceps fungus and Dicrocoelium fluke worms can manipulate their host’s behaviour to suit their own ends. The juvenile stage of the fluke is released by snails as cysts in their slime. Ants eat said slime for its moisture. Once in the ant, one key worm gets up to the central nerve structure of the ant, and convinces it to climb to the top of a blade of grass and clamp down, waiting right on show to be accidentally eaten up by a cow or sheep. The worm drives the ant to get itself eaten. The real mind-controlling worm is even better at its job than the fictional eel!

Why are there so many parasites in Sci-Fi (and why are they all so damn cool)? Art and culture are vital for exploring and communicating the world around us. This stands just as true for science fiction, and just as true for the gory and the weird that nature likes to throw at us. The strange and exciting parts of nature are what take our piqued interest, and drive us to fascination and awe. So, while the current zombie tidal wave might just be past its peak, I reckon as long as we have fantastic, powerful, utterly disgusting parasites from which to draw inspiration, we’re going to be telling stories about them for a long time to come.

This post, by author Beth Levick, was kindly donated by the Scouse Science Alliance and the original text can be found here.

References: fictional

References: better than fictional

What’s going on in your head?: The science behind our inner voice

As a neuroscientist, one aspect of brain-science that has always intrigued me is the idea that we may never know exactly how another person experiences the world and whether their experiences differ from our own. I know what the red ball (pictured right) looks like to me but how do I know that you’re seeing same thing? In fact, I’ve often wondered what it would be like to see the world through the eyes of someone whose perceptions differ from mine, for example someone with colour or face blindness.

Sadly though, I’ve always assKarl_Pilkington_2008-02umed that my own experiences are disappointingly mundane and ‘average’. That was until ‘life guru’ Karl Pilkington taught me otherwise…

A few months ago, during a particularly long experiment, I was passing time listening to old exerts from the Ricky Gervais show when I came across the following dialogue:

Reading from Karl’s diary: “While I sat listening to The Kinks on my iPod, I wondered if everybody thinks in their accent. I know I do.”

Stephen: What’s this? What are you talking about?

Ricky: How do you know you think in your accent? Tell me a typical thought

Karl: I thought “that’s weird innit?” not “that’s weird isn’t it?” and I thought “I actually think in my accent”

Ricky: No, but, when I think I don’t think the sentence as like I’m saying it, it’s just a thought, the thought appears, it’s conceptual and it’s already there. It’s not like I go, “Rick?” “What?” “Just err… looking at that fella over there were you?” “Yeah, I was yeah. Erm, I was think he looked a bit weird” “Oh, so was I”, I don’t think out whole sentences…

Stephen: Is that how your mind works?

Karl: In a way, yeah

Ricky: Brilliant, it’s great, he has to think out whole sentences!

Stephen: That explains a lot!

This sparked my curiosity since, as far back as I can remember I’ve always thought in complete sentences, often to the extent that I have conversations with myself inside my own head – I just assumed that this was a pretty normal thing to do!

So, I decided to do a bit of my own research into this ‘inner monologue’. This research began life (as many eminent and respected studies often do) on Facebook, where I asked a number of friends:

“What is it like to climb inside someone else’s head? – I’m researching for a post on the inner monologue and, although I think in words like I’m narrating my own life, apparently there are people who don’t…what’s it like inside your head? and if you don’t think in complete sentences, how do you think?”

From this question I got some pretty interesting answers – In brief, most people who responded had some kind of inner voice but few regularly thought in complete sentences or engage this voice in conversation. Some interesting answers included:

“I think in pictures like I’m watching a silent film. In order to submit things to memory I have to have visuals as i struggle to remember audio descriptions. So most of my memory is made up of pictures and that’s how my thought processes work!”

“I sometimes imagine a highly adapted version of something I’ve read or watched – featuring me – and tailored to my real life situation of the time. Less actual words, more images, but like I’m an outsider observing myself observe my situation.”

“I think I only think in words when I’m either a) questioning something (“why’s that there?”) or b) making a decision to do something (“cup of tea!”). I often say such things aloud too when I’m alone.”

“I was wondering about my very minimal inner monologue after talking to my husband about it earlier this week. I find it incredible how most people seem to constantly be thinking in words/sentences. It sounds exhausting to me. I think in actions, visualizations, feelings, impulses and only really have a proper inner monologue when reading or writing. I never know internally what I’m about to say out loud (unless I force myself to do so, or if I’m nervous about talking in specific situations). Often my mind seems blank with no thoughts. I find meditation very easy.”

“I have narrated my life for as long as I remember. Sometimes, when something is particularly challenging, I sort of Parkinson interview myself, as if the problem is now in the past, and I’m discussing how I overcame it….I’ve done that since I was a teenager!”

So, it seems like people experience a huge spectrum of inner ruminations –  from short sharp assertions “cup of tea!” to long complex “Parkinson style” inner interviews.

But what do scientists actually know about this inner voice? Well, unfortunately it seems that this is one topic that’s been neglected by modern science. However, inspired by the theories of L. S. Vygotsky, modern research has now again picked up the baton and started to delve into the inner workings of the verbal mind.

Where does the inner voice come from?:

16931172632_0f1676a803_mVygotsky believed that inner speech starts to develop in early childhood, evolving from a phenomenon known as ‘private speech’. Many young children talk to themselves while playing – I remember I used to talk to myself, I’d also sometimes have conversations with inanimate objects (perhaps a downside of being an only child?). Vygotsky called this dialogue private speech and suggested that it comes from social dialogues with parents which, in later childhood, becomes internalised as inner speech.

This would imply that inner speech relies on the same biological mechanisms as those used when we speak out loud. Interestingly, we know that inner verbalisation is accompanied by tiny muscular movements in the larynx – it’s as though audible speech is almost produced but is then silenced at the last minute. If anyone’s like me, they may have experienced the phenomenon of externalised inner speech: when I’m deep in internal thought I’ve been known to accidentally say things out loud which should have stayed in my head.

Neuroscientists have also found that an area within the left inferior frontal gyrus, known as Broca’s area, is active when we speak out loud and also during inner speech. Intriguingly, if this region is disrupted using magnetic brain stimulation both outer and inner speech can be altered.

And, to answer Karl’s question….It has been suggested that, assuming inner speech derives from childhood verbalisations, the voice you hear in your head should sound like your own voice – as Karl would say “everybody thinks in their accent”.

Screen Shot 2015-10-10 at 17.01.37Interestingly, studies of limericks suggest that this is indeed the case! Ruth Filik and Emma Barber from the University of Nottingham asked participants to read two limericks silently in their heads, these being:

1) There was a young runner from Bath, Who stumbled and fell on the path; She didn’t get picked, As the coach was quite strict, So he gave the position to Kath.

2) There was an old lady from Bath, Who waved to her son down the path; He opened the gates, And bumped into his mates, Who were Gerry, and Simon, and Garth.

All participants were native to the UK, some having northern accents and others southern. In the UK there is a strong regional divide in the pronunciation of the words bath and path, with southerners rhyming bath/path with Garth while northerners rhyme bath/path with Kath (this being the correct way to pronounce things). By tracking participants eye movements the researchers were able to tell when they were reading a rhyming or a non-rhyming sentence. From this they found that both groups appeared to read silently in their own regional accent (although this is not always the case).

So, what does inner speech actually do?

4929178358_dac74312b0_zVygotsky thought that inner speech may help people to perform difficult tasks. Thinking a task through in words may make it easier to accomplish – there are definitely a lot of words going through my mind when I’m building Ikea furniture. Actually, a number of studies have found that people tend to perform worse on tasks which require planning (like playing chess) if their inner voice is suppressed while performing the task.

Recent studies have also found that inner speech often has a motivational quality. In fact, one of my friends offered this example of her inner voice: “I tend to ask myself questions and then think through the different answers. Also I cheer-lead myself along- ‘Right, ok, you can do this!’”.

The self reflective tendency of the inner monologue may also allow us to reflect more on who we are as individuals. Indeed, Canadian psychologist Alan Morin suggests that people who use inner speech more often also show better self understanding: “Inner speech allows us to verbally analyse our emotions, motives, thoughts and behavioural patterns,” he says. “It puts to the forefront of consciousness what would otherwise remain mostly subconscious.” This idea is further supported by a study of neuroanatomist Jill Bolte Taylor who reported a lack for self awareness after a stroke which damaged her language system.

But, I doubt my friends who reported the lack of an inner voice suffer from any associated lack of self awareness. Therefore, I’m sure that there are still a number of individual differences which remain unaccounted for in these studies.

The dark side:

2967650878_1f436efd1c_zJust as your inner voice can be your own personal cheerleader giving you a boost when you’re feeling low, it can also be your worst enemy. Alongside my Facebook friends, I also posed my question to a group of individuals who, like myself, have been or are currently struggling with depression and/or anxiety. I was intrigued to find that, of all 30 responses, only a couple of people reported not having an internal monologue and most said that their inner voice was conversational (like my own). Not just this but most also said that their inner voice was ‘nasty’ and ‘cruel’ repeating phrases such as “you are useless” or “you aren’t good enough”.

There are a number of studies which support this observation, specifically suggesting that depressed older people rely more heavily on negative internalised speech than social communications when constructing their view of reality (giving them a negative outlook on life). Indeed, the backbone of cognitive behavioural therapy (a commonly used tool in the treatment of mental illness) relies on teaching individuals to re-frame or alter negative thought processes like those mentioned above – “I can’t do it” may become “it’s a challenge but I’m capable given enough time”

Researchers are still not sure how the inner monologue, negative thought processes and social isolation interact in the case of depression. It may be that withdrawal from social interaction leads to a greater dependence on internal processes or perhaps disordered negative thoughts breed the need to withdraw from society. Whatever the case, a better understanding of the mechanisms behind our inner critics may help understand and treat those suffering from depressive illnesses.

Researchers from Durham University found that around 60% of people report that their inner speech has the to-and-fro quality of a conversation. So, despite Ricky and Stephen’s surprise, it seems that Karl perhaps isn’t that abnormal after all. With inner speech being such a wide-spread phenomenon and knowing its possible links with mental health, perhaps it’s time scientists paid a bit more attention to the little voice in our heads?

Post by: Sarah Fox

A Spinal Emulator for Medical Training

The complexity of the human body requires medical practitioners to have an astute working knowledge of anatomy and physiology. It’s no surprise then, that pursuing a degree in medicine is challenging and costly pursuit!

One of the most enigmatic and challenging regions of the anatomy to diagnose is probably the spine. To diagnose problems with the lumbar (lower region) of the spine, a doctor will lay a patient flat on his/her stomach (to achieve what’s called a lordotic spine shape). The doctor will then use the pisiform (bony region at the bottom of the wrist) to apply pressure to each of the lumbar vertebrae. By feeling how the vertebrae respond, the stiffness and how far they move, an experienced doctor can diagnose problems. In case you can’t visualise that, check out this demonstration.

To become competent at this takes time. In fact in medical circles it’s seen as a skill that takes years of practical experience, rather than something that can be picked up in an three hour workshop… But back problems are becoming pretty commonplace, especially with our increasingly sedentary lifestyles, so this is not a skill that should be in short supply!

But why is it such a difficult skill to acquire?

6dcf55_b9902d2be5924629ba07c2cf267ea78a.jpg_srb_p_410_410_75_22_0.50_1.20_0.00_jpg_srbWell, currently medical students learn to diagnose spinal problems by practising on lifeless, unrealistic plastic models.

That is, pushing down onto plastic vertebrae that don’t move or feel anything like a real spine. Following this medical students will then continue to try it out on each other. But, how can you teach students exactly what, for example, a degenerated disk feels like? Short answer, you can’t learn the realistic feel of it without trying it out on a real patient.

Not only that, how can a professor teach his students how fast or hard to press down on the spine, without knowing exactly how hard or fast a student is applying pressure?

My solution

The crux of this problem is that we’re currently using ‘low tech’ to teach this method. This project involves using some simple, low cost technology to create a spinal emulator, that a student can use to learn this method by simulating both a healthy and an unhealthy spine.

Something like this, which I have designed as an initial prototype…


At the top of this model are five lumbar spine vertebrae, the sacrum and the coccyx. Four of the lumbar vertebrae are mounted on flexible metal bars to provide passive “springiness”. But one of the vertebrae (Lumbar 3) is not. This vertebrae is mounted onto a linear actuator (basically a motor system that can move up and down – it’s the black thing in the design above). Underneath the linear actuator is a load cell, a sensor that can measure force.

So we have a system where one of the vertebrae can be electrically moved, and the pressure applied can be sensed. Hopefully you can see where I’m going with this. We can use some embedded electronics to move the vertebrae up or down, according to the amount of pressure being applied to it. With some control algorithms, we can essentially simulate what a real vertebrae would feel like, by simulating a force-displacement ratio (stiffness ratio). We can go further, and simulate varying stiffness profiles to correspond to specific spine problems. In other words we can, in theory, simulate any spine problem in theory.

Why are we only simulating one vertebrae?

Simple answer: cost. This is just a prototype, our budget is small, so it’s a proof of concept. If it works we’ll be looking to create a fully actuated spinal emulator.

So will it work?

Hopefully. But the real value of the system will be in how well it can simulate the real feel of a human spine. And that depends a lot on the algorithms I use… Seems I’ll need to learn the method before I can even attempt creating software to simulate it – any volunteers?

Guest post by Josh Elijah: @yoshelij

aBs7TWXT_400x400Josh is an electronics engineer with a passion for robotics and control systems. He is currently working on a range of projects at Manchester University before embarking on a PhD.

Thinking on your feet: The effects of dance on the brain

It’s nearing the end of September: a month for colourful autumn leaves, freshly sharpened pencils and pumpkin spiced lattes.  For many dance music fans, it’s also time to head to the island of Ibiza for the legendary closing parties at some of the world’s greatest clubs. Typically, nights on the ‘white isle’ see clubbers dancing well into the night and early hours of the morning.

Let’s go dancing: DJs Disciples get the crowd moving at this year’s Cream Ibiza closing party. Credit: James Chapman Photography.

But how does dancing affect us? As anyone who has ever gone to a club night, ceilidh or even a Zumba class can testify, dancing can be excellent physical exercise, raising our heart rates and burning hundreds of calories. However, there is now growing evidence that dancing can also change the way you think.

Just ask professional dancer turned academic psychologist, Dr Peter Lovatt. Dr Lovatt runs the Dance Psychology Lab and researches the links between dance, problem solving and creativity (watch his TEDx talk). According to Dr Lovatt, the benefits of dancing are obvious: “dancing made me feel relaxed and stress free, it helped me to think more clearly, and it felt like the most natural thing in the world to do.” But where’s the empirical evidence for this claim? One emerging area of research studies how different types of dancing can improve different types of problem solving. In a recent study, researchers tested the relationship between dancing and ‘divergent’ thinking; that is, creative thinking tasks with multiple solutions, such as brainstorming. In the experiment, primary school children were randomly allocated to participate in 10 minutes of either ‘improvised’ dance (the experimental group) or ‘command-style’ dance, where they learned a simple routine (the control group). The children then performed a creative toy design task. The results revealed that children assigned to the improvised dance group performed significantly better than the control group. In other words, improvised dancing seemed to boost the children’s creative thinking ability.

There is also growing interest in how dancing can help maintain healthy brain function in older age. Whilst the link between exercise and healthy cognitive function remains uncertain, it remains a key area of interest for researchers. However, fitness may not be the only mechanism involved. Indeed, dancing involves a combination of elements which may be beneficial, including social interaction, musical stimulation and cognitive reasoning (i.e. literally thinking on your feet). In one study, 35 older people who took part in a dancing programme, for over six months, showed a range of cognitive improvements, including improved working memory and reaction times. Yet within the group cardio-respiratory performance did not change. Furthermore, in an American cohort study that tracked over 400 older adults over several years, dancing was the only physical activity linked with lower risk of dementia. This suggests it might not necessarily be just the work-out factor involved in dancing that helps to protect cognitive and perceptual abilities.

Researchers have also explored the therapeutic effects of dance for treating clinical conditions. The findings of several small-scale studies indicate that dancing may be beneficial for people with certain neurodegenerative disorders, like dementia.  For example, residents of a dementia nursing home who took part in weekly dance sessions as part of a research study gained small improvements in certain visual functions and planning ability. Dancing may also help people with mental illness. In one study involving patients admitted to a psychiatric ward, just 30 minutes of dancing to lively music was sufficient to reduce their symptoms of depression and improve vitality.  The interesting thing about this study is that researchers also recruited a second group of patients to simply listen to the same music, without dancing. The results showed that only the patients who danced derived any benefit: in other words, for these patients music alone wasn’t enough.

Of course, the evidence in this area is still emerging and better quality studies are needed to fully understand how dance affects the brain.  The research that has been done still leaves lots of unanswered questions, like what are the effects of different types of dancing and does it matter what type of music you listen to? In the meantime, however, the next time you head off to Ibiza, Zumba or even just dance around the kitchen, just consider the possibility that you might be doing yourself more good than you think.

See you at the front.

Post by: Lamiece Hassan

Aromatherapy: what is it and does it actually work?

ET_essential_oil_candleWe all know that smells can affect the way we feel. Indeed, essential oils are used regularly in Ancient Egypt and India as an adjunct to improve health and well-being. These oils are usually extracted by steam distillation from fragrant plants such as lavender, rose, orange, cinnamon or peppermint, to name just a few. The oils can be inhaled, used during massage, or even ingested.

It is theorised that the effect scent has on mood may be mediated by the architecture of the olfactory system. The areas of the brain that process scents are directly connected with areas involved in processing emotions, memories and autonomic responses.

Let’s start from the beginning, i.e. the nose. Here the receptors on olfactory neurons detect odorants (chemicals which form a scent) and transform these  particles into electrical signals. These signals travel along the olfactory nerve to the olfactory bulb in the central nervous system (Kadohisa, 2013). The olfactory bulb forms connections with other brain areas such as amygdala (the center of emotions) (Wilson-Mendenhall et al., 2013) and the entorhinal cortex (important in memory) (Takehara-Nishiuchi, 2014). The amygdala, in turn, is connected to the hypothalamus, a part of brain that regulates physiological states, e.g. controlling the release of stress hormones. This is one reason why smells can have an impact on our mood and why they evoke such strong memories. Can you think of any smell which conjures up a memory for you? – If so, let us know in the comments below!

The_Soul_of_the_Rose_-_WaterhouseA number of people find that essential oils can affect their mood but these are not the only odorants can which have this effect. If you like spending time in nature you probably noticed that being surrounded with vegetation can reduce stress. One study suggests that the “green odour” (the scent of leaves and vegetation) changes the electrical signals in our brain in a way that brings about a sedative-like action, reflected in a feeling of relaxation (Sano et al., 2002). Studies on rats have shown that this effect could be due to the action of the green odour on the brain circuit which release adrenaline and cortisol (the hypothalamic-pituitary-adrenal axis) (Nakashima et al., 2004).

Another botanical scent, the essential oil of rose, may have a similar effect on the brain’s stress circuitry (Fukada et al., 2012). Women who carried a test paper soaked in rose essential oil for several days during exam period showed no change in their cortisol levels, while those students supplied with a jasmine aroma patch or nothing at all, had increased amount of cortisol around their exams. One suggestion raised by this study is that rose essential oil could prevent the release of stress hormones. Further, in another study essential oil extracted from orange peels reduced the activity in the prefrontal cortex, part of the brain involved in integrating information, planning and making decisions (Igarashi et al., 2014). After barely ninety seconds of inhaling these oils participants felt more “comfortable”, “relaxed” and “natural”.

Have you ever noticed that in times of stress your skin becomes dry or you are plagued by eczema? Stress causes shrinking of the lipids that form the protective skin barrier, increasing transepidermal water loss (TEWL) – the escape of moisture from the skin. Some studies suggest that inhaling the “green odour” or rose essential oil can reduce this water leakage and prevent the stress-related drying of the skin (Fukada et al., 2007).

Aromatherapy is based on a holistic approach to the patient, considering both their physical and psychological needs (meaning that any effects of aromatherapy may be person-specific). Scientific studies have shown evidence both for and against the effectiveness of aromatherapy but with many individuals reporting benefits further research is certainly required.

This article is for informational purposes only. Always use essential oils as instructed by the manufacturer or a therapist.

Post by: Jadwiga Nazimek

Fukada, M., E. Kano, M. Miyoshi, R. Komaki, and T. Watanabe, 2012, Effect of “rose essential oil” inhalation on stress-induced skin-barrier disruption in rats and humans: Chem Senses, v. 37, p. 347-56.

Kadohisa, M., 2013, Effects of odor on emotion, with implications: Front Syst Neurosci, v. 7, p. 66.

Nakashima, T., M. Akamatsu, A. Hatanaka, and T. Kiyohara, 2004, Attenuation of stress-induced elevations in plasma ACTH level and body temperature in rats by green odor: Physiology & Behavior, v. 80, p. 481-488.

Sano, K., Y. Tsuda, H. Sugano, S. Aou, and A. Hatanaka, 2002, Concentration effects of green odor on event-related potential (P300) and pleasantness: Chemical Senses, v. 27, p. 225-230.

Takehara-Nishiuchi, K., 2014, Entorhinal cortex and consolidated memory: Neurosci Res, v. 84, p. 27-33.

Wilson-Mendenhall, C. D., L. F. Barrett, and L. W. Barsalou, 2013, Neural Evidence That Human Emotions Share Core Affective Properties: Psychological Science, v. 24, p. 947-956.

Science and Religion – Awkward Bedfellows Through The Ages

Science and religion haven’t always seen eye-to-eye over the centuries. The ways in which they impact upon one another have changed hugely, with each civilisation and religion having its own views and rules. Here, I’ll take a look at 3 major moments in history that showcase this ever-changing, and often tumultuous, relationship.

The Ancient Egyptians (~3150 BC to ~30 BC)

The Egyptians didn’t have what we would call ‘scientific understanding’. Rather than deducing earthly and natural meanings for the phenomena they observed, they attributed everything to their Gods. Yet they learned an incredible amount about the world in their bid to understand their Gods’ wishes and to use natural phenomena in the pursuit of worship.

One notable example of this was the way in which they used the stars, mapping the paths of certain celestial bodies across the sky with such accuracy that they were eventually able to predict their movements throughout the year. Much of our knowledge of the night sky has stemmed from Egyptian observations, and so their importance cannot be overstated.

Karnak Temple. Photo credit: Vyacheslav Argenberg via Flickr. Shared under Creative Commons License 2.0
Karnak Temple. Photo credit: Vyacheslav Argenberg via Flickr. Shared under Creative Commons License 2.0

A fantastic application of their knowledge can be seen in the Karnak Temple in Luxor, built for the Sun God, Amun Re. The Egyptian astronomers, or ‘cosmologists’, realised that the sun rises at different points along the horizon, depending on the time of year. So, when building the temple, the architects positioned the building so that, on the Winter Solstice, the sun rises directly between the 2 front pillars, filling the temple with light. By all accounts it is a phenomenal sight and one that I’d love to see some day.

However, whilst the Egyptian architects and thinkers were considered great minds, they were always considered inferior to the Gods they sought to worship. Religion dominated the culture, leaving little perceived need for Science.

The Ancient Greeks (~800 BC to ~150 BC)

Arguably, it wasn’t until the Ancient Greeks developed the first recognisable scientific methodology that things began to change. Amongst the Greeks were some of the greatest minds ever known, including Pythagoras, Archimedes and Aristotle. They began to study the reasons behind phenomena, not content to just accept them as the will of the Gods, gaining reputations for being geniuses in the process, even in their own time.

Hippocrates. Engraving by Peter Paul Rubens - Courtesy of the National Library of Medicine. Licensed under Public Domain via Wikimedia Commons
Hippocrates. Engraving by Peter Paul Rubens – Courtesy of the National Library of Medicine. Licensed under Public Domain via Wikimedia Commons

The Ancient Greeks’ religion overlapped somewhat with that of the Ancient Egyptians. Their often-similar Gods were also thought to influence most aspects of life. As such, there were some things that people just weren’t ready for science to explain. For example, Hippocrates – author of the Hippocratic Oath upon which Western medicine is founded – realised that disease wasn’t a divine punishment. It was, in fact, borne of earthly causes.

Obviously, such revelations didn’t always go down well. Hippocrates, whilst advancing his society’s understanding of the world, had just diminished the role of the Gods in that world. Eventually, however, these ideas took hold and arguably improved Science’s standing in society. Religion remained an integral part of society, but Science had now proven its worth and its role in society would only increase during the transition to enlightenment.

The 19th Century

By the 19th century, in Western cultures contradicting religious teachings was still proving massively controversial. In Christianity, for example, it was an accepted fact that God created the Earth, the Moon and the Stars, as well as all of Life.

Charles Darwin. Photo credit: Wikimedia Commons. Shared under Creative Commons License 3.0
Charles Darwin. Photo credit: Wikimedia Commons. Shared under Creative Commons License 3.0

Despite rumblings amongst some scientists that this wasn’t the case, scientific establishments had a close relationship with the Church of England, so such contradictory thinking never really took hold. A certain Mr Darwin, however, was so convinced of the importance of his work, ‘On the Origin of Species’, that he had it published on 24th November 1859, courting massive controversy. The Church, naturally, rejected the theory, whilst the scientific community was split.

The general public were caught in the middle of a fascinating stage in the relationship between Science and Religion. Should they trust the Church, which held such sway in their lives, or should they trust the ever-growing number of scientists, trusted and revered minds, who dared to disagree with the Church? For their part, scientists were now forced to dig deeper and drive scientific understanding even further in an effort to answer the questions to which the public demanded answers.

The product of all this is the world in which we live now, where Science is driving forward understanding at an ever-increasing pace. Religion remains important in many people’s lives but I would argue that, for many, Science has an even greater importance in their lives as it seeks to offer tangible evidence-based answers to the questions we have about the universe. The question now is how the relationship between Religion and Science will change in the future. It is a dynamic relationship, no doubt, with time and location playing massive roles in its development. Only time will tell how they will get along a century from now…

This post, by author Ian Wilson, was kindly donated by the Scouse Science Alliance and the original text can be found here.

Sartorial Science

Are you sick of the lazy stereotypes that surround scientists? That we are all old, white men in lab coats, with fuzzy hair and safety goggles, and that the only thing that we find fashionable are tank tops and boiler suits? Well I was, and so that is why my colleague Sophie Powell and I have created a new blog, to challenge these conventions.

Scientists come in all shapes and sizes…
Scientists come in all shapes and sizes…

I have always been extremely interested in fashion, and at one point I believe that I had the largest collection of bowties in the North West. As well as being a PhD student at the University of Manchester, Sophie is also a keen fashion blogger, posting regularly on her website, The Scientific Beauty. We were both sick of seeing articles such as this one from the Guardian portraying scientists as socially inept and modishly incompetent troglodytes, and so we decided to create Sartorial Science.

The idea behind this blog is that any scientist, from undergraduate to professor can send us a photo of them in their resplendent best, and then answer some basic questions about their research and their fashion influences. It is supposed to be a bit of fun, but like similarly minded projects ‘This is What a Scientist Looks Like’ and ‘STARtorialist’, it aims to showcase to the wider public that scientists are real people, and that many of them have a variety of interests outside of science, including fashion and looking fabulous!

Many might think that sites such as this are a waste of time, and that scientists should only

...and they can even wear science!
…and they can even wear science!

concern themselves with doing their research, publishing results, and applying for grants. However, it is extremely important to humanise the people behind the science, not least because it will help to inspire a future of generation of scientists. If younger students think that being a scientist is all about working in a laboratory and conforming to stereotypes, then many of them might not decide to pursue science any further than compulsory education.

As well as showcasing the sartorial merits of our contributors, we also hope to gather enough data to be able to start investigating the relationship between scientists and fashion, in a more detailed study that would be suitable for publication. But in order for that to happen we need lots more posts, so come on scientists show us your style!

Post by: Sam Illingworth

Fish and their sun-protective “superpowers”

Screen Shot 2015-08-28 at 17.22.27With the summer holidays in full swing and the sun making (intermittent) appearances, it’s time to start lathering on the suntan cream! Despite the hassle and general “greasiness” of these products, suntan cream is essential to protect our skin from damaging ultraviolet (UV) A and UVB rays which can lead to sunburn, premature skin aging and cancer. But while we’re busy trying not to stick to our beach towels, it may be interesting to note that not all organisms share our sticky plight: many bacteria, algae and marine invertebrates are known to produce their own sun protection. Now, research suggests that even fish may share this useful ability.

The sun is vital for maintaining life on Earth. It provides us with essential light and heat, without which our planet would be a lifeless rock covered in ice. But sunlight comprises different forms of light, including UV radiation which is Screen Shot 2015-08-28 at 17.22.38invisible to the naked eye. It is this UV radiation (specifically the UVA and UVB forms) that can be harmful to our health, causing damage to the skin’s DNA. In humans, this can result in detrimental DNA mutations occurring, leading to various skin cancers such as basal cell carcinoma and squamous cell carcinoma.

But UV radiation is also harmful to other organisms, and many bacteria, algae and invertebrates that inhabit marine environments are exposed to high levels of sunlight (e.g. reefs, rock pools, etc.), meaning they need to protect themselves against this damaging UV radiation. While we humans need to lather on the suntan cream, these clever organisms produce their own sunscreens in the form of mycosporine-like amino acids and gadusols, which are able to absorb UV radiation and provide photoprotection. Such compounds are made by an enzyme called DDGS for short, a member of the sugar phosphate cyclase “superfamily” of proteins which are involved in synthesising natural therapeutic products (e.g. the antidiabetic drug acarbase).

While mycosporine-like amino acids and gadusols have been found in more complex marine animals, such as fish, it was originally thought that these compounds had been acquired through the animal’s diet. Recently, however, a group of scientists from Oregon State University in the United States have discovered that fish can produce gadusol themselves. Interestingly, this seems to be achieved through a different pathway to that used by bacteria.

Screen Shot 2015-08-28 at 17.22.45Rather than DDGS, the group found that fish (in this case, zebrafish) possess a gene responsible for making an enzyme similar to another member of the sugar phosphate cyclase superfamily, EEVS. This EEVS-like gene is found grouped with a functionally unknown gene termed MT-Ox. In fact, the researchers were able to produce their own gadusol by adding both genes to a modified strain of E Coli and growing the cells in an environment rich in the necessary components for gadusol production. This suggests the EEVS-like and MT-Ox genes are involved in the production of this UV-protective compound in fish. Importantly, both the EEVS-like and MT-Ox genes are expressed during embryonic development, providing further evidence that fish are able to synthesise gadusol, rather than simply acquiring the compound through their diet.

Unfortunately for us, the EEVS-like and MT-Ox genes are not present in the mammalian or human genome, but they do appear in other animals including amphibians, reptiles and birds, inferring that the production of UV-protective compounds may be even more widespread than once thought. And while this does not save us from the dreaded, yet essential exercise of putting on suntan cream, it certainly acts as a friendly reminder that we may not be as evolutionarily superior to these animals as we might think…which I suppose is a good thing.

Post by: Megan Barrett

Citizen science: the power of the crowd

Have you ever thought about being a scientist?  The growing movement of citizen science encourages public volunteers to contribute towards ‘the doing’ of scientific research, all without giving up the day job, undergoing extensive training or putting on a white coat. Topics and activities involved can vary widely, but typically ‘citizen scientists’ get involved in collecting, processing and/or interpreting data in some way, all under the direction of professional scientists or researchers.  For example, you could be asked to record observations about the natural environment, track human or animal behaviours, perform logging or mapping activities, or interpret images and patterns.

Galazy Zoo (a study to investigate the properties and histories of galaxies) is regarded as one of the most well-known and successful citizen science projects.  Launched in 2007, the project started life as a website that invited members of the public to sort images of galaxies into different categories (ellipticals, mergers and spirals).  Stunned by the speed and scale of the initial response (over 70,000 classifications received within 24 hours), the project has continued to attract an army of willing volunteers who perform ever more skilled tasks. Furthermore, they have developed a growing number of online resources to engage schools and the general public in astronomy.  Meanwhile, in Old Weather, volunteers delve into and transcribe the contents of historical logbooks from ships. Transcribing handwritten entries about weather readings taken at sea from thousands of logbooks into a digital (and analysable) format would clearly be a mammoth, and potentially impossible, task; yet, opening it up to the public has made it feasible.

So, citizen science can be a valuable way of both advancing scientific inquiry and engaging the public in science.  The general public can indulge their own interests whilst getting the feel-good factor of knowing they’ve contributed towards science. Basically, it’s a win-win situation.  Or is it?  Cynics may argue that some projects are little more than crowdsourcing, enabling scientists to achieve what would otherwise be too expensive, time consuming or intensive for researchers working alone.  Whilst it’s true that there may be pragmatic reasons for using citizen science approaches, projects can offer opportunities for genuine dialogue and collaboration between scientists and the public.  For example, members of the public with hay fever are now involved in designing #BritainBreathing, a citizen science project aimed at understanding more about seasonal allergies such as hay fever. So far, individuals have been involved in ‘paper prototyping’ workshops, to sketch out the design of a mobile phone app that will capture data about allergy-related symptoms such as sneezing, breathing and wheezing. Hence, the goal is to engage citizens in multiple roles whereby they can act as co-designers and collaborators, and not just as passive sensors.

More than just sensors: as part of #BritainBreathing, workshop attendees try out ‘paper prototyping’ to design a mobile phone app to capture data about hay fever symptoms.
More than just sensors: as part of #BritainBreathing, workshop attendees try out ‘paper prototyping’ to design a mobile phone app to capture data about hay fever symptoms.

Citizen science is not a new concept. Indeed, the first project has been traced back to 1833, when the astronomer Denison Olmsted invited the public to submit first-hand accounts of a spectacular meteor shower. Nonetheless, the term ‘citizen science’ only entered the Oxford English Dictionary In 2014 and it is enjoying a ‘boom period’ at the moment, with an explosion of projects emerging.  Why the sudden popularity, you might ask?  Two factors stand out.  First, advances in information technology and the widespread availability of internet-enabled devices (e.g. smartphones and tablets) have enabled individuals to contribute data online in real-time, on the move, and with minimal effort.  Second, there is growing recognition that the public have a stake in science and that research should be more democratic, shaped by the interests and needs of the people.  Whilst the former is certainly useful in enabling projects, personally it is the latter that most excites me.  Citizens have more opportunities (and power) than ever to shape research to generate the knowledge and solutions we want for our futures.  So go on, indulge your inner scientist. Power to the people.

Guest post by: Lamiece Hassan

Lam headshot 2Lamiece is a health services researcher and public involvement specialist at The University of Manchester.  With a background in psychology, her research has explored topics such as health promotion, mental health in prisons and psychotropic medicines.  Her current work focuses on how we can use digital technologies and health data in trustworthy ways to empower patients and improve health.

Share This