‘Hangry’ humans – why an empty stomach can make us mean

There’s no point denying it, at one point or another, we’ve all been guilty of being ‘hangry’. Whether you’re a frequent culprit or just an occasional offender, getting angry when hungry is a common crime in many households, and one that can result in arguments, ‘fallings out’ and even a night spent sleeping on the couch. But is it really our fault or is there a more biological reason to blame? An increasing body of research suggests our blood glucose may be the real culprit.

The glucose we obtain from our diet is a key source of energy, required for our bodies to function and delivered to all of our cells via our blood. Out of all the organs of the body, our brain is the most energy-consuming, using around 20% of the energy our bodies produce. It also relies almost completely on glucose as its energy source, making an efficient supply of this sugar essential to maintaining proper brain function. This is particularly true for higher-order brain processes such as self-control, which require relatively high levels of energy to carry out, even for the brain. Since self-control allows us to resist such impulsive urges as out-of-control eating or aggressive outbursts, if our brain does not have sufficient energy to perform this process, our ability to stem these unwanted impulses can suffer.

Low levels of glucose in our blood can also result in an increase in certain chemicals in our body, believed to be linked to aggression and stress. Cortisol, for instance, colloquially named the ‘stress hormone’, has been shown to increase in individuals when they restrict their caloric (and therefore glucose) intake. Neuropeptide Y concentrations have also been shown to be higher in individuals with conditions associated with impulsive aggression when compared to healthy volunteers.

Given such evidence, it therefore makes sense that low levels of blood glucose, like those experienced when we are hungry, could plausibly lead us to become more aggressive. The association between blood glucose and level of aggression has been observed in multiple studies, including Ralph Bolton’s 1970s research of the Quolla Indians. These Peruvian highlanders are well-known for their high rates of unpremeditated murder and seemingly irrational acts of violence. Having observed both this behaviour and a strong sugar craving among the Quolla Indians, Bolton decided to investigate the possible link between hunger and aggression. In agreement with his hypothesis, Bolton found that the Quolla Indians commonly experienced low blood glucose levels, and that those with the lowest levels tended to be the most aggressive.

In another, more recent study, similar findings were observed in college students who took part in a competitive task. Participants were randomly assigned to consume either a glucose beverage or placebo drink containing a sugar substitute. Following this, participants then competed against an opponent in a reaction time task, which has been shown previously to provide a measure of aggression. Before beginning the task, the students could set the intensity of noise their partner would be blasted with if they lost. As predicted, participants who drank the glucose drink behaved less aggressively towards their partner, choosing lower noise intensities, compared with those who had consumed a sugar substitute. This suggested that hunger-related aggression, or ‘hangriness’, could be ameliorated by boosting one’s glucose levels.

One notable (though some may argue rather dark) study into the ‘hangry’ condition investigated the relationship between blood glucose and aggressiveness in married couples. As well as pitting spouses against each other in a similar reaction time task to the one described above, participants were also given a voodoo doll of their partner and told to stick pins in the doll each evening, depending on how angry they were at their partner. (Warning, do not try this at home). As with previous studies, lower levels of blood glucose resulted in participants blasting their spouses with higher noise intensities and sticking more pins in the voodoo dolls, suggesting greater levels of anger and aggression.

While these studies do not necessarily ascertain causality, the relationship between low blood glucose and the tendency to become aggressive makes biological sense, since glucose is the main energy source our brains need to control such negative impulses. As observed in studies and experienced by many of us, ‘hangry’-related crimes can also be easily avoided by supplying the potential offender with food, further supporting the role of glucose in hunger-related anger. So next time ‘hangriness’ threatens to ruin the harmony in your household, fill your mouth with food rather than foul language, and save yourself a night banished to the couch.

Post by: Megan Freeman

Save

Save

Mothers-to-be may be sick of “morning sickness”, but does this symptom of pregnancy serve a purpose?

screen-shot-2016-11-06-at-18-01-08Pregnancy, a beautiful time in any woman’s life when she witnesses her child growing inside her, feels her baby kick for the first time, and spends a great deal time vomiting into her toilet. Rather misleadingly termed “morning sickness”, nausea and vomiting during pregnancy (or NVP for short) is experienced by around 70% of expectant mothers during their first trimester and is rarely confined just to the first half of the day. NVP can begin as early as 5 weeks into a pregnancy, peaks between weeks 8 and 12, and generally continues up until about week 20. But what is the point of NVP I hear you suffering mothers-to-be cry? Is there a reason for this less than appealing part of pregnancy or is it just an unwanted side effect of this miraculous event?

Despite the difficulties and unpleasantness NVP can bring the mother, mild to moderate forms of NVP have been widely associated with favourable outcomes for her baby. Reductions in the risk of preterm delivery, low birth weight and miscarriage have all been shown to accompany NVP and suggest this condition may in fact possess an important function in pregnancy. It should be noted that the nausea and vomiting discussed here does not include the pathological condition hyperemesis gravidarum, which occurs in approximately 1% of mothers-to-be and can lead to serious complications if left untreated.

There are a number of theories which may help explain why NVP has evolved as a part of pregnancy. The first of these sees NVP as a method of “communication” to a woman’s partner, alerting them to the pregnancy and the need to modify their behaviour screen-shot-2016-11-06-at-18-01-16accordingly. This would lead to a reduction in their desire to have sexual intercourse, instead providing more protection and an increased food supply to the expectant mother.

While it may sound like an attractive idea to have our partners evolutionarily programmed to wait on us hand and foot during pregnancy, the “communication” theory seems unlikely. Firstly, the peak of NVP occurs later than the cessation of periods, an equally clear and a less unpleasant signal of pregnancy. NVP would therefore be superfluous, meaning it would be eliminated through natural selection. Secondly, there is no evidence to suggest sexual intercourse is detrimental to pregnancy and so no need to reduce its desirability.

An alternative hypothesis is that NVP is a side effect, or “by-product”, of the internal conflict which occurs between the expectant mother and her foetus. This is not an aggressive or violent form of conflict of course, but rather a competition for the mother’s limited resources. Pregnancy, childbirth and parenthood are all costly investments for a mother, and while taking more of her nutritional intake allows the foetus to maximise its fitness, this act also reduces the nutrition available to the mother and consequently lowers her fitness. Such a tussle for resources is bound to result in visible side effects, hence the presence of NVP.

Similarly to the “communication” hypothesis, this “by-product” theory has a number of flaws. For example, if NVP were a sign of foetal fitness, the presence of NVP should denote a successful pregnancy. However, NVP does not occur in all viable pregnancies, nor does its presence always result in positive pregnancy outcomes. In addition, this theory suggests that NVP symptoms should occur later during pregnancy when the foetus is larger and therefore requires more resources which, as previously discussed, is not the case.

screen-shot-2016-11-06-at-18-01-23The final and most widely favoured theory for the function of NVP is the “mother and embryo protection” hypothesis. This states that NVP acts to reduce an expectant mother’s intake of agents which could harm her pregnancy (known as teratogens), including caffeine, alcohol and tobacco. It also removes any dietary toxins or food-borne teratogens which are ingested by the mother-to-be before they reach the baby and, as a consequence, teaches her to avoid these foods. Hence, the well-known “food aversions” experienced by many pregnant women. By the same method, NVP and food aversions protect the expectant mother from foods that may contain pathogenic microorganisms that could make her ill. This is particularly important during pregnancy as a woman’s immune system is lowered during this period to prevent her body rejecting the embryo, which appears to her body as a foreign tissue.

Many of the features of NVP and pregnancy support the “mother and embryo protection” theory. To begin with, NVP symptoms usually occur in the first trimester, at the same time that the expectant mother and embryo are most immunologically vulnerable and therefore need increased protection from toxins and teratogens found in food. In accordance with the “protection” hypothesis, food aversions also tend to be greatest during the first trimester and the types of food that pregnant women tend to find aversive are those most likely to contain pathogenic microorganisms or teratogens, such as meat, caffeinated drinks and alcohol.

Over the years, a number of theories have been put forward, aiming to provide a reason for the characteristic “morning sickness”, or NVP, experienced by the majority of expectant mothers in their first trimester. Whether the front-running “mother and embryo protection” hypothesis is true or if another explanation exists, NVP certainly does appear to possess a function, being widely associated with positive pregnancy outcomes. While this is unlikely to make the experience of NVP a pleasant one, hopefully such knowledge will provide at least some comfort to all the mothers-to-be out there currently well acquainted with their toilet bowls.

Post by: Megan Freeman @Meg_an12

Symbiosis – harmony or harm?

We have all experienced relationships which are beneficial and others that are not. The same can be seen throughout nature. Originally defined by German scientist Heinrich Anton de Bary, symbiosis describes a close association between two species, principally a host and a symbiont, which lives in or on the host. While some partnerships may be advantageous or neutral to one or both parties, others may have a more detrimental effect.

Mutualistic symbiosis:

screen-shot-2016-10-02-at-22-27-45The first of the symbioses involves relationships between two different species which benefit both organisms. Mutualistic symbiosis can involve organisms of all shapes and sizes from stinging ants and bullhorn acacia trees, a relationship where the tree provides the ants with food and shelter in return for protection from herbivores, to the alliance between oxpeckers and zebras, in which the bird enjoys a readily available food source while the zebra has any parasites living on it removed.

One of the most well studied forms of mutualistic symbioses is that of the ruminant (i.e. cattle and sheep etc.), as these organisms play an important role in our agriculture and nutrition. Ruminants host an extensive microbial population in the largest of their four stomachs, the rumen. A mutually beneficial relationship exists between these two organisms because the rumen microbes are able to digest the plant matter consumed by the ruminant. In doing so, they produce fatty acids, which can be used by both parties for energy. Carbon dioxide is also released in this process, providing the rumen microbes with the oxygen-free environment they need to survive (these microbes are predominantly anaerobic so are poisoned by oxygen).

Parasitic symbiosis:

screen-shot-2016-10-02-at-22-27-52In contrast to mutualistic symbiosis, the interaction between two organisms may be less savoury in nature. Parasitic symbiosis describes a relationship between organisms where the symbiont benefits at the expense of its host. Unfortunately for the host, this generally causes it harm, whether this be in the form of disease, reduced reproductive success or even death. The symbiosis between birds, such as the cuckoo and the reed warbler, known as brood parasitism, is a characteristic example of a parasite-host relationship. Rather than building her own nest, the parasitic cuckoo will lay her eggs in a reed warbler’s nest, leaving the warbler to raise this egg along with her own offspring. Once hatched, the cuckoo chick then ejects the warbler’s young from the nest, allowing it to receive all the food that its “adopted” mother provides.

Unsurprisingly, this antagonistic relationship has led scientists to question why warblers raise these parasitic chicks if the practice is so harmful. It has been suggested that cuckoos engage in a kind of “evolutionary arms race” with its chosen host, based on the host’s ability to recognise a parasitic egg. In this ongoing contest, the evolution of a host species to become more adept at spotting and rejecting a parasitic egg may result in a subsequent evolution in the cuckoo to counter this change. This may be to lay eggs with greater similarity to the host’s or to move towards a new host species. Such a process could continue indefinitely.

screen-shot-2016-10-02-at-22-28-01An even more detrimental relationship exists between the parasitoid wasp and its hosts, which include a range of insects from ants to bees. Similarly to cuckoos, these wasps rely on their host to facilitate the development of their young, but do so by either laying their eggs inside the host or gluing them to its body. Once hatched, the wasp larva will feed on the host, usually until it dies.

Commensal symbiosis:

Symbiosis does not necessarily have to be beneficial or detrimental to the host organism. Commensal symbiosis describes a relationship in which one organism benefits while the host is unaffected. This may be in the form of shelter, transportation or nutrition. For example, throughout their lifecycles small liparid fish will “hitch a ride” on stone crabs, providing them with transportation and protection from predators while conserving energy. The crabs, meanwhile, appear to be neither benefitted nor harmed.

One case of commensalism which may come as a surprise involves Candida Albicans, a species of yeast known to cause the fungal infection Candidiasis in humans. Contrary to popular belief, C. Albicans can be pathogenic or commensal depending on which phenotype it has. Under normal circumstances, C. Albicans reside in our gastrointestinal tract undergoing a commensal symbiotic relationship with us (i.e. causing us no harm). This interaction is actually the default existence for C. Albicans. When changes occur in the body’s environment, however, a “switch” in phenotypes to the pathogenic form can occur, placing a temporary hiatus on the usual commensal relationship.

A plethora of symbiotic relationships exist throughout the natural world, from the tiny microbes inhabiting the ruminant gut to the large acacia trees housing ants. They can offer both organisms the harmony of a mutually beneficial association, as is the case with the oxpecker and the zebra, or be parasitic and work in the favour of one player while harming the other, as seen with the parasitoid wasp. In some instances, one organism can gain benefit without impacting the other either positively or negatively. As illustrated by C. Albicans and cuckoos, a symbiotic interaction may change or evolve according to the environment or evolution of the host, respectively. Symbiosis is clearly a highly important aspect of nature which many organisms rely on for survival, and one that will continue to fascinate scientists and non-scientists alike both now and in the future.

Post by Megan Barrett.

Fat vs fat in the fight against obesity and diabetes

Using fat to treat obesity and obesity-related conditions, such as type II diabetes, may sound like a strange idea. Nevertheless, this is exactly what scientists are working to achieve. So the question is, why?

Screen Shot 2016-08-14 at 20.49.08

There are in fact two types of fat in the body. The more familiar of the two – white adipose tissue (WAT) – is the tissue we refer to colloquially as “fat” and associate with gaining weight. The second, and lesser known variety of fat, is brown adipose tissue (BAT). This unique tissue is densely innervated by the body’s sympathetic nervous system (SNS – initiator of the “fight or flight” response) and is well supplied by blood vessels. Originally believed only to exist in small mammals and human infants. BAT is used to produce heat without the need for shivering. This is particularly important for these organisms as a way of maintaining their core temperature in mild cold conditions.

The key to BAT’s ability to produce heat is a special protein called uncoupling protein 1 (UCP1), which is housed within the large number of mitochondria found in this tissue. In general, mitochondria act as the “powerhouses” of all cells, producing energy in the form of ATP which allows the cell to perform all its required functions.

However, in cold conditions, sensors on the skin send a signal to the part of the brain responsible for regulating body temperature. The brain, in turn, sends a message to BAT via the SNS, releasing noradrenaline and stimulating UCP1. Upon activation, UCP1 is able to “override” the usual ATP-synthesising function of BAT mitochondria, instead releasing energy as heat.

Importantly, this process of heat production requires a significant amount of energy to achieve. As the main fuel used by BAT are fat molecules, such as lipids and fatty acids, the idea that BAT could potentially be used to help reduce body weight in obesity is not as ludicrous as it may first appear. This is supported by the finding that both UCP1- and BAT-deficient mice models display increased weight gain.

Screen Shot 2016-08-14 at 20.48.58Despite evidence of the role BAT plays in controlling body weight in experimental rodent models, BAT was originally believed to be absent in adult humans. More recently, however, the use of radioactive tracer PET and CT scans have demonstrated the presence of functional BAT in adults in mild cold conditions using large cohorts of patients. Further analysis of PET and CT data has also uncovered an inverse association between BAT activity and BMI, with lower levels of activity in patients with severe obesity. Though the average human adult is estimated to possess just 50–80g of BAT, this mass is believed to use up to 20% of our daily energy intake, making BAT a desirable candidate to aid weight loss in obesity.

BAT has also been shown to express high levels of the glucose-transporter protein and displays comparable glucose uptake to muscle in response to insulin. Given that a key risk factor for type II diabetes is being overweight, and that this disorder is characterised by high blood glucose and insulin resistance, targeting BAT as a method of treatment is also under investigation. Notably, insulin-activated glucose uptake into BAT is significantly lower in those with obesity compared to subjects of a normal weight.

Given the mechanism of BAT activation discussed above, as well as the observation during PET and CT scans that BAT activity is inhibited when human subjects are warmed, the simplest way to stimulate this tissue for therapeutic reasons may be with the use of cold temperatures. In support of this theory, a small study involving healthy Japanese volunteers, who were repeatedly exposure to mild cold stimuli over 6 weeks, reported increased BAT and resulted in a loss of body fat in subjects. In another study, exposing healthy volunteers with active BAT to short-term mild cold led to increased insulin sensitivity.

Screen Shot 2016-08-14 at 20.49.22A second potential method of stimulating BAT with the aim to treat obesity or type II diabetes is via the SNS, specifically through beta-adrenergic receptors (βARs). It should be noted that anti-obesity medications which target these receptors have been tested in the past. However, these attempts failed due to unspecific activation of βARs throughout the body – particularly the heart – resulting in serious side effects, including increased blood pressure and heart rate. Nevertheless, the use of more specific βAR agonists have been used successfully to increase BAT activity and lessen obesity and insulin resistance in rodent models, without such side effects.

Alternatively, it may be possible to target the thermoreceptors on the skin responsible for sensing cold, known as TRPs. As cold exposure cannot be controlled easily, activating these receptors artificially may provide a more efficient way of using BAT to treat obesity or type II diabetes. Interestingly, there are already a number of foods which activate TRPs, including menthol in mint and capsaicin in chili peppers. Animal and human studies of capsinoids – non-pungent analogues of capsaicin – have demonstrated that these compounds increase active BAT and reduce body fat.

Despite sounding rather contradictory, adipose tissue may prove useful in the treatment of obesity and the obesity-related disorder, type II diabetes. Unlike its white counterpart, BAT actually uses up lipids and glucose from the body as fuel in order to produce heat. However, in doing so, evidence suggests this can also result in weight loss, reduced insulin resistance and lowered blood glucose, making it a potential treatment for obesity and type II diabetes. Possible methods of activating BAT for this purpose may include cold stimuli or agonists which target βAR or TRP receptors. It is still early days for this treatment possibility but the idea certainly isn’t as strange as it may first appear.

Post by: Megan Barrett

Save

Save

The functions of bioluminescence brought to light

When picturing the dead of night or the deepest depths of the ocean, we may be inclined to think of pure, impenetrable darkness. Yet nature has quite a different image in mind – one where darkness is pierced by flashes of light, warming glows and pulses of colour. These vibrant light displays are the result of a phenomenon known as bioluminescence – the production and emission of light from a living organism, which occurs when a molecule called luciferin combines with oxygen in the presence of the enzyme, luciferase. Apart from being undeniably beautiful to watch, there is increasing evidence to suggest bioluminescence has a number of important functions for an organism.

Screen Shot 2016-07-10 at 11.41.10Bioluminescence can occur in a broad range of organisms, from bacteria to fish (with the exception of higher vertebrates, e.g. reptiles and mammals), and is found across a variety of land and, more commonly, marine environments. In fact, around 60–80% of deep sea fish are thought to be bioluminescent. The pattern, frequency and wavelength (i.e. the colour) of light emitted can also differ by species and habitat. For instance, while violet or blue light is more common in deep water, bioluminescent organisms found on land tend to produce green or yellow light.

Bioluminescence lends itself to a number of functions – the first being for reproductive success. The most prominent examples of bioluminescence’s advantage in this area are in fireflies and glow-worms. In species of firefly where only the males are able to fly, the females attract their mate by emitting a constant glow which can be spotted by the males as they fly overhead. In other species, the male fireflies are also bioluminescent and produce a flashing light in response to the female’s glow. This results in a kind of “courtship conversation”. It has been suggested that the female’s preference may be determined by the frequency of male flashes, with higher flashing rates being more desirable. There are also a variety of fish in the deep ocean which appear to use bioluminescence to facilitate reproduction. Black dragonfish, for instance, are unusual in the fact that they emit a red infrared light, rather than the blue light common to deep-sea organisms. In doing so, however, dragonfish are able to use this light to seek out a mate in the darkness without alerting prey to their presence.

Screen Shot 2016-07-10 at 11.41.17In contrast to hiding from prey, as with the dragonfish, bioluminescence can also be used to attract prey. Deep in the Te Ana-au caves of New Zealand, fungus gnat larvae construct luminescent “fish lines” to lure other insects. These insects then become trapped on the sticky lines and become a tasty meal for the lurking larvae. Back in the ocean once more, there is evidence that a type of jellyfish, known as the “flowerhat” jellyfish, also uses bioluminescence to attract small fish (e.g. young rockfishes) on which it preys. These jellyfish have fluorescent patches on the tips of their tentacles. In experiments studying how tip visibility influences predation, it was found that significantly more rockfish were attracted to the jellyfish with visible fluorescence than when the fluorescence was indiscernible, highlighting the importance of this attribute to this organism in acquiring food.

Alternatively, bioluminescence may also be useful to protect an organism from predation. This can work in a variety of ways, from providing camouflage to acting as a warning signal to predators against the dangers of attacking. A good example of the latter of these functions can be seen in glow-worm larvae. Unlike adult glow-worms whose fluorescence aids courtship, glow-worm larvae emit light to warn predatory toads of their unpalatability. This has been demonstrated by researchers in Belgium who found that wild nocturnal toads were more reluctant to attack if dummies resembling the larvae were bioluminescent. In addition, bioluminescence can work to protect against predation by acting as a diversion technique. Some species of squid, for example, are able to release a luminescent secretion when under threat, confusing the predator so they can escape.

For now, the role of bioluminescence seems to be clearer in animals than in those organisms outside the animal kingdom, such as fungi or bacteria. There is, however, a recent study which has suggested a potential role for bioluminescence as a method of spreading spores for a certain variety of mushroom found in Brazilian coconut forests. The investigators in charge of this study were able to attract nocturnal insects using an artificial light which replicated the mushroom’s green bioluminescence. This did not happen when the light was switched off, suggesting light may be used to help this type of mushroom entice insects which can then disperse its spores.

Bioluminescence can provide its creator with a light in the darkness. It can help an organism to seek out, attract and successfully court a mate; lure unsuspecting prey to their doom, and warn off or divert the attention of predators when under attack. Yet while there are a many instances where the function of bioluminescence is fairly clear, as discussed here, scientists remain very much “in the dark” in other cases. This is particularly true for those organisms, such as fungi and bacteria, which do not belong to the animal kingdom. Nevertheless, with continued research and new discoveries forever being made, it is only a matter of time before these elusive functions are brought to light.

Post by: Megan Freeman

Save

“Accidents will happen” – when serendipity meets drug discovery

Few would argue that advancements in science require hard work, long hours and intellectual minds. But a few accidents along the way can certainly help as well. Serendipity, the accidental, but very fruitful, discovery of something other than what you were looking for, has played a significant role in drug discovery through the ages. Here are just three examples to whet your appetite…

Warfarin

Screen Shot 2016-05-29 at 21.56.59Today, warfarin is the main blood thinner (anticoagulant) used in the treatment of blood clotting disorders, heart attacks and strokes. However, its story began on the prairies of North America and Canada during The Great Depression. During the 1920s, financial hardship forced farmers in these areas to feed damp or mouldy hay to their livestock. As a result, many of these seemingly healthy cattle started to die from internal bleeding (haemorrhaging) – an illness later given the name ‘sweet clover disease.’ Although investigators were able to attribute sweet clover disease to the consumption of mouldy hay, the underlying molecule responsible for causing the disease remained a mystery. Consequently, the only potential solutions offered to farmers at the time were to remove the mouldy hay or transfuse the bleeding animals with fresh blood.

It was not until 1939 in the lab of Karl Link that the guilty molecule was isolated – an anticoagulant known as dicoumarol. This is a derivative of the natural molecule coumarin, responsible for the well-recognised smell of freshly cut grass. With the idea of creating a new rat poison designed to kill rodents through internal bleeding, Link’s lab investigated a number of variations of coumarin under the funding of the Wisconsin Alumni Research Foundation (WARF). One of these derivatives was found to be particularly effective. Thus in 1948, the rat poison warfarin was born, named after the Body who funded its discovery.

Despite its clear anticoagulating properties, reservations remained about using warfarin clinically in humans. This was until 1951 when a man who had overdosed on warfarin, in an attempt to commit suicide, was successfully given vitamin K to reverse the anticoagulating effects of the rat poison. Hence, with a way to reverse these, the production of a variation of warfarin for human use was developed under the name ‘Coumadin’. This was later successfully used to treat President Dwight Eisenhower when he suffered a heart attack, giving the drug widespread popularity in the treatment of blood clots, which has continued for the last 60 years.

Penicillin

Screen Shot 2016-05-29 at 21.57.08No article on serendipity in drug discovery would be complete without at least mentioning the discovery of the antibiotic penicillin by Sir Alexander Fleming. It was 1928 when Fleming returned to his laboratory in the Department of Systematic Bacteriology at St. Mary’s in London after a vacation. The laboratory was in its usual untidy and disordered state. However, while clearing up the used Petri dishes piled above a tray of Lysol, he noticed something odd about some of the Petri dishes which had not come into contact with the cleaning agent. The Petri dishes contained separate colonies of staphylococci, a type of bacteria responsible for a number of infections such as boils, sepsis and pneumonia. There was also a colony of mould near the edge of the dish, around which there were no visible staphylococci or at least only a few small and nearly transparent groups. Following a series of further experiments, Fleming concluded that this mould, which he named ‘penicillin’, was not only able to prevent the growth of staphylococcal bacteria, but also killed this type of bacteria.

Screen Shot 2016-05-29 at 21.57.15However, while today the value of penicillin is well recognised, there was little interest in Fleming’s discovery in 1929. In fact, in a presentation of his findings at the Medical Research Club in the February of that year, not a single question was asked. It was only when pathologist Howard Florey from Oxford happened across Fleming’s paper on penicillin that the true clinical significance of the drug was acknowledged. In 1939, along with Ernst B. Chain and Norman Heatley, Florey conducted a series of experiments in which they infected mice with various bacterial strains including staphylococcus aureus. They then treated half the mice with penicillin while leaving the others as controls. While the controls died, the researchers found that penicillin was able to protect the mice against these types of infection. These exciting findings appeared in the Lancet in 1940 and eventually led to the commercial production of the drug for use in humans in the early 1940s in the United States. Since then, an estimated 200 million lives have been saved with the help of this drug.

Viagra

Despite today being widely known for its ability to help men with erectile dysfunction, Viagra was not originally designed for this purpose. The drug company Pfizer had initially hoped to develop a new treatment for angina, one that worked to reverse the constriction of blood vessels that occurs in this condition. However, clinical trials of Viagra, known then as compound UK92480, showed poor efficacy of the drug at treating the symptoms of angina. Nonetheless, it was highly effective in causing the male recipients in the trial to develop and maintain erections.

Upon further investigation, it was identified that, rather than acting to relax the blood vessels supplying the heart, Viagra caused the penile blood vessels to relax. This would lead to increased blood flow to this appendage, resulting in the man developing an erection. Following this unexpected discovery, Pfizer rebranded Viagra as an erectile dysfunction drug and launched it as such in 1998. Today, Viagra is one of the 20 most prescribed drugs in the United States and has, I’m sure, left Pfizer thanking their lucky stars they were involved in its serendipitous discovery.

The accidental discoveries of warfarin, penicillin and Viagra are just three instances of serendipity in drug discovery from a long list – one far longer than can fit in this article. Nevertheless, these cases provide excellent examples of how unplanned factors such as mysterious deaths, unexpected side effects during drug trials, and even just plain untidiness, has led to significant advances in the field of drug discovery. Such serendipitous events have given us a range of indispensable medications which we still rely on today, and I’m sure will continue to provide new and exciting breakthroughs in the future.

Post by: Megan Barrett

When sleep becomes a person’s worst nightmare!

Screen Shot 2015-11-30 at 19.39.49At the end of a long, hard day, many of us relish the comfort of our beds. We snuggle under the covers and with a satisfying sigh, welcome the sweet onset of sleep. But for some people, sleep is not such a pleasant experience. Here’s three conditions likely to turn sleep into someone’s worst nightmare:

1) Sleep paralysis

Waking up and not being able to move or speak is a terrifying prospect. But for some people, this nightmare can actually be a reality. People who suffer from sleep paralysis may experience periods, either as they wake up or when they are falling asleep, when they feel conscious but are unable to move a muscle, sometimes for up to a few minutes. During this time, the individual may also experience a crushing sensation in their chest or disturbing hallucinations.

Despite being described in various ways throughout history, the term “sleep paralysis” was first coined in 1928 and is believed to be caused by a disturbance in a person’s normal sleep pattern. Briefly, our sleep occurs in approximately 90 minute cycles consisting of two stages: the non-rapid eye movement (NREM) stage, which makes up about 75–80% of our sleep, and the REM stage. It is while we are in REM sleep that we experience our most vivid dreams. During this sleep-stage, our brain also sends signals to our muscles inhibiting movement. People with sleep paralysis tend to wake during REM sleep, therefore finding they cannot move or speak as their muscles are still paralysed. As a consequence, this disorder is often associated with risk factors that affect one’s sleep (e.g. stress and narcolepsy) and treatment tends to focus on addressing the related conditions.

2) REM behaviour disorder

Screen Shot 2015-11-30 at 19.39.57In contrast to sleep paralysis, REM behaviour disorder is characterised a by a lack of muscle inhibition while a person is in the REM stage of sleep. Consequently, people with REM behaviour disorder tend to act out their dreams physically and verbally (e.g. kicking out, screaming, etc.). This can be both distressing and potentially dangerous to themselves and any poor souls sharing a bed with them. In fact, 35–65% of people with this condition report having caused injury to themselves or their bed partner. As one may expect, diagnosis of REM behaviour disorder often follows as a result of such injuries.

REM behaviour disorder usually occurs in people over 50 years old, and may be a risk factor of disorders associated with neurological decline (e.g. Parkinson’s disease). At present, treatments for the condition focus on symptom control using medication (e.g. clonazepam) and ensuring one’s sleep environment is safe.

3) Sleep apnea

Sleep apnea is a potentially serious, and highly distressing, condition where someone will intermittently stop breathing repeatedly while they sleep. This is often accompanied by heavy snoring and disrupted sleep resulting in excessive daytime tiredness. There are two types of sleep apnea: obstructive sleep apnea which, as the name suggests, occurs when a person’s airway becomes blocked due to the muscles and soft tissue collapsing during sleep; and central sleep apnea, a rare form of the condition, where the brain fails to signal to the muscles telling them to breathe. If left untreated, both forms of sleep apnea can lead to serious medical conditions, such as high blood pressure (hypertension), low oxygen blood levels (hypoxemia) and stroke.

Screen Shot 2015-11-30 at 19.40.11Diagnosis of sleep apnea is primarily based on measuring the number of times a person stops breathing per hour (≥15 or ≥5 in combination with other symptoms e.g. excessive daytime tiredness) while they sleep. Doctors will also look for the presence of risk factors, such as obesity and high blood pressure, as indicators of the condition. Sleep apnea is usually a lifelong condition but can be managed in a number of ways from making lifestyle changes (e.g. losing weight or sleep on one’s side) to using a therapy called continuous positive airway pressure (CPAP), a mask linked to a ventilator which applies mild air pressure to keep the airways open.

Despite the obvious differences between these three conditions, sleep paralysis, REM behaviour disorder and sleep apnea all have one thing in common – they make going to sleep distressing, and sometimes harmful, for those living with them. So don’t take for granted a good night’s rest; for people with these sleep disorders, sleep may be a nightmare waiting to happen.

Post by: Megan Barrett

Fish and their sun-protective “superpowers”

Screen Shot 2015-08-28 at 17.22.27With the summer holidays in full swing and the sun making (intermittent) appearances, it’s time to start lathering on the suntan cream! Despite the hassle and general “greasiness” of these products, suntan cream is essential to protect our skin from damaging ultraviolet (UV) A and UVB rays which can lead to sunburn, premature skin aging and cancer. But while we’re busy trying not to stick to our beach towels, it may be interesting to note that not all organisms share our sticky plight: many bacteria, algae and marine invertebrates are known to produce their own sun protection. Now, research suggests that even fish may share this useful ability.

The sun is vital for maintaining life on Earth. It provides us with essential light and heat, without which our planet would be a lifeless rock covered in ice. But sunlight comprises different forms of light, including UV radiation which is Screen Shot 2015-08-28 at 17.22.38invisible to the naked eye. It is this UV radiation (specifically the UVA and UVB forms) that can be harmful to our health, causing damage to the skin’s DNA. In humans, this can result in detrimental DNA mutations occurring, leading to various skin cancers such as basal cell carcinoma and squamous cell carcinoma.

But UV radiation is also harmful to other organisms, and many bacteria, algae and invertebrates that inhabit marine environments are exposed to high levels of sunlight (e.g. reefs, rock pools, etc.), meaning they need to protect themselves against this damaging UV radiation. While we humans need to lather on the suntan cream, these clever organisms produce their own sunscreens in the form of mycosporine-like amino acids and gadusols, which are able to absorb UV radiation and provide photoprotection. Such compounds are made by an enzyme called DDGS for short, a member of the sugar phosphate cyclase “superfamily” of proteins which are involved in synthesising natural therapeutic products (e.g. the antidiabetic drug acarbase).

While mycosporine-like amino acids and gadusols have been found in more complex marine animals, such as fish, it was originally thought that these compounds had been acquired through the animal’s diet. Recently, however, a group of scientists from Oregon State University in the United States have discovered that fish can produce gadusol themselves. Interestingly, this seems to be achieved through a different pathway to that used by bacteria.

Screen Shot 2015-08-28 at 17.22.45Rather than DDGS, the group found that fish (in this case, zebrafish) possess a gene responsible for making an enzyme similar to another member of the sugar phosphate cyclase superfamily, EEVS. This EEVS-like gene is found grouped with a functionally unknown gene termed MT-Ox. In fact, the researchers were able to produce their own gadusol by adding both genes to a modified strain of E Coli and growing the cells in an environment rich in the necessary components for gadusol production. This suggests the EEVS-like and MT-Ox genes are involved in the production of this UV-protective compound in fish. Importantly, both the EEVS-like and MT-Ox genes are expressed during embryonic development, providing further evidence that fish are able to synthesise gadusol, rather than simply acquiring the compound through their diet.

Unfortunately for us, the EEVS-like and MT-Ox genes are not present in the mammalian or human genome, but they do appear in other animals including amphibians, reptiles and birds, inferring that the production of UV-protective compounds may be even more widespread than once thought. And while this does not save us from the dreaded, yet essential exercise of putting on suntan cream, it certainly acts as a friendly reminder that we may not be as evolutionarily superior to these animals as we might think…which I suppose is a good thing.

Post by: Megan Barrett

Keep your friends close – It may be good for your health

Screen Shot 2015-06-28 at 10.43.48We humans are social creatures. We love to meet up with our friends, family and partners, and rely on them for support through the good times and the bad. But it turns out we may also rely on our loved ones for our health. Our social ties may be helping us to keep sickness at bay and aiding a longer happier life.

There is no shortage of studies that suggest a potential link between feelings of social isolation and declining health in humans. In a study of 2,101 adults aged 50 years and over, a US-based group of scientists found that over a 6-year period, feelings of loneliness predicted higher rates of depression, a reduction in self-reported health and an increased risk of mortality. In 2010, an analysis of 148 separate studies showed that among the 300,000 plus participants, those with stronger social ties had an increased likelihood of survival.

So what is behind this link? Loneliness is well documented as a risk factor for co-morbidities such as increased blood pressure, obesity, lowered immune response, disrupted sleep, depression and cognitive decline in the elderly. But, is this simply due to the fact that negative feelings of loneliness lead us to take less care of ourselves, resulting in worse health? Or is there something more “biological” going on?

Although the precise biological mechanisms behind the impact of loneliness on our health remain unclear, there is a growing body of evidence to suggest this feeling may affect a Screen Shot 2015-06-28 at 10.43.57number of key systems in our bodies, including the hypothalamic-pituitary-adrenocortical (HPA) axis. The HPA axis is responsible for the release of important hormones called glucocorticoids – cortisol in humans and corticosterone in rodents. These hormones help regulate such things as our sleep, blood sugar, heart function and immune response. However, chronic high levels of glucocorticoids have also been linked with disease. Long-term increased levels of cortisol, for example, have been associated with high blood pressure, diabetes and an increased susceptibility to infection, as well as a number of other chronic diseases.

Interestingly, both urinary cortisol levels and HPA activation have been found to be increased in individuals who feel lonely, with higher levels of loneliness associated with greater cortisol increases. However, this effect only appears to be significant in individuals who are chronically lonely, suggesting the length of time one feels lonely for may play an important role in how this impacts upon our health.

Given the detrimental effect loneliness appears to have on our physical and mental well-being, one must wonder what the function of this feeling is? What is the benefit of making us feeling bad?

Screen Shot 2015-06-28 at 10.44.11From an evolutionary point of view, the aversive nature of loneliness is pretty logical. When we feel socially isolated or our social ties start to waver, we get the desire to reconnect with others. Back when we lived in tribes, maintaining social relationships allowed us to protect each other from predators and hunt more efficiently, thus ensuring the survival of our species. Similarly, our desire to find a mate allowed us to reproduce and ensure our genetic legacy. This is strengthened by an innate desire to care for our children as without a parent’s nurture and love, children would die.

In this respect, it also makes sense that we, as a species, are not alone in our social nature. Studies in social mammals, such as rats and rhesus monkeys, have found that social isolation of such animals can lead to anxious or depressive behaviour, altered physiology (e.g. blood pressure, inflammation, immune responses, etc.) and mortality. Social isolation has been shown to promote obesity and lead to type 2 diabetes in mice and even in the fruit fly, Drosophila melanogaster, isolation has been shown to reduce lifespan.

So, it seems loneliness may not just be an unpleasant feeling we all experience from time to time. Evidence suggests feelings of social isolation – particularly if these are chronic – could put us at risk of high blood pressure, diabetes and other health-related co-morbidities, not to mention possibly sending us to an early grave! Despite the negative feeling of loneliness coming with an evolutionary function – i.e. promoting the survival of the species – it certainly seems to be a feeling one may want to avoid. So pick up a phone and call your friends, reach out to your family and organise a meet up. Most importantly, keep those all important social ties strong – it may be good for your health!

Post by: Megan Barrett

The unsung story of amusia

image03We’ve all seen those contestants on shows like ‘X Factor’ or ‘Britain’s got Talent’ who are adamant they can sing, when the evidence unfortunately suggests differently. We ask ourselves how it is they can’t tell or we leap to the conclusion that it must be a set up. And, while I admit this may sometimes be the case, bear in mind there could also be a medical diagnosis to explain the situation. These individuals may have a condition known as amusia.

More colloquially called “tone deafness”, approximately 4% of the population suffer from amusia. This differs from the self-diagnosed 15–17% who believe they have the condition but are just poor singers – the difference being that poor singers are aware of their difficulty while true amusics are not. Amusics also tend to find music unpleasant to listen to, leading them to try to avoid situations in which they may be exposed – a rather difficult feat given the popularity and prevalence of music in modern society. Amusia can be congenital (i.e. the individual is born with the condition) or acquired (as the result of a brain injury or stroke). While amusia may seem less debilitating than other potentially socially isolating conditions such as dyslexia or dyspraxia, it can also cause an individual a great deal of stress, lead to social stigma and may affect an individual’s ability to process and learn tonal languages (e.g. Mandarin or Thai).

The term amusia was coined back in 1888 by a doctor called August Knoblauch, following the first diagnosis of the condition 10 years earlier. Nowadays, amusia is diagnosed by a set of six tests, collectively known as The Montreal Battery of Evaluation of Amusia (MBEA), which examine an individual’s musical ability for contour, scale, pitch interval, rhythm, meter and musical memory.

As yet, there is no consensus on the neurological causes for amusia but, a key feature apparent in those with the condition seems to be a deficit in fine-grained pitch discrimination (i.e. an individual’s ability to process a small change in pitch, such as a tone or semi-tone). Based on a number of studies which imaged the brains of amusics and non-amusics, two areas of the brain known to be involved in processing music appear to be affected in amusia – the auditory cortex (AC; especially the right AC) and the inferior frontal gyrus (IFG). These studies found a difference in cortical thickness of the AC and IFG, as well as a reduction in brain activity in the IFG of amusic subjects compared to matched controls. Amusics also showed reduced connectivity from the AC to the IFG (via a group of fibres called the arcuate fasciculus) which correlated with the degree of tone deafness of the individual, offering further evidence for the involvement of these areas in the condition.

Screen Shot 2015-05-03 at 21.22.18

Whether someone with amusia can be “rehabilitated” or trained to improve their ability to process tone and to sing in tune is also up for debate. One small study in 2012 which provided five amusics with a 7-week course held by a professional singing teacher reported that four of the five subjects showed improved MBEA scores at the end of the study. However, whether the improvements were significant enough to warrant the time and resource invested into this study has been questioned.

It’s no secret that the term tone deafness is overused. But the condition, amusia, is a long-standing medical diagnosis which can have a significant effect on an individual’s social and educational life. Despite ongoing debate, the areas of the brain involved in music processing (the AC and IFG) differ both physically and in terms of activity in amusics compared with non-amusics. So next time X Factor hosts a “not so musically gifted individual”, I for one will hold my cynicism and consider a more medical reason before assuming it’s a set up.

Post by: Megan Barrett