Aaahh!! Real Monsters!: How parasites and pathogens colonised fiction.

After the recent torrent of zombie everything and anything, it might feel like science fiction is all about done with weird parasites and diseases.  But the mystery and power of organisms sometimes invisible to the human eye has inspired fiction for decades, including some of the most famous Sci-Fi monsters. I’d take a wager that we’re still a few undead away from total eradication of fictional parasites.

Settle in, pull on a hazmat suit and a facemask, and we’ll delve elbow deep into the parasitic ooze of film, television and video games to take a good look at some of the best parasites and pathogens Sci-Fi has to offer.

Xenomorph or Alien – Alien franchise
Best get the big guns out right away. Alien is one of my all-time favourite films, centred around one of cinema’s most iconic and terrifying Sci-Fi monsters.

Xenomorphs they steal resources from their host from within the host’s body, so we can call them endoparasites. They’ve got a pretty complex life cycle: some life stages needing a host and some able to live in the environment. This mixture of host dependency is seen quite often in real parasites, in human-infective worms such as the roundworms Schistosoma and Ascaris, and flatworms like Fasciola. Like the Xenomorph, these worms use their human host as a place to reproduce or develop, whilst the free living stages search through the environment for new hosts to infect.

 Putting my well-practised, “parasite-life-cycle-specific” drawing skills to good use even years on from all of my undergraduate exams.
Putting my well-practised, “parasite-life-cycle-specific” drawing skills to good use even years on from all of my undergraduate exams.

Real parasitic worms are fairly scary too, responsible for a huge burden of severe and chronic disease especially among the world’s poorest populations. Although we can at the very least be grateful that their method of exiting the host as eggs in the faeces is a little less violent than the “chestbursting” exit of the Xenomorph.

Genophage – Mass Effect video game series
Some of our fear of pathogens is really a result of our fear of our own misuse of them, as bioweapons. Genophage is a phage-like virus in the Mass Effect universe used against the Krogran race to control their population by the Citadel, an intergalactic governing body.

Phages are small, simple viruses that infect bacteria. In doing so, they are able to insert genetic material from themselves or other host cells, into that of their current host. The modus operandi of the genophage virus is not too dissimilar, as it inserts a specific mutation into all the body cells of Krogans that prevent pregnancies carrying to term.

Phages have the power to turn the fairly unpleasant Escherichia coli bacterium into a thoroughly horrible and occasionally fatal O157:H7 form. Scientists are now trying to harness this ability, but for much less nefarious purposes. It’s hoped that modified phages could provide a new mechanism of delivering vaccines or medical treatment against certain infections: seriously cool stuff.

Ceti Eel – Star Trek II: The Wrath of Khan

As a complete non-Trekkie, my one-time viewing of 1982’s The Wrath of Khan didn’t give me a full idea of the wonderful world of Star Trek zoology (TRIBBLES. LOOK AT THEM).

 TRIBBLES. Star Trek: The Original Series. Desilu productions. Still taken from Wikimedia Commons.
TRIBBLES. Star Trek: The Original Series. Desilu productions. Still taken from Wikimedia Commons.

From that one film I was introduced to Ceti Eels, fantastic parasites that set off my love for the gory and gruesome in a manner only paralleled by real parasites on the level of loaiasis and Chigoe fleas. After incubating in the body of its parent, the developed Ceti Eel enters a host through the ear, worming its way into the skull cavity and attaching to the cerebral cortex. As you can imagine this is hugely painful.

The Ceti Eel then unveils its crowning weapon: mind control. Or to be more precise, the infected are left susceptible to suggestion – fantastic news for the enigmatic antagonist, Khan.

Mind control must surely be confined to Sci-Fi? Not so. Both Ophiocordyceps fungus and Dicrocoelium fluke worms can manipulate their host’s behaviour to suit their own ends. The juvenile stage of the fluke is released by snails as cysts in their slime. Ants eat said slime for its moisture. Once in the ant, one key worm gets up to the central nerve structure of the ant, and convinces it to climb to the top of a blade of grass and clamp down, waiting right on show to be accidentally eaten up by a cow or sheep. The worm drives the ant to get itself eaten. The real mind-controlling worm is even better at its job than the fictional eel!

Why are there so many parasites in Sci-Fi (and why are they all so damn cool)? Art and culture are vital for exploring and communicating the world around us. This stands just as true for science fiction, and just as true for the gory and the weird that nature likes to throw at us. The strange and exciting parts of nature are what take our piqued interest, and drive us to fascination and awe. So, while the current zombie tidal wave might just be past its peak, I reckon as long as we have fantastic, powerful, utterly disgusting parasites from which to draw inspiration, we’re going to be telling stories about them for a long time to come.

This post, by author Beth Levick, was kindly donated by the Scouse Science Alliance and the original text can be found here.

References: fictional
http://en.wikipedia.org/wiki/Alien_%28creature_in_Alien_franchise%29
http://masseffect.wikia.com/wiki/Genophage
http://en.wikipedia.org/wiki/Khan_Noonien_Singh
http://en.memory-alpha.wikia.com/wiki/Ceti_eel

References: better than fictional
http://en.wikipedia.org/wiki/Helminths
http://en.wikipedia.org/wiki/Bacteriophage
http://news.nationalgeographic.com/news/2014/10/141031-zombies-parasites-animals-science-halloween/
http://en.wikipedia.org/wiki/Dicrocoelium_dendriticum
http://en.wikipedia.org/wiki/Ophiocordyceps_unilateralis

Science and Religion – Awkward Bedfellows Through The Ages

Science and religion haven’t always seen eye-to-eye over the centuries. The ways in which they impact upon one another have changed hugely, with each civilisation and religion having its own views and rules. Here, I’ll take a look at 3 major moments in history that showcase this ever-changing, and often tumultuous, relationship.

The Ancient Egyptians (~3150 BC to ~30 BC)

The Egyptians didn’t have what we would call ‘scientific understanding’. Rather than deducing earthly and natural meanings for the phenomena they observed, they attributed everything to their Gods. Yet they learned an incredible amount about the world in their bid to understand their Gods’ wishes and to use natural phenomena in the pursuit of worship.

One notable example of this was the way in which they used the stars, mapping the paths of certain celestial bodies across the sky with such accuracy that they were eventually able to predict their movements throughout the year. Much of our knowledge of the night sky has stemmed from Egyptian observations, and so their importance cannot be overstated.

Karnak Temple. Photo credit: Vyacheslav Argenberg via Flickr. Shared under Creative Commons License 2.0
Karnak Temple. Photo credit: Vyacheslav Argenberg via Flickr. Shared under Creative Commons License 2.0

A fantastic application of their knowledge can be seen in the Karnak Temple in Luxor, built for the Sun God, Amun Re. The Egyptian astronomers, or ‘cosmologists’, realised that the sun rises at different points along the horizon, depending on the time of year. So, when building the temple, the architects positioned the building so that, on the Winter Solstice, the sun rises directly between the 2 front pillars, filling the temple with light. By all accounts it is a phenomenal sight and one that I’d love to see some day.

However, whilst the Egyptian architects and thinkers were considered great minds, they were always considered inferior to the Gods they sought to worship. Religion dominated the culture, leaving little perceived need for Science.

The Ancient Greeks (~800 BC to ~150 BC)

Arguably, it wasn’t until the Ancient Greeks developed the first recognisable scientific methodology that things began to change. Amongst the Greeks were some of the greatest minds ever known, including Pythagoras, Archimedes and Aristotle. They began to study the reasons behind phenomena, not content to just accept them as the will of the Gods, gaining reputations for being geniuses in the process, even in their own time.

Hippocrates. Engraving by Peter Paul Rubens - Courtesy of the National Library of Medicine. Licensed under Public Domain via Wikimedia Commons
Hippocrates. Engraving by Peter Paul Rubens – Courtesy of the National Library of Medicine. Licensed under Public Domain via Wikimedia Commons

The Ancient Greeks’ religion overlapped somewhat with that of the Ancient Egyptians. Their often-similar Gods were also thought to influence most aspects of life. As such, there were some things that people just weren’t ready for science to explain. For example, Hippocrates – author of the Hippocratic Oath upon which Western medicine is founded – realised that disease wasn’t a divine punishment. It was, in fact, borne of earthly causes.

Obviously, such revelations didn’t always go down well. Hippocrates, whilst advancing his society’s understanding of the world, had just diminished the role of the Gods in that world. Eventually, however, these ideas took hold and arguably improved Science’s standing in society. Religion remained an integral part of society, but Science had now proven its worth and its role in society would only increase during the transition to enlightenment.

The 19th Century

By the 19th century, in Western cultures contradicting religious teachings was still proving massively controversial. In Christianity, for example, it was an accepted fact that God created the Earth, the Moon and the Stars, as well as all of Life.

Charles Darwin. Photo credit: Wikimedia Commons. Shared under Creative Commons License 3.0
Charles Darwin. Photo credit: Wikimedia Commons. Shared under Creative Commons License 3.0

Despite rumblings amongst some scientists that this wasn’t the case, scientific establishments had a close relationship with the Church of England, so such contradictory thinking never really took hold. A certain Mr Darwin, however, was so convinced of the importance of his work, ‘On the Origin of Species’, that he had it published on 24th November 1859, courting massive controversy. The Church, naturally, rejected the theory, whilst the scientific community was split.

The general public were caught in the middle of a fascinating stage in the relationship between Science and Religion. Should they trust the Church, which held such sway in their lives, or should they trust the ever-growing number of scientists, trusted and revered minds, who dared to disagree with the Church? For their part, scientists were now forced to dig deeper and drive scientific understanding even further in an effort to answer the questions to which the public demanded answers.

The product of all this is the world in which we live now, where Science is driving forward understanding at an ever-increasing pace. Religion remains important in many people’s lives but I would argue that, for many, Science has an even greater importance in their lives as it seeks to offer tangible evidence-based answers to the questions we have about the universe. The question now is how the relationship between Religion and Science will change in the future. It is a dynamic relationship, no doubt, with time and location playing massive roles in its development. Only time will tell how they will get along a century from now…

This post, by author Ian Wilson, was kindly donated by the Scouse Science Alliance and the original text can be found here.

Diabetes & Periodontitis – A Deadly Combination

People are unaware that diabetes mellitus, either type 1 or type 2, goes hand in hand with increased susceptibilities to oral health problems. Even diabetics themselves know little about the risks of bacterial infections such as Porphyromonas gingivalis, a primary cause of periodontitis; the correct term for gum disease. Although P. gingivalis is not normally found in subjects with good dental health, the presence of other bacteria is far more common. Streptococcus mutans and Streptococcus gordonii are both found within the oral setting and form biofilms on the tooth surface. Regular brushing and flossing removes these unwanted visitors but if the accumulated bacteria remain undisturbed for a long period of time they can begin to destroy the gum tissue surrounding the teeth. Interestingly this is where the P. gingivalis comes in. A shift in normal ecological balance in the microenvironment allows the bacteria to act as a secondary invader of the gums, and more specifically the gingival sulcus, the part where the tissue contacts the tooth. Colonisation of P. gingivalis arises via its ability to adhere to salivary molecules, matrix proteins in the gum and other bacteria present in the mouth. It is clearly an opportunistic pathogen.

Figure 1 - P.gingivalis: a dental nightmare
Figure 1 – P.gingivalis: a dental nightmare

So, why do diabetics have an increased risk of developing periodontitis? Well, Advanced Glycation End Products (AGEs) arise from chronic hyperglycaemia and therefore are common in diabetes. It is these glycated proteins or lipids which have been shown to impact on periodontal deterioration. Although the exact mechanism behind the interactions of AGE with the disease are unknown there is general consensus suggesting a couple of important points;

  • An accumulation of AGEs affects the host immunological response. The products can disrupt an important nuclear transcription factor called NF-κβ, one which is involved in many inflammatory responses. IL-6 and TNF-α are also just two important pro-inflammatory cytokines which have been shown to be upregulated in the presence of AGEs.
  • AGEs will not only upregulate the production of certain cytokines, they also affect the chemotactic properties of mono and polynuclear cells. This enhances the inflammatory response at a given site of infection, in this case at the gums and surrounding tissue.

One final problem in diabetic patients is a drop in salivary pH. Xerostomia, or hypo-salivation is a main cause of the low pH. Maintaining the correct level of fluid in the body perhaps is the greatest problem for individuals with diabetes mellitus. The presence of AGEs and glycated haemoglobin, the latter being another result of high blood sugar levels, disrupts the balance of fluid and electrolytes in the blood stream. Diabetes is a condition that is associated with polyuria (frequent urination), which occurs because the excessive glucose found in the blood changes the normal osmolarity gradient within the body. Simple GCSE Biology states that water will move from an area of high concentration to low concentration. Therefore the increased movement of water into the bloodstream will effectively force the kidney to produce more urine. It’s a vicious cycle – high glucose levels mean more urine produced, causing the person to become dehydrated which leads onto hypo-salivation, leaving an environment perfect for bacterial infection.

 Figure 2 - Periodontitis - A problem for all, but one that is more worrying for diabetics
Figure 2 – Periodontitis – A problem for all, but one that is more worrying for diabetics

The low pH and reduced salivary rate contributes to an increase in tooth decay and as a consequence bacterial/fungal infections are more common in individuals with diabetes mellitus. This is because most oral bacteria and yeast thrive in the acidic conditions of the mouth, the reason why dental experts warn against sugary diet rich in carbohydrates – the main source of food for all mouth dwelling species. This is an alarming problem for experts and scientists worldwide, with an estimated 1 in 3 individuals with either form of diabetes mellitus having some degree of periodontitis during their lifetime. Of course the deterioration of dental health concerns everybody, but more attention must be paid to those that are at a higher risk. Managing the condition as a whole will pay dividends but are there any further precautions which should be taken to preserve the oral wellbeing for diabetics? This remains the most difficult question. Antimicrobial management and regular periodontal treatment is common in the general population, but both should be more prevalent in controlling diabetes related infections.

This post, by author Jason Brown, was kindly donated by the Scouse Science Alliance and the original text can be found here.

References
Goyal, D. et al (2012) Salivary pH and Dental Caries in Diabetes Mellitus. International Journal of Oral & Maxillofacial Pathology. 3(4):13-16
Griffen, AL. et al (1998) Prevalence of Porphyromonas gingivalis and Periodontal Health Status. J Clin Microbiol. 36(11):3239-3242
Lamont, RJ. Jenkinson, H. (1998) Life Below the Gum Line: Pathogenic Mechanisms of Porphyromonas gingivalis. Microbiol. Mol. Biol. Rev. 62(4):1244-1263
Takeda, M. et al (2006) Relationship of Serum Advanced Glycation End Products with Deterioration of Periodontitis in Type 2 Diabetes Patients. J.Periodontol. 77(1): 15-20.

The dolphins that lend a helping flipper

Interactions between humans and animals can happen on many levels, but it is rare for the human to feel to be the less intelligent half of the relationship. Yet, when humans and dolphins meet, this can often seem to be the case. Generally, humans feel to be the master race, in control and superior to other animals. However, our encounters with dolphins can often demonstrate how they may operate on a level more similar to ours than we realise.

 The Common Dolphin. Photo Credit: NOAA NMFS via Wikimedia Commons
The Common Dolphin.
Photo Credit: NOAA NMFS via Wikimedia Commons

Some pods of wild dolphins have very interesting interactions with small fishing villages, especially in Brazil. Each morning the fishermen lay out their nets ready for the fish and each morning the local dolphin pod arrives and proceeds to herd fish towards the fishermen. If this behaviour wasn’t unusual enough, the dolphins have even begun signalling to the fishermen to tell them when to throw their nets using a system of fin slaps against the surface of the water. This coordination ensures the fishermen have full nets after a very short time.

This behaviour wasn’t trained or instructed by mankind; it is completely natural. But what do the dolphins gain? This is where it becomes a little less clear. Some speculate that they benefit by having an easy time of catching the fish that are trying to escape, but this isn’t known for sure.

 Painting of dolphins from the Bronze Age in Crete.  Photo Credit: H-stt via Wikipedia
Painting of dolphins from the Bronze Age in Crete.
Photo Credit: H-stt via Wikipedia

All that is known is that this strange working relationship is a natural occurrence that will continue on. The fishermen will teach their sons to watch for the dolphins’ signals, and the dolphins will teach their calves to herd the fish.

When humans enter the sea we are, in a sense, invading the dolphins’ home. Yet, even when we place ourselves outside of our natural habitat and get into difficulty, instead of ignoring us or despising us for intruding on their world dolphins are well known for lending a flipper. Stories can be traced back to Ancient Greek legends of dolphins rescuing sailors. This isn’t just a myth though – more recent stories of dolphins staying with lost divers can be found from all around the world.

Here is an instance where the dolphins aren’t just interacting with humans freely but where they are also going out of their way to help us when we’re in distress. They have been witnessed attacking sharks that are threatening people in the water. But, again, why? Why are dolphins choosing to do this? They could be the first animal that has ever shown true altruism (besides humans).

Photo Credit: Claudia14 via Pixabay. Image used under Creative Commons Deed CC0
Photo Credit: Claudia14 via Pixabay. Image used under Creative Commons Deed CC0

The dolphins could drive fish into nets to gain an easy meal, but protecting humans doesn’t show a clear benefit for them. All other animals allow the ecosystem to flow as normal and will not interfere with its course. In these two examples, however, the dolphins have chosen not just to intervene but to intervene to help one species at the expense of another. They drive fish to their deaths so that we may catch them. They stop the sharks from having an easy meal to save the lives of humans.

We speculate about the degree of intelligence dolphins possess and it is well recognised that they are intelligent creatures; so perhaps they are intelligent enough to understand us better than we think. Perhaps, similarly to humans, there are both good and bad dolphins. We hear of dolphins rescuing people only from those that were rescued; we don’t hear about the people that drown because a pod of dolphins ignored them. Some speculate that they are acting more from choice rather than instinct, which would mean they have a higher level of awareness than we first realised.

Unless we can decipher the dolphins’ communication techniques, something we have been trying to do since the 1960’s, we may never know why these magnificent beings occasionally go out of their way to help us.

This post, by author Jennifer Rasal, was kindly donated by the Scouse Science Alliance and the original text can be found here.

References

Epigenetics – Putting the “epi” in epic

The latest craze in the world of science is to talk about epigenetics. You may have heard about it on TV or read about it in the newspapers, quite probably associated with some wonder cure or a way of shaking off those pounds without having to do anything.

Epigenetics is an extremely young area of interest in biology. It differs from good old-fashioned genetics in that it does not concern itself with the DNA sequence. Instead, it deals specifically with how chemical modifications made to the DNA and/or the proteins with which it associates (histones) can affect gene expression. It is an area of great interest because this regulation can have quite dramatic consequences, despite being relatively short-lived. These chemical modifications can be made and unmade very quickly, and thus ‘kick-in’ rapidly, yet can be triggered by simple changes in factors such as diet or exercise.

DNA is a long chain of individual molecules called nucleotides, which have three main parts: a deoxyribose sugar, a phosphate group and a base. There are four possible bases, which may be found in DNA (G, A, T or C) and certain sequences of these bases are used to encode proteins.

DNA molecules are enormous in length, as you might imagine given that they encode a human being. [Insert dubious statistic about length of DNA and distance to the Moon & back]. This presents a logistical challenge, because this information has to be readily accessible so that it can be read and copied to make proteins, yet it must also be stored away and protected within the small space of a cell’s nucleus.

The way in which nature has achieved this is by developing protein molecules around which the DNA can wind – the histones. Long chains of DNA wound around histone complexes coil and wind up even further, ultimately giving rise to the familiar ‘X’-shaped chromosomes that are seen during cell division (Figure 1).

Figure 1. Zooming out from DNA (1), to DNA wrapped around histones (2), through to an entire X-shaped chromosome formed from lots of DNA wrapped around lots of histones (5). Image source: Wikimedia Commons (Author: Magnus Manske). Image used under Creative Commons License 3.0)
Figure 1. Zooming out from DNA (1), to DNA wrapped around histones (2), through to an entire X-shaped chromosome formed from lots of DNA wrapped around lots of histones (5). Image source: Wikimedia Commons (Author: Magnus Manske). Image used under Creative Commons License 3.0)
 Figure 2. DNA wrapped around a histone (Image source: Wikipedia (Author: PDBot). Image used under Creative Commons License 3.0)
Figure 2. DNA wrapped around a histone (Image source: Wikipedia (Author: PDBot). Image used under Creative Commons License 3.0)

The phosphate groups carried within the backbone of the DNA give it a strong negative charge. Figure 2 shows how the protein has many positively-charged ‘tails’ reaching out towards the coiled DNA. These opposite charges attract to keep the DNA tightly wound and stable. When the time comes that some of this DNA needs to be accessed to be read, there are enzymes that attach modifications (e.g. methyl groups –CH3) to the protein’s ‘tails’ to remove their positive charges. These modifications are completely reversible and provide flexibility in regulating which genes can be activated at a given time.

Methyl group modifications can also be attached to the bases within the DNA. This is yet another element of epigenetics and it works in a similar way to histone modification. These groups recruit proteins that block the DNA-reading machinery from accessing the DNA.

Why is this important?

This system adds a sophisticated level of control to gene expression and regulation. This is part of what allows us, as multicellular organisms, to exist. Breakdown of this control can lead to disease and has been shown to have an important role in cancer. Harnessing the power of the ‘epigenome’ is of intense medical interest for the development of new drugs and in the use of stem cells.

The importance of epigenetics is emblemised by the field of stem cell research. In the stem cells of the embryo all genes are accessible and there is very little epigenetic control. This is important because these cells will go on to differentiate and form all of the many varieties of cells in the body. Such stem cells, with the ability to become different cell types, are said to be ‘pluripotent’.  But, as these stem cells differentiate and become more specialised towards a particular task, the level of epigenetic control tightens, effectively closing off whole portions of the genome that are irrelevant for a particular cell type.

In 2012 Sir John Gurdon and Shinya Yamanaka won the Nobel Prize in Physiology and Medicine for their “discovery that mature cells can be reprogrammed to become pluripotent”. They found that the introduction of just four different proteins that help regulate gene expression (transcription factors) is sufficient for the reprogramming of a mature cell into a pluripotent stem cell. These transcription factors serve to wipe the epigenetic slate clean, and actually erase modifications at the epigenetic level to reopen the genome.

These cells can differentiate into any other cell type, meaning that it might, one day, be possible to generate replacement cells and tissues might to treat a huge range of medical conditions including heart disease and Alzheimer’s Disease.

This post, by author James Torpey, was kindly donated by the Scouse Science Alliance and the original text can be found here.

Reference:
http://www.nobelprize.org/nobel_prizes/medicine/laureates/2012/

Are all owls really nocturnal? : And other common misconceptions about owls

We’ve all been brought up (particularly in the UK) with common myths about owls; they all say ‘twit twoo’, they can turn their heads all the way round, they’re all nocturnal and of course that they are the wisest of all creatures. But how many of these are actually true?

If you ask any child or even an adult what sound an owl makes they will answer with ‘twit twoo’, and how they sometimes hear the noise of the owl down a quiet dark country lane at night. Firstly, the only owl that says ‘twit twoo’ is the tawny owl, Strix aluco, and it’s actually a breeding pair, the male emitting the ‘twit’ and the female the ‘twoo’ sound. However, the tawny owl, one of five recognised and protected British owl species (along with the barn owl, the little owl, the long eared owl and the short eared owl) is the most common of all the owl species in this country, so the chances of you hearing the ‘twit twoo’ sound is much more likely than hearing the screech of the barn owl for example.

How about the myth that they can turn their heads all the way around, or as a little girl once told me ‘they can turn their heads around and around and around…’? I’m afraid the truth is if this was the case the owl would choke or its head would fall off! However, owls can turn their heads up to 270 degrees each way (left or right) and 180 degrees upwards, and they do this by having twice as many vertebrae in their neck than mammals do. But why do they do this? Having such large eyes means that their eye sockets are fixed in their skull, so unlike us they can’t look left or right by just moving their eyes. Instead owls have to move their whole head to focus their eyes on their prey.

Next, the common thought that all owls are nocturnal. Again a myth I’m afraid. Although about 60% of all owl species are nocturnal, the rest are diurnal (active during the day) or crepuscular (active at dawn and dusk). Interestingly, you can actually figure out when each species of owl is most active by simply looking at their eyes. If they have black eyes they’re nocturnal, if they have yellow eyes they’re diurnal and if they have orange eyes they are a crepuscular species (see figure 1). Pretty neat, right?

Screen Shot 2015-03-08 at 14.16.13
Figure 1: A) snowy owl (Bubo scandiacus) with yellow eyes is a diurnal species, B) European eagle owl (Bubo bubo) with orange eyes is a crepuscular species and B)Tawny owl (Strix aluco) with black eyes is a nocturnal species

Finally, the myth of the wise old owl has existed for thousands of years, ever since people worshipped the Greek goddess of wisdom, Athena, who had a pet owl (see figure 2). Clearly people assumed that if Athena was wise, owls must be too. This myth has continued into modern culture with characters such as ‘Owl’ in Winnie the Pooh and ‘Archimedes’ the owl in The Sword in the Stone (great film though :D). The truth is owls are clever when it comes to hunting, but really that’s all they need to be able to do. They don’t need to figure out the answers to a crossword puzzle like we might try to do, or decipher the instructions to microwave a meal, they need to hunt and they’re very good at it. The reason for the case that owls are not ‘wise’ is because underneath all the fluff and feathers they have a skull the size of a golf ball and inside a brain about the size of a 5p coin; one third is used for eyesight, one third for hearing and one third for general thinking. So, although most of what you thought you knew about owls may not actually be true, I’m sure you agree they are still incredibly fascinating creatures.

SSA4
Figure 2: The owl of Athena

This post, by author Alice Maher, was kindly donated by the Scouse Science Alliance and the original text can be found here.

What we can learn from sharks: Ancient antibodies

shark1Now I’m sure that many will agree with the statement that, at least in the literary sense of inspiring awe, sharks are awesome creatures. They are one of the apex predators in their environment, the structure of their skin at a molecular level makes them swim faster and they can constantly replenish their teeth in conveyor belt like fashion!

As creatures go the shark model is pretty old, about 420 million years old, and yet they also still have amazing complexity. Of particular interest to this article is their highly evolved – if ancient – immune system.

shark2
Strand model of an antibody, the structural heavy chains are shown in green whilst the light chains that help to form the antigen binding sites are seen in red.

Just like us a shark’s immune system is reliant on a class of proteins known as antibodies to function correctly. Antibodies are a class of protein that play a variety of different roles in the body, most notably in their recognition and immobilisation of invading pathogens. This occurs through the binding of small macromolecules, called antigens, to a special recognition site of the antibody. These are typically exposed at the surface of pathogens or else secreted by them.

This is however not the only reason many researchers are interested in antibodies. Their ability to recognise very small quantities of antigens with a high specificity makes them useful in bio sensing or as treatments to disease by binding to specific proteins thus blocking their function.

For these applications it is useful to have a large amount of such antibodies, and in fact for a lot of fields of biological research (not just immunology), antibodies are a critical reagent in many experiments.

shark3
Outline of antibody production via hybridoma cells

The typical method for producing these antibodies is to first induce their production in a model organism, such as a mouse, then extract cells from the mouse’s spleen, fuse them with a myeloma (a benign cancer cell that divides rapidly) and then use this hybrid cell (known as a hybridoma) to produce many copies of the antibody. Following this the antibodies must be carefully purified before they can be used.

This is a complicated process and fraught with difficulties, getting mice to make the right antibodies and producing the hybridoma is difficult enough for a start. A better approach is to produce recombinant antibodies by inserting DNA encoding them into model cells such as E-coli or yeast, which are much easier to handle. A big advantage here is that it is also possible to  make human antibodies, critical for any clinical work. However this synthetic approach is still tough as many recombinant antibodies are not stable enough to survive purification.

This is where the sharks come in. Sharks live in a highly saline environment, the sea, and so to avoid their bodies just shrivelling up through osmosis their bodily fluids contain high concentrations of osmolytes that prevent water moving out of their tissues. One of the major osmolytes in sharks is urea, typically at a concentration of several hundred millimolar.

Urea is a powerful denaturant and is a go to reagent for biochemists who want to study how stable a particular biomolecule is. Since shark blood contains high concentrations of urea scientists suspected that their circulating proteins, including antibodies, would be more stable than their human counterparts.

This hypothesis proved to be true however it was not just enough to know that they were more stable, they also needed to know how and why. The best way to understand the inner workings of a protein is to have an atomic resolution structure and to get that you require two things; protein crystals and x-rays.

Unfortunately making protein crystals is not so straightforward (see my earlier post on the topic), and in this case it proved impossible. Instead fragments of the complete shark antibody were crystallised and analysed by x-ray crystallography. The atomic images were then put together by researchers like a jigsaw puzzle to gain an insight into the inner workings of the antibody,

What they found is that whilst the sequence of amino acids that make up shark antibodies is very different to human antibodies, the overall shape and fold is remarkably similar. This is an example of convergent evolution, were two different species evolve the same solution to a problem independently.

shark4
Over lay of human antibody heavy-chain fragments (Red, Blue and Green) with the same region from a shark antibody (Grey)

There were however still some important differences. In particular the shark antibodies contained several extra intermolecular interactions at key points within their structure that helped to make the molecule more robust to chemical stress.

Armed with this new knowledge the researchers went one step further and modified recombinant human enzymes to now include these extra interactions to test the impact they would have on stability. Happily the stabilisation was transferrable and the modified enzymes could survive in temperatures up to 10 degrees higher than unmodified, as well as being almost twice as resistant to urea.

What was most pleasing is that the rate of secretion for the modified antibodies was vastly improved. The researchers suggested that this would likely be a boon to the antibody industry and could potentially have significant diagnostic and therapeutic application. This all goes to show that sources of scientific innovation can come from the most unlikely of places…oh and of course reinforces my point that sharks are super awesome.

SSAThis post, by author Marcus Gallagher-Jones, was kindly donated by the Scouse Science Alliance and the original text can be found here.

References

Feige, M.J. et al., 2014. The structural analysis of shark IgNAR antibodies reveals evolutionary principles of immunoglobulins. Proceedings of the National Academy of Sciences. Available at: http://www.pnas.org/content/early/2014/05/14/1321502111.abstract.

Why is Snake Venom so variable?

Based on a presentation by Dr. Wolfgang Wüster (Bangor University) – 12/03/13

Saw-scaled Viper (Echis carinatus) (Photo credit: Frupus)
Saw-scaled Viper (Echis carinatus) (Photo credit: Frupus)

I hate snakes. I’m just going to say it from the start; they scare the living daylights out of me. I’d have been living with one if my girlfriend hadn’t noticed the colour drain from my face when she mentioned buying one. And yet, for reasons I cannot explain, I went along to a seminar yesterday all about venomous snakes! I’m glad I did though – Dr. Wolfgang Wüster talked about them with great energy and enthusiasm, getting quite a few laughs along the way, and, most importantly, piquing the entire lecture theatre’s interest. I found the talk so engaging that I’ve decided to share what I learned here.

Snake venoms are mixtures of toxins, usually consisting of tens to hundreds of the poisonous proteins. This obviously allows for a great degree of variation in nature as different venoms contain different combinations of toxins and quantities thereof. As you’d expect, lots of different species of snake have different toxins; however, the variation can go all the way down to differences between members of the same species. In fact, in some species, an adult’s venom can be different to its venom as an infant. This wide range of venoms has an equally diverse range of effects on prey, resulting in paralysis, haemorrhages, and massive cell death and tissue damage, amongst other things. Upon explaining this in the talk, Dr. Wüster took great pleasure in showing some truly disgusting images – remember; I go through the pains of Science so you don’t have to!

Common symptoms of any kind of snake bite poisoning (Photo credit: Wikipedia)
Common symptoms of any kind of snake bite poisoning (Photo credit: Wikipedia)

The main question of the talk was that of what causes this variation in venom composition. It’s probable that this all depends on what individual venoms are used for, which, in the majority of cases is overpowering and killing prey. Diet in snakes is an example of a ‘selective pressure’. This is where something affects the survival of a population, thus encouraging evolution of that population to overcome the stress.

Diet, as a selective pressure, acts upon many characteristics of venom. For example, the volume of venom stored in a snake’s glands is usually only enough to kill enough prey to survive. As such, snakes requiring a greater food intake or those that kill larger prey will produce more venom than those that consume less food. The overriding reason for this is that producing venom requires energy, so the minimal amount necessary is made and used.

Dr. Wüster’s group saw an interesting example of this in a model system of 4 species of Saw-scaled Vipers. Whilst most snakes eat vertebrates (animals with a backbone), these vipers also eat arthropods (invertebrates with exoskeletons and segmented bodies, such as scorpions and spiders). The 4 species differ greatly in the ratios of arthropods and vertebrates that they eat, yet all 4 species take 2 to 3 bites to kill scorpions, taking their time to see how much venom is necessary to subdue their prey. This may be evidence of economy of usage of venom, meaning that these model organisms have evolved to favour potent, rather than voluminous, venom to reduce the amount required.

Anatomy of a venomous snake’s head (Photo Credit: How Stuff Works)
Anatomy of a venomous snake’s head (Photo Credit: How Stuff Works)

Prey resistance also plays a role in determining the volume of venom a snake produces, as well as the potency of that venom. For example, Moray Eels that live in the same regions as Sea Snakes have evolved resistance to the snakes’ venom. As a direct consequence of this the snakes have evolved to produce and release more of it to compensate.

In conclusion, Dr. Wüster presented compelling evidence that venom composition differs based on dietary requirements. Different combinations of toxins affect different preys, and different snakes need their venoms to have different harmful effects. The ‘arms race’ that develops from predator-prey relationships, whereby the prey evolves to resist the venom and the snake evolves to counteract this, also drives diversification. Finally, using venom economically seems to be a very important factor in these predators. Dr Wüster explained that future work would take a detailed look at the genetics behind venom variation, studying the genes encoding toxins and the variation that exists therein. I, for one, look forward to hearing about their findings, even if it does mean spending more time looking at pictures and videos of snakes…

This post, by author Ian Wilson, was kindly donated by the Scouse Science Alliance and the original text can be found here.

Bovine tuberculosis in the UK- the bigger picture

Bovine tuberculosis (bTB) is a disease that seems to gather a lot of attention in the popular press, especially the highly controversial badger culling trials in England and Wales. But what is this disease, and why is it so important in this country?

The disease 

Bovine TB is caused by the bacterium Mycobacterium bovis (M. bovis), a very close relative of the principle cause of human TB, M. tuberculosis. It predominantly infects cattle, but perhaps not surprisingly can also be transmitted to, and cause disease in, people making it a zoonosis (a disease which can be transmitted readily between humans and animals). In both cases the resulting disease is very serious; symptoms include persistent cough, weight loss and fever of increasing severity over weeks and months, ultimately leading to death if no intervention is applied.

Mycobacteria
A high magnification image of mycobacteria captured using an electron microscope (CDC)

Historically, human infection with bTB has been a very serious problem in the UK; records from the 1930s suggest it accounted for at least 16,000 deaths in that decade alone! The major route of transmission at the time was through people drinking the milk of infected animals. Widespread implementation of pasteurisation (the process of heat-treating milk to kill disease causing bugs) from the late 1930s onwards, combined with the routine testing of cattle for the disease and meat hygiene inspection, virtually eliminated M. bovis as a causative agent of human TB in the UK. However, in many other parts of the world, particularly developing countries (where such measures are sometimes harder to implement) human infection with bTB remains a very serious issue, with millions of people still at risk across the globe, as demonstrated by figures published by the WHO on human TB; it is estimated that in the 1990s, TB accounted for 30 million deaths worldwide, of which M. bovis is believed to have been the causative agent between 10 and 50% of cases in developing countries, the areas where TB is a particular problem1.

bTB in cattle: an ongoing saga 

M. bovis is a very slow growing bug, with disease taking several months to become obvious physically. Unfortunately, during this time infected animals may remain within their herds, potentially spreading the disease with other individuals. The slow growing nature of the bacteria also makes it incredibly difficult to kill. As a consequence in the UK, and many other countries where bTB is a problem, the treatment of infected animals is not considered an option. Instead a “test and cull” policy is employed, whereby animals are tested for the disease and immediately slaughtered if found to be positive.

The test used in cattle is very similar to that used to detect TB in people (some of you may remember having this procedure as a child). It works by injecting a small amount of material taken from the M. bovis bacteria (called the “antigen”) into the skin of the cow. Animals that are actively fighting infection will react strongly to this material resulting in a large swelling in the area; this inflammation indicates that they are bTB-positive. At present all cattle in England and Wales are tested at regular intervals, typically every 1-4 years depending upon the current level of disease in the area. Farms where positive animals are found are placed under restrictions, whereby no animals may be moved off the premises until the farm’s ‘TB-free’ status is restored, in an attempt to stop the disease being spread through the movement of infected cattle to other farms.

MooCow
Courtesy of Rachael Evans

Testing was originally introduced in the mid 1930s on a voluntary basis, but became compulsory from the 1950s across the entire country. There was initially a rapid reduction in disease prevalence, and throughout the 1960s and 70s it was all-but eradicated. However, from the mid-1980s onwards, and for reasons which are still not fully understood, the disease underwent a resurgence and the situation today in parts of the UK is as bad as it ever has been; between 2000 and 2010 over 250,000 cattle were slaughtered as a result of these strict measures at an estimated cost to the UK taxpayer of     £500million. Currently Scotland is the only part of the country considered to be free of the disease, and while the most recent government figures suggest disease prevalence is slowly decreasing, whether this trend will continue, as hoped, remains to be seen2


Completing the picture

One reason that bTB has continued to be an issue after so many decades is that there are a great deal of unanswered questions in connection with its resurgence and transmission. As with many diseases, more discoveries simply raise more questions- none more contentious than the issue of badgers.

It is known that badgers can act as a “reservoir host”, meaning that they are able to carry M. bovis and transmit it back to cattle. What is less clear is the significance badgers play in spreading the disease, and whether or not controlling badger populations would help in controlling bTB in cattle.

In the autumn of 2013, badger-culling trials were commenced in 2 areas of England in an experimental effort to quantify whether any benefit is afforded through a reduction of the disease in cattle. However, these trials were abandoned early in 2014, since it was not possible to achieve the target of culling 70% of the badger population; the minimum figure believed to be necessary to reduce overall disease prevalence. This is a hugely disappointing outcome, as it means that badgers have been culled for no beneficial reason, and the trial has therefore provided no new information as to whether culling badgers helps control disease in cattle- a situation neither pro nor anti-badger culling campaigners can possibly be happy with.

In addition to the link with badgers, there are many other aspects to bTB biology that are not fully understood. Two notable examples of this include the recent and rather startling news that domestic cats have infected their owners with M. bovis. This not only underlines the issue of public health surrounding bTB, but also demonstrates the variety of species which are capable of carrying infection, potentially further muddying the waters as regards reservoir host species and disease control3. A second example is that of recent research from the Department of Infection Biology, at the University of Liverpool, which has shown a new experimental link between bTB and liver fluke, a common parasitic worm that infects livestock across the UK. In this study researchers found that liver fluke were capable of ‘modulating’ a cow’s immune system, meaning they can in effect ‘turn down’ the immune response in an effort to avoid being killed by it. This appears to have knock-on consequences for the cow, since it also reduces the animal’s ability to fight off other infections like bTB. Additionally, dampening of immune responses has been shown to reduce the sensitivity of the bTB test, meaning that in areas where there is a lot of liver fluke, bTB may go un-diagnosed for long periods. This could help explain in part why certain areas of the country remain in the clutches of bTB despite our best efforts4.

Clearly then, there is a much more to bTB than the press would have you believe. The next time you see or read an article in the news about the rights and wrongs of badger culling, remember this is only one small part of a much bigger picture. For example, would the public feel differently if the government was pursuing the control of feral cat populations in an attempt to reduce the risk of TB infection in people?

SSAThis post, by author John-Graham Brown, was kindly donated by the Scouse Science Alliance and the original text can be found here.

Citations:

1. Statistics worldwide: http://wwwnc.cdc.gov/eid/article/4/1/98-0108_article.htm
2. Statistics for bTB in the UK: http://www.defra.gov.uk/animal-diseases/a-z/bovine-tb/
3. M. bovis in cats: http://171.67.117.160/content/174/13/326.2.full
4. Liverpool liver fluke research and bTB: http://www.nature.com/news/bovine-tb-disguised-by-liver-fluke-1.10685

Helen Beatrix Potter: Author, Illustrator and Scientist

Helen Beatrix Potter
Beatrix Potter with her spaniel Spot (Photo: Wikimedia Commons, copyright expired)

You may be forgiven for thinking of Beatrix Potter as the talented author and illustrator of a large number of children’s books, including The Tale of Peter Rabbit, but she is much more than that. For Beatrix Potter was a leading mycologist (someone who studies fungus) and conservationist and it was these interests that lead her to write her best-selling books. Beatrix Potter continues to enlighten people today as a recently discovered parasitic fungus (Tremella simplex) in Aberdeen was found to have been drawn by Beatrix Potter in the late 1890’s. So what drew the young Beatrix to nature and its study?

Beatrix was interested in nature from a very young age and was very meticulous in recording observable data, often drawing or painting what she observed in nature. Although these paintings were not systematic as Beatrix drew what interested her it led to her close friend John Everett Millais acknowledging her keen eye: “plenty of people can draw, but you…have observation.” From as young as nine years of age Beatrix was drawing watercolours of caterpillars with anatomical and field observations. Her love of nature was further enhanced by opportunities during her childhood. Beatrix was born into a wealthy family and so enjoyed summer holidays near the River Tay in Scotland which enabled her to draw a wide range or flora and fauna in the local area. Additionally, she was able to learn photographic techniques, including detail and perspective, from her father Rupert, an amateur photographer, further enhancing Beatrix’s talent in painting. Subsequent trips to the Lake District also influenced a lot of Beatrix’s painting at a young age. On these trips she also exhibited a keen interest in geography and archaeology, noting in her journals about the formation of land, soil erosion and paintings of fossils.

Educated privately through governesses at home, Beatrix’s talent in drawing was recognised early and further tuition in painting was provided. However, this was detested by Beatrix who did not wish to copy other painters but experiment with her own style, later sticking with watercolours. Beatrix cared for a lot of pets at home and these provided a great source of inspiration for many of her drawings. She would also draw a menagerie of animals secretly hidden in the nursery with her younger brother Walter Bertram including mice, rabbits, bats, snails, egg collections and insects. Additionally, when pets died the Potter children would boil the corpse and play with the bones to learn more about the anatomy of the animals they drew.

Oyster mushroom mycelium growing on a bed of coffee granules  (Photo: Tobi Kellner, Wikimedia Commons, License: CC BY-SA 3.0)
Oyster mushroom mycelium growing on a bed of coffee granules
(Photo: Tobi Kellner, Wikimedia Commons, License: CC BY-SA 3.0)

At first, study for her drawings were through the use of a hand lens, then a camera and later with her younger brother’s microscope and this is how Beatrix became fascinated with fungi. Her interests began at first with their colour and structure and she later became interested in her 30’s in the role of spores in reproduction of different fungi. At the time this topic was highly debated within British mycologist circles. On a holiday to Scotland in 1892, Beatrix formed an alliance with a noted naturalist Charles McIntosh and exchanged her accurate drawings of rare specimens for his knowledge of microscopic drawing of fungi, knowledge of taxonomy and live specimens during winter. By 1895, Beatrix had collected and drawn the spores and spore-producing structures (basidia) of the mushroom Boletus granulatus, now called Suillus granulatus. She had also successfully managed to germinate spores of a number of species and produced drawings of the mycelium.

With these interesting results at the time, Beatrix approached the Royal Botanic Gardens at Kew Gardens only to be dismissed by the current director, Willian Thiselton-Dyer. However, her uncle, the chemist Henry Enfield Roscoe, encouraged Beatrix to continue her research into fungal spore reproduction, which she then later offered to the Linnean Society in London, though at the time they did not admit women or allow them to attend meetings. The paper Beatrix submitted was titled ‘On the germination of the spores of Agaricineae’ and contained many of her microscope drawings. This paper has since been lost but it seemed as if Beatrix was heavily interested in the idea of hybridisation.

Around this time as well, the principal of London’s Morley Memorial College for Men and Women, Caroline Martineau, commissioned Beatrix to produce lithographs for use in lectures, of which two survive today, one on a Sheetweb spider and the other of insects. After a lifetime of drawing Beatrix donated her botanical and mycological drawings to the Armitt Museum and Library in Ambleside, Lake District. These are still used today by both amateur and professional mycologists and 59 of her drawings were reproduced in a book on fungi.

Peter Rabbit and Benjamin Bunny from The Tale of Benjamin Bunny (Photo: Wikimedia Commons, photo in the public domain)
Benjamin Bunny from one of Beatrix Potter’s books
(Photo: Wikimedia Commons, photo in the public domain)

However, these feats are not the limits to Beatrix’s love of nature. During her life, Beatrix also became fascinated with the countryside, not in keeping with her parents’ views for their child, and became a wealthy land owner in the North of England, running both her own farms and those she shared with the National Trust. It is through this work that Beatrix became interested in conservation, particularly concerned with breeding native Herdwick sheep and promoting the preservation of the land in the Lake District. On her death, Beatrix Potter donated her land to the National Trust and today over 1700 hectares are still enjoyed by thousands of visitors each year.

Therefore, through her work as both a mycologist and conservationist it is important that we think of Beatrix Potter as more than an author. For it was through Beatrix Potter, who fought against societies who did not acknowledge women and rejected her papers that the foundations of mycology was born. In Beatrix’s own words ‘with opportunity the world is very interesting.’

SSAThis post, by author Rebecca Jones, was kindly donated by the Scouse Science Alliance and the original text can be found here.