Acne bacteria to blame for back pain?

What do acne and chronic back pain have in common? Well, as it turns out, more than people once thought.  A group at the University of Southern Denmark have found that the same bacteria that gives people spots might be to blame for up to 40% of patients with lower back pain. What’s more, these infections can be treated with antibiotics.

Slipped disc popping out from in between the evenly grey vertebrae

Your backbone is a column of alternating vertebrae (bones) and intervertebral discs (cushions). The bones provide the strength and support, while the cushion discs allow movement and flexibility. Occasionally, thanks to a mix of age and awkward movement, the disc can bulge out from between the bones. In some cases the jelly-like goo in the disc’s centre, called the nucleus, can even ooze out – a bit like thick jam leaking out of a doughnut. If the nuclear material or the disc itself puts pressure on nerves coming in and out of the spine, it can be even more painful.

Slipping a disc is, by all accounts, excruciating, but it usually starts to heal by 6-8 weeks. However, someone can be diagnosed with chronic back pain (CBP) when the pain doesn’t subside after three months. Trouble is, this happens all too often, with an estimated 4 million people in the UK suffering from CBP at some point in their lives. The cost of CBP to the NHS is about £1 billion per annum. This doesn’t even cover lost working hours or the loss of livelihood suffered. Treatment usually focuses on relieving pain, preventing inflammation and more recently, cognitive behavioural therapy to treat the patient’s psychology, especially if the organic, physical cause of the pain is no longer obvious.

Recently, scientists in Denmark found a really important link between the bacteria responsible for acne, known as Propionibacterium acnes (P. acnes) and bad backs. The researchers found that in about half of their patients with slipped discs, the disc itself was infected, usually with P. acnes. A year later, 80% of the infected patients – compared to 43% of the uninfected patients – had dodgier bones either side of the slipped disc than 12 months before. The affected bones had developed tiny fractures and the bone marrow was replaced with serum, the liquid found in blisters.

Acne is not to blame for bad teenage hairstyle choices.

So how did the discs get infected? Bacteria like P. acnes get into our bloodstream all the time, particularly when we brush our teeth or squeeze spots. P. acnes and other similar bacteria don’t like oxygen-rich environments and so don’t normally grow inside us. The spinal disc doesn’t have a lot of oxygen around, providing a perfect home for the bacteria. If the disc is damaged – say, after popping out from the spinal column – tiny blood vessels sprout into it, letting the bacteria move in and settle down.

There, the bacteria grow and, rather than spread anywhere else, they spit out inflammatory chemicals and acid. The acid corrodes the bone next to the disc and causes more swelling and pain around the area. This discovery is ground-breaking, since before this research it was thought that discs couldn’t get infected except in a few exceptional cases.

The Danish researchers then conducted a second study, testing whether simple antibiotics could get rid of these bacteria and therefore treat chronic lower back pain. Patients that already had the characteristic signs of bone inflammation (tiny fractures and swelling) were given a 100-day course of antibiotics. The patients were reassessed a year after the trial began. Patients treated with antibiotics reported less pain, less ‘bothersomeness’ (yes!), took fewer days off work, made fewer visits to the doctor and, crucially, their bones looked in much better nick than the patients given a placebo.

Considering the huge numbers of people who are affected by chronic back pain, and the cost of treatments like surgery versus a course of antibiotics, this discovery has been glorified as the stuff of Nobel prizes. The revelation that bacteria may be to blame for some cases of this mysteriously untreatable condition rings familiar. It has been likened to the discovery of the culprit bacteria behind stomach-ulcers, Helicobacter pylori. Like back pain today, stomach ulcers were dismissed for years to be a disease of the mind, endemic among stressed-out melodramatics or people who ate too much spicy food. (And yes, Barry Marshall did get a Nobel Prize for swallowing a Petri-dishful of H. pylori.) It would be fantastic if, instead of resorting to surgery, half a million CBP patients could be effectively cured within 100 days or less!

The bacteria in the plate on the right have become resistant to many of the antibiotic white spots and so are more widespread.
Photo by Dr. Graham Beards

Unfortunately, there is a downside. Antibiotics have long been the magical cure-all, but just like fossil fuels, housing and talent on TV, we’re running out. Bacteria are becoming resistant to antibiotics faster than we can create new, effective ones. It’s an arms race and we’re losing, very quickly. What’s worse is that because of the recent negativity surrounding over-prescription, there are now restrictions on giving patients broad spectrum antibiotics. Since antibiotics can’t be used as much as they were 30 years ago, pharmaceutical companies can’t make any profit from developing new ones. And so, to further compound the problem of antibiotic resistance, there are fewer and fewer antibiotics being created every year.

In 2000 alone, UK doctors made 2.6 million prescriptions of antibiotics for acne. One study by a group in Leeds looked at the number of acne patients who were infected with P.acnes and were resistant to at least one type of anti-acne antibiotic. Between 1991 and 2000, the fraction of acne patients with antibiotic-resistant bacteria rose from about a third to more than a half.

The discovery that acne bacteria might be to blame for so many cases of debilitating back pain is hugely important. However, it also highlights how dependent we are on our dangerously dwindling supply of effective antibiotics, and how we might be wasting antibiotic effectiveness on comparatively trivial conditions such as spots.

Post by: Natasha Bray

Your Brain on Lies, Damned Lies and ‘Truth Serums’

question the answerPork pies, fibs, whoppers, untruths, tall stories, fabrications, damned lies…not to mention statistics.

Apparently, every person lies an average of 1.65 times every day. However, since that average is self-reported, maybe take that figure with a pinch of salt. The truth is, most people are great at lying. The ability to conjure up a plausible alternative reality is, when you think about it, seriously impressive, but it takes practice. From about the age of 3 young children are able to make up false information, at a stunning rate of one lie every 2 hours – though admittedly the lies from a toddler’s wild imagination are relatively easy to identify.

When we lie, brain cells in the prefrontal cortex – the planning, ‘executive’ of the brain – work harder than when we tell the truth. This may be reflected in the physical structure of our brains as well: pathological liars have been shown to have more white ‘wiring’ matter and less grey matter in the prefrontal cortex of their brain than other people. But how can we tell if someone is telling a lie, or telling the truth?

Back in the day – 2000 years ago – in ancient India, people would use the rice test to spot liars. When someone is lying, their sympathetic (‘fight or flight’) nervous system goes into overdrive, leading to a dry mouth. If you could spit out a grain of rice, you were seen to be telling the truth. If your mouth was parched and you couldn’t spit the grain out, you were lying. Since then, several different methods of catching out liars have been used – to varying levels of success.

In several books and films (Harry Potter, True Lies, The Hitchhiker’s Guide to the Galaxy and many more), a ‘truth serum’ is used to elicit accurate information from the recipient.  In actual fact, however, truth serums don’t exist.  Apart from in fiction, their title is an ironic misnomer. Having said that, scientists have tried for decades to develop a failsafe ‘veritaserum’ in order to catch out liars.

wineAlcohol has been used as a sort of lie preventor for millennia, as the Latin phrase ‘in vino veritas’ (in wine [there is] truth) demonstrates. Alcohol acts in the brain by increasing the activity of GABA channels, leading to a general depression of brain activity. This has been thought to suppress complex inhibitions of thoughts and behaviours, loosening the drinker’s tongue. However, drinking alcohol doesn’t prevent people giving false information, and it by no means prompts people to tell ‘the truth, the whole truth and nothing but the truth’.

A drug called scopolamine was used to sedate women during childbirth in the early 20th century when a doctor noticed that the women taking the drug would answer questions candidly and accurately. Scopolamine acts on GABA receptors in a similar way to alcohol, and so a person intoxified with the drug is just as likely to give false information as someone who’s had a few stiff drinks.

Barbiturates are sedatives such as sodium amytal that work on the brain in a similar way to alcohol – by interfering with people’s inhibitions such that they spill the beans. Sodium amytal was used in several cases in the 1930s to interrogate suspected malingerers in the U.S. army, but the drug does not prevent lying and can even make the recipient more suggestible and prone to making inaccurate statements.

headIn the 1950s and 60s, the CIA’s Project ‘MK-ULTRA’ tested drugs such as LSD on unconsenting adults and children. If LSD proved a reliable truth serum, it would be an invaluable tool in the Cold War. The tests showed that LSD would be far too unreliable and unpredictable to use in interrogation.

Despite the repeated lack of success in the search for a ‘truth serum’, scientists have continued trying to develop alternative technologies for busting liars. The polygraph, used by respected institutions including the CIA , FBI and The Jeremy Kyle Show, measures changes in arousal – heart rate, blood pressure, sweating, and breathing rate – in order to detect deception. However, there is a lot of scepticism surrounding polygraphy. In particular, there are several hacks to avoid getting caught out by a polygraph – most notably biting your tongue, difficult mental arithmetic, or tensing your inner anal sphinchter without clenching your buttocks (thanks for that factual gem, QI).

The improvement of brain imaging methods – in particular functional magnetic resonance imaging or fMRI – has extended the scope of detecting liars. On the internet, one might stumble across ‘No Lie MRI’, an American firm that offers a lie detection service for individuals, lawyers, governments and corporations. They claim that this service could be used to “drastically alter/improve interpersonal relationships, risk definition, fraud detection, investor confidence [and] how wars are fought.”

court

Currently James Holmes, the man charged with injuring 70 and killing 12 at the Batman cinema shooting in Aurora, Colorado, is on trial. The judge has ruled his consent to a “medically appropriate” narcoanalytic interview and polygraph. That is, Holmes could be interviewed under the influence of sodium amytal or other similar drugs in order to determine whether or not he is feigning insanity. The use of these drugs may contravene the U.S. Constitution’s 5th Amendment ‘the right to remain silent’. Clinical psychiatrist Professor Hoge says, “The idea that sodium amytal is a truth serum is not correct.  It’s an invalid belief. It is unproven in its ability to produce reliable information and it’s not a standard procedure used by forensic psychiatrists in the assessment of the insanity defence, nor is polygraph.”

The potential benefits of a 100% reliable, valid method of lie-detection are obvious, although there are ethical grey areas that scientists and the legal/ethical community would need to tackle if the technology is ever found. For now I think the evidence for using current lie detection methods, especially for anything more serious than The Jeremy Kyle Show, is far too sparse.

Post by Natasha Bray

Pushing Scientific Boundaries: How far is too far?

Science is nothing if not controversial. From Galileo through Darwin to modern day researchers; certain scientists have always challenged the dogma of the era and often faced persecution because of it. These scientists usually kept up their ‘heretical’ beliefs because they were sure they were right and, in some famous examples, they were eventually vindicated.

But how does controversy affect modern-day science? We have now reached a stage where almost nothing seems impossible. We are able to do things that would have seemed outrageous a century ago: flying through the air on a regular basis, transplanting hands and faces and curing cancer, to name a few. A lot of scientific breakthroughs are made when people push the ethical boundaries of their time, but at what point must we say “that is enough, this has gone too far”? As each scientific taboo is broken and assimilated into modern day research, will there ever be a time where we push too far? Even if we do, will future generations use these once-controversial techniques as freely as we now accept that the earth revolves around the sun?

634px-Day-old_miceOne problem faced when deciding whether or not a techniques morally acceptable is the notion that moral and ethical values vary significantly from person to person. For example, in October 2012, it was reported that scientists were able to create healthy baby mice from stem cells. This led to speculation that in the future infertile women may be able to give birth to healthy babies made from their own stem cells. When the article was reported in the Guardian, the comments below the report were divided. Some thought it was a great breakthrough which should be applauded for the sophistication of the science alone. Others were excited about the prospect of ending the misery of infertility. Some people, however, were more cautious. Arguments against the technique included the opinion that, in an already overpopulated world, should we really be celebrating something that could make that problem worse? Others feared the scientists were “playing God” and were scared at the thought of them having so much control over life itself. This research may have started as a simple question of determining whether such a technique was possible, or from a desire to help infertile women but has now entered a minefield of divided opinion and controversy.

One scientist who is no stranger to controversy is John Craig Venter. Venter, a genome specialist based in the USA, hit the headlines in 2010 when his team created the first synthetic organism. Venter and his colleagues created a bacterial genome entirely from synthetic DNA, they nicknamed the resulting organism Synthia. Synthia acted much like a normal bacterium, replicating and passing the synthetic DNA on to her offspring. Whilst Venter was praised in many scientific corners for this remarkable achievement, there were others who voiced concerns about his work. Venter defended his creation by pointing out a number of beneficial tasks it could accomplish: for example capturing and removing excess carbon dioxide in the atmosphere, or generating an alternative fuel source.

ev.owaInterestingly, sometimes the amount of controversy generated around a discovery depends on the person who made it. Venter has previously made himself unpopular with the scientific community by turning the project to sequence the human genome into a race. He has also made moves into patenting (particularly of the synthetic genome he created), ensuring that in the future he will have full control over how Synthia may be used and will reap any financial rewards attached to this. This has angered many scientists who believe that discoveries should not be owned by any one individual and that they should also not be exploited for profit. Venter’s millionaire lifestyle and self-aggrandising quotes (for example apparently insinuating that he deserves a Nobel Prize) have also rubbed fellow scientists up the wrong way. This behaviour may mean that people are generally mistrustful of Venter’s motives and therefore make his discoveries controversial. Did he make Synthia because he truly wanted to help technology and the environment? Did he do it just because he could? Or because he knew it would get him publicity? Did he make it with the idea of patenting? Or is it a case of “all of the above”?

However, is Venter any different from controversial figures of the past, some of whom we now consider to be the greatest scientific minds of all time? Do we need these maverick scientists to push forward discoveries that others are too afraid to make? If Venter hadn’t turned it into a race, the human genome project would not have been finished earlier than planned. There’s certainly no denying that, whatever you think of his methods, Venter has made remarkable achievements in his career. On the other hand, do we need these boundaries pushed? How much should science interfere with nature? Is it this type of behaviour which makes scientists appear immoral or power-hungry in the minds of the public?

Unfortunately, there is no easy answer to this question. It would be nice to say that science can keep behind the moral horizon and still move forwards, but that’s not the way the world works. We need mavericks and controversial figures to push scientific discoveries into the next era and, as I stated before, what is controversial at first may become normal several years later.

800px-Neandertala_homo,_modelo_en_Neand-muzeoFor my part, I’m wary of scientists who do something which they know is controversial simply because it is possible for it to be done. I call this the “Jurassic Park mentality”: doing something for no better reason than ‘because you can’. Now, before you protest that Jurassic Park is fictional, remember that sometimes truth can be stranger than fiction. Take for example the Harvard Professor who wants a surrogate mother for a Neanderthal baby. I always like to think that research should have some greater purpose which will ultimately prove beneficial. However, I’m not sure how a Neanderthal baby would be even remotely beneficial to anything or anyone.

Although, it’s true that we can’t always tell how research will be used in the future. Sometimes little or less controversial discoveries can become part of something much bigger, and there’s no way of knowing how your research may be used by other people. Just ask Albert Einstein, whose work on atomic theory went on to aid development of the atom bombs dropped on Hiroshima and Nagasaki during World War II.

Perhaps it’s best to think of it this way: when you start pushing at the boundaries of what will be considered controversial or even downright immoral, maybe that’s the time to step back and think “What will the point of this be? Will this be helpful to humanity or the planet or the universe or am I just doing for publicity, fame, glory or just because it is possible?” And if your answer comes into the latter part of that question, then maybe you should at least carefully assess the possibility of someone getting eaten by a rampaging dinosaur before you continue.

Post by: Louise Walker