A pill to cure Alzheimer’s!?: Why science stories should be reported more carefully

On October 10th 2013, there were headlines on the front pages of several British papers claiming that “A simple pill may cure Alzheimer’s”. These papers included the Times (£) and the Independent, who both put the stories on their front pages, the BBC website and breakfast show. The story was also tweeted by these outlets as well as America’s Fox News:

As a scientist who has worked on Alzheimer’s disease, headlines like this always provoke my cynical side. I’ve seen stories proclaiming a cure many times before, yet no cure is forthcoming. My cynicism was somewhat rewarded when I researched the story further. The study did indeed find that a pill, which inhibits a protein called PERK, was able to prevent brain cell death in mice which showed symptoms of disease. However, the disease the mice had was not Alzheimer’s; they had a prion disease.

In order to understand the research, here’s a quick explanation. Prions are misfolded, infectious proteins which are linked to neurodegenerative diseases such as BSE (or “mad cow disease” as it was known in the 80s) and CJD in humans. Alzheimer’s is also caused at least in part by a misfolded protein, called amyloid-beta (Aβ). Both prions and Aβ are affected by something called the Unfolded Protein Response (UPR) in cells. The UPR detects the misfolded protein and stops the brain cell making any new proteins. This means that the cell cannot make proteins which are essential to its survival and so will eventually die, leading to neurodegeneration.

Don’t get me wrong, the results from the study are promising. However, the newspaper headlines are incredibly misleading. There are some key problems with interpreting the research as a “cure for Alzheimer’s”:

mouse

    • This study only theoretically applies to Alzheimer’s disease as the authors note that Aβ is subjected to the same UPR as prions. The effects of the drug will need to be tested on Aβ before any definitive conclusion can be made about its effectiveness in treating Alzheimer’s. Furthermore, Alzheimer’s disease has other contributory destructive mechanisms not related to the UPR which also need to be assessed. The same is true before a link can be made to the other neurodegenerative diseases mentioned in the paper, such as Parkinson’s or ALS.
    • The research was conducted in mice rather than humans and there is no guarantee that the drug will be usable in humans. It may not have the same effects or the side effects may render the drug unusable.
    • The drug causes potentially serious side effects in mice such as mild diabetes and weight loss. This would have to be rectified before the drug can be administered to humans which could take a significant amount of time.
    • The weight loss side effect in mice means that they could not be used for a long time and it is unknown what the long term effects of the drug are. Something which targets both the brain and an essential cellular process such as the UPR may have detrimental effects if used over a long period of time.
    • The pill does not “cure” memory loss. The mice that were treated with the drug did not regain memories which were already lost. However, treating these mice did prevent the disease from progressing further. This pill will not help people who already suffer from mid to late-stage dementia.
  • Even if the drug is suitable for use in humans, it will have to go through clinical trials before being put to regular use. This will take years, possibly even decades.

Most of the newspapers covering this story did mention some of these problems. The Independent in particular made it very clear the study was in mice and a cure is “a long way off”. (However, in their tweet (above) they say that, “This breakthrough in treatment for Alzheimer’s could very soon pave the way for a simple pill to cure the disease.” showing the differences in these types of news communication). The Express, on the other hand, took seven lines to even mention that the study was in mice. Unfortunately, in this day and age, many people don’t read further than the headline, sub-heading and possibly the first two or three paragraphs.  Many people therefore may well get the impression that a cure for Alzheimer’s is imminent and misunderstand the point of the study.

Later on in the day when things had calmed down a bit many newspapers did write editorials (for example in the Independent and the Guardian). These mostly highlighted some of the points above and clarifying that there is still a long way to go in curing Alzheimer’s.  But this is after the damage had been done, the headlines had been seen and the tweets had been sent. The point is that the story should never have been given so much prominence in the first place.

It’s quite easy to assess how people are reacting to a story by use of Twitter. A quick search of “Alzheimer’s” on the day the story broke showed a lot of people re-tweeting the story from various news sources, some with a link, some without. The misleading nature of the headline can be seen by the nature of some of these tweets, including one which said “They’ve found a cure for Alzheimer’s. This is big”. One of the big tweeters was the comedian Jimmy Carr, who tweeted this (rather lame) joke:

With no link to the story, how are people supposed to know where he got the information from? Another problem with today’s microblogging news delivery system is that there isn’t a lot of room for details and so the story can easily get mutated.

There were some expressing cynicism. The Alzheimer’s Society stated “This is a promising development as it shows this biological pathway is a potential target for new treatments.  However, it is important to note that this study was carried out on mice with prion disease and so it is not clear how applicable it is to humans with diseases such as Alzheimer’s.”

But it’s not the users of Twitter who I am concerned about. The problem with misleading story reporting like this is the effect it has on sufferers of the disease or their relatives. The reason this particular story has got me angry is because I have seen the effects that this sort of reporting can have. Long-term readers will be aware that my grandmother suffered from Alzheimer’s Disease, which took a huge toll on my grandfather.

I still clearly remember a day when a national tabloid newspaper carried the headline “Vaccine for Alzheimer’s Disease!” My grandfather read the headline, turned to me and said “Does this mean they’ll be able to cure your grandmother?” My cynicism piqued, I read the article and had to gently tell him no. That story held many of the same points as this one; it was a study done in mice and no human trials had been conducted. It is five years later and there is no news on that subject; whether it failed at clinical trials or is still being tested I don’t know. But the false hope it gave to my grandfather, and the countless others who read these headlines and think their disease may be cured soon, is a sad and dangerous thing.

newspaperWho is to blame for this misinformation spreading? It’s probably a subtle combination of the scientists who wrote the paper, the journal who published it and the reporters who wrote the newspaper stories. For scientists, having work published in national newspapers is a huge coup; national reports result in interest in your work and so you’re more likely to secure funding  to continue with your groundbreaking research. Unfortunately, newspapers and by extension their readers will mostly respond to “interesting” stories, which translates to “treating a disease that people have heard of”.

Alzheimer’s is big news now, as it is predicted to affect 1 million people by 2021. Therefore, the scientists probably put Alzheimer’s as the key point of the findings to increase interest in their research. This is a common practise amongst researchers desperate to secure funding from a dwindling pot. I noted when researching this post that every single headline said the more evocative “Alzheimer’s” rather than “Parkinson’s Disease” or “ALS” which were also mentioned in the research paper as potential beneficiaries of the drug. Curiously, CJD isn’t even mentioned by the researchers as a disease which can benefit from the treatment despite being the best-studied prion disease in humans. However, CJD is much rarer than Alzheimer’s (causing 1 death per 1-2 million of the population) and the media storm that happened around it in the 90’s has died down. It is not a “sexy” enough disease to sell research or newspapers on such a grand scale.

How is this problem going to be solved? Is it possible to make research interesting if it’s not linked to a disease? I would like to think it would, but them I’m biased. It’s a real bugbear for me as a biologist that an “interesting” story about biological research has to be about curing a disease. Research which just explains how a system works can be fascinating.

Certainly taking out a small, speculative point and blowing it up to the key part of the story doesn’t work.  However an accurate headline such as “There’s a drug which prevents brain cell death in mice that have something similar to Alzheimer’s disease; won’t be used in humans for a decade or so” is hardly catchy. But it should be made crystal clear in the very first reading points of the article exactly what has been found and its relationship to the disease; the authors and journalists at least owe that to the people affected. As a reader, it’s probably best to take headlines involving the words “cure” and a deadly disease with a pinch of salt until you’ve read the full article. Unless they involve the words “repeated successful human trials” then it’s probably best to treat the information with caution.

Post by: Louise Walker (in rant mode)

Body disorders that you never knew existed- Part 1

Welcome to the world of the weird and wonderful. You will be taken on a run down through five of the most unusual, rare, fascinating and possibly unthinkable disorders that we know exist.

1.  Hypertrichosis- ‘Werewolf syndrome’

HypertrichosisImagine having a body covered in so much hair that people mistake you for a werewolf. This is something that sufferers of hypertrichosis have to deal with on a daily basis. Hair growth isn’t restricted to the areas of the body that we consider ‘normal’, instead spreading to areas over their body and face in men, women and children alike. The disorder is extremely rare with fewer than 100 known cases worldwide. But how does this unusual condition come about? Scientists think that there are two causes; one of a genetic nature, and the other developing due to certain external factors. Researchers in China tested the DNA of two unrelated patients with the condition and found that there were extra genes present in the same region of the X chromosome. This extra DNA sits near to a gene involved in hair growth (SOX3) and is thought to switch on this gene, stimulating mass hair production. Next time you have a moan about having to shave or wax to get rid of your unwanted hair, spare a thought for hypertrichosis sufferers.

2. Foreign Accent Syndrome

Speech_2Whilst this sounds like something from a very strange medical drama, foreign accent syndrome really does exist. Usually occurring as a result of severe brain injury such as stroke or trauma, the patient ends up speaking with an accent distinct from the one they had before. One of the most recent cases occurred after a women suffered from a severe migraine. She woke up in hospital to find that she was speaking with a Chinese accent despite never having visited China. What is to blame for this sudden change in dialect? Scientists have found that damage to the parts of the brain required for speech and movement of muscles during speaking affects how we pronounce words. This changes the timing and rhythm of our speaking. As our tongue forms words in a different way, it sounds as if we are speaking with an accent.

3. Congenital pain insensitivity

SplinterA condition where you are unable to feel any pain sounds like an absolute blessing. No headaches, no pain when you’ve broken a bone, or when you whack your knee on the side of a table. But now think about it seriously, imagine not being able to tell if you’ve pushed your body too far exercising or cut your finger whilst chopping up a carrot. Pain is one of our body’s most protective mechanisms, alerting us that something is wrong and needs our attention. Without this basic mechanism we would have no way of knowing when something has gone wrong.  Individuals born with this condition have what we call a loss of sensory perception: they are unable to feel pain but can feel pressure and touch. A mutation affecting how the nerve cells form during development is thought to cause the improper functioning of these nerves in response to pain. Sadly, this is likely to occur with other deficits such as mental retardation and in some cases the ability to regulate body temperature. Not being able to feel pain would be extremely advantageous-…if you are a superhero that is. For us mere mortals, not so helpful.

4) Fibrodysplasia Ossificans Progressiva- ‘Stone man syndrome’

FOPStone man syndrome does what it says on the tin. Cue an image of The Thing from the Fantastic Four- a body essentially made of rock. Slowly over time, sufferers of this excruciatingly painful disorder start turning to bone. Due to a malfunction of the bodies repair mechanism, the gene that is responsible for ossification (bone growing) during development remains active. This gene is usually switched off after the development of bones in the fetus. In time, muscles, tendons and ligaments slowly begin to harden and turn to bone. As the degree of ossification worsens, everyday tasks such as tying your shoelaces or walking to the shop become an impossible task. Would surgery provide suitable relief? In short, no. Surgery is not considered an option as this type of trauma causes the body to attempt to repair the damaged area – creating more bone and more damage than before. Although there are around 700 confirmed cases of FOP worldwide, there is very little known about how to treat it. Remember next time your body feels stiff and uncomfortable that what you are experiencing couldn’t even scratch the surface of what these people of made of stone are subjected to.

5) Trimethylaminuria- ‘Fish odour syndrome’

FishTrimethylaminuria is a rare metabolic condition that can be embarrassing for individuals suffering from it. An enzyme (FM03) that is needed to breakdown trimethylamine (TMO) into a substance called trimethylamineoxide is absent from the body. TMO gradually builds up without the enzyme to break it down, and so has to be removed from the body through other outlets such as the skin, urine and breath.  Whilst sweating out toxins isn’t unusual, it is the strong fish-like odour that comes partnered with it that is considered abhorrent. The condition is more common in women, possibly irritated by female hormones. Despite the putrid odour, there are no other symptoms associated with it.

Acne bacteria to blame for back pain?

What do acne and chronic back pain have in common? Well, as it turns out, more than people once thought.  A group at the University of Southern Denmark have found that the same bacteria that gives people spots might be to blame for up to 40% of patients with lower back pain. What’s more, these infections can be treated with antibiotics.

Slipped disc popping out from in between the evenly grey vertebrae

Your backbone is a column of alternating vertebrae (bones) and intervertebral discs (cushions). The bones provide the strength and support, while the cushion discs allow movement and flexibility. Occasionally, thanks to a mix of age and awkward movement, the disc can bulge out from between the bones. In some cases the jelly-like goo in the disc’s centre, called the nucleus, can even ooze out – a bit like thick jam leaking out of a doughnut. If the nuclear material or the disc itself puts pressure on nerves coming in and out of the spine, it can be even more painful.

Slipping a disc is, by all accounts, excruciating, but it usually starts to heal by 6-8 weeks. However, someone can be diagnosed with chronic back pain (CBP) when the pain doesn’t subside after three months. Trouble is, this happens all too often, with an estimated 4 million people in the UK suffering from CBP at some point in their lives. The cost of CBP to the NHS is about £1 billion per annum. This doesn’t even cover lost working hours or the loss of livelihood suffered. Treatment usually focuses on relieving pain, preventing inflammation and more recently, cognitive behavioural therapy to treat the patient’s psychology, especially if the organic, physical cause of the pain is no longer obvious.

Recently, scientists in Denmark found a really important link between the bacteria responsible for acne, known as Propionibacterium acnes (P. acnes) and bad backs. The researchers found that in about half of their patients with slipped discs, the disc itself was infected, usually with P. acnes. A year later, 80% of the infected patients – compared to 43% of the uninfected patients – had dodgier bones either side of the slipped disc than 12 months before. The affected bones had developed tiny fractures and the bone marrow was replaced with serum, the liquid found in blisters.

Acne is not to blame for bad teenage hairstyle choices.

So how did the discs get infected? Bacteria like P. acnes get into our bloodstream all the time, particularly when we brush our teeth or squeeze spots. P. acnes and other similar bacteria don’t like oxygen-rich environments and so don’t normally grow inside us. The spinal disc doesn’t have a lot of oxygen around, providing a perfect home for the bacteria. If the disc is damaged – say, after popping out from the spinal column – tiny blood vessels sprout into it, letting the bacteria move in and settle down.

There, the bacteria grow and, rather than spread anywhere else, they spit out inflammatory chemicals and acid. The acid corrodes the bone next to the disc and causes more swelling and pain around the area. This discovery is ground-breaking, since before this research it was thought that discs couldn’t get infected except in a few exceptional cases.

The Danish researchers then conducted a second study, testing whether simple antibiotics could get rid of these bacteria and therefore treat chronic lower back pain. Patients that already had the characteristic signs of bone inflammation (tiny fractures and swelling) were given a 100-day course of antibiotics. The patients were reassessed a year after the trial began. Patients treated with antibiotics reported less pain, less ‘bothersomeness’ (yes!), took fewer days off work, made fewer visits to the doctor and, crucially, their bones looked in much better nick than the patients given a placebo.

Considering the huge numbers of people who are affected by chronic back pain, and the cost of treatments like surgery versus a course of antibiotics, this discovery has been glorified as the stuff of Nobel prizes. The revelation that bacteria may be to blame for some cases of this mysteriously untreatable condition rings familiar. It has been likened to the discovery of the culprit bacteria behind stomach-ulcers, Helicobacter pylori. Like back pain today, stomach ulcers were dismissed for years to be a disease of the mind, endemic among stressed-out melodramatics or people who ate too much spicy food. (And yes, Barry Marshall did get a Nobel Prize for swallowing a Petri-dishful of H. pylori.) It would be fantastic if, instead of resorting to surgery, half a million CBP patients could be effectively cured within 100 days or less!

The bacteria in the plate on the right have become resistant to many of the antibiotic white spots and so are more widespread.
Photo by Dr. Graham Beards

Unfortunately, there is a downside. Antibiotics have long been the magical cure-all, but just like fossil fuels, housing and talent on TV, we’re running out. Bacteria are becoming resistant to antibiotics faster than we can create new, effective ones. It’s an arms race and we’re losing, very quickly. What’s worse is that because of the recent negativity surrounding over-prescription, there are now restrictions on giving patients broad spectrum antibiotics. Since antibiotics can’t be used as much as they were 30 years ago, pharmaceutical companies can’t make any profit from developing new ones. And so, to further compound the problem of antibiotic resistance, there are fewer and fewer antibiotics being created every year.

In 2000 alone, UK doctors made 2.6 million prescriptions of antibiotics for acne. One study by a group in Leeds looked at the number of acne patients who were infected with P.acnes and were resistant to at least one type of anti-acne antibiotic. Between 1991 and 2000, the fraction of acne patients with antibiotic-resistant bacteria rose from about a third to more than a half.

The discovery that acne bacteria might be to blame for so many cases of debilitating back pain is hugely important. However, it also highlights how dependent we are on our dangerously dwindling supply of effective antibiotics, and how we might be wasting antibiotic effectiveness on comparatively trivial conditions such as spots.

Post by: Natasha Bray

Your Brain on Lies, Damned Lies and ‘Truth Serums’

question the answerPork pies, fibs, whoppers, untruths, tall stories, fabrications, damned lies…not to mention statistics.

Apparently, every person lies an average of 1.65 times every day. However, since that average is self-reported, maybe take that figure with a pinch of salt. The truth is, most people are great at lying. The ability to conjure up a plausible alternative reality is, when you think about it, seriously impressive, but it takes practice. From about the age of 3 young children are able to make up false information, at a stunning rate of one lie every 2 hours – though admittedly the lies from a toddler’s wild imagination are relatively easy to identify.

When we lie, brain cells in the prefrontal cortex – the planning, ‘executive’ of the brain – work harder than when we tell the truth. This may be reflected in the physical structure of our brains as well: pathological liars have been shown to have more white ‘wiring’ matter and less grey matter in the prefrontal cortex of their brain than other people. But how can we tell if someone is telling a lie, or telling the truth?

Back in the day – 2000 years ago – in ancient India, people would use the rice test to spot liars. When someone is lying, their sympathetic (‘fight or flight’) nervous system goes into overdrive, leading to a dry mouth. If you could spit out a grain of rice, you were seen to be telling the truth. If your mouth was parched and you couldn’t spit the grain out, you were lying. Since then, several different methods of catching out liars have been used – to varying levels of success.

In several books and films (Harry Potter, True Lies, The Hitchhiker’s Guide to the Galaxy and many more), a ‘truth serum’ is used to elicit accurate information from the recipient.  In actual fact, however, truth serums don’t exist.  Apart from in fiction, their title is an ironic misnomer. Having said that, scientists have tried for decades to develop a failsafe ‘veritaserum’ in order to catch out liars.

wineAlcohol has been used as a sort of lie preventor for millennia, as the Latin phrase ‘in vino veritas’ (in wine [there is] truth) demonstrates. Alcohol acts in the brain by increasing the activity of GABA channels, leading to a general depression of brain activity. This has been thought to suppress complex inhibitions of thoughts and behaviours, loosening the drinker’s tongue. However, drinking alcohol doesn’t prevent people giving false information, and it by no means prompts people to tell ‘the truth, the whole truth and nothing but the truth’.

A drug called scopolamine was used to sedate women during childbirth in the early 20th century when a doctor noticed that the women taking the drug would answer questions candidly and accurately. Scopolamine acts on GABA receptors in a similar way to alcohol, and so a person intoxified with the drug is just as likely to give false information as someone who’s had a few stiff drinks.

Barbiturates are sedatives such as sodium amytal that work on the brain in a similar way to alcohol – by interfering with people’s inhibitions such that they spill the beans. Sodium amytal was used in several cases in the 1930s to interrogate suspected malingerers in the U.S. army, but the drug does not prevent lying and can even make the recipient more suggestible and prone to making inaccurate statements.

headIn the 1950s and 60s, the CIA’s Project ‘MK-ULTRA’ tested drugs such as LSD on unconsenting adults and children. If LSD proved a reliable truth serum, it would be an invaluable tool in the Cold War. The tests showed that LSD would be far too unreliable and unpredictable to use in interrogation.

Despite the repeated lack of success in the search for a ‘truth serum’, scientists have continued trying to develop alternative technologies for busting liars. The polygraph, used by respected institutions including the CIA , FBI and The Jeremy Kyle Show, measures changes in arousal – heart rate, blood pressure, sweating, and breathing rate – in order to detect deception. However, there is a lot of scepticism surrounding polygraphy. In particular, there are several hacks to avoid getting caught out by a polygraph – most notably biting your tongue, difficult mental arithmetic, or tensing your inner anal sphinchter without clenching your buttocks (thanks for that factual gem, QI).

The improvement of brain imaging methods – in particular functional magnetic resonance imaging or fMRI – has extended the scope of detecting liars. On the internet, one might stumble across ‘No Lie MRI’, an American firm that offers a lie detection service for individuals, lawyers, governments and corporations. They claim that this service could be used to “drastically alter/improve interpersonal relationships, risk definition, fraud detection, investor confidence [and] how wars are fought.”

court

Currently James Holmes, the man charged with injuring 70 and killing 12 at the Batman cinema shooting in Aurora, Colorado, is on trial. The judge has ruled his consent to a “medically appropriate” narcoanalytic interview and polygraph. That is, Holmes could be interviewed under the influence of sodium amytal or other similar drugs in order to determine whether or not he is feigning insanity. The use of these drugs may contravene the U.S. Constitution’s 5th Amendment ‘the right to remain silent’. Clinical psychiatrist Professor Hoge says, “The idea that sodium amytal is a truth serum is not correct.  It’s an invalid belief. It is unproven in its ability to produce reliable information and it’s not a standard procedure used by forensic psychiatrists in the assessment of the insanity defence, nor is polygraph.”

The potential benefits of a 100% reliable, valid method of lie-detection are obvious, although there are ethical grey areas that scientists and the legal/ethical community would need to tackle if the technology is ever found. For now I think the evidence for using current lie detection methods, especially for anything more serious than The Jeremy Kyle Show, is far too sparse.

Post by Natasha Bray

Pushing Scientific Boundaries: How far is too far?

Science is nothing if not controversial. From Galileo through Darwin to modern day researchers; certain scientists have always challenged the dogma of the era and often faced persecution because of it. These scientists usually kept up their ‘heretical’ beliefs because they were sure they were right and, in some famous examples, they were eventually vindicated.

But how does controversy affect modern-day science? We have now reached a stage where almost nothing seems impossible. We are able to do things that would have seemed outrageous a century ago: flying through the air on a regular basis, transplanting hands and faces and curing cancer, to name a few. A lot of scientific breakthroughs are made when people push the ethical boundaries of their time, but at what point must we say “that is enough, this has gone too far”? As each scientific taboo is broken and assimilated into modern day research, will there ever be a time where we push too far? Even if we do, will future generations use these once-controversial techniques as freely as we now accept that the earth revolves around the sun?

634px-Day-old_miceOne problem faced when deciding whether or not a techniques morally acceptable is the notion that moral and ethical values vary significantly from person to person. For example, in October 2012, it was reported that scientists were able to create healthy baby mice from stem cells. This led to speculation that in the future infertile women may be able to give birth to healthy babies made from their own stem cells. When the article was reported in the Guardian, the comments below the report were divided. Some thought it was a great breakthrough which should be applauded for the sophistication of the science alone. Others were excited about the prospect of ending the misery of infertility. Some people, however, were more cautious. Arguments against the technique included the opinion that, in an already overpopulated world, should we really be celebrating something that could make that problem worse? Others feared the scientists were “playing God” and were scared at the thought of them having so much control over life itself. This research may have started as a simple question of determining whether such a technique was possible, or from a desire to help infertile women but has now entered a minefield of divided opinion and controversy.

One scientist who is no stranger to controversy is John Craig Venter. Venter, a genome specialist based in the USA, hit the headlines in 2010 when his team created the first synthetic organism. Venter and his colleagues created a bacterial genome entirely from synthetic DNA, they nicknamed the resulting organism Synthia. Synthia acted much like a normal bacterium, replicating and passing the synthetic DNA on to her offspring. Whilst Venter was praised in many scientific corners for this remarkable achievement, there were others who voiced concerns about his work. Venter defended his creation by pointing out a number of beneficial tasks it could accomplish: for example capturing and removing excess carbon dioxide in the atmosphere, or generating an alternative fuel source.

ev.owaInterestingly, sometimes the amount of controversy generated around a discovery depends on the person who made it. Venter has previously made himself unpopular with the scientific community by turning the project to sequence the human genome into a race. He has also made moves into patenting (particularly of the synthetic genome he created), ensuring that in the future he will have full control over how Synthia may be used and will reap any financial rewards attached to this. This has angered many scientists who believe that discoveries should not be owned by any one individual and that they should also not be exploited for profit. Venter’s millionaire lifestyle and self-aggrandising quotes (for example apparently insinuating that he deserves a Nobel Prize) have also rubbed fellow scientists up the wrong way. This behaviour may mean that people are generally mistrustful of Venter’s motives and therefore make his discoveries controversial. Did he make Synthia because he truly wanted to help technology and the environment? Did he do it just because he could? Or because he knew it would get him publicity? Did he make it with the idea of patenting? Or is it a case of “all of the above”?

However, is Venter any different from controversial figures of the past, some of whom we now consider to be the greatest scientific minds of all time? Do we need these maverick scientists to push forward discoveries that others are too afraid to make? If Venter hadn’t turned it into a race, the human genome project would not have been finished earlier than planned. There’s certainly no denying that, whatever you think of his methods, Venter has made remarkable achievements in his career. On the other hand, do we need these boundaries pushed? How much should science interfere with nature? Is it this type of behaviour which makes scientists appear immoral or power-hungry in the minds of the public?

Unfortunately, there is no easy answer to this question. It would be nice to say that science can keep behind the moral horizon and still move forwards, but that’s not the way the world works. We need mavericks and controversial figures to push scientific discoveries into the next era and, as I stated before, what is controversial at first may become normal several years later.

800px-Neandertala_homo,_modelo_en_Neand-muzeoFor my part, I’m wary of scientists who do something which they know is controversial simply because it is possible for it to be done. I call this the “Jurassic Park mentality”: doing something for no better reason than ‘because you can’. Now, before you protest that Jurassic Park is fictional, remember that sometimes truth can be stranger than fiction. Take for example the Harvard Professor who wants a surrogate mother for a Neanderthal baby. I always like to think that research should have some greater purpose which will ultimately prove beneficial. However, I’m not sure how a Neanderthal baby would be even remotely beneficial to anything or anyone.

Although, it’s true that we can’t always tell how research will be used in the future. Sometimes little or less controversial discoveries can become part of something much bigger, and there’s no way of knowing how your research may be used by other people. Just ask Albert Einstein, whose work on atomic theory went on to aid development of the atom bombs dropped on Hiroshima and Nagasaki during World War II.

Perhaps it’s best to think of it this way: when you start pushing at the boundaries of what will be considered controversial or even downright immoral, maybe that’s the time to step back and think “What will the point of this be? Will this be helpful to humanity or the planet or the universe or am I just doing for publicity, fame, glory or just because it is possible?” And if your answer comes into the latter part of that question, then maybe you should at least carefully assess the possibility of someone getting eaten by a rampaging dinosaur before you continue.

Post by: Louise Walker