Hearing what you expect to hear.

Can you spot the error?

Can you spot the error?

Most of us have had the experience of finding a glaring error in some written work that we had previously checked several times. For example when blogging I often find at least one simple error on a post once it has actually been published, despite proofreading it thoroughly before submission. In such circumstances it seems impossible that one can have overlooked such an obvious error. The reason that such mistakes get missed is that we tend to perceive what we expect to perceive. When proof reading something we ourselves have written we know what we were planning to write. We therefore tend to perceive the words we think we put on the page, rather than those that are actually there. This is why it is always wise to get someone else to read over an important bit of written work before you submit it; they are less likely to know what to expect, and therefore less likely to miss errors.

Why does the brain attempt to predict future perceptual input when it can cause such errors? The reason is that in the vast majority of cases expectation improves perception. There is ample evidence from behavioural science that being able to predict the content of an upcoming stimulus improves our ability to successfully perceive it. This is especially true when the signal is degraded in some way, as is often the case with real-world perception (e.g. 1, 2). Despite the advantages of predictive perceptual signaling, the errors caused by expectation are of interest because they can inform us about the limits of our perceptual systems, and how mistakes and dysfunctions within such systems can occur. For example the ‘size-weight illusion’ (3) occurs because we have an expectation that an object’s weight is proportional to it’s size. This expectation causes us to be less accurate at estimating the weight of objects that are unusually high or low in density.

Expectation impacts on our perceptual system by biasing perception towards reporting the expected signal. When expectation is incorrect this can cause us to mistakenly perceive the expected signal when a different but similar signal is present (e.g. 4). For example one might mistake the written word ‘law’ for ‘low’. More startling is the possibility that expectation may cause us to perceive an expected signal when no signal is present. This could explain the occurrence of some hallucinations, where the brain appears to create often complex percepts in the absence of any concordant sensory information. Recently it has been proposed that the auditory hallucinations often experienced by those suffering from psychosis may occur because signals of expected perceptual input become misperceived as genuine sounds (5). Indeed misperceptions due to expectation may be more likely in those with psychosis because the cognitive dysfunctions they suffer make their perceptual predictions less accurate than those of healthy individuals (6)

Previous research has shown that expectation can cause misperceptions when the signal actually present has some similarity with the expected signal (1,4). Further to this it has been shown that erroneous expectation as to the nature of a spoken word can cause a missing phoneme to be heard from within that word (7). This suggests that expectation can cause the perception of speech during periods when no speech at all is present. However hallucinated speech tends to be far more complex then just hearing individual phonemes. Furthermore the (genuine) presence of the rest of the word was required in order to generate the illusionary perception of the missing phoneme in past research. In contrast auditory hallucinations can occur when little or no concordant sensory information related to the hallucinated percept is present. Thus the finding that expectation can generate the perception of missing phonemes doesn’t readily suggest that perceptual expectation could be a cause auditory hallucinations.

Is it possible that the impact of expectation on perception is such that it can cause entire words to be perceived when they are in fact absent? A study recently published in the British Journal of Psychology tested this possibility (8). To mimic the auditory-verbal hallucinations common in psychosis, healthy participants were asked to report whether the final word of a spoken sentence was present when that final word had either been masked or entirely replaced with white noise. A false-positive in this task (reporting speech when only white noise was presented) could be considered analogous to an auditory hallucination in that it represents the perception of auditory-verbal stimulation when no such stimulation is actually present. Crucially during the task the sentence frame (the words preceding the final word) either did or did not generate a specific expectation as to the nature of the final word. This manipulation allowed the effect of expectation on the detection of speech to be tested. Replicating previous findings, the presence of expectation was found to improve the ability to detect speech within the white noise. However it was also found that, despite this improvement, the presence of expectation biased perception to such an extent that it significantly increased the number of false positive responses (8). Thus expectation is capable of generating the perception of spoken words in the absence of any actual speech.

Interestingly in this study the ability of expectation to induce false positive responses was not found to be related to the participant’s level of self-reported hallucination-proneness. This may suggest that while signals of expected input can provide the content for auditory hallucinations, a further deficit must be present in those who hallucinate which allows such signals to erroneously be heard as genuine in naturalistic circumstances. What such deficit(s) might be is the subject of ongoing research which will hopefully lead to a full understanding of the basis of psychotic auditory hallucinations.


1) Krol, M. E., & El-Deredy, W. (2011). When believing is seeing: the role of predictions in shaping visual perception. Q J Exp Psychol (Hove), 64(9), 1743-1771.10.1080/17470218.2011.559587 <Link>

2) Boulenger, V., Hoen, M., Jacquier, C. & Meunier, F. (2011). Interplay between acoustic/phonetic and semantic processes during spoken sentence comprehension: An ERP study. Brain & Language 16, 51-63 <Link>

3) Charpentier, A., 1891. Analyse experimentale de quelques elements de la sensation de poids [Experimental analysis of some elements of weight sensations]. Arch. Physiol. Norm. Pathol. 3, 122–135. <Link>

4) Rahnev, D., Lau, H. & Lange, F. (2011) Prior expectation modulates the interaction between sensory and prefrontal regions in the human brain. The Journal of Neuroscience 31 (29) 10741-10748 <Link>

5) Nazimek, J. M., Hunter, M. D., & Woodruff, P. W. (2012). Auditory hallucinations: expectation-perception model. Med Hypotheses, 78(6), 802-810.S0306-9877(12)00133-8 [pii] 10.1016/j.mehy.2012.03.014 <Link>

6) Fletcher, P. C., & Frith, C. D. (2009). Perceiving is believing: a Bayesian approach to explaining the positive symptoms of schizophrenia. Nature Reviews Neuroscience, 10(1), 48-58. <Link>

7) Samuel, A. G. (1981). Phonemic Restoration – Insights from a New Methodology. Journal of Experimental Psychology-General, 110(4), 474-494 <Link>

8) Hoskin, R., Hunter, M.D., Woodruff, P.W. (2014). The effect of psychological stress and expectation on auditory perception: A signal detection analysis. British Journal of Psychology 105(4) 524-546. <Link>

Book Review: The Sense of Being Stared At, by R. Sheldrake

Sheldrake, R. 2004. The Sense of Being Stared At, and Other Aspects of the Extended Mind. Arrow Books, UK.

The famous sci-fi writer Sir Arthur C. Clarke once said “Magic’s just science we don’t understand yet.”

A quote that is repeated by Jane Foster (Natalie Portman) in the movie Thor (2011) as she persuades Erik Selvig (Stellan Skarsgård) to believe her story that Thor the Norse God (Chris Hemsworth) has come to earth. Their argument is one that can easily be found off screen and it is easy to see where it has come from. Take any piece of today’s technology or illustrate scientific principles to the people of the past and they will assume that what you are doing is magic.

While it is possible that there are many things within science we don’t yet know, the Erik Selvig’s of the world firmly draw the line between magic and science. This is one book that famously blurs that line and has consequently acquired controversy.

This book builds upon Sheldrake’s numerous publications and previous books, notably Dogs That Know When Their Owners Are Coming Home. He approaches the controversial area of telepathy in a scientific manner which befits his education and career: He did his undergraduate degree in Natural History at Cambridge and then proceeded to study Philosophy and History of Science at Harvard. He returned to Cambridge to receive his PhD in biochemistry. His career since then has been heavily embedded within science, and he has established scientific protocols that examine specific aspects of his theory.

His theory, and the basic premise of the book, is that the mind does not just exist inside and within the brain, but that it is stretched outside of the human body. He describes this as a field, much like a magnetic field, which he terms the “morphic field” that explains supernatural-like abilities. The book is divided into three parts (Telepathy, The Power of Attention and finally Remote Viewing and Foreshadowings of the Future) with chapters focusing on a specialism within each of the parts e.g. telephone telepathy in part 1, the sense of being stared at in part 2, and precognition in part 3. Each chapter is as formulaic as a scientific publication; he starts with definitions and the history of the specific ability, and then moves onto experimental case studies explicitly stating his methods, data and analysis (including the importance of statistics to differentiate between positive results and coincidences).

As Carl Sagan once said that extraordinary claims require extraordinary evidence, Sheldrake offers up a wealth of evidence and data á la scientific rigour. Nevertheless this appears to not be enough to quell the staunch position of some sceptics, and there are plenty of them! You can always check out Sheldrake’s page on his website: Controversies and Debates with Skeptics to get the full story.

It is very easy to reject work that might come under pseudo-science and it is the first natural tendency of any scientist to be sceptical and question it. Sheldrake, as a scientist himself, knows the importance of healthy scepticism in scientific advancement, but he does find himself in debates and confrontations with the dogmatic sceptics who openly attack his research. Sheldrake addresses the hostility to his work outlining three reasons why it has garnered such criticism:
1- Such topics like his are open to fraud
2- His work and evidence violates the most basic taboo that the mind exists in the body
3- The issue of the paranormal infringes upon privacy issues, which nobody likes the idea of.

The one fact that strikes me the most (and probably anyone who reads the book) is that he doesn’t just give evidence, but he encourages others to get involved and do their own experiments. He provides his data and methods in the appendices (and also on his website), and explains how he has collaborated with the sceptics to improve his experiments in terms of both methodology and statistical significance. His book may blur the line, but he has the scientific background and approach that many others in pseudo-science do not have. In any case he encourages the scientific process and engages in debates and collaborations, which is always a good thing!

His work is fascinating no matter how it is received. Even if you are an open-minded “Jane Foster” or a suspicious “Erik Selvig,” I would recommend this book as excellent food for thought for anyone who enjoys an intellectual challenge! As for whether his theory holds water or not, the only way to settle that is with more objective and scientific evidence. So who is ready to participate?

Book Review: Gaia – A new look at life on earth By James Lovelock

“Not so long ago, it seemed that mankind was like a cancer on this planet” (pg130)

This sounds awfully similar to the sentiment offered by Agent Smith in the Matrix:
“Human beings are a disease, a cancer of this planet. You’re a plague and we are the cure.”
His speech to Morpheus does sound quite similar to our relationship with the environment, which Lovelock expands upon in his notable book Gaia- a new look at life on earth.

James Lovelock started to write this book in Ireland 1974 against the backdrop of the Cold War and the start of the environmental movement, a legacy bequeathed by Rachel Carson (1907-1964). His involvement in NASA’s planetary exploration programme to investigate life on Mars spurred him on to ask the question: what defines life? His focus turned back to earth and from this in partnership with Lynn Margulis was born the Gaia hypothesis: that the earth and life itself is a cybernetic organism maintaining balance (homeostasis) through feedback loops.

His book, which has been corrected and re-published with a new preface, explains his thought process and provides the evidence for the existence of Gaia. He gives his story and how William Golding (author of Lord of the Flies) suggested the name Gaia from Greek mythology. He explains how the critics responded to his work; that some were not all too happy with the language he used to present his theory because it wasn’t scientific enough. His evidence for Gaia begins with the history and evolution of the earth, and how through feedback cycles equilibriums were established. It is a global disequilibrium that keeps energy in flux and because of this Gaia is a self-regulating system keeping in check the components and processes that could ultimately be disastrous for life. He illustrates this principle by using as an example the atmosphere (chapter 5) and the ocean (chapter 6). While he only touches upon the effects of pollution briefly in earlier chapters, he expands upon it in chapter 7, and these earlier chapters provide an excellent context to the chemicals that we deem pollution. An important point that he makes is that a lot of the chemicals going out into the environment (as a result of us) are already present naturally, but through feedback Gaia regulates the amounts. Pollution, he explains, is thus an anthropocentric concept and “may even be irrelevant in a Gaian context” (pg103). The problem that he sees is that we can disrupt Gaia’s mechanism through positive feedback leading to an increase in chemicals to the point where they become destructive. Species modify their environment (as Lynn Margulis pointed out to him), and even though some may believe we are divided from nature, ultimately we’re not. We still are connected to nature by what we do to it. Lovelock concludes that to succeed in our environmental efforts we need a clear understanding of the world around us, and in his epilogue he suggests that perhaps our destiny lies in becoming tamed ourselves so that we become a community to live in harmony with Gaia.

The media today is constantly bombarding us with messages on climate change: the latest policy updates, the battle with those who deny climate change, novel preventative methods and even if we can find a way to live sustainably in a world driven by economic growth (Happy Planet, New Scientist, 5 July 2014, pg 31). We seem to be divided into those who have a positive viewpoint on how much we can change and those who are cynical, which includes Lovelock himself. James Lovelock has since published several follow up books and back in April, an exhibition was opened at the Science Museum honouring him and his life’s work (James Lovelock reflects on Gaia’s Legacy). In an age where we are battling tooth and nail against climate change, this classic book takes us back to a time where the seed of environmentalism was born and it reminds us of the bigger picture of what we are aiming to achieve for ourselves and future generations.


Some sense in sensory deprivation

How would you cope if you couldn’t hear, see or feel anything? How do sensory systems react when they have no information to process? Such questions may seem rather bizarre, but they are in fact the topic of sensory deprivation research. Sensory deprivation involves systematically preventing information from reaching one or more sensory modalities. As a research methodology it has a long and chequered history.

Some early sensory deprivation studies were funded by the CIA with the purpose of identifying effective methods of interrogation (1, 2). Such experiments involved attempts to completely remove all sensory information from an individual. Participants were left for long periods in secluded, sound-proofed rooms, wearing cardboard sleeves to reduce tactile stimulation and goggles to reduce visual stimulation. For the unfortunate participants, the consequences of being exposed to these conditions were often hallucinations, anxiety and mental deterioration. The extent of the volunteers suffering was such that they were (eventually) paid compensation in recognition of their maltreatment. Indeed the controversy surrounding these sorts of experiments led to the introduction of stricter ethical guidelines to prevent unwitting participants of behavioural research being exposed to potential harm or distress (1).

Although these early, total sensory deprivation experiments were clearly unacceptable, more subtle forms of sensory deprivation are still used in human research. Sensory deprivation can, if implemented correctly, reveal important information about the functioning of the nervous system without causing any harm to those participating in the research. One example of the positive use of sensory deprivation concerns its role in improving our understanding of tinnitus. Back in the 1950s it was discovered that people with normal hearing often began to experience hallucinatory sounds similar to tinnitus if they were left in a silent, sound-proofed room (3). This finding led to the idea that hearing loss may contribute to the development of tinnitus. However, most tinnitus sufferers do not exhibit the sort of total hearing loss that is mimicked by a silent, sound-proofed room. Indeed some tinnitus sufferers demonstrate very good hearing! It appears therefore that processes other than hearing loss must be involved in the development of tinnitus.

Modern theories of tinnitus point the finger at the process of homeostatic plasticity; the mechanism by which the equilibrium of neurological systems is maintained through adjustments made to physiological processes (4). In the same way that a thermostat alters the activity of a heating system to maintain a certain temperature, it is thought that the brain modifies the degree of spontaneous firing within neuronal populations in order to maintain a consistent level of activity. In response to damage to the auditory nerve, homeostatic plasticity may cause the brain to implement a ‘gain’ to ongoing spontaneous activity within the auditory system. While this gain may serve to reduce, or even remove, the impact that nerve damage has on hearing ability, it may also induce tinnitus by elevating baseline neural activity to a level that is similar to that evoked by genuine sounds (4).

Scientists funded by the British Tinnitus Association recently used a more refined sensory deprivation methodology to test whether homeostatic plasticity may contribute to tinnitus (5). 18 participants were asked to wear an earplug in one ear for a week. The earplug was specially designed to mimic the sort of high-frequency hearing loss that commonly occurs due to old age or noise damage. Altering the input in just one ear not only minimised the inconvenience for participants, it also allowed the effects of the auditory deprivation to be ethically tested over a far longer period than would otherwise be possible with more substantial deprivation. The methodology therefore provided a far more naturalistic assessment of the effects of hearing loss on the auditory system.

14 of the 18 participants reported experiencing hallucinatory sounds during the week, with most of the sounds taking a form similar to those experienced in tinnitus. This confirmed that real-life forms of hearing loss are capable of inducing tinnitus-like symptoms. Crucially, the pitch of the hallucinated sounds matched the frequency spectrum of the deprivation induced by the earplugs; selective loss of hearing for high frequencies produced mainly high frequency hallucinatory sounds. As the type of hearing loss induced in this study should only provoke homeostatic changes in neuronal populations that process high frequency sounds, this finding supports the idea that homeostatic plasticity contributes to the development of tinnitus.

Despite its somewhat inglorious history, sensory deprivation remains an extremely important methodological tool. By removing the influence of external stimuli, sensory deprivation provides a clearer view of the workings of internally-driven neurological processes such as homeostatic plasticity. As neurological disorders are often characterised by dysfunctions in these internal processes, sensory deprivation studies can provide invaluable insight into the causes of such disorders.

More information about research into the causes of tinnitus is available at http://www.tinnitus.org.uk/the-search-for-a-cure-1


(1) McCoy, A.W. (2007) Science in Dachau’s shadow: Hebb, Beecher and the development of CIA Psychological torture and modern medical Ethics. Journal of the History of the Behavioral Sciences, Vol. 43(4), 401–417. <Link>

(2) Klein, N. (2007). The shock doctrine : the rise of disaster capitalism (1st ed. ed.). New York: Metropolitan Books/Henry Holt. <Link>

(3) Heller, M.F. & Bergman, M. (1953). Tinnitus Aurium in normally hearing persons. The Annals of Otology, Rhinology and Laryngology, 62 (1), 73-83 <Link>

(4) Schaette R, Kempter R (2006) Development of tinnitus-related neuronal hyperactivity through homeostatic plasticity after hearing loss: a computational model. Eur J Neurosci 23: 3124–3138. <Link>

(5) Schaette R, Turtle C, Munro KJ (2012) Reversible Induction of Phantom Auditory Sensations through Simulated Unilateral Hearing Loss. PLoS ONE 7(6): e35238. doi:10.1371/journal.pone.0035238 <Link>

A Child’s Curiosity

“What is gravity?” asks my then four year old nephew to his family.
Quite an inspirational question from a four year old! But then again, at the time they were all watching the highly acclaimed Cosmos; the science documentary hosted by Neil deGrasse Tyson. A child’s curiosity clearly knows no bounds when it comes to the world around them, and I am astounded by the equally captivating questions his brother asks me (“What does e in math[s] mean?”)
It is sad that in later years children eventually lose their interest in science. In 2008, The Telegraph published a report with statistics that showed some children were losing their interest between primary and secondary school (42% of 9 year olds were interested in science, but this drops to 38% of 12 year olds and 35% of 14 year olds). In 2013 Ofsted reported an assessment of science education and recommendations based on a survey [Maintaining curiosity: a survey into science education in schools] conducted between 2010 -2013. CaSE in London responded positively to this report and has called for the UK Government to heed these recommendations.

I had the chance to observe such inspirational curiosity during my Easter holidays when I visited my cousin and her children (hereafter referred to as my niece and nephews) in Boston, centre of scientific excellence. It really doesn’t get any better than that! One of the things I was particularly looking forward to was extracting DNA from strawberries with them. I had of course previously done this with Science Brainwaves at a school outreach event in 2011. Just telling my 11 year old niece and 9 year old nephew about this got them really excited. When I told them their dining table needed to be cleaned, my nephew (who prefers to kick a football around the house much to his parents’ disapproval) was so eager that he could not clean the table fast enough! I complemented this little experiment with a brief 101 on DNA and PCR (polymerase chain reaction, which copies DNA). When I interviewed them later about it, they told me they loved it and would gladly do it again. Score 1 for science! If doing practical experiments benefits children so much, then it certainly makes the removal of practical work for A’ Level exams by Ofqual in the UK dubious. Naturally, this move has been criticised and on the 12th May 2014, a hearing was hosted by the Commons Select Committee to discuss this proposal.

I also had the opportunity to visit some of the scientific attractions of Boston with the kids; the New England Aquarium and the Museum of Science. I found that the trip to the aquarium was definitely worthwhile. My nephew had his own epiphany linking the ecology of sharks and their prey to the food web, as he dictated later in the evening. I was amazed and a little scared when I had the chance to touch stingrays and a baby shark in the giant ‘touch-tank’. But I wasn’t the only one in seventh heaven. It was interesting to find out that my niece actually wants to be marine biologist. Score 2 for science. I asked my 9 year old nephew what he wanted to do and he replied with a list of potential careers: archaeologist, cosmologist or footballer. I was completely honest with him when I said that he might make more money becoming a footballer, there probably won’t be very much opportunities in archaeology, but that cosmology might be more fulfilling. He didn’t seem too upset or waylaid by this. I neglected to ask the youngest, as at the time he was 4 years old and most of his ambitions included chocolate and tickling. I will ask him another time.

When it comes to their parents encouraging them into science, my cousin and her husband do not hold back! Amazed by my nephew’s questions, the family is now subscribed to the Scientific American and together they all sit down to watch Cosmos. This documentary, playing on Fox and National Geographic, follows on from the Carl Sagan version which played in 1980. I watched the second [Some of the Things That Molecules Do] and fourth [A Sky Full of Ghosts] episodes, glad that the second episode provided a neat and tidy introduction to genetics and evolution, the ideal precursor to extracting DNA from strawberries. I was really astounded at how the show pitched itself perfectly to both adults and children. The parents and all three kids were glued to the screen, and I found the balance between explanation and visuals in complete harmony. But of course, a documentary like this does not go unnoticed by the Creationists in the USA, especially when evolution is involved. In an article in the Huffington Post, the show was criticised by Creationists for not given its due time on Creationism. I asked my niece and nephew what they thought and they make the same claims as any scientist would make: Why give inaccurate information to the public, especially when there is evidence to back up what we already know? Score 3 for science.

For the final flourish in this scientific journey with my niece and nephew I tell them I write a monthly blog for Science Brainwaves. So I asked them to read Resurrection! Bringing species back from the dead, and get their thoughts and feedback. They both would like to see the sabre tooth cat brought back (clearly indicating they have not seen Jurassic Park). My niece is for bringing back species from the dead as she thinks it will make great study material. My nephew on the other hand is a lot more cautious. I don’t think I can score this as either for or against science, but I’m definitely happy to see this younger generation at least considering the ethical options. Either way, the final score is 3 for science, I’m still the Cool Fun Aunt and now I can happily watch and nurture their scientific ambitions to reality.


DNA Extraction

Doing DNA extraction with my niece and nephew


The touch tank at the New England Aquarium


Interbreeding humans: The Sassy Palaeolithic Action

Are you confused about the different types of humans that got it on with each other?
If you are, then in this blog entry, I provide a summary factsheet style of the pairs of hominids that interbred with each other, with evidence for and against it, and the current conclusion. All dates are expressed as years before present, and be aware that in the field of ancient DNA, human contamination is always an issue that can potentially confound the results.

Hominid Action Pair #1:     Neanderthals     +        Humans: Non-Africans
Home-turf:                            Europe/Asia                         Global
Lived from:                        400,000 – 30,000             200,000- present
When and where:                     47,000-65,000 Middle East
Evidence for getting it on: FORinterbreedingIf the different colours represent different genetic make-up, then the answer to the question “Where did the modern humans get their genes from?” can only be “from mating with Neanderthals”.

Evidence against getting it on: AGAINSTintebreedingThis model clearly produces the same pattern but without mating.

Current status: The literature arguing against interbreeding has been focused on the pitfalls of the method used. Nevertheless, new methods are now confirming that Neanderthals and humans were up in a tree K.I.S.S.I.N.G. Now, we are starting to identify what genes we got from them.


Hominid Action Pair #2:         Denisovans      +      Humans: Asians/ Oceanians
Home-turf:                           Siberia (Tropics?)                       Oceania
Lived from:                            ? – ~50,000                       200,000- present
When and where:                      S.E Asia and prior to 44,000
Evidence for getting it on: It’s the same case as the evidence for mating between Neanderthals and humans, except here the genes that went into the ancestral populations of the Aboriginal Australians, Near Oceanians, Polynesians, Fijians, East Indonesians, and Philippine Mamanwas and Manobos can only have come from the Denisovans.
Evidence against getting it on: While models are presented in the literature for discussion, they have been argued against. The consequence of which is that the results are described as a best fit to the data.
Current Status: We know that the action took place BUT we don’t know enough about the Denisovans. We only have a tooth and small finger bone from a Siberian cave. Watch this space!


Hominid Action Pair #3:              Denisovans       +         Homo erectus?
Home-turf:                              Siberia (Tropics?)        Africa, China, Indonesia
Lived from:                                 ? – ~50,000               1.9 million years- 150,000
When and where:                                         ? and Asia?
Evidence for getting it on: The Denisovan tooth has some features that you can find in some of the older Homo species, and the DNA appears to also look quite archaic. Homo erectus was widespread across Asia, so it does seem likely that the two hominids crossed paths.
Evidence against getting it on: At the minute, there isn’t any. This model was proposed as the most likely scenario to a result that arose from another study.
Current Status: As already said before, we don’t know that much about the Denisovans. There are still a lot of gaps making this a hypothesis.


Hominid Action Pair #4:             Neanderthals        +       Denisovans
Home-turf:                                    Europe/Asia              Siberia (Tropics?)
Lived from:                                400,000 – 30,000              ? – ~50,000
When and where:                                    ? and Europe/Asia?
Evidence for getting it on: NeintoDeEvidence against getting it on: Not presented.
Current Status: This has just recently been published, and the amount of DNA material from the Neanderthals into the Denisovans is calculated to have been very small. More analyses will have to be done.


Here is a final tree summarising the relationships between the hominids, and the arrows indicate where interbreeding took place. Adapted from Prufer et al 2014.

SummaryWhile I used a lot of literature to create the above summary, if you are looking for more information then I recommend Veeramah and Hammer 2014, Nature Reviews Genetics 15, 149-162 as it provides an up to date informative review while also including relevant references.

Resurrection! Bringing extinct species back from the dead

My best friend has nightmares about dinosaurs; T. rexes chasing and searching for her as she hides behind furniture. I don’t blame her. Courtesy of Jurassic Park, I’m sure there are plenty of others who have had similar nightmares. During my PhD, I once had a nightmare that Neanderthals were back and trying to take over Europe. But is it really possible to bring back species from the dead?

At first glance, it seems to be more difficult than the movies give credit for. In 2009, the New Scientist published an article outlining the method for any Dr Moreau wannabes, but also why the technology is not available for it to work:   

1- Obtain a complete and accurate genome

2- Package and assemble this genome into chromosomes

3- Identify a suitable surrogate to provide the egg and to gestate the embryo to full term

But as technology is moving at a face pace the limits of scientific accomplishments are being further tested. Revive and Restore is a project aiming to push the boundaries for resurrecting extinct species. Inspiration for this project came when the last passenger pigeon (a species hunted to extinction for its meat) named Martha died in Cincinnati Zoo in 1914. Several species have since been chosen as candidates to be resurrected, including a range of birds, the quagga, the Easter Island palm and several Pleistocene mammals. The first meeting held on the 8 February 2012 brought together conservation biologists and geneticists to assess the feasibility of resurrection. Since then the project has established a list of criteria to examine the suitability of each candidate species, e.g. how will bringing back the species answer scientific questions, is it possible to re-wild a species and, if selecting a species further back in time, is there enough preserved DNA. Sure enough in 2013 an article What If Extinction Is Not Forever? appeared in Science discussing the risks versus the benefits. The objections fell into five categories: animal welfare, health, environment, political and moral, and the benefits included: scientific knowledge, technological advancement, environmental benefits, justice and the “cool” aspect of it.

One of the reasons why the passenger pigeon was selected is because there are plenty of samples that are young enough to obtain good quality DNA. The key issue therefore lies in the survivability of ancient DNA sequences. The potential rests firstly in obtaining increasingly older DNA sequences and secondly, ensuring that the genome is of high quality. The oldest DNA sequence obtained to date is the 700,000 year old horse genome from Canada, a far cry from what was once an hypothesised maximum of 100,000 years old (Shapiro and Hofreiter, 2014). Sequencing technologies have also drastically increased genome coverage (how many times a gene is “read” a bit like each time you re-read a book you pick up on more information and with better accuracy). The Denisovan genome, for instance, before the use of the new sequencing technology enabled a coverage of 1.9 fold, which increased dramatically to 30 fold once the new techniques were applied. Nevertheless even with the new technology to analyse fragmented DNA, Shapiro and Hofreiter (2014, p. 1236573-2) state “it may not be possible to sequence any eukaryotic palaeogenome truly to completion”- a sad case indeed for the 11 year old boy who once asked me at a Science Brainwaves workshop “can we bring monsters back?” Assuming he meant dinosaurs and without getting into the logistics with him, I gave him the theoretical but not entirely true response of ‘yes’. He then exclaimed “COOL” really loudly and wondered off. To be honest, I didn’t really want to break his heart, and I may just have inspired him to become a scientist! A similar question was posed (in a considerably much more professional manner) at the Royal Society meeting in London Ancient DNA: the first three decades in November 2013. Towards the end of the talk The Future of aDNA by Professor Michael Hofreiter, a colleague from UCL asked “can we reconstitute from ancient DNA (putting aside the technical details) a viable sequence to create a viable organism?” The answer was a resounding no. When it came to resurrecting dinosaurs, Professor Hofreiter quite clearly stated that the field of ancient DNA will never be able to go that far back in time. With an added flourish came the phrase “don’t waste your time or money!”

So should we be worried about dinosaurs chasing us, or Neanderthals taking over? With the advancement of technology opening up the possibility of resurrection, it is only those that have not been extinct for very long that could be resurrected, such as the dodo or the passenger pigeon. The debate on this however continues: It is through this approach that together scientists, policy-makers and the public can work together to ensure that this new science is used in a mature and sensible way. As for my best friend, she is relieved and hasn’t had a dinosaur nightmare since.

Jurassic Park, copyright IMDB Universal Pictures 2012

Jurassic Park, copyright IMDB Universal Pictures 2012

Animal intelligence.

I have covered a wide range of topics throughout my Biomedical Science degree: but one thing that really sparks my interest that isn’t really biomed related are studies involving animal behaviour.
Animal rights have recently been a hot topic, especially due to the viral article recently informing about the dolphin killing ritual in the Faroe Islands, which have shocked many. One main reason underlying animal cruelty may be because their value is often under appreciated, their intelligence overlooked, as it is not apparently obvious. Animals are able to show emotions and even compassion for their own species, as well as inter-species.
Whether it’s little zebra fish showing aggressive behaviour with a mirror test, or elephants for their caring behaviours, all sorts of animals are capable of actions that are similar to many types of human emotional behaviour.

Everyday birds that are seen as an annoyance on the streets, for example pigeons are often underrated for their abilities, normally as they are viewed as vermin. A well-known instinct is for very accurate homing ability. This ability may be due to their sense of smell, by spatially mapping ‘olfactory positions’ of locations, or past suggestions of magnetic field detection, a theory less well received.
Pigeons can also show other characteristics e.g. choosing a preference of two options, tested with a string test using a touchscreen:

pigeon touchscreen
The test involved choosing between string attached to an empty animated box (red string), and a box that looked like it had food in it (green). This demonstrated pigeon’s ability to learn, and make associations with a preference and food reward.
They are also able to pass the mirror test as they have the ability to recognise their own reflection. However this is being argued against, as the pigeons in the original experiments were trained to respond to a mirror. Although simple, passing these tests shows a lot of potential.

Crows are able to take certain faces that are associated with negative emotion/danger to memory, and other crows are called to the proximity to learn and ‘mob’ the dangerous face. These memories may last a lifetime, so are able to spread widely in the community through social learning.
Other such species, like primates and cetaceans are fairly well known for their ability to shown complex behaviours.

Capuchin monkeys are known for displaying playful and lively behaviours to attract a mate. A recent amusing article has shown how females can be seen throwing stones at males, as a form of ‘flirtatious’ behaviour. It is one of the only ways they can attract a mate, as there is no physical characteristics shown in this species before copulation. For males, it is much easier; rubbing their bodies with urine is effective in attracting females.
Dolphins are able to show complex behaviours, e.g. being able to communicate through their own series of clicking/whistle sounds, each having different interpretations. Another very interesting learnt behaviour is protecting their young’s snout with a sponge, when teaching foraging techniques. Although, recently people have started to consider whether they are really as smart as they are put up to be, as they can show some of the same social characteristics as chickens e.g. roaming in large groups, empathetic responses, so may be on the same intelligence wavelength, which in my opinion should not be any more of a reason for their abuse. This had lead to suggestions, which I do agree on, that animals should be appreciated for their different types of intelligence, rather than put on a hierarchy- type scale.

This may rule out possible ‘classification’ of dolphins as ‘non-human persons’. The Cetacean family, to which whales and dolphins belong, are highly respected due to their complex intelligence, to the extent that some countries decided to ban captivity, and the use of dolphins for entertainment purposes, as their intelligence is up to the level that they could have the same rights as a ‘non human person’.

In many species of varying family and size, there is an abundance of complex behaviours that are analogous to human behaviours, some being a caring thought, to heart-wrenching empathy behaviour, even towards other species.
With many different animals, research and analysis will continually reveal that there is more than meets the eye. Behaviour is a very complex topic, with opinions coming from many different perspectives, leading to differing definitions of the term intelligent.

Image taken is a screenshot from video: TrendVideos32. 2013. Pigeons master touchscreen intelligence test. [Accessed 06 Feb 2014] from: http://www.youtube.com/watch?v=KEVUD9UM-FA

Science on the Origin of Life

Since the dawn of civilisation and the dreaming up of our early creation myths, the philosophical and scientific debate of the origin of life has enchanted people worldwide. Since the thousands of years when early man prayed to sky gods have we got any closer from determining how life originated on earth? And can we even prove any of the theories through the scientific method?

The most widely held theory is that of abiogenesis. This is the idea that the conditions present on early earth when life was beginning, such as the electrical activity and dense atmosphere, resulted in the spontaneous creation of the building blocks of life. When these early conditions were replicated in the lab, in the iconic Urey-Miller experiment of 1953, some ingredients for life, such as amino acids, were seen.

The one major problem with this theory is that just as in cooking, adding the ingredients together doesn’t automatically make the meal. Life is amazingly complex and intricate. Having the building blocks doesn’t account for how they organised themselves into the patterns that can be call life.
Scientists are working to fill in this gap, producing theories based on the original abiogenesis. These ideas attempt to explain how order was achieved. These include the Deep Sea Vent hypothesis, the Coenzyme and RNA world hypothesis, and the Iron-Sulfur World theory. Other theories stray away from the abiogenesis idea, such as Autocatalysis Clay hypothesis, Gold’s “Deep-Hot Biosphere” model, Lipid world and Polyphosphates, to name a few.

One theory looks beyond the earth for the origin of life. This is known as panspermia, the theory that life originated in space. The main proof behind the theory is the presence of dead microbes and fossils found in debris in the stratosphere. However this evidence has faced harsh criticism from the scientific world.

There are many questions yet to be answered by Panspermia. In the theory, life, in the form of microbes, came to earth piggybacking on meteors and asteroids. How did the microbes survive the harsh conditions of space, and the harsher conditions whilst entering or exiting the atmosphere? Where did they come from? How did they then survive on a barren planet enough to divide and evolve? This theory doesn’t really solve the fundamental question of where life originated, but it does extend the time for which life can form over. In the history of the universe, earth is relatively new.
Science is yet to form a watertight theory on the origin in life, and there is question if it ever will. The many different ideas debate over which is the closest to what events occurred millions of years ago. Without concrete evidence all we can do is continue to develop these ideas based on theories and assumptions.

Neolithic Revolution in the air!

THE NEOLITHIC REVOLUTION WAS A KEY MOMENT IN THE PREHISTORY OF HUMANS. It sparked civilisation as we know it- settlements were established, crops were grown and animals were domesticated transforming the economy of subsistence globally. Beginning in the Levant (Near East) around 12,000 years ago, the Neolithic Revolution spread into Europe 8000 years ago and lasted up until 4000 years ago when the Bronze Age began.

The major question is how did this revolution spread? Did the indigenous hunter gatherers adopt farming solely though cultural transmission? Or did the farmers pass on their practices alongside their genes? These two models (see diagram) – culturally diffused model (CDM) and demic diffused model (DDM) – originally seen as two polar opposites as mechanisms of the spread, have been debated throughout the 20th century. By identifying the proportion of Mesolithic/ hunter-gatherer and Neolithic/ farmer genes within the current gene pool (see diagram), the correct model could be identified.

Classical genetic markers in present day populations (such as blood groups) appear to lend support to the DDM revealing a genetic cline from the Near East towards the West. But modern genetic markers can reflect population processes that have taken place both before and after the Neolithic spread. Instead ancient DNA (aDNA) provides a unique window of opportunity to look back into the past. Ancient DNA studies do come fraught with difficulties. Over time DNA degrades and fragments into short molecules. Usually this means any contaminating modern DNA is favourably extracted and analysed instead. Nevertheless strict and rigorous protocols exist to minimise contamination and new technology has been optimised for aDNA extraction.

The archaeological record has shown that as farmers migrated across Europe, two different routes were taken as indicated by distinct ceramic styles. One route was through Central Europe, from Hungary to Slovakia, Ukraine and through to Paris, as shown by the Linearbandkeramik (LBK) and Alföldi Vonaldiszes Kerámia (AVK) pottery styles. The other route represented with Impressed Ware/Cardial culture was along the Mediterranean coast. aDNA studies have been conducted on samples from these different sites and cultures, and the picture that emerges is one more complex than just picking one model over the other. It certainly appears that the two routes have their own model: while the Central Europe/ LBK route shows little to no genetic continuity between the Mesolithic hunter-gatherers and the Neolithic farmers, the Mediterranean route tends towards genetic continuity and therefore a level of gene flow between the two populations, a pattern which even seems to lead up into Sweden.

But this most certainly is not the end of the story. For one thing, the genetic studies carried out were analysing the mitochondrial DNA (mtDNA), which is inherited solely down the female line (men inherit their mothers’ mitochondrial DNA but will not pass it on). In one study, it was found that of Spainish Neolithic samples while the mtDNA belonged to hunter gatherer groups from the Palaeolithic, the Y chromosome was shown to be from the Neolithic Near East. This does seem to suggest that the role of men and women during the advance of the Neolithic differed to some extent. Additionally, it also appears that the change to farming practices did not happen as rapidly as expected, and was not as clear cut. In two recent papers (with a particular focus on Germany), it was found that hunter gatherers and farmers lived alongside each other for about 2000 years and, interestingly while the Mesolithic hunter gatherers and the Neolithic farmers had their own distinctive gene pools, at some point in the Neolithic there were intermediary groups with shared ancestry and lifestyle undoubtedly reflecting the transition that was taking place.

There is a level of difficulty when studying the past; we cannot always state processes or cause and effects with a perfect degree of certainty, but we can say what the evidence appears to suggest, and in this case it appears to suggest a high a degree of complexity as the Neolithic Revolution took hold. There is never just any one specific model that can answer our questions, and there will always be other lines of evidence to explore. To answer the original question how the Neolithic Revolution spread cannot be placated with just one simple answer. It is never that easy. But as aDNA analyses show, we can still get one step closer to that very complicated answer.

More information:

Bollongino et al 2013 Science 342 (6157) 479-481

Brandt et al 2013 Science 342 (6155) 257-261

Gamba et al 2011 Molecular Ecology 21 (1) 45-56

Haak et al 2005 Science 310 (5750) 1016-1018

Lacan et al 2011 PNAS 108 (45) 18255–18259

Pinhasi et al 2012 Trends in Genetics 28 (10) 496-505

Skoglund et al 2012 Science 336 (6080) 466-469Neolithic Revolution