All posts by Admin

Neglected Baby Beetles Evolve Greater Self-Reliance

Neglected baby beetles evolve greater self-reliance

Sexton beetle and larva.
source: www.cam.ac.uk

A new study reveals that when burying beetle larvae are denied parental support, they evolve bigger jaws to compensate.

Our ongoing research investigates the importance of the social environment in evolution

Rebecca Kilner

In gardens, parks and woods across the UK, the Sexton burying beetle Nicrophorus vespilloides quietly buries dead mice and other small vertebrates to create edible nests for their young.
Most parents remove the animal’s hair and slash the flesh of the carcass to help their newly-hatched larvae crawl inside. Typically they also stay on to defend and feed them, but levels of care vary and larvae can survive without their parents.
In a laboratory in Cambridge’s Zoology Department, researchers exploited the insect’s unusual natural history to establish two starkly different experimental populations and explore how parental behaviour drives evolution.
The study, published today in the journal Nature Communications shows that larvae evolve distinctive adaptations in response to the different levels of parental care.
The scientists behind the research exposed hundreds of beetles to two levels of parental care, over 13 generations. In a No Care environment, parents were removed as soon as they had prepared their mouse carcass nest but before their larvae had hatched. By contrast, in the Control environment, the parents were allowed to care for their young until they were ready to leave home.
The researchers found that when parents fed meat to their babies’ mouth-to-mouth, the larvae evolved relatively smaller mandibles. These horizontally-aligned bladelike jaws play a vital role in the larva’s life, enabling them to enter the carcass and feed on the flesh once inside, but they are less important when parents help their young to feed.
“By contrast, when the parents were removed from their young and larvae were forced to self-feed, the larvae evolved significantly larger jaws to compensate for the lack of help,” said Benjamin Jarrett, who led the study.
Many previous studies have shown that social interactions in animals can drive evolutionary change through arms races which cause traits to become increasingly exaggerated. But animals also cooperate and it has been argued that when one individual contributes more, this can diminish traits in the less active social partner. Rarely, however, has direct evidence of this process been obtained.
So what are the larval mandibles like in natural populations, where the level of parental care is very variable from family to family? Here the researchers found that larval jaws are consistently large on average, regardless of the size of the larva.
“They seem to be anticipating the worst possible scenario of receiving no help at all. This looks like a conservative bet-hedging strategy for survival,” said Jarrett.
“Whether parents eventually decide to stay or go, the larva are equipped with large jaws and so can fend for themselves if necessary.”
The laboratory’s experimental populations of beetles are continuing to evolve and are now in the 35th generation of experiencing different levels of parental care.
“Our ongoing research investigates the importance of the social environment in evolution. We are watching the way that evolution unfolds in these experimental populations and they constantly teach and surprise us,” said Professor Rebecca Kilner, senior author of the paper.
“The better our understanding of how evolution works, the better able we are to predict how animals will evolve in a changing world”.

Reference:
Benjamin Jarrett et al. ‘A sustained change in the supply of parental care causes adaptive evolution of offspring morphology.’ Nature Communications (2018). DOI: 10.1038/s41467-018-06513-6


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Women Much Less Likely To Ask Questions In Academic Seminars Than Men

Women much less likely to ask questions in academic seminars than men

A seminar audience in Austin, Texas, United States
source: www.cam.ac.uk
A new study reveals a stark disparity between male and female participation in a key area of academic life and offers recommendations to ensure all voices are heard.

Junior scholars are encountering fewer visible female role models

Alecia Carter

Women are two and a half times less likely to ask a question in departmental seminars than men, an observational study of 250 events at 35 academic institutions in 10 countries has found.
This disparity exists despite the gender ratio at these seminars being, on average, equal. It also reflects significant differences in self-reported feelings towards speaking up.
The research, led by a then Junior Research Fellow at Churchill College, University of Cambridge, adds to a growing body of evidence showing that women are less visible than men in various scientific domains and helps to explain the “leaky pipeline” of female representation in academic careers. Women account for 59 per cent of undergraduate degrees but only 47 per cent of PhD graduates and just 21 per cent of senior faculty positions in Europe.
The bias, identified in a paper published today in PLOS One, is thought to be particularly significant because departmental seminars are so frequent and because junior academics are more likely to experience them before other kinds of scholarly events. They also feature at an early stage in the career pipeline when people are making major decisions about their futures.
“Our finding that women ask disproportionately fewer questions than men means that junior scholars are encountering fewer visible female role models in their field,” warns lead author, Alecia Carter.

Self-reported behaviour and perceptions

In addition to observational data, Carter and her co-authors drew on survey responses from over 600 academics ranging from postgraduates to faculty members (303 female and 206 male) from 28 different fields of study in 20 countries.
These individuals reported their attendance and question-asking activity in seminars, their perceptions of others’ question-asking behaviour, and their beliefs about why they and others do and do not ask questions.
The survey revealed a general awareness, especially among women, that men ask more questions than women. A high proportion of both male and female respondents reported sometimes not asking a question when they had one. But men and women differed in their ratings of the importance of different reasons for this.
Crucially, women rated ‘internal’ factors such as ‘not feeling clever enough’, ‘couldn’t work up the nerve’, ‘worried that I had misunderstood the content’ and ‘the speaker was too eminent/intimidating’, as being more important than men did.
“But our seminar observation data show that women are not inherently less likely to ask questions when the conditions are favourable,” says Dieter Lukas, who was a postdoctoral researcher at Cambridge during the data collection.

Question-asking behaviour

The researchers found that women were more likely to speak up when more questions were asked. When 15 questions were asked in total, as opposed to the median of six, there was a 7.6 per cent increase in the proportion of questions asked by women.
But when the first question in a seminar was asked by a man, the proportion of subsequent questions asked by women fell six per cent, compared to when the first question was asked by a woman. The researchers suggest that this may be an example of ‘gender stereotype activation’, in which a male-first question sets the tone for the rest of the session, which then dissuades women from participating.
“While calling on people in the order that they raise their hands may seem fair, it may inadvertently result in fewer women asking questions because they might need more time to formulate questions and work up the nerve,” said co-author Alyssa Croft, a psychologist at the University of Arizona.
The researchers were initially surprised to discover that women ask proportionally more questions of male speakers and that men ask proportionally more of female speakers.
“This may be because men are less intimidated by female speakers than women are. It could also be the case that women avoid challenging a female speaker, but may be less concerned for a male speaker,” said co-author Gillian Sandstrom, a psychologist at the University of Essex.
Linked to this, the study’s survey data revealed that twice as many men (33 per cent) as women (16 per cent) reported being motivated to ask a question because they felt that they had spotted a mistake.
Women were also more likely to ask questions when the speaker was from their own department, suggesting that familiarity with the speaker may make asking a question less intimidating. The study interprets this as a demonstration of the lower confidence reported by female audience members.
Welcoming the research, Professor Dame Athene Donald, Professor of Experimental Physics at the University of Cambridge and Master of Churchill College, Cambridge, said:
“Asking questions at the end of talks is one of the activities that (still) makes me most nervous … Whatever anyone may think when they meet me about how assertive my behaviour is, it would seem that I too have internalised this gender stereotype.”

Recommendations

“This problem can only be addressed by lasting changes in the academic culture which break gender stereotypes and provide an inclusive environment,” Alecia Carter says.
The researchers accept that this will take time but make four key recommendations to improve the situation in departmental seminars:
  • Where possible, seminar organisers should avoid placing limits on the time available for questions. Alternatively, moderators should endeavour to keep each question and answer short to allow more questions to be asked.
  • Moderators should prioritise a female-first question, be trained to ‘see the whole room’ and maintain as much balance as possible with respect to gender and seniority of question-askers.
  • Seminar organisers are encouraged not to neglect inviting internal speakers.
  • Organisers should consider providing a small break between the talk and the question period to give attendees more time to formulate a question and try it out on a colleague.
“Although we developed these recommendations with the aim of increasing women’s visibility, they are likely to benefit everyone, including other underrepresented groups in academia,” said Carter.
“This is about removing the barriers that restrain anyone from speaking up and being visible.”
Reference:
Alecia J. Carter , Alyssa Croft, Dieter Lukas, Gillian M. Sandstrom, ‘Women’s visibility in academic seminars: Women ask fewer questions than men.’ 
PLOS ONE (2018). DOI: 10.1371/journal.pone.0202743

The researchers and further info
Alecia Carter is a Researcher at the Institut des Sciences de l’Évolution, Université de Montpellier.
Dieter Lukas is a Senior Scientist at the Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany.
Alyssa Croft is Assistant Professor in the Department of Psychology at the University of Arizona, Tucson, USA.
Gillian Sandstrom is a Lecturer in the Department of Psychology at the University of Essex, UK.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Ebola and Lassa Fever Targeted By New Vaccine Trial and Improved Surveillance

Ebola and Lassa fever targeted by new vaccine trial and improved surveillance

source: www.cam.ac.uk

Scientists hope that a new approach to vaccine development, combined with improved surveillance of potential future threats of outbreak, could help to massively reduce the impact of deadly diseases such as Ebola, Marburg and Lassa fever.

“This has the potential to have an enormous positive impact on global public health”

Jonathan Heeney

Researchers from the University of Cambridge will shortly begin clinical trials of a new vaccine that builds on almost two decades of research to protect against diseases caused by RNA viruses. At the same time, they will begin studying the natural animal reservoirs of the viruses in an attempt to try and predict which strains are likely to cause future outbreaks, information that will be essential for creating effective vaccines.

Ebola, Lassa and Marburg viruses cause haemorrhagic fever, leading to severe disease, often with high mortality rates. Outbreaks can cause devastating local epidemics in the human population and to wildlife, including non-human primates. The recent Ebola epidemic in West Africa (2013–2016) killed over 11,000 people and devastated the infrastructure and economies of Liberia, Sierra Leone and Guinea.

A new approach to vaccine development

Professor Jonathan Heeney and colleagues at the Lab of Viral Zoonotics, University of Cambridge, have developed and successfully tested a trivalent vaccine in guinea pigs that protects against Ebola, Lassa and Marburg viruses. As a result, Professor Heeney has been awarded a further £2 million by Innovate UK and the Department of Health and Social Care to take the vaccine to clinical trials in humans.

The research takes a new approach pioneered by Professor Heeney and builds on Cambridge’s strengths in genomics, monoclonal antibody research and computational biology. It has led to the formation of DIOSynVax, a spin-out company of Cambridge Enterprise.

A virus’s genetic code is written into its RNA (just as ours is written into our DNA), which leads to the generation of proteins. When we are infected by a virus, our immune system responds to these proteins, known as ‘antigens’, producing antibodies that can identify and try to eliminate the invading pathogen.

The approach developed by Professor Heeney involves understanding how the immune system correctly identifies the virus from its proteins, and using this information to create ‘viruses’ that can generate an immune response. Using monoclonal antibodies – copies of antibodies taken from survivors of the target diseases – they can then test whether the body can effectively eliminate these fake viruses, leading to protection.

“We’ve taken fundamental science that stretches back almost two decades and developed a new approach to vaccine development,” says Professor Heeney. “This has the potential to dramatically reduce the time needed to produce new vaccines and change the way in which the industry makes them.”

With the new funding, the team hopes to scale up production while ensuring that the quality of the vaccine is maintained. They will then carry out toxicity tests in animals and human blood samples to test for potential adverse effects; if successful, they will then trial the vaccine in healthy human volunteers.

The funding is part of a £5m commitment from the Department of Health and Social Care to fund five projects to develop new vaccines with a ‘One Health’ focus, considering how the environment, the health of animals and the health of humans interact. This sits within the government’s £120m UK aid commitment to develop vaccines to help tackle diseases with epidemic potential.

Predicting the next outbreak

In recent Ebola outbreaks, the approach used successfully by the World Health Organization is known as ‘ring vaccination’, focused on vaccinating and monitoring a ring of people around each infected individual. However, this approach can only be used in response to an outbreak. In order for a vaccine to be used proactively – to prevent an outbreak in the first place – it is necessary to predict which strain or strains of a virus are most likely to cause future epidemics.

“A disproportionally high number of emerging and re-emerging diseases – from Ebola and Lassa through to rabies and influenza – are caused by RNA viruses carried naturally by animals,” says Professor Heeney. “We know very little about the viral diversity within these reservoir species and what enables them to spread to humans – and hence where the likely future threats lie.”

Viral genomes are notoriously variable due to the high mutation rates that occur during replication. These accumulate over time and result in evolution of the viruses as they circulate in their natural animal reservoir populations. If some viral variants arise and are able to adapt to use human cell receptors and are then able to escape immune defences, they may become highly infectious and cause large disease outbreaks.

“Vaccines are only as good as the antigen immune targets of the virus that they are designed for,” adds Professor Heeney. “If the antigen changes, the vaccine will no longer be effective. In most cases, current vaccine candidates against RNA viruses are from past human outbreaks with little or no information of future risks from viral variants carried in animal reservoirs, especially those with the potential for animal-to-human transmission.”

Professor Heeney has also received £1.4 million from the Biotechnology and Biological Sciences Research Council (BBSRC) to lead a project that aims to predict where future outbreaks may arise from and the likely strains, and to then use this knowledge to inform vaccine design. This One Health project enlists veterinarians, clinicians, ecologists and medical and public health workers in West Africa to understand how people catch Lassa fever from rat populations. Their work will include trapping rat species that carry these viruses and placing GPS tags to monitor their movements, as well as obtaining molecular, genomic and antibody data from the animals and viral sequences from infected rats.

Professor Melanie Welham, Executive Chair of BBSRC, says: “This important research from the team at the University of Cambridge is about providing effective treatments for some potentially deadly diseases spread by rats and bats: Lassa and Ebola respectively. Novel strategies to combat dangerous infections like these are essential and often underpin the development of much-needed next generation vaccines.

“Professor Heeney and team have already made a significant difference in this area, researching cross species transmissions of these viruses, with a view to developing vaccines for Ebola and Lassa that would be effective against multiple strains.”

In addition, the team is collaborating with Professor James Wood, Head of the Department of Veterinary Medicine at Cambridge, who is conducting a complementary study funded by the Global Challenges Research Fund to sample bat colonies in Ghana, believed to be a natural reservoir for the Ebola virus.

“Equipped with this information, we should be able to design better vaccine antigens for more effective and broadly-protective vaccines,” says Professor Heeney. “Combined with our accelerated vaccine development platform, this has the potential to have an enormous positive impact on global public health.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists Reveal Plan To Target the Cause of Alzheimer’s Disease

Scientists reveal plan to target the cause of Alzheimer’s disease

source: www.cam.ac.uk

Researchers have developed a new way to target the toxic particles that destroy healthy brain cells in Alzheimer’s disease.

This is the first time that a systematic method to go after the pathogens – the cause of Alzheimer’s disease – has been proposed.

Michele Vendruscolo

Academics at the University of Cambridge and at Lund University in Sweden have devised the first strategy to ‘go after’ the cause of the devastating disease, which could eventually lead to the development of new drugs to treat dementia. Their findings are reported in the journal PNAS.

“This is the first time that a systematic method to go after the pathogens – the cause of Alzheimer’s disease – has been proposed,” said Professor Michele Vendruscolo from Cambridge’s Department of Chemistry, the paper’s senior author. “Until very recently scientists couldn’t agree on what the cause was so we didn’t have a target. As the pathogens have now been identified as small clumps of proteins known as oligomers, we have been able to develop a strategy to aim drugs at these toxic particles.”

Alzheimer’s disease leads to the death of nerve cells and tissue loss throughout the brain. Over time, the brain shrinks dramatically and the cell destruction causes memory failure, personality changes, and problems carrying out daily activities.

Scientists identified abnormal deposits called protein oligomers as the most likely suspects of the cause of dementia. Although proteins are normally responsible for important cell processes, when people have Alzheimer’s disease these proteins become rogue, form clumps and kill healthy nerve cells.

“A healthy brain has a quality control system that effectively disposes of potentially dangerous masses of proteins, known as aggregates,” said Vendruscolo. “As we age, the brain becomes less able to get rid of the dangerous deposits, leading to disease. It is like a household recycling system, if you have an efficient system in place then the clutter gets disposed of in a timely manner. If not, over time, you slowly but steadily accumulate junk that you don’t need. It is the same in the brain.”

The research was carried out by an international team of scientists that also included Professor Sir Christopher Dobson, Master of St John’s College, University of Cambridge, at the Centre for Misfolding Diseases (CMD), which he co-founded. “This interdisciplinary study shows that it is possible not just to find compounds that target the toxic oligomers that give rise to neurodegenerative disorders but also to increase their potency in a rational manner,” he said. “It now makes it possible to design molecules that have specific effects on the various stages of disorders such as Alzheimer’s disease, and hopefully to convert them into drugs that can be used in a clinical environment.”

There have been approximately 400 clinical trials for Alzheimer’s disease but none of them has specifically targeted the pathogens that cause it. In the UK, dementia is the only condition in the top 10 causes of death without a treatment to prevent, cure or slow its progression.

“Our research is based on the major conceptual step of identifying protein oligomers as the pathogens and reports a method to systematically develop compounds to target them. This approach enables a new drug discovery strategy,” said Vendruscolo.

The team believes their first drug candidates could reach clinical trials in around two years. They have co-founded Wren Therapeutics, a biotechnology company based in the newly opened Chemistry of Health building, whose mission is to take the ideas developed at Cambridge and translate them into finding new ways to diagnose and treat Alzheimer’s disease and other misfolding disorders.

The group’s new strategy is based on a chemical kinetics approach developed in the last ten years by scientists led jointly by Professor Tuomas Knowles, also a Fellow at St John’s College, Professor Dobson and Professor Vendruscolo, working at the new centre in Cambridge, in collaboration with scientists at Lund University led by Professor Sara Linse.

“Since the process of aggregation is highly dynamic, the framework of kinetics allows us to approach this problem in a new way and find approaches to stop the generation of toxic proteins species at their very source,” said Knowles.

“This is a detailed academic study looking at how quickly compounds are able to stop amyloid building up into toxic clumps, which are characteristic of Alzheimer’s disease,” said Dr David Reynolds, Chief Scientific Officer from Alzheimer’s Research UK. “With no treatments to slow or stop the diseases that cause dementia, it’s vital we improve approaches like this that could help refine the drug discovery progress and accelerate new treatments for people living with Alzheimer’s.”

Reference:
Sean Chia et al. ‘SAR by kinetics for drug discovery in protein misfolding diseases.’ PNAS (2018). DOI: 10.1073/pnas.1807884115

Adapted from a St John’s College press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Mitochondrial Diseases Could Be Treated With Gene Therapy, Study Suggests

Mitochondrial diseases could be treated with gene therapy, study suggests

Researchers have developed a genome-editing tool for the potential treatment of mitochondrial diseases: serious and often fatal conditions which affect 1 in 5,000 people.

Mitochondrial replacement therapy is a promising approach to prevent transmission of mitochondrial diseases, however, as the vast majority of mitochondrial diseases have no family history, this approach might not actually reduce the proportion of mitochondrial disease in the population.

Payam Gammage

The researchers, led by the University of Cambridge, applied an experimental gene therapy treatment in mice and were able to successfully target and eliminate the damaged DNA in mitochondria which causes the devastating conditions.

Their results, published in the journal Nature Medicine, could provide a practical route to treating patients with these diseases and may provide a future alternative to mitochondrial replacement therapy, or ‘three-parent IVF’. This is the first time programmable genome engineering tools have been used inside a living animal, resulting in such significant modification of mitochondrial DNA.

Mitochondria are the powerhouses inside our cells, producing energy and carrying their own DNA. They are inherited from a person’s mother via the egg, but if they are damaged, it can result in a serious mitochondrial disease. For example, MELAS Syndrome is a severe multi-system disorder causing progressive loss of mental and movement abilities, which usually becomes apparent in early childhood.

There are typically about 1000 copies of mitochondrial DNA per cell, and the percentage of these that are damaged, or mutated, will determine whether a person will suffer from mitochondrial disease or not. Usually, more than 60% of the mitochondrial DNA molecules in a cell need to be mutated for the disease to emerge, and the more mutated mitochondrial DNA a person has, the more severe their disease will be. Conversely, if the percentage of mutated DNA can be reduced, the disease could potentially be treated.

Mitochondrial diseases are currently incurable, although a new IVF technique of mitochondrial transfer gives families affected by mitochondrial disease the chance of having healthy children – removing affected mitochondria from an egg or embryo and replacing them with healthy ones from a donor.

“Mitochondrial replacement therapy is a promising approach to prevent transmission of mitochondrial diseases, however, as the vast majority of mitochondrial diseases have no family history, this approach might not actually reduce the proportion of mitochondrial disease in the population,” said Dr Payam Gammage, a postdoctoral researcher in the MRC Mitochondrial Biology Unit, and the paper’s first author.

“One idea for treating these devastating diseases is to reduce the amount of mutated mitochondrial DNA by selectively destroying the mutated DNA, and allowing healthy DNA to take its place,” said Dr Michal Minczuk, also from the Medical Research Council (MRC) Mitochondrial Biology Unit, and the study’s senior author.

To test an experimental gene therapy treatment, which has so far only been tested in human cells grown in petri dishes in a lab, the researchers used a mouse model of mitochondrial disease that has the same mutation as some human patients.

The gene therapy treatment, known as the mitochondrially targeted zinc finger-nuclease, or mtZFN, recognises and then eliminates the mutant mitochondrial DNA, based on the DNA sequence differences between healthy and mutant mitochondrial DNA. As cells generally maintain a stable number of mitochondrial DNA copies, the mutated copies that are eliminated are replaced with healthy copies, leading to a decrease in the mitochondrial mutation burden that results in improved mitochondrial function.

The treatment was delivered into the bloodstream of the mouse using a modified virus, which is then mostly taken up by heart cells. The researchers found that the treatment specifically eliminates the mutated mitochondrial DNA, and resulted in measures of heart metabolism improving.

Following on from these results, the researchers hope to take this gene therapy approach through clinical trials, in the hope of producing an effective treatment for mitochondrial diseases.

This work was supported by the Medical Research Council and was performed in collaboration with Sangamo Therapeutics and the Max Planck Institute for Biology of Ageing in Cologne.

Reference: 
Payam A. Gammage et al. ‘Genome editing in mitochondria corrects a pathogenic mtDNA mutation in vivo.’ Nature Medicine (2018). DOI: 10.1038/s41591-018-0165-9.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

New ‘Rising Path’ Opens At Cambridge University Botanic Garden

source: www.cam.ac.uk

A new Rising Path, designed to offer a fresh perspective on Cambridge University Botanic Garden’s historic Systematic Beds, will open to the public on Saturday (September 22, 2018).

Observation is the cornerstone of all scientific enquiry and plants are full of shapes, patterns, numbers, colours, textures, symmetry, scents and tastes to discover.

Juliet Day

The Rising Path is part of the Garden’s Understanding Plant Diversity Project, a three year project supported by The Monument Trust, which aims to revitalise the contemporary relevance of the Garden’s Systematic Beds for researchers, teachers and visitors by exploring how plant diversity is identified and organised – the science of plant taxonomy.

Professor Beverley Glover, Director of the Botanic Garden, said: “We are thrilled to be opening the Rising Path to the public. This innovative structure is a key part of an ongoing project to re-examine, re-interpret and re-display our Systematic Beds which will help visitors, students and scientists to understand more about plants, how they are related, and why this is so important for science.”

Rising Path Project Manger, Juliet Day, said: “Observation is the cornerstone of all scientific enquiry and plants are full of shapes, patterns, numbers, colours, textures, symmetry, scents and tastes to discover. We have developed the Rising Path and created interpretation displays to encourage visitors of all ages to explore the Systematic Beds and enjoy looking more closely at plants.”

The Rising Path leads to a viewing platform where visitors will be able to see the full extent and layout of the Systematic Beds from a 3m high vantage point. Rest points along the path highlight the innovations that allowed plants to leave the water for life on land and to proliferate into the 400,000 species known today.

At ground level, an interpretation hub expands on the twin educational purpose of the Systematic Beds: how to look at plants, and how to organise those plants in order to provide a robust framework for effective research and communication.

Added Glover “For thousands of years, humans have grouped plants into families based on observation made using the naked eye. Today, scientists also study the DNA of plants to determine relationships. Changes in scientific understanding pose interesting challenges for a Garden that seeks to be both guardian of a historic landscape and relevant to contemporary research.”

The Systematic Beds occupy nearly three acres of CUBG and are of global heritage significance. They were designed in 1845 by CUBG’s first Curator, Andrew Murray, and their design uniquely translates the leading botanic text book of the time by Augustin de Candolle into a display on the ground to represent and teach plant taxonomy – the science of identifying and classifying plant species.

The Understanding Plant Diversity project, which includes the Rising Path, seeks to ensure the Systematic Beds remain a useful teaching tool in the modern world. When renovation is complete in 2019, the Systematic Beds will represent 1,600 plant species belonging to about 78 families dispersed across 119 beds.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

‘Significant Breakthrough’ In Understanding the Deadly Nature of Pandemic Influenza

source: www.cam.ac.uk

Researchers at the University of Cambridge and the University of Oxford have discovered a new molecule that plays a key role in the immune response that is triggered by influenza infections. The molecule, a so-called mini viral RNA, is capable of inducing inflammation and cell death, and was produced at high levels by the 1918 pandemic influenza virus. The findings appeared in Nature Microbiology yesterday (September 17).

 

We think it is a significant breakthrough and that it is particularly exciting that we are finding this factor a hundred years after the 1918 pandemic.

Aartjan te Velthuis

Influenza is one of the main infectious diseases in humans. Seasonal influenza viruses account for about 650,000 deaths per year, whereas pandemic strains such as the 1918 H1N1 pandemic virus have been linked to 50-100 million deaths worldwide. Highly pathogenic avian influenza viruses such as the H5N1 and H7N9 strains have a mortality rate of about 50% in humans.

The reasons for difference in disease severity and lethality caused by seasonal influenza viruses on the one hand, and pandemic and highly pathogenic avian influenza viruses on the other hand is still poorly understood. Previous research has indicated that in infections with the 1918 pandemic virus or infections with an H5N1 avian virus, a powerful immune response is established that leads to death.

This led Dr Aartjan te Velthuis of the University of Cambridge and his colleagues Prof Ervin Fodor, Dr Josh Long and Dr David Bauer of the University of Oxford, to ask what viral molecule can trigger this powerful immune response.

The British groups first looked to how viruses are detected by the cell. Normally, an infected cell spots the presence of a virus by sensing the genetic material of the virus, RNA in the case of flu.

Work by Dr Richard Randall, a co-author on the manuscript from the University of St Andrews, has shown that influenza viruses are good at hiding their RNA. This observation prompted te Velthuis and his colleagues to look for flu RNA that the virus was not able to hide from the cellular pathogen sensing system. What they found was truncated pieces of the viral genome that the virus had produced in error. The researchers called these pieces mini viral RNAs.

Fodor and his colleagues next investigated whether different influenza viruses produce mini viral RNAs at different frequencies and whether there was a link with the strong innate immune response that, for instance, the 1918 pandemic virus induces.

A combination of in vitro and in vivo experiments performed at Oxford and Cambridge, as well as by collaborators Leo Poon of the University of Hong Kong, Debby van Riel of the Erasmus Medical Centre, and Emmie de Wit of the Rocky Mountain Laboratories, revealed indeed a strong correlation between the ability of an influenza virus to generate mini viral RNAs and the amount of inflammation and cell death the virus infection caused.

“We think it is a significant breakthrough and that it is particularly exciting that we are finding this factor a hundred years after the 1918 pandemic,” said Dr te Velthuis.

The research groups are now continuing their efforts to investigate whether there is a causal link between influenza virus mortality and the production of mini viral RNAs. Together with their latest work, these efforts may help us understand better how influenza viruses cause disease, how we can identify dangerous influenza viruses, and how to develop new antivirals against influenza virus infections.

The work was funded by the Wellcome Trust, Royal Society, Medical Research Council, NIH, and the Netherlands Organization for Scientific Research.

The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Congratulations to David Ely and CambridgeIC on their 2nd Queens Award!

David Ely, a founder and director of CambridgeIC, said: “We are delighted to have been honoured with a Queen’s Award for Enterprise again. I would like to thank all of our employees, partners, customers and associates for their support.

“I would like to extend thanks to the many creative and insightful people who helped lay the foundations of CambridgeIC even before it was founded in 2007.”

Come and see David on September 26th at our EOTM!

‘High-Yield’ Farming Costs the Environment Less Than Previously Thought – and Could Help Spare Habitats

‘High-yield’ farming costs the environment less than previously thought – and could help spare habitats

source: www.cam.ac.uk

New findings suggest that more intensive agriculture might be the “least bad” option for feeding the world while saving its species – provided use of such “land-efficient” systems prevents further conversion of wilderness to farmland.

Our results suggest that high-yield farming could be harnessed to meet the growing demand for food without destroying more of the natural world

Andrew Balmford

Agriculture that appears to be more eco-friendly but uses more land may actually have greater environmental costs per unit of food than “high-yield” farming that uses less land, a new study has found.

There is mounting evidence that the best way to meet rising food demand while conserving biodiversity is to wring as much food as sustainably possible from the land we do farm, so that more natural habitats can be “spared the plough”.

However, this involves intensive farming techniques thought to create disproportionate levels of pollution, water scarcity and soil erosion. Now, a study published today in the journal Nature Sustainability shows this is not necessarily the case.

Scientists have put together measures for some of the major “externalities” – such as greenhouse gas emission, fertiliser and water use – generated by high- and low-yield farming systems, and compared the environmental costs of producing a given amount of food in different ways.

Previous research compared these costs by land area. As high-yield farming needs less land to produce the same quantity of food, the study’s authors say this approach overestimates its environmental impact.

Their results from four major agricultural sectors suggest that, contrary to many people’s perceptions, more intensive agriculture that uses less land may also produce fewer pollutants, cause less soil loss and consume less water.

However, the team behind the study, led by scientists from the University of Cambridge, caution that if higher yields are simply used to increase profit or lower prices, they will only accelerate the extinction crisis we are already seeing.

“Agriculture is the most significant cause of biodiversity loss on the planet,” said study lead author Andrew Balmford, Professor of Conservation Science from Cambridge’s Department of Zoology. “Habitats are continuing to be cleared to make way for farmland, leaving ever less space for wildlife.”

“Our results suggest that high-yield farming could be harnessed to meet the growing demand for food without destroying more of the natural world. However, if we are to avert mass extinction it is vital that land-efficient agriculture is linked to more wilderness being spared the plough.”

The Cambridge scientists conducted the study with a research team from 17 organisations across the UK and around the globe, including colleagues from Poland, Brazil, Australia, Mexico and Colombia.

The study analysed information from hundreds of investigations into four vast food sectors, accounting for large percentages of the global output for each product: Asian paddy rice (90%), European wheat (33%), Latin American beef (23%), and European dairy (53%).

Examples of high-yield strategies include enhanced pasture systems and livestock breeds in beef production, use of chemical fertilizer on crops, and keeping dairy cows indoors for longer.

The scientists found data to be limited, and say more research is urgently needed on the environmental cost of different farming systems. Nevertheless, results suggest many high-yield systems are less ecologically damaging and, crucially, use much less land.

For example, in field trials, inorganic nitrogen boosted yields with little to no greenhouse gas “penalty” and lower water use per tonne of rice. Per tonne of beef, the team found greenhouse gas emissions could be halved in some systems where yields are boosted by adding trees to provide shade and forage for cattle.

The study only looked at organic farming in the European dairy sector, but found that – for the same amount of milk – organic systems caused at least one third more soil loss, and take up twice as much land, as conventional dairy farming.

Co-author Professor Phil Garnsworthy from the University of Nottingham, who led the dairy team, said: “Across all dairy systems we find that higher milk yield per unit of land generally leads to greater biological and economic efficiency of production. Dairy farmers should welcome the news that more efficient systems have lower environmental impact.”

Conservation expert and co-author Dr David Edwards, from the University of Sheffield, said: “Organic systems are often considered to be far more environmentally friendly than conventional farming, but our work suggested the opposite. By using more land to produce the same yield, organic may ultimately accrue larger environmental costs.”

The study authors say that high-yield farming must be combined with mechanisms that limit agricultural expansion if they are to have any environmental benefit. These could include strict land-use zoning and restructured rural subsidies.

“These results add to the evidence that sparing natural habitats by using high-yield farming to produce food is the least bad way forward,” added Balmford.

“Where agriculture is heavily subsidised, public payments could be contingent on higher food yields from land already being farmed, while other land is taken out of production and restored as natural habitat, for wildlife and carbon or floodwater storage.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Mentoring Can Reduce Anxiety, Study Finds

Mentoring can reduce anxiety, study finds

source: www.cam.ac.uk

Mentoring of junior colleagues can reduce anxiety and improve the mental health of the mentors themselves, finds a new study.

The mentoring of junior colleagues can reduce anxiety and improve the mental health of the mentors themselves in high-pressure occupations, concludes a new study co-authored at Cambridge Judge Business School involving an English police force.

While previous research had indicated that the anxiety of mentees can be reduced through the guidance of more senior mentors, the new study finds that imparting knowledge and experience can also help mentors by making their jobs more rewarding.

“We found that mentoring relationships provide a unique context for mentors to discuss and normalise their concerns, to share ideas for managing anxieties, and to find more meaning in their work,” concludes the study, published in the Journal of Vocational Behavior.

“Mentoring relationships appeared to provide an organisational mechanism to prompt supervisor and colleague interactions, which in turn facilitated a reduction in the mentors’ anxiety.”

In England alone, mental illness accounts for annual expenditure on healthcare of £14 billion and a reduction in gross domestic product of £52 billion owing to people unable to work to their full capacity.

Policing was chosen as an appropriate setting to study how mentoring can reduce anxiety in occupations that play important social roles, including the medical profession and the military – roles that require mental strength in challenging situations coupled with political pressure to become more efficient. The study follows a mentoring programme that was rolled out at one of 43 territory-based police forces in England and Wales since 2013.

Despite the pressures of their roles – including threats, abuse, snap decisions and the risk of death – police officers tend not to seek support from other officers, including more senior colleagues, to avoid “negative stigma” associated with mental health disorders. Mentoring can help fill this void, the study says.

“The study suggests that a relatively inexpensive practice such as mentoring can help reduce anxiety among both senior and junior staff, and this could help organisations address the serious and costly workplace issues of anxiety and mental health,” says study co-author Dr Thomas Roulet, University Senior Lecturer in Organisation Theory at Cambridge Judge Business School. “While the study focused on high-stress roles in the public eye, we believe that the findings may also apply to other occupations that also have anxiety-provoking pressures.”

The study is co-authored by Dr Michael Gill of Said Business School at Oxford University and Chief Inspector Stephen Kerridge of the Cambridgeshire Constabulary.

Excerpts of interviews with mentors and mentees indicated that it was beneficial for people in such busy and often frantic jobs as policing to have an opportunity to be “listened to” and to take note of the fact that “we’ve all gone through” certain work experiences.

“Mentoring provided reassurance to the mentors by illuminating how other, often junior officers also experience anxiety thereby normalising their own experiences,” the study says. “By acknowledging that anxieties are common, both the mentees and mentors in this study appeared to be more comfortable discussing such issues and therefore in developing different coping mechanisms.”

Reference: 
Michael J. Gill et al. ‘Mentoring for mental health: A mixed-method study of the benefits of formal mentoring programmes in the English police force.’ Journal Of Vocational Behavior (2018). DOI: 10.1016/j.jvb.2018.08.005

Originally published on the Cambridge Judge Business School website


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Bridges Impact Award – Amazon Growing Business Awards

Bridges Fund Management to Sponsor the Inaugural Bridges Positive Impact Award as Part of This Year’s Growing Business Awards.

 

Our aim is to recognise and celebrate the UK’s most exciting impactful businesses – that is, growth businesses that are succeeding financially while also driving positive change for people and/or the planet.

In addition to being honoured at the UK’s most prestigious SME awards (with associated coverage in Real Business and Real Deals), the winning business also gets a free impact workshop delivered by Impact+, our in-house impact specialists.

Do you know of any businesses that might be eligible? If so, we’d love to hear from them! Applications are open HERE until September 16th.

INNOVATE: POSITIVE IMPACT IN ACTION

In this short film, we show why Innovate – the latest investment from the Bridges Sustainable Growth Funds – is a great example of a high-growth business generating positive impact through its core business model.

A specialist school catering business, Innovate is helping to combat the rise in childhood obesity by providing delicious, nutritious food to over 90,000 pupils at primary and secondary schools in England. The quality of its proposition – which typically leads to a big increase in the uptake of school lunches – has enabled it to enjoy strong growth in a market now worth over £2bn a year.

 

Find out more about the award here

Scientists Pioneer a New Way To Turn Sunlight Into Fuel

Scientists pioneer a new way to turn sunlight into fuel

source: www.cam.ac.uk

The quest to find new ways to harness solar power has taken a step forward after researchers successfully split water into hydrogen and oxygen by altering the photosynthetic machinery in plants.

This could be a great platform for developing solar technologies.

Katarzyna Sokół

Photosynthesis is the process plants use to convert sunlight into energy. Oxygen is produced as a by-product of photosynthesis when the water absorbed by plants is ‘split’. It is one of the most important reactions on the planet because it is the source of nearly all of the world’s oxygen. Hydrogen which is produced when the water is split could potentially be a green and unlimited source of renewable energy.

A new study, led by academics at the University of Cambridge, used semi-artificial photosynthesis to explore new ways to produce and store solar energy. They used natural sunlight to convert water into hydrogen and oxygen using a mixture of biological components and manmade technologies.

The research could now be used to revolutionise the systems used for renewable energy production. A new paper, published in Nature Energy, outlines how academics at the Reisner Laboratory in Cambridge’s Department of Chemistry developed their platform to achieve unassisted solar-driven water-splitting.

Their method also managed to absorb more solar light than natural photosynthesis.

Katarzyna Sokół, first author and PhD student at St John’s College, said: “Natural photosynthesis is not efficient because it has evolved merely to survive so it makes the bare minimum amount of energy needed – around 1-2 per cent of what it could potentially convert and store.”

Artificial photosynthesis has been around for decades but it has not yet been successfully used to create renewable energy because it relies on the use of catalysts, which are often expensive and toxic. This means it can’t yet be used to scale up findings to an industrial level.

The Cambridge research is part of the emerging field of semi-artificial photosynthesis which aims to overcome the limitations of fully artificial photosynthesis by using enzymes to create the desired reaction.

Sokół and the team of researchers not only improved on the amount of energy produced and stored, they managed to reactivate a process in the algae that has been dormant for millennia.

She explained: “Hydrogenase is an enzyme present in algae that is capable of reducing protons into hydrogen. During evolution, this process has been deactivated because it wasn’t necessary for survival but we successfully managed to bypass the inactivity to achieve the reaction we wanted – splitting water into hydrogen and oxygen.”

Sokół hopes the findings will enable new innovative model systems for solar energy conversion to be developed.

She added: “It’s exciting that we can selectively choose the processes we want, and achieve the reaction we want which is inaccessible in nature. This could be a great platform for developing solar technologies. The approach could be used to couple other reactions together to see what can be done, learn from these reactions and then build synthetic, more robust pieces of solar energy technology.”

This model is the first to successfully use hydrogenase and photosystem II to create semi-artificial photosynthesis driven purely by solar power.

Dr Erwin Reisner, Head of the Reisner Laboratory, a Fellow of St John’s College, University of Cambridge, and one of the paper’s authors described the research as a ‘milestone’.

He explained: “This work overcomes many difficult challenges associated with the integration of biological and organic components into inorganic materials for the assembly of semi-artificial devices and opens up a toolbox for developing future systems for solar energy conversion.”

Reference: 
Katarzyna P. Sokół et al. ‘Bias-free photoelectrochemical water splitting with photosystem II on a dye-sensitized photoanode wired to hydrogenase.’ Nature Energy (2018). DOI: 10.1038/s41560-018-0232-y

​Originally published on the St John’s College website. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Breeder Meerkats Age Faster, But Their Subordinates Still Die Younger

Breeder meerkats age faster, but their subordinates still die younger

source: www.cam.ac.uk

Despite rapidly ageing, dominant animals live longer because their underlings are driven out of the group – becoming easy targets for predators. The secret of a long meerkat life is to be “ruler of your community… cracking down on would-be rivals,” say scientists.

A meerkat’s place within the social group shapes the mortality risks it faces.

Dominic Cram

In many cooperative species, the dominant breeders live longest despite the wear-and-tear of leadership and reproduction.

It has even been suggested these breeders hold the secret of immunity to age-related diseases. Some social insects, such as bees, do have breeders with genetic profiles that delay ageing – but this has never been documented in our fellow mammals.

Scientists from the University of Cambridge have now investigated the lifespans of meerkats: a highly social mammal that lives in groups of up to fifty, where a single dominant couple produce around 90% of the pups.

The researchers found that the DNA of dominant breeders actually shows signs of accelerated ageing – yet they still consistently outlive the non-breeding subordinates in the group. Their study shows that dominants live an average of 4.4 years compared to subordinates 2.8 years.

This is because meerkat underlings are forced to take the often-fatal risk of leaving the safety of the group to find breeding opportunities, say scientists. Dominants rarely tolerate rival breeders, and violently eject subordinates from the group if they feel threatened.

On reaching the top of the social pecking order, however, meerkats remain ensconced within the group. The study shows an average subordinate spends more than six days each year in the wilderness, with this figure rising year-on-year. Dominant breeders are typically absent for under two hours per year.

“Dominant meerkats typically die due to internal stresses on their bodies, resulting in gradual, predictable declines until death. In humans we might describe this as ‘natural causes’,” said Dr Dominic Cram from Cambridge’s Department of Zoology, lead author of the study published today in Current Biology.

“Subordinate meerkats die due to sudden, unpredictable circumstances such as exposure to predators, killing them instantly. A meerkat’s place within the social group shapes the mortality risks it faces,” he said.

“The secret of long life for meerkats is not to battle the inevitable declines of ageing, but to be the ruler of your community, profiting from social support and cracking down on would-be rivals.”

Cram conducted the research as part of the Kalahari Meerkat Project: a long-term study of social behavior and ecology, run for over twenty years at the University of Cambridge by Professor Tim Clutton-Brock – a leading figure in the study of mammal societies.

The project has helped train generations of zoologists through the observation of generations of meerkats, resulting in a wide range of data on the life histories of over 3000 meerkat individuals in over 100 groups.

The team collected blood samples from the meerkats, and measured DNA sections called telomeres that help protect DNA from damage – much like the plastic caps on shoe-laces. As they erode over time, the chance of unravelling increases, so the length of telomeres can be used to estimate “biological age”.

While the telomeres of subordinate meerkats remained stable, dominant telomeres shrunk by a third in just 18 months – suggesting accelerated ageing caused by the toils of raising young and fending off rivals.

Yet the dominant meerkats still lived an average of 60% longer than subordinates, as the lower ranking meerkats were increasingly forced to risk more and more time outside the group as they grew older.

“Each year the subordinates spend over triple the amount of time outside the group as the previous year, reaching a peak of 35 days per year, or 10% of their time, outside the social group,” said Cram.

For subordinate males, all females in the group are their sisters or mother, so they must court females away from the group to avoid inbreeding. Subordinate females are bullied and chased away by the dominant when they become a reproductive rival.

Of all those that leave, some return – or try to – after a few days or weeks. A lucky few start their own group and become dominant breeders. Many are never seen again.

“Within a group, a sentinel always keeps look-out and sounds the alarm, allowing the meerkats to flee into burrows or bolt-holes. Each meerkat takes a turn on sentinel duty,” said Cram.

“Away from the group there is no early warning system, and meerkats are easy prey for eagles, goshawks and caracal. Letting down their guard to dig for food is too risky, so many starve for fear of being eaten.”

“Lone meerkats have even been known to be torn apart by members of a rival group. It’s a dangerous world for a solo meerkat.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Experts Warn of Cardiovascular Risk From Heavy Metal Pollution

Experts warn of cardiovascular risk from heavy metal pollution

source: www.cam.ac.uk

Even low doses of toxic chemicals in the environment pose a significant risk to cardiovascular health, according to a report in today’s edition of The BMJ, led by researchers at the University of Cambridge. The researchers have also challenged the omission of environmental risk factors such as toxic metal contaminants in water and foods from the recent World Health Organization report on non-communicable diseases (NCDs).

It’s clear from our analysis that there’s a possible link between exposure to heavy metals or metalloids and risk of conditions such as heart disease, even at low doses – and the greater the exposure, the greater the risk

Rajiv Chowdhury

In recent decades, exposures to environmental toxic metals such as arsenic, copper, lead, cadmium and mercury, have become a global public health concern. Although often naturally occurring, these contaminants have made their way into water supplies and, via irrigation, into the food chain. For example, in Bangladesh, deep wells were introduced in the Ganges Delta to draw water clear of bacterial and viral pathogens, but this inadvertently led to exposure to toxic metals.

Concern has often focused on the toxicity or carcinogenic properties of the metals, particularly at high doses. However, there is increasing evidence to suggest that heavy metals may have other adverse effects on health – including cardiovascular disease such as heart disease and stroke – even at lower levels of exposure, which might be prevalent in many parts of the world, including the UK and the US.

To interpret the available evidence, a team led by researchers at Cambridge’s Department of Public Health and Primary Care carried out a systematic review and meta-analysis of published studies covering 350,000 unique participants from 37 countries.

The results of the study showed that exposure to arsenic, lead, cadmium and copper – but not mercury – was associated with an increased risk of coronary heart disease and cardiovascular disease.

“It’s clear from our analysis that there’s a possible link between exposure to heavy metals or metalloids and risk of conditions such as heart disease, even at low doses – and the greater the exposure, the greater the risk,” says Dr Rajiv Chowdhury, the study’s first author. “While people shouldn’t be overly worried about any immediate health risk, it should send a message to policymakers that we need to take action to reduce people’s exposure.”

Worldwide, those at greatest exposure of arsenic, lead, cadmium and copper were around 30% to 80% more likely to develop cardiovascular disease than those at lowest exposure.

The report is important, say the researchers, because it highlights the need to tackle this environmental and public health problem, one which disproportionately affects people in low and middle income countries, though may still affect those in higher income countries. Interventions need not be costly, they stress; for example, cheap, scalable technologies (e.g. environmentally-friendly water filters) or behavioural interventions (e.g. rinsing practices of rice and vegetables prior to cooking) are currently being tested to reduce exposures at the household level.

Additionally, in a letter published at the end of June, Dr Chowdhury and colleagues expressed their disappointment that the earlier WHO report by the Independent High-Level Commission on non-communicable diseases published in June did not include exposure to heavy metals as a key contributing factor. Writing in the Lancet, the authors said: “Unfortunately, this globally important report had a major omission: recognising the detrimental role of environmental risk factors, beyond the conventional behavioural factors (tobacco and alcohol use, physical inactivity, and unhealthy diet), in enhancing global NCD burden and health inequality.”

Dr Chowdhury and colleagues recently also received £8.1 million from the UK Research Councils’ Global Challenges Research Fund to set up a long-term programme (called CAPABLE) to further investigate environmental factors of cardiovascular diseases and to help inform preventative strategies.

Reference
1) Chowdhury, R et al. Environmental toxic metal contaminants and cardiovascular risk: a systematic review 1 and meta-analysis of observational studies. BMJ; 30 Aug 2018; DOI:10.1136/bmj.k3310

2) Chowdhury R, et al. Reducing NCDs globally: the under-recognised role of environmental risk factors. The Lancet; 28 June 2018; DOI: 10.1016/S0140-6736(18)31473-9


Researcher profile: Dr Rajiv Chowdhury

2006 was a year of “life-changing events” for Dr Rajiv Chowdhury: not only did he receive a Commonwealth scholarship to study for a masters at Cambridge, but his wife also gave birth to their baby daughter.

Rajiv grew up in Bangladesh, where he studied medicine before moving to Cambridge for his masters. This made him acutely aware of the challenges facing low-income countries. “I could see for myself the massive inequalities, the huge burden of disease, the poor infrastructure, lack of resources…” he says. But it was his encounter with non-communicable diseases (NCDs) – conditions such as heart disease, cancer and type 2 diseases – that was to have particular relevance to his current work.

Following his masters, Rajiv went on to receive a Gates Cambridge scholarship to support his PhD in epidemiology – the first Gates Cambridge scholar from Bangladesh. He now studies the role played by both environmental factors (such as toxic metals, diet, etc.) and genetic factors in influencing the risk of chronic NCDs.

Rajiv’s particular interest involves working closely with researchers in low-income countries such as Bangladesh, Sri Lanka and Malaysia – as well as being the principal investigator on several international research projects, he is also Scientific Director for CAPABLE (Cambridge Programme to Assist Bangladesh in Lifestyle and Environmental risk reduction), funded through the Global Challenges Research Fund.

“I hope that our work will lead to the establishment of the largest NCD scientific cohort study in Bangladesh,” he says. “We’re aiming for it to recruit over 150,000 participants. This should help us generate some effective solutions to reduce the impact of environmental risk factors which affect many hundreds of millions of people worldwide.”

Rajiv says he has learned a lot from both his mentors and peers during his time at Cambridge. “Cambridge offers one of the most intellectually stimulating environments in the world for scientific endeavours,” he says. “I have come across some of the brightest minds in biomedical science here who have changed my life and the way I perceive research.”

Image: Dr Chowdhury checking survey forms filled in by our local community health workers in Bangladesh


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Electronic Device Implanted in the Brain Could Stop Seizures

Electronic device implanted in the brain could stop seizures

source: www.cam.ac.uk

Researchers have successfully demonstrated how an electronic device implanted directly into the brain can detect, stop and even prevent epileptic seizures.

These thin, organic films do minimal damage in the brain, and their electrical properties are well-suited for these types of applications.

George Malliaras

The researchers, from the University of Cambridge, the École Nationale Supérieure des Mines and INSERM in France, implanted the device into the brains of mice, and when the first signals of a seizure were detected, delivered a native brain chemical which stopped the seizure from progressing. The results, reported in the journal Science Advances, could also be applied to other conditions including brain tumours and Parkinson’s disease.

The work represents another advance in the development of soft, flexible electronics that interface well with human tissue. “These thin, organic films do minimal damage in the brain, and their electrical properties are well-suited for these types of applications,” said Professor George Malliaras, the Prince Philip Professor of Technology in Cambridge’s Department of Engineering, who led the research.

While there are many different types of seizures, in most patients with epilepsy, neurons in the brain start firing and signal to neighbouring neurons to fire as well, in a snowball effect that can affect consciousness or motor control. Epilepsy is most commonly treated with anti-epileptic drugs, but these drugs often have serious side effects and they do not prevent seizures in three out of 10 patients.

In the current work, the researchers used a neurotransmitter which acts as the ‘brake’ at the source of the seizure, essentially signalling to the neurons to stop firing and end the seizure. The drug is delivered to the affected region of the brain by a neural probe incorporating a tiny ion pump and electrodes to monitor neural activity.

When the neural signal of a seizure is detected by the electrodes, the ion pump is activated, creating an electric field that moves the drug across an ion exchange membrane and out of the device, a process known as electrophoresis. The amount of drug can be controlled by tuning the strength of the electric field.

“In addition to being able to control exactly when and how much drug is delivered, what is special about this approach is that the drugs come out of the device without any solvent,” said lead author Dr Christopher Proctor, a postdoctoral researcher in the Department of Engineering. “This prevents damage to the surrounding tissue and allows the drugs to interact with the cells immediately outside the device.”

The researchers found that seizures could be prevented with relatively small doses of drug representing less than 1% of the total amount of drug loaded into the device. This means the device should be able to operate for extended periods without needing to be refilled. They also found evidence that the delivered drug, which was in fact a neurotransmitter that is native to the body, was taken up by natural processes in the brain within minutes which, the researchers say, should help reduce side effects from the treatment.

Although early results are promising, the potential treatment would not be available for humans for several years. The researchers next plan to study the longer-term effects of the device in mice.

Malliaras is establishing a new facility at Cambridge which will be able to prototype these specialised devices, which could be used for a range of conditions. Although the device was tested in an animal model of epilepsy, the same technology could potentially be used for other neurological conditions, including the treatment of brain tumours and Parkinson’s disease.

The research was funded by the European Union.

Reference: 
Christopher M. Proctor et al. ‘Electrophoretic drug delivery for seizure control.’ Science Advances (2018). DOI: 10.1126/sciadv.aau1291

 


Researcher profile: Dr Christopher Proctor

Dr Christopher Proctor is one of the first nine recipients of the Borysiewicz Biomedical Sciences Fellowship programme.

My research sets out to develop medical devices to treat and diagnose various health problems that have been difficult to address with conventional approaches such as epilepsy, Parkinson’s disease and brain tumours. As an engineer with expertise in electronics and materials, I work closely with biologists and clinicians in all stages of device development from early stage designing to late-stage testing.

The most exciting day I’ve had in research so far was when a concept that I took from a drawing on paper to a real device that I could hold in my hand, prevented a seizure for the third time. I say the third time because I am forever a sceptic, so I was hesitant to believe our initial results until we repeated it a couple times. Having seen that it was a repeatable result was very exciting because that is when you know you may really be on to something special.

I hope my research will ultimately lead to a better quality of life for people with health problems. I believe we are only scraping the surface of what is possible when we pair electronic devices with biology. It is difficult to project where early-stage research will go, but I suspect the way we address some of the most difficult to treat diseases may be radically different in the coming decades.

Cambridge is a great place to research and develop medical devices because this type of work is truly a team effort that requires expertise in everything from engineering to chemistry to medicine up to government regulations, finance and marketing. There is an ecosystem in and around the University of Cambridge that can bring all these experts together and that is exactly what is needed to take an early stage technology all the way to the patients that we are trying to help.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

IoTUK Challenger South – Hack and Pitch Event

By Cambridge Hack

We’re looking for startup and SME teams who would love to use their technology, their knowledge and their experience to create solutions to pressing real-world challenges.

The IoT UK Challenger South will set SMEs and start-ups four challenges in the arenas of transport, smart cities and health, with the over-arching theme of harnessing IoT to aid in producing solutions to the challenges.

TAKE PART to hone your ideas and skills, and for the visibility and exposure you can achieve for your companies, together with the possibility of applying your knowledge and products for larger clients or on a larger scale than might otherwise have been the case. There will be subject matter experts on hand to learn from, along with a panel of distinguished judges, who will be nominating winners for the PRIZES ON OFFER!

Cambridge Hack have been appointed by Digital Catapult and IoTUK to run IoTUK Challenger for the South of England.

More information is available on the IoT website, and applications to attend must be made there: https://iotuk.org.uk/iotuk-challenger-south/

Please contact Dominic Bowles with any queries – dominic.bowles@cambridgehack.com 07713 075 715

Funding Announced For Almost 400 New Doctoral Places in Arts and Humanities

Funding announced for almost 400 new doctoral places in arts and humanities

source: www.cam.ac.uk

The Open University, the University of Oxford and the University of Cambridge are pleased to announce the success of their bid for funding for the Open-Oxford-Cambridge Arts and Humanities Research Council Doctoral Training Partnership, which will create nearly 400 new doctoral places in the arts and humanities.

The unique collaboration between Oxford, Cambridge and the Open University opens up exciting new prospects for the next generation of doctoral research students in the Arts and Humanities

Martin Millett

The Open-Oxford-Cambridge AHRC DTP is a consortium of the three universities for doctoral training and funding in the Humanities. The DTP is underpinned by world-class research and training environments, supported by strategic partnerships with the BBC World Service, the National Trust and British Telecom, and is national and international in mindset, and determined to take a leading role in shaping the future of doctoral training in the UK.

The AHRC is the UK’s largest funder of postgraduate training in the arts and humanities, and plays an essential role in supporting the next generation of highly capable researchers. By working together, the AHRC, the Open University, and the Universities of Oxford and Cambridge are able to commit to investing in this partnership over its lifetime.

Professor David Rechter, incoming Director of the Open-Oxford-Cambridge AHRC DTP, said: “I am pleased by the success of our bid, and look forward to recruiting our first cohort of students next year. Supported by our partners the National Trust, the BBC World Service and British Telecom, the Open-Oxford-Cambridge DTP will offer students a wealth of opportunities to pursue research and engage in training, and to learn from each other as part of a large multi-disciplinary group. These opportunities will equip our DTP students with the research expertise and skills that will allow them to go on to wide range of careers in academia and beyond.”

Professor Martin Millett, Head of the School of Arts and Humanities at Cambridge, said: “The success of this bid is excellent news. The unique collaboration between Oxford, Cambridge and the Open University opens up exciting new prospects for the next generation of doctoral research students in the Arts and Humanities.”

Professor Edward Harcourt, the AHRC’s Director of Research, Strategy and Innovation, said: “The AHRC is delighted to announce its renewed commitment to the Doctoral Training Partnerships model. Our support for the next generation of arts and humanities researchers is critical to securing the future of the UK arts and humanities sector, which accounts for nearly a third of all UK academic staff, is renowned the world over for its outstanding quality, and which plays a vital part in our higher education ecosystem as a whole.

“We were extremely pleased with the response to our call, which saw high-quality applications from across the UK from a variety of diverse and innovative consortia, each with a clear strategy and vision for the future support of their doctoral students.”

Professor Kevin Hetherington, Pro-Vice-Chancellor (Research and Academic Strategy), The Open University, said: “The Open University is delighted that the AHRC has chosen to recognise the commitment to innovation and diversity inherent in the Open-Oxford-Cambridge DTP, and looks forward to participating fully in the delivery of an exciting training programme for our PhD students.”

Professor Karen O’Brien, Head of the Humanities Division, University of Oxford, said: “This is good news and an endorsement of our collective commitment to developing the next generation of Humanities scholars. We are looking forward to working with the Open University, Cambridge, the AHRC and our strategic partners to deliver a truly exciting opportunity to our consortium students.”

Stephen Cassidy, Chief Researcher, System Science, BT Labs, said: “As a communication company deeply rooted in the interaction between people, communities and businesses, BT sees great benefit in being part of this DTP. Interaction with the students and academics will extend our understanding of ethical, legal and social ramifications of the possible directions the industry as a whole could (and is) embarking on. These are issues of international scale, and we are pleased to link with the DTP and to provide further links with our research collaborations around the UK and the globe.”

Jamie Angus, Director, BBC World Service Group, said: “The objectives of the Consortium and the Doctoral Training partnership fit very well with the BBC World Service’s objectives; The BBC World Service Group provides independent impartial journalism to nearly 350 million people around the world each week, across cultural, linguistic and national boundaries.  We look forward to working with world-class doctoral students in the Humanities drawing on their research skills and subject expertise, as well as making the most of the huge range of languages studied at Oxford, Cambridge and the OU. Working together we will play our part so that the Consortium can provide DTP-funded students with skills and experience they need to communicate their ideas beyond academia so that they may be better able to reach a wider audience.”

Nino Strachey, Head of Research and Specialist Advice at the National Trust, said: “The National Trust is delighted at the success of the bid and excited to work with students and staff from these internationally recognised universities and partners. With a long history of hosting and co-supervising PhDs, we look forward to offering opportunities for students to gain experience of the heritage sector and to work with Europe’s largest conservation charity.”

Information on how to apply for scholarships via the Open-Oxford-Cambridge AHRC Doctoral Training Partnership for entry in 2019/20 will be available from www.oocdtp.ac.uk from 1 September 2018.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

‘Believing You’re a Winner’ Gives Men a Testosterone Boost and Promiscuous Disposition

‘Believing you’re a winner’ gives men a testosterone boost and promiscuous disposition

source: www.cam.ac.uk
New findings suggest that the male body tries to “optimise” self-perceived improvements in social status through hormonal shifts that promote “short-term mating”.

Our results show that both testosterone and its corresponding psychological effects can fluctuate quickly and opportunistically

Danny Longman

A new study shows that men only have to believe they’ve bested another man in competition to get raised testosterone levels and an inflated sense of their own value as a sexual prospect.

Scientists found that this hormonal and psychological shift made men more inclined to approach new potential partners.

The research team measured hormone levels, as well as self-perceived attractiveness and confidence in approaching women, in 38 men in their twenties before and after competing in head-to-head battles on rowing machines.

Unbeknownst to participants, the competitions in the study were rigged to randomly declare the winner, regardless of who was the stronger rower.

While previous studies have shown that winning can affect male hormones, it was not known whether this was down to the efforts it takes to win or the belief that one is victorious.

The latest study, led by biological anthropologists from the University of Cambridge and published today in the journal Human Nature, reveals that just being convinced you have won, or indeed lost, is enough to cause male hormonal fluctuations that can influence sexual behaviour.

Researchers say this is an example of “plasticity”: the body adapting quickly – without altering genetic make-up – to suit a change in circumstance. In this case a perceived change in social status, due to the men believing they have defeated a rival.

The body attempts to take advantage of this apparent status improvement by inducing chemical and consequently behavioural changes that promote a “short-term” approach to reproductive success, say the researchers. Namely, more sex with new and different partners.

“Much of evolution consists of trade-offs in energy investment,” said study lead author Dr Danny Longman, from Cambridge’s Department of Archaeology.

“A common trade-off for males both across and within species is between mating strategies. One reproductive approach is short-term, investing time and energy in attracting and pursuing many mates, and fighting off competition. Another approach is long-term, investing energy in raising offspring with a single mate.”

“We found that a perceived shift in social status can cause male physiology to adapt by preparing to shift mating strategies to optimise reproductive success.”

Longman points out that in many animal populations, male social hierarchies correspond with reproductive success, and social status is determined by competition between males.

The study used a simple proxy for social and sexual competition by pitting athletic young men against each other to see who was the most powerful rower.

“Victory in a rowing contest strongly implies the possession of greater physical strength than the opponent, a trait found to be valued by women in our evolutionary past when choosing a mate,” said Longman.

He took saliva samples to test hormone levels before and after the races. A number of psychological questionnaires were also administered, designed to gauge self-esteem, ‘sociosexuality’ (willingness to engage in casual sex), ‘self-perceived mate value’ and mating behaviour (e.g. the likelihood of approaching attractive women). Crucially, Longman and colleagues then manipulated the results of the races.

The men who believed they had won received an average testosterone increase of 4.92%, while those convinced they had lost dropped by an average of 7.24%. Overall, men who thought they were winners had testosterone levels 14.46% higher their deflated opponents.

The men who thought they had lost showed no difference in their perceived value as a mate or confidence approaching women. However, the men who felt like winners had a ‘self-perceived mate value’ that was 6.53% higher, on average, than their rivals, and were 11.29% more likely to approach attractive women in an effort to instigate sexual relations.

“The endocrine system that controls hormones is responsive to situational changes. Previous research has shown that testosterone is lower when men are in a committed relationship, or have children, to promote long-term mating strategies,” said Longman.

“Our results show that both testosterone and its corresponding psychological effects can fluctuate quickly and opportunistically, shifting towards short-term mating in response to a perceived change in status that may increase mating value.”

Male social status has less to do with physical strength in many modern societies, and Longman would be curious to see if similar results arise from intellectual challenges more familiar to the office-based culture many men now inhabit. There is always the issue of free will, however.

“Male physiology may shift to take advantage of certain situations, but ultimately a man’s decisions are up to him.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Lost Norse of Greenland Fuelled the Medieval Ivory Trade, Ancient Walrus DNA Suggests

Lost Norse of Greenland fuelled the medieval ivory trade, ancient walrus DNA suggests

source: www.cam.ac.uk

New DNA analysis reveals that, before their mysterious disappearance, the Norse colonies of Greenland had a “near monopoly” on Europe’s walrus ivory supply. An overreliance on this trade may have contributed to Norse Greenland’s collapse when the medieval market declined.

The very thing which gave the society its initial resilience, may have also contained the seeds of its vulnerability

James Barrett

The Icelandic Sagas tell of Erik the Red: exiled for murder in the late 10th century he fled to southwest Greenland, establishing its first Norse settlement.

The colony took root, and by the mid-12th century there were two major settlements with a population of thousands. Greenland even gained its own bishop.

By the end of the 15th century, however, the Norse of Greenland had vanished – leaving only abandoned ruins and an enduring mystery.

Past theories as to why these communities collapsed include a change in climate and a hubristic adherence to failing farming techniques.

Some have suggested that trading commodities – most notably walrus tusks – with Europe may have been vital to sustaining the Greenlanders. Ornate items including crucifixes and chess pieces were fashioned from walrus ivory by craftsmen of the age. However, the source of this ivory has never been empirically established.

Now, researchers from the universities of Cambridge and Oslo have studied ancient DNA from offcuts of tusks and skulls, most found on the sites of former ivory workshops across Europe, in order to trace the origin of the animals used in the medieval trade.

In doing so they have discovered an evolutionary split in the walrus, and revealed that the Greenland colonies may have had a “near monopoly” on the supply of ivory to Western Europe for over two hundred years.

For the latest study, published today in the journal Proceedings of the Royal Society B, the research team analysed walrus samples found in several medieval trading centres – Trondheim, Bergen, Oslo, Dublin, London, Schleswig and Sigtuna – mostly dating between 900 and 1400 CE.

The DNA showed that, during the last Ice Age, the Atlantic walrus divided into two ancestral lines, which researchers term “eastern” and “western”. Walruses of the eastern lineage are widespread across much of the Arctic, including Scandinavia. Those of the western, however, are unique to the waters between western Greenland and Canada.

Finds from the early years of the ivory trade were mostly from the eastern lineage. Yet as demand grew from the 12th century onwards, the research team discovered that Europe’s ivory supply shifted almost exclusively to tusks from the western lineage.

They say that ivory from western linage walruses must have been supplied by the Norse Greenlanders – by hunting and perhaps also by trade with the indigenous peoples of Arctic North America.

“The results suggest that by the 1100s Greenland had become the main supplier of walrus ivory to Western Europe – a near monopoly even,” said Dr James H. Barrett, study co-author from the University of Cambridge’s Department of Archaeology.

“The change in the ivory trade coincides with the flourishing of the Norse settlements on Greenland. The populations grew and elaborate churches were constructed.

“Later Icelandic accounts suggest that in the 1120s, Greenlanders used walrus ivory to secure the right to their own bishopric from the king of Norway. Tusks were also used to pay tithes to the church,” said Barrett.

He points out that the 11th to 13th centuries were a time of demographic and economic boom in Europe, with growing demand from urban centres and the elite served by transporting commodities from increasingly distant sources.

“The demands for luxury goods produced from ivory may have helped the far-flung Norse communities in Greenland survive for centuries,” said Barrett.

Co-author Dr Sanne Boessenkool of the University of Oslo said: “We knew from the start that analysing ancient DNA would have the potential for new historical insights, but the findings proved to be particularly spectacular.”

The new study tells us less about the end of the Greenland colonies, say Barrett and colleagues. However, they note that it is hard to find evidence of walrus ivory imports to Europe that date after 1400.

Elephant ivory eventually became the material of choice for Europe’s artisans. “Changing tastes could have led to a decline in the walrus ivory market of the Middle Ages,” said Barrett.

Ivory exports from Greenland could have stalled for other reasons: over-hunting can cause walrus populations to abandon their coastal “haulouts”; the “Little Ice Age” – a sustained period of lower temperatures – began in the 14th century; the Black Death ravaged Europe.

Whatever caused the cessation of Europe’s trade in walrus ivory, it must have been significant for the end of the Norse Greenlanders,” said Barrett. “An overreliance on a single commodity, the very thing which gave the society its initial resilience, may have also contained the seeds of its vulnerability.”

The heyday of the walrus ivory trade saw the material used for exquisitely carved items during Europe’s Romanesque art period. The church produced much of this, with major ivory workshops in ecclesiastical centres such as Canterbury, UK.

Ivory games were also popular. The Viking board game hnefatafl was often played with walrus ivory pieces, as was chess, with the famous Lewis chessmen among the most stunning examples of Norse carved ivory.

Tusks were exported still attached to the walrus skull and snout, which formed a neat protective package that was broken up at workshops for ivory removal. These remains allowed the study to take place, as DNA extraction from carved artefacts would be far too damaging.

Co-author Dr Bastiaan Star of the University of Oslo said: “Until now, there was no quantitative data to support the story about walrus ivory from Greenland. Walruses could have been hunted in the north of Russia, and perhaps even in Arctic Norway at that time. Our research now proves beyond doubt that much of the ivory traded to Europe during the Middle Ages really did come from Greenland”.

The research was funded by the Leverhulme Trust, Nansenfondet and the Research Council of Norway.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Size Matters: If You Are a Bubble of Volcanic Gas

Size matters: if you are a bubble of volcanic gas

source: www.cam.ac.uk

The chemical composition of gases emitted from volcanoes – which are used to monitor changes in volcanic activity – can change depending on the size of gas bubbles rising to the surface, and relate to the way in which they erupt. The results, published in the journal Nature Geoscience, could be used to improve the forecasting of threats posed by certain volcanoes.

At first, we couldn’t understand how the gases could emerge much colder than the molten lava sloshing in the lake.

Clive Oppenheimer

A team of scientists, including a volcanologist and mathematician from the University of Cambridge, discovered the phenomenon through detailed observations of gas emissions from Kīlauea volcano in Hawaii.

At many volcanoes around the world, gas emissions are monitored routinely to help with forecasting eruptions. Changes in the output or proportions of different gases – such as carbon dioxide and sulphur dioxide – can herald shifts in the activity of a volcano. Volcanologists have considered that these chemical changes reflect the rise and fall of magma in the Earth’s crust but the new research reveals that the composition of volcanic gases depends also on the size of the gas bubbles rising up to the surface.

Until the latest spectacular eruption opened up fissures on the flank of the volcano, Kīlauea held a vast lava lake in its summit crater. The behaviour of this lava lake alternated between phases of fiery ‘spattering’ powered by large gas bubbles bursting through the magma, and more gentle gas release, accompanied by slow and steady motion of the lava.

In the past, volcanic gases have been sampled directly from steaming vents and openings called fumaroles. But this is not possible for the emissions from a lava lake, 200 metres across, and at the bottom of a steep-sided crater. Instead, the team used an infrared spectrometer, which is employed for routine volcano monitoring by co-authors of the study, Jeff Sutton and Tamar Elias from the Hawaiian Volcano Observatory (US Geological Survey).

The device was located on the edge of the crater, pointed at the lava lake, and recorded gas compositions in the atmosphere every few seconds. The emissions of carbon- and sulphur-bearing gases were measured during both the vigorous and mild phases of activity.

Each individual measurement was used to compute the temperature of the volcanic gas. What immediately struck the scientists was that the gas temperatures ranged from 1150 degrees Celsius – the temperature of the lava – down to around 900 degrees Celsius. “At this temperature, the lava would freeze,” said lead author Dr Clive Oppenheimer, from Cambridge’s Department of Geography. “At first, we couldn’t understand how the gases could emerge much colder than the molten lava sloshing in the lake.”

The clue to this puzzle came from the variation in calculated gas temperatures – they were high when the lava lake was placid, and low when it was bubbling furiously. “We realised it could be because of the size of the gas bubbles,” said co-author Professor Andy Woods, Director of Cambridge’s BP Institute. “Larger bubbles rise faster through the magma and expand rapidly as the pressure reduces, just like bubbles rising in a glass of fizzy drink; the gas cools down because of the expansion.” Larger bubbles form when smaller bubbles bump into each other and merge.

Woods and Oppenheimer developed a mathematical model to account for the process, which showed a very good fit with the observations.

But there was yet another surprising finding from the gas observations from Hawaii. As well as being cooler, the emissions from the large gas bubbles were more oxidised than expected – they had higher proportions of carbon dioxide to carbon monoxide.

The chemical balance of volcanic gases such as carbon dioxide and carbon monoxide (or sulphur dioxide and hydrogen sulphide) is generally thought to be controlled by the chemistry of the surrounding liquid magma but what the new findings showed is that when bubbles get large enough, most of the gas inside follows its own chemical pathway as the gas cools.

The ratio of carbon dioxide to carbon monoxide when the lava lake was in its most energetic state was six times higher than during the most stable phase. The scientists suggest this effect should be taken into account when gas measurements are being used to forecast major changes in volcanic activity.

“Gas measurements are critical to our monitoring and hazard assessment; refining our understanding of how magma behaves beneath the volcano allows us to better interpret our observations,” said co-author Tamar Elias from the Hawaiian Volcano Observatory.

And there is another implication of this discovery – not for eruptions today but for the evolution of the Earth’s atmosphere billions of years ago. “Volcanic emissions in Earth’s deep past may have made the atmosphere more oxidising than we thought,” said co-author Bruno Scaillet. “A more oxygen-rich atmosphere would have facilitated the emergence and viability of life on land, by generating an ozone layer, which shields against harmful ultraviolet rays from the sun.”

Reference:
Clive Oppenheimer et al “Influence of eruptive style on volcanic gas emission chemistry and temperature” Nature Geoscience (2018). DOI: 10.1038/s41561-018-0194-5

​Inset image: Clive Oppenheimer in Hawaii. Credit: Clive Oppenheimer

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists Measure Severity of Drought During the Maya Collapse

Scientists measure severity of drought during the Maya collapse

source: www.cam.ac.uk

The severity of drought conditions during the demise of the Maya civilisation about one thousand years ago has been quantified, representing another piece of evidence that could be used to solve the longstanding mystery of what caused the downfall of one of the ancient world’s great civilisations.

The role of climate change in the collapse of Classic Maya civilisation is somewhat controversial, partly because previous records are limited to qualitative reconstructions.

Nick Evans

Researchers from the University of Cambridge and the University of Florida developed a method to measure the different isotopes of water trapped in gypsum, a mineral that forms during times of drought when the water level is lowered, in Lake Chichancanab in Mexico’s Yucatán Peninsula where the Maya were based.

Based on these measurements, the researchers found that annual precipitation decreased between 41% and 54% relative to today during the period of the Maya civilisation’s collapse, with periods of up to 70% rainfall reduction during peak drought conditions, and that relative humidity declined by 2% to 7% relative to today. The results are reported in the journal Science.

“The role of climate change in the collapse of Classic Maya civilisation is somewhat controversial, partly because previous records are limited to qualitative reconstructions, for example whether conditions were wetter or drier,” said Nick Evans, a PhD student in Cambridge’s Department of Earth Sciences and the paper’s first author. “Our study represents a substantial advance as it provides statistically robust estimates of rainfall and humidity levels during the Maya downfall.”

Maya civilisation is divided into four main periods: the Preclassic (2000 BCE – 250 CE), Classic (250 CE – 800 CE), terminal Classic (800 – 1000 CE) and Postclassic (1000 CE – 1539 CE). The Classic period was marked by the construction of monumental architecture, intellectual and artistic development, and the growth of large city-states.

During the 9th century however, there was a major political collapse in the central Maya region: their famous limestone cities were abandoned and dynasties ended. And while the Maya people survived beyond this period, their political and economic power was depleted.

There are multiple theories as to what caused the collapse of the Maya civilisation, such as invasion, war, environmental degradation and collapsing trade routes. In the 1990s, however, researchers were able to piece together climate records for the period of the Maya collapse and found that it correlated with an extended period of extreme drought.

Professor David Hodell, Director of Cambridge’s Godwin Laboratory for Palaeoclimate Research and the senior author of the current paper, provided the first physical evidence of a correlation between this period of drought at Lake Chichancanab and the downfall of the Classic Maya civilisation in a paper published in 1995.

Now, Hodell and his colleagues have applied a new method and estimated the extent of this drought. Using a new geochemical method to measure the water locked within gypsum from Chichancanab, the researchers have built a complete model of hydrological conditions during the terminal Classic Period when the Maya collapsed.

The researchers analysed the different isotopes of water trapped within the crystal structure of the gypsum to determine changes in rainfall and relative humidity during the Maya downfall.

They measured three oxygen and two hydrogen isotopes to reconstruct the history of the lake water between 800 and 1000 CE. When gypsum forms, water molecules are incorporated directly into its crystalline structure, and this water records the different isotopes that were present in the ancient lake water at the time of its formation. “This method is highly accurate and is almost like measuring the water itself,” said Evans.

In periods of drought, more water evaporates from lakes such as Chichancanab, and because the lighter isotopes of water evaporate faster, the water becomes heavier. A higher proportion of the heavier isotopes, such as oxygen-18 and hydrogen-2 (deuterium), would indicate drought conditions. By mapping the proportion of the different isotopes contained within each layer of gypsum, the researchers were able to build a model to estimate past changes in rainfall and relative humidity over the period of the Maya collapse.

This quantitative climate data can be used to better predict how these drought conditions may have affected agriculture, including yields of the Maya’s staple crops, such as maize.

The research was supported by the European Research Council.

Reference:
Nicholas P. Evans et al. ‘Quantification of Drought During the Collapse of the Classic Maya Civilization.’ Science (2018). DOI: 10.1126/science.aas9871

Inset image: Lake Chichancanab, the site of the study. Chichancanab means “Little Sea” in Yucatec Maya, reflecting its relatively salty water composed dominantly of calcium and sulfate. (Credit: Mark Brenner)


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists Identify Exoplanets Where Life Could Develop As It Did On Earth

Scientists identify exoplanets where life could develop as it did on Earth

source: www.cam.ac.uk

Scientists have identified a group of planets outside our solar system where the same chemical conditions that may have led to life on Earth exist.

This work brings us just a little bit closer to addressing the question of whether we are alone in the universe.

Paul Rimmer

The researchers, from the University of Cambridge and the Medical Research Council Laboratory of Molecular Biology (MRC LMB), found that the chances for life to develop on the surface of a rocky planet like Earth are connected to the type and strength of light given off by its host star.

Their study, published in the journal Science Advances, proposes that stars which give off sufficient ultraviolet (UV) light could kick-start life on their orbiting planets in the same way it likely developed on Earth, where the UV light powers a series of chemical reactions that produce the building blocks of life.

The researchers have identified a range of planets where the UV light from their host star is sufficient to allow these chemical reactions to take place, and that lie within the habitable range where liquid water can exist on the planet’s surface.

“This work allows us to narrow down the best places to search for life,” said Dr Paul Rimmer, a postdoctoral researcher with a joint affiliation at Cambridge’s Cavendish Laboratory and the MRC LMB, and the paper’s first author. “It brings us just a little bit closer to addressing the question of whether we are alone in the universe.”

The new paper is the result of an ongoing collaboration between the Cavendish Laboratory and the MRC LMB, bringing together organic chemistry and exoplanet research. It builds on the work of Professor John Sutherland, a co-author on the current paper, who studies the chemical origin of life on Earth.

In a paper published in 2015, Professor Sutherland’s group at the MRC LMB proposed that cyanide, although a deadly poison, was in fact a key ingredient in the primordial soup from which all life on Earth originated.

In this hypothesis, carbon from meteorites that slammed into the young Earth interacted with nitrogen in the atmosphere to form hydrogen cyanide. The hydrogen cyanide rained to the surface, where it interacted with other elements in various ways, powered by the UV light from the sun. The chemicals produced from these interactions generated the building blocks of RNA, the close relative of DNA which most biologists believe was the first molecule of life to carry information.

In the laboratory, Sutherland’s group recreated these chemical reactions under UV lamps, and generated the precursors to lipids, amino acids and nucleotides, all of which are essential components of living cells.

“I came across these earlier experiments, and as an astronomer, my first question is always what kind of light are you using, which as chemists they hadn’t really thought about,” said Rimmer. “I started out measuring the number of photons emitted by their lamps, and then realised that comparing this light to the light of different stars was a straightforward next step.”

The two groups performed a series of laboratory experiments to measure how quickly the building blocks of life can be formed from hydrogen cyanide and hydrogen sulphite ions in water when exposed to UV light. They then performed the same experiment in the absence of light.

“There is chemistry that happens in the dark: it’s slower than the chemistry that happens in the light, but it’s there,” said senior author Professor Didier Queloz, also from the Cavendish Laboratory. “We wanted to see how much light it would take for the light chemistry to win out over the dark chemistry.”

The same experiment run in the dark with the hydrogen cyanide and the hydrogen sulphite resulted in an inert compound which could not be used to form the building blocks of life, while the experiment performed under the lights did result in the necessary building blocks.

The researchers then compared the light chemistry to the dark chemistry against the UV light of different stars. They plotted the amount of UV light available to planets in orbit around these stars to determine where the chemistry could be activated.

They found that stars around the same temperature as our sun emitted enough light for the building blocks of life to have formed on the surfaces of their planets. Cool stars, on the other hand, do not produce enough light for these building blocks to be formed, except if they have frequent powerful solar flares to jolt the chemistry forward step by step. Planets that both receive enough light to activate the chemistry and could have liquid water on their surfaces reside in what the researchers have called the abiogenesis zone.

Among the known exoplanets which reside in the abiogenesis zone are several planets detected by the Kepler telescope, including Kepler 452b, a planet that has been nicknamed Earth’s ‘cousin’, although it is too far away to probe with current technology. Next-generation telescopes, such as NASA’s TESS and James Webb Telescopes, will hopefully be able to identify and potentially characterise many more planets that lie within the abiogenesis zone.

Of course, it is also possible that if there is life on other planets, that it has or will develop in a totally different way than it did on Earth.

“I’m not sure how contingent life is, but given that we only have one example so far, it makes sense to look for places that are most like us,” said Rimmer. “There’s an important distinction between what is necessary and what is sufficient. The building blocks are necessary, but they may not be sufficient: it’s possible you could mix them for billions of years and nothing happens. But you want to at least look at the places where the necessary things exist.”

According to recent estimates, there are as many as 700 million trillion terrestrial planets in the observable universe. “Getting some idea of what fraction have been, or might be, primed for life fascinates me,” said Sutherland. “Of course, being primed for life is not everything and we still don’t know how likely the origin of life is, even given favourable circumstances – if it’s really unlikely then we might be alone, but if not, we may have company.”

The research was funded by the Kavli Foundation and the Simons Foundation.

Reference:
Paul B. Rimmer et al. ‘The Origin of RNA Precursors on Exoplanets.’ Science Advances (2018). DOI: 10.1126/sciadv.aar3302

Inset image: Diagram of confirmed exoplanets within the liquid water habitable zone (as well as Earth). Credit: Paul Rimmer


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Mathematician Awarded 2018 Fields Medal

Cambridge mathematician awarded 2018 Fields Medal

source: www.cam.ac.uk

University of Cambridge mathematician Caucher Birkar has been named one of four recipients of the 2018 Fields medals, the most prestigious awards in mathematics.

Kurdistan was an unlikely place for a kid to develop an interest in mathematics – I’m hoping that this news will put a smile on the faces of those 40 million people.

Caucher Birkar

Professor Birkar, who originally came to the UK as a Kurdish refugee, was given the award today at the International Congress of Mathematicians in Rio de Janeiro, Brazil.

The Fields medals, often called the Nobel Prize of mathematics, are awarded every four years. Medallists must be under the age of 40 by the start of the year they receive the award, with up to four mathematicians honoured at a time. Awarded for the first time in 1936, the medal is recognition for works of excellence and an incentive for new outstanding achievements.

Caucher Birkar from simonsfoundation.org on Vimeo.

Birkar, a member of Cambridge’s Department of Pure Mathematics and Mathematical Statistics, won the award for his work on categorising different kinds of polynomial equations. He proved that the infinite variety of such equations can be split into a finite number of classifications, a major breakthrough in the field of birational geometry. Born in a Kurdish village in pre-revolutionary Iran, Birkar sought and obtained political asylum in the UK while finishing his undergraduate degree in Iran.

“War-ridden Kurdistan was an unlikely place for a kid to develop an interest in mathematics,” Birkar told the ICM today. “I’m hoping that this news will put a smile on the faces of those 40 million people.”

Birkar, who just this year received recognition for his work as one of the London Mathematical Society Prize winners, was born in 1978 in Marivan, a Kurdish province in Iran bordering Iraq with about 200,000 inhabitants. His curiosity was awakened by algebraic geometry, the same interest that, in that same region, centuries earlier, had attracted the attention of Omar Khayyam (1048-1131) and Sharaf al-Din al-Tusi (1135-1213).

After graduating in Mathematics from Tehran University, Birkar went to live in the UK, where he became a British citizen. In 2004, he completed his PhD at the University of Nottingham with the thesis “Topics in modern algebraic geometry”. Throughout his career, birational geometry has stood out as his main area of interest. He has devoted himself to the fundamental aspects of key problems in modern mathematics – such as minimal models, Fano varieties, and singularities. His theories have solved long-standing conjectures.

In 2010, the year in which he was awarded by the Foundation Sciences Mathématiques de Paris, Birkar wrote, alongside Paolo Cascini (Imperial College London), Christopher Hacon (University of Utah) and James McKernan (University of California, San Diego), an article called “Existence of minimal models for varieties of general log type” that revolutionised the field. The article earned the quartet the AMS Moore Prize in 2016.

Founded by the Canadian mathematician John Charles Fields to celebrate outstanding achievements, the Fields Medal has already been awarded to 56 scholars of the most diverse nationalities, among them, Brazilian Fields laureate Artur Avila, an extraordinary researcher from IMPA, awarded in 2014 in South Korea. Due to its importance and prestige, the medal is often likened to a Nobel Prize of Mathematics.

“This is absolutely phenomenal, both for Caucher and for mathematics at Cambridge,” said Professor Gabriel Paternain, Head of the Department of Pure Mathematics and Mathematical Statistics. “Caucher was already an exceptional young researcher when he came to Cambridge, and he’s now one of the most remarkable people in this field. At Cambridge, we want to give all of our young researchers the opportunity to really explore their field early in their career: it can lead to some truly amazing things.”

The winners of the Fields medal are selected by a group of specialists nominated by the Executive Committee of the International Mathematical Union (IMU), which organize the ICMs. Every four years, between two and four researchers under the age of 40 are chosen. Since 2006, a cash prize of 15 thousand Canadian dollars accompanies the medal.

In an interview with Quanta Magazine, Birkar spoke of the math club at Tehran University, where pictures of Fields medallists lined the walls. “I looked at them and said to myself, ‘Will I ever meet one of these people?’ At that time in Iran, I couldn’t even know that I’d be able to go to the West.

“To go from the point that I didn’t imagine meeting these people to the point where someday I hold a medal myself — I just couldn’t imagine that this would come true.”

Professor Birkar is Cambridge’s 11th Fields medallist.

The other three winners of the 2018 Fields medals are Peter Scholze from the University of Bonn, Akshay Venkatesh from the Institute of Advanced Studies and Alessio Figalli from ETH Zurich.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Hewitsons LLP Sponsors 2018 Cambridge Innovation Summit

Hewitsons LLP is delighted to have sponsored the third annual Cambridge Innovation Summit which took place on 11 July.

The Cambridge Innovation Summit brought together over 100 leading thinkers about innovation processes from businesses and other organisations in the USA, Russia, Japan, Australia and across Europe. Delegates from all of the consortia run by the Cambridge-based Centre for Business Innovation (CfBI) and invited guests worked together on challenges and opportunities which they are experiencing in the innovation landscape.

There were sessions on ‘Global Innovation Processes’ (with discussion leaders from Apple, Facebook, 3M and Lloyds Bank), ‘Innovation for Health and Wellness’ (with discussion leaders from Amadeus Partners, the NHS, Astra Zeneca, Philips and Illumina) and ‘Innovation Made in Cambridge’ (with Professor Tim Minshall leading the discussion), and table-top demonstrations from a number of Cambridge-based innovators.

The world-class networking opportunity culminated with dinner in Trinity Hall addressed by Jane Osbourn of Medimmune on the theme of ‘Why Cambridge for Innovation?’

The following day CfBI consortia for ‘Open Innovation Meets Big Data’, ‘Medical Adherence/Digital Health’, ‘Nano-Carbon Enhanced Materials’ and ‘Corporate Venturing Leadership’ met in private sessions in the Cambridge region.

Andrew Priest, Head of Technology at Hewitsons, says “It was a privilege to once again sponsor and take part in the Cambridge Innovation Summit. The event offered a great insight into why Cambridge continues to attract so many technology businesses and entrepreneurial individuals, offering innovative solutions across many different technology sectors. On the evidence of what we have seen and heard today, Cambridge will continue to be a ‘hotbed’ for technology innovation for many years to come”.

Peter Hewkin CEO of the Centre for Business Innovation says “We are delighted to bring together the members of CfBI’s eight international consortia in Cambridge (UK) each year to inspire, inform and engage them in one of the world’s best innovation clusters”.

Planning has already started for Cambridge Innovation Summit 2019.

Editors notes:

About Hewitsons
Hewitsons is a leading law firm that delivers services with an absolute focus on the interests of each of its clients. It advises businesses, individuals and institutions including charities, educational bodies and the public sector. Hewitsons operates UK-wide and internationally, delivering a broad range of high-quality specialist legal services. Its ability to ensure its clients obtain top-quality advice in other jurisdictions is greatly enhanced by its international network, LawExchange International. The size and depth of its resources mean its clients know that Hewitsons will consistently add value and achieve the very best results for them.

Hewitsons has offices in Cambridge, London, Northampton and Milton Keynes. Our website can be found at www.hewitsons.com

About CfBI
CfBI is a new type of service organisation, headquartered in Cambridge, UK. It creates and facilitates communities whose participants work together in a trusting environment, towards common goals or under a brand which reflects their values and beliefs in order to  ‘do more with less’ in the spirit of open innovation. CfBI’s team has been refining the formula for over fifteen years to help members all over the world to get optimum benefit.

CfBI continues to expand its portfolio of consortia delivering “collaborative advantage” across Europe, the USA and beyond. Leading companies, government departments, research institutes and industry clusters, participate to derive benefit from accelerated learning, project cost sharing, influencing regulators, designing and promoting best practises, training as well as business development.

Read more at www.cfbi.com

The annual Cambridge Innovation Summit is where all of the CfBI’s consortium members come together with invited guests to discuss and help to shape the innovation landscape.

Military Spending Did Not “Crowd Out” Welfare in Middle East Prior to Arab Spring

Military spending did not “crowd out” welfare in Middle East prior to Arab Spring

source: www.cam.ac.uk

Findings dispute “guns versus butter” narrative as a major factor behind the Arab Spring. Researchers caution against uncritically applying lessons from Western nations to interpret public policy decisions in the Middle East.

Policy analysts should not single out military spending as a main culprit for the lack of investment in public goods

Adam Coutts

Research casts doubt on the widely-held view that spiralling military expenditure across the Middle East and North Africa (MENA) “crowded out” investment in healthcare and public services, leading to civil unrest that eventually exploded in the Arab Spring revolutions.

The so-called “guns versus butter” or “welfare versus warfare” hypotheses – that prioritised military spending resulted in neglect of health and education, thereby creating conditions that fomented public rebellion – is considered by many experts to be a root cause of the uprisings that gripped the region during 2011.

However, a team of researchers who analysed economic and security data from MENA nations in the 16 years leading up to the Arab Spring found no evidence of a trade-off between spending on the military and public services, specifically healthcare.

The researchers from Cambridge and the Lebanese American University argue that much of the evidence for the ‘guns versus butter’ causal link come from analyses of wealthy European nations, which has then been assumed to hold true for the Middle East.

They say the study’s findings, published today in the journal Defence and Peace Economics, provide a “cautionary note” against a reliance on simplistic correlations based on data from OECD nations to draw important policy conclusions about the causes of turmoil in the Middle East.

“Our research finds reports of this apparent spending trade-off prior to the Arab Spring to be somewhat spurious,” said Dr Adam Coutts, based at Cambridge University’s Department of Sociology.

“Academics and policy-makers should be careful in assuming that models and results from studies of other regions can be transplanted onto the Middle East and North Africa,” he said.

“Determining the cause of unrest is a rather more complex task than some experts may suggest. Historical experiences and political economy factors need to be considered.”

While only Saudi Arabia is in the top ten global nations for military spending in terms of hard cash, when calculated as a share of GDP six of the top ten military spenders are MENA nations.

Coutts and colleagues ran World Bank data through detailed statistical models to explore the trade-off between spending on military and on welfare – health, in this case – of 18 different MENA nations from 1995 up to the start of the Arab Spring in 2011.

The team also looked at casualties resulting from domestic terror attacks in an attempt to estimate security needs that might have helped drive military spending in a region plagued by terrorism.

They found no statistically significant evidence that increased military spending had an impact on health investment. “Contrary to existing evidence from many European nations, we found that levels of military expenditure do not induce or affect cuts to healthcare in the Middle East and North Africa,” said co-author Dr Adel Daoud from Cambridge’s Centre for Business Research.

The researchers also found no evidence for casualties from terrorism affecting either health or military spending – perhaps a result of the routine nature of such occurrences in the region.

“There may have been a policy adaptation in which regional conflicts and security threats are no longer the main influence on government security and military spending decisions,” said Daoud.

Adam Coutts added: “It has been argued that Arab populations accepted an ‘authoritarian bargain’ over the last forty years – one of societal militarisation in return for domestic security – and that this came at the expense of their welfare and social mobility.

“However, health and military spending cannot be predicted by each other in this troubled region. Policy analysts should not single out military spending as a main culprit for the lack of investment in public goods.

“Once again we find that straightforward explanations for unrest in the Middle East and North Africa are tenuous on close analysis.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.