All posts by Admin

High Ozone Levels In Tropical Pacific Caused By Fires Burning In Africa and Asia

High ozone levels in tropical Pacific caused by fires burning in Africa and Asia

Source: www.cam.ac.uk

Study indicates ‘biomass burning’ may play a larger role in climate change than previously realised.

The measurements are now starting to produce insight into how the composition of the remote tropical atmosphere is affected by human activities occurring nearly halfway around the world.

Neil Harris

While efforts to limit emissions of greenhouse gases, including ozone, tend to focus on industrial activities and the burning of fossil fuels, a new study suggests that future regulations may need to address the burning of forests and vegetation. The study, published in the journal Nature Communications, indicates that ‘biomass burning’ may play a larger role in climate change than previously realised.

Based on observations from two aircraft missions, satellite data and a variety of models, an international research team showed that fires burning in tropical Africa and Southeast Asia caused pockets of high ozone and low water in the lower atmosphere above Guam – a remote island in the Pacific Ocean 1,700 miles east of Taiwan.

“We were very surprised to find high concentrations of ozone and chemicals that we know are only emitted by fires in the air around Guam,” said the study’s lead author Daniel Anderson, a graduate student at the University of Maryland. “We didn’t make specific flights to target high-ozone areas – they were so omnipresent that no matter where we flew, we found them.”

For the study, two research planes on complementary missions flew over Guam measuring the levels of dozens of chemicals in the atmosphere in January and February 2014. One aircraft flew up to 24,000 feet above the ocean surface during the UK Natural Environment Research Council’s Coordinated Airborne Studies in the Tropics (CAST) mission. The other flew up to 48,000 feet above the ocean surface during the CONvective Transport of Active Species in the Tropics (CONTRAST) mission.

“International collaboration is essential for studying global environmental issues these days,” said CAST Principal Investigator Neil Harris, of Cambridge’s Department of Chemistry. “This US/UK-led campaign over the western Pacific was the first of its kind in this region and collected a unique data set. The measurements are now starting to produce insight into how the composition of the remote tropical atmosphere is affected by human activities occurring nearly halfway around the world.”

Researchers examined 17 CAST and 11 CONTRAST flights and compiled over 3,000 samples from high-ozone, low-water air parcels for the study. In the samples, the team detected high concentrations of chemicals associated with biomass burning—hydrogen cyanide, acetonitrile, benzene and ethyne.

“Hydrogen cyanide and acetonitrile were the smoking guns because they are emitted almost exclusively by biomass burning. High levels of the other chemicals simply added further weight to the findings,” said study co-author Julie Nicely, a graduate student from the University of Maryland.

Next, the researchers traced the polluted air parcels backward 10 days, using the National Oceanic and Atmospheric Administration (NOAA) Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model and precipitation data, to determine where they came from. Overlaying fire data from NASA’s moderate resolution imaging spectroradiometer (MODIS) on board the Terra satellite, the researchers connected nearly all of the high-ozone, low-water structures to tropical regions with active biomass burning in tropical Africa and Southeast Asia.

“The investigation utilised a variety of models, including the NCAR CAM-Chem model to forecast and later analyse chemical and dynamical conditions near Guam, as well as satellite data from numerous instruments that augmented the interpretation of the aircraft observations,” said study co-author Douglas Kinnison, a project scientist at the University Corporation for Atmospheric Research.

In the paper, the researchers also offer a new explanation for the dry nature of the polluted air parcels.

“Our results challenge the explanation atmospheric scientists commonly offer for pockets of high ozone and low water: that these zones result from the air having descended from the stratosphere where air is colder and dryer than elsewhere,” said University of Maryland Professor Ross Salawitch, the study’s senior author and principal investigator of CONTRAST.

“We know that the polluted air did not mix with air in the stratosphere to dry out because we found combined elevated levels of carbon monoxide, nitric oxide and ozone in our air samples, but air in the higher stratosphere does not contain much naturally occurring carbon monoxide,” said Anderson.

The researchers found that the polluted air that reached Guam never entered the stratosphere and instead simply dried out during its descent within the lower atmosphere. While textbooks show air moving upward in the tropics, according to Salawitch, this represents the net motion of air. Because this upward motion happens mostly within small storm systems, it must be balanced by air slowly descending, such as with these polluted parcels released from fires.

Based on the results of this study, global climate models may need to be reassessed to include and correctly represent the impacts of biomass burning, deforestation and reforestation, according to Salawitch. Also, future studies such as NASA’s upcoming Atmospheric Tomography Mission will add to the data collected by CAST and CONTRAST to help obtain a clearer picture of our changing environment.

In addition to those mentioned above, the study’s authors included UMD Department of Atmospheric and Oceanic Science Professor Russell Dickerson and Assistant Research Professor Timothy Canty; CAST co-principal investigator James Lee of the University of York; CONTRAST co-principal investigator Elliott Atlas of the University of Miami; and additional researchers from NASA; NOAA; the University of California, Irvine; the California Institute of Technology; the University of Manchester; the Institute of Physical Chemistry Rocasolano; and the National Research Council in Argentina.

This research was supported by the Natural Environment Research Council, National Science Foundation, NASA, and National Oceanic and Atmospheric Administration.

Reference:
Daniel C. Anderson et al. ‘A pervasive role for biomass burning in tropical high ozone/low water structures’ Nature Communications (2016). DOI: 10.1038/ncomms10267. 

Inset image: Air Tracking. Credit: Daniel Anderson

Adapted from a University of Maryland press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/high-ozone-levels-in-tropical-pacific-caused-by-fires-burning-in-africa-and-asia#sthash.bWKq7Na0.dpuf

Cocaine Addiction: Scientists Discover ‘Back Door’ Into The Brain

Cocaine addiction: Scientists discover ‘back door’ into the brain

source: www.cam.ac.uk

Individuals addicted to cocaine may have difficulty in controlling their addiction because of a previously-unknown ‘back door’ into the brain, circumventing their self-control, suggests a new study led by the University of Cambridge.

Most people who use cocaine do so initially in search of a hedonic ‘high’. In some individuals, though, frequent use leads to addiction, where use of the drug is no longer voluntary, but ultimately becomes a compulsion

David Belin

A second study from the team suggests that a drug used to treat paracetamol overdose may be able to help individuals who want to break their addiction and stop their damaging cocaine seeking habits.

Although both studies were carried out in rats, the researchers believe the findings will be relevant to humans.

Cocaine is a stimulant drug that can lead to addiction when taken repeatedly. Quitting can be extremely difficult for some people: around four in ten individuals who relapse report having experienced a craving for the drug – however, this means that six out of ten people have relapsed for reasons other than ‘needing’ the drug.

“Most people who use cocaine do so initially in search of a hedonic ‘high’,” explains Dr David Belin from the Department of Pharmacology at the University of Cambridge. “In some individuals, though, frequent use leads to addiction, where use of the drug is no longer voluntary, but ultimately becomes a compulsion. We wanted to understand why this should be the case.”

Drug-taking causes a release in the brain of the chemical dopamine, which helps provide the ‘high’ experienced by the user. Initially the drug taking is volitional – in other words, it is the individual’s choice to take the drug – but over time, this becomes habitual, beyond their control.

Previous research by Professor Barry Everitt from the Department of Psychology at Cambridge showed that when rats were allowed to self-administer cocaine, dopamine-related activity occurred initially in an area of the brain known as the nucleus accumbens, which plays a significant role driving ‘goal-directed’ behaviour, as the rats sought out the drug. However, if the rats were given cocaine over an extended period, this activity transferred to the dorsolateral striatum, which plays an important role in habitual behaviour, suggesting that the rats were no longer in control, but rather were responding automatically, having developed a drug-taking habit.

The brain mechanisms underlying the balance between goal-directed and habitual behaviour involves the prefrontal cortex, the brain region that orchestrates our behaviour. It was previously thought that this region was overwhelmed by stimuli associated with the drugs, or with the craving experienced during withdrawal; however, this does not easily explain why the majority of individuals relapsing to drug use did not experience any craving.

Chronic exposure to drugs alters the prefrontal cortex, but it also alters an area of the brain called the basolateral amygdala, which is associated with the link between a stimulus and an emotion. The basolateral amygdala stores the pleasurable memories associated with cocaine, but the pre-frontal cortex manipulates this information, helping an individual to weigh up whether or not to take the drug: if an addicted individual takes the drug, this activates mechanisms in the dorsal striatum.

However, in a study published today in the journal Nature Communications, Dr Belin and Professor Everitt studied the brains of rats addicted to cocaine through self-administration of the drug and identified a previously unknown pathway within the brain that links impulse with habits.

The pathway links the basolateral amygdala indirectly with the dorsolateral striatum, circumventing the prefrontal cortex. This means that an addicted individual would not necessarily be aware of their desire to take the drug.

“We’ve always assumed that addiction occurs through a failure or our self-control, but now we know this is not necessarily the case,” explains Dr Belin. “We’ve found a back door directly to habitual behaviour.

“Drug addiction is mainly viewed as a psychiatric disorder, with treatments such as cognitive behavioural therapy focused on restoring the ability of the prefrontal cortex to control the otherwise maladaptive drug use. But we’ve shown that the prefrontal cortex is not always aware of what is happening, suggesting these treatments may not always be effective.”

In a second study, published in the journal Biological Psychiatry, Dr Belin and colleagues showed that a drug used to treat paracetamol overdose may be able to help individuals addicted to cocaine overcome their addiction – provided the individual wants to quit.

The drug, N-acetylcysteine, had previously been shown in rat studies to prevent relapse. However, the drug later failed human clinical trials, though analysis suggested that while it did not lead addicted individuals to stop using cocaine, amongst those who were trying to abstain, it helped them refrain from taking the drug.

Dr Belin and colleagues used an experiment in which rats compulsively self-administered cocaine. They found that rats given N-acetylcysteine lost the motivation to self-administer cocaine more quickly than rats given a placebo. In fact, when they had stopped working for cocaine, they tended to relapse at a lower rate. N-acetylcysteine also increased the activity in the brain of a particular gene associated with plasticity – the ability of the brain to adapt and learn new skills.

“A hallmark of addiction is that the user continues to take the drug even in the face of negative consequences – such as on their health, their family and friends, their job, and so on,” says co-author Mickael Puaud from the Department of Pharmacology of the University of Cambridge. “Our study suggests that N-acetylcysteine, a drug that we know is well tolerated and safe, may help individuals who want to quit to do so.”

Reference
Murray, JE et al. Basolateral and central amygdala differentially recruit and maintain dorsolateral striatum-dependent cocaine-seeking habits. Nature Comms; 16 December 2015

Ducret, E et al. N-acetylcysteine facilitates self-imposed abstinence after escalation of cocaine intake. Biological Psychiatry; 7 Oct 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/cocaine-addiction-scientists-discover-back-door-into-the-brain#sthash.265BAzOW.dpuf

Melting of Massive Ice ‘Lid’ Resulted In Huge Release of CO2 At The End of The Ice Age

Melting of massive ice ‘lid’ resulted in huge release of CO2 at the end of the ice age

source: www.cam.ac.uk

A new study of how the structure of the ocean has changed since the end of the last ice age suggest that the melting of a vast ‘lid’ of sea ice caused the release of huge amounts of carbon dioxide into the atmosphere.

Although conditions at the end of the last ice age were very different to today, this study highlights the importance that dynamic features such as sea ice have on regulating the climate system.

Jenny Roberts

A new study reconstructing conditions at the end of the last ice age suggests that as the Antarctic sea ice melted, massive amounts of carbon dioxide that had been trapped in the ocean were released into the atmosphere.

The study includes the first detailed reconstruction of the Southern Ocean density of the period and identified how it changed as the Earth warmed. It suggests a massive reorganisation of ocean temperature and salinity, but finds that this was not the driver of increased concentration of carbon dioxide in the atmosphere. The study, led by researchers from the University of Cambridge, is published in the journalProceedings of the National Academy of Sciences.

The ocean is made up of different layers of varying densities and chemical compositions. During the last ice age, it was thought that the deepest part of the ocean was made up of very salty, dense water, which was capable of trapping a lot of CO2. Scientists believed that a decrease in the density of this deep water resulted in the release of CO2 from the deep ocean to the atmosphere.

However, the new findings suggest that although a decrease in the density of the deep ocean did occur, it happened much later than the rise in atmospheric CO2, suggesting that other mechanisms must be responsible for the release of CO2 from the oceans at the end of the last ice age.

“We set out to test the idea that a decrease in ocean density resulted in a rise in CO2 by reconstructing how it changed across time periods when the Earth was warming,” said the paper’s lead author Jenny Roberts, a PhD student in Cambridge’s Department of Earth Sciences who is also a member of the British Antarctic Survey. “However what we found was not what we were expecting to see.”

In order to determine how the oceans have changed over time and to identify what might have caused the massive release of CO2, the researchers studied the chemical composition of microscopic shelled animals that have been buried deep in ocean sediment since the end of the ice age. Like layers of snow, the shells of these tiny animals, known as foraminifera, contain clues about what the ocean was like while they were alive, allowing the researchers to reconstruct how the ocean changed as the ice age was ending.

They found that during the cold glacial periods, the deepest water was significantly denser than it is today. However, what was unexpected was the timing of the reduction in the deep ocean density, which happened some 5,000 years after the initial increase in CO2, meaning that the density decrease couldn’t be responsible for releasing CO2 to the atmosphere.

“Before this study there were these two observations, the first was that glacial deep water was really salty and dense, and the second that it also contained a lot of CO2, and the community put two and two together and said these two observations must be linked,” said Roberts. “But it was only through doing our study, and looking at the change in both density and CO2 across the deglaciation, that we found they actually weren’t linked. This surprised us all.”

Through examination of the shells, the researchers found that changes in CO2 and density are not nearly as tightly linked as previously thought, suggesting something else must be causing CO2 to be released from the ocean.

Like a bottle of wine with a cork, sea ice can prevent CO2-rich water from releasing its CO2to the atmosphere. The Southern Ocean is a key area of exchange of CO2 between the ocean and atmosphere. The expansion of sea ice during the last ice age acted as a ‘lid’ on the Southern Ocean, preventing CO2 from escaping. The researchers suggest that the retreat of this sea ice lid at the end of the last ice age uncorked this ‘vintage’ CO2, resulting in an increase in carbon dioxide in the atmosphere.

“Although conditions at the end of the last ice age were very different to today, this study highlights the importance that dynamic features such as sea ice have on regulating the climate system, and emphasises the need for improved understanding and prediction as we head into our ever warming world,” said Roberts.

Reference:
Roberts, J. et. al. ‘Evolution of South Atlantic density and chemical stratification across the last deglaciation.’ PNAS (2016). DOI: 10.1073/pnas.1511252113

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/melting-of-massive-ice-lid-resulted-in-huge-release-of-co2-at-the-end-of-the-ice-age#sthash.SygI4uQf.dpuf

Banning Trophy Hunting Could Do More Harm Than Good

Banning trophy hunting could do more harm than good

source: www.cam.ac.uk

Trophy hunting shouldn’t be banned, but instead it should be better regulated to ensure funds generated from permits are invested back into local conservation efforts, according to new research.

There are many concerns about trophy hunting beyond the ethical that currently limit its effectiveness as a conservation tool.

Nigel Leader-Williams

Banning trophy hunting would do more harm than good in African countries that have little money to invest in critical conservation initiatives, argue researchers from the Universities of Cambridge, Adelaide and Helsinki. Trophy hunting can be an important conservation tool, provided it can be done in a controlled manner to benefit biodiversity conservation and local people. Where political and governance structures are adequate, trophy hunting can help address the ongoing loss of species.

The researchers have developed a list of 12 guidelines that could address some of the concerns about trophy hunting and enhance its contribution to biodiversity conservation. Their paper is published in the journal Trends in Ecology & Evolution.

“The story of Cecil the lion, who was killed by an American dentist in July 2015, shocked people all over the world and reignited debates surrounding trophy hunting,” said Professor Corey Bradshaw of the University of Adelaide, the paper’s senior author.

“Understandably, many people oppose trophy hunting and believe it is contributing to the ongoing loss of species; however, we contend that banning the US$217 million per year industry in Africa could end up being worse for species conservation,” he said.

Professor Bradshaw says trophy hunting brings in more money and can be less disruptive than ecotourism. While the majority of animals hunted in sub-Saharan Africa are more common and less valuable species, the majority of hunting revenue comes from a few valuable species, particularly the charismatic ‘Big Five’: lion, leopard, elephant, buffalo and black or white rhinoceros.

“Conserving biodiversity can be expensive, so generating money is essential for environmental non-government organisations, conservation-minded individuals, government agencies and scientists,” said co-author Dr Enrico Di Minin from the University of Helsinki.

“Financial resources for conservation, particularly in developing countries, are limited,” he said. “As such, consumptive (including trophy hunting) and non-consumptive (ecotourism safaris) uses are both needed to generate funding. Without such these, many natural habitats would otherwise be converted to agricultural or pastoral uses.

“Trophy hunting can also have a smaller carbon and infrastructure footprint than ecotourism, and it generates higher revenue from a lower number of uses.”

However, co-author Professor Nigel Leader-Williams from Cambridge’s Department of Geography said there is a need for the industry to be better regulated.

“There are many concerns about trophy hunting beyond the ethical that currently limit its effectiveness as a conservation tool,” he said. “One of the biggest problems is that the revenue it generates often goes to the private sector and rarely benefits protected-area management and the local communities. However, if this money was better managed, it would provide much needed funds for conservation.”

The authors’ guidelines to make trophy hunting more effective for conservation are:

  1. Mandatory levies should be imposed on safari operators by governments so that they can be invested directly into trust funds for conservation and management;
  2. Eco-labelling certification schemes could be adopted for trophies coming from areas that contribute to broader biodiversity conservation and respect animal welfare concerns;
  3. Mandatory population viability analyses should be done to ensure that harvests cause no net population declines;
  4. Post-hunt sales of any part of the animals should be banned to avoid illegal wildlife trade;
  5. Priority should be given to fund trophy hunting enterprises run (or leased) by local communities;
  6. Trusts to facilitate equitable benefit sharing within local communities and promote long-term economic sustainability should be created;
  7. Mandatory scientific sampling of hunted animals, including tissue for genetic analyses and teeth for age analysis, should be enforced;
  8. Mandatory 5-year (or more frequent) reviews of all individuals hunted and detailed population management plans should be submitted to government legislators to extend permits;
  9. There should be full disclosure to public of all data collected (including levied amounts);
  10. Independent government observers should be placed randomly and without forewarning on safari hunts as they happen;
  11. Trophies must be confiscated and permits are revoked when illegal practices are disclosed; and
  12. Backup professional shooters and trackers should be present for all hunts to minimise welfare concerns.

Reference:
E. Di Minin et al. ‘Banning Trophy Hunting Will Exacerbate Biodiversity Loss.’ Trends in Ecology & Evolution (2015). DOI: 10.1016/j.tree.2015.12.006

Adapted from a University of Adelaide press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/banning-trophy-hunting-could-do-more-harm-than-good#sthash.WN4HFNcM.dpuf

Global Learning Is Needed To Save Carbon Capture and Storage From Being Abandoned

Global learning is needed to save carbon capture and storage from being abandoned

source: www.cam.ac.uk

Governments should not be abandoning carbon capture and storage, argues a Cambridge researcher, as it is the only realistic way of dramatically reducing carbon emissions. Instead, they should be investing in global approaches to learn what works – and what doesn’t.

If we’re serious about meeting aggressive national or global emissions targets, the only way to do it affordably is with carbon capture and storage

David Reiner

Carbon capture and storage, which is considered by many experts as the only realistic way to dramatically reduce carbon emissions in an affordable way, has fallen out of favour with private and public sector funders. Corporations and governments worldwide, including most recently the UK, are abandoning the same technology they championed just a few years ago.

In a commentary published today (11 January) in the inaugural issue of the journalNature Energy, a University of Cambridge researcher argues that now is not the time for governments to drop carbon capture and storage (CCS). Like many new technologies, it is only possible to learn what works and what doesn’t by building and testing demonstration projects at scale, and that by giving up on CCS instead of working together to develop a global ‘portfolio’ of projects, countries are turning their backs on a key part of a low-carbon future.

CCS works by separating the carbon dioxide emitted by coal and gas power plants, transporting it and then storing it underground so that the CO2 cannot escape into the atmosphere. Critically, CCS can also be used in industrial processes, such as chemical, steel or cement plants, and is often the only feasible way of reducing emissions at these facilities. While renewable forms of energy, such as solar or wind, are important to reducing emissions, until there are dramatic advances in battery technology, CCS will be essential to deliver flexible power and to build green industrial clusters.

“If we’re serious about meeting aggressive national or global emissions targets, the only way to do it affordably is with CCS,” said Dr David Reiner of Cambridge Judge Business School, the paper’s author. “But since 2008, we’ve seen a decline in interest in CCS, which has essentially been in lock step with our declining interest in doing anything serious about climate change.”

Just days before last year’s UN climate summit in Paris, the UK government cancelled a four-year, £1 billion competition to support large-scale CCS demonstration projects. And since the financial crisis of 2008, projects in the US, Canada, Australia, Europe and elsewhere have been cancelled, although the first few large-scale integrated projects have recently begun operation. The Intergovernmental Panel on Climate Change (IPCC) says that without CCS, the costs associated with slowing global warming will double.

According to Reiner, there are several reasons that CCS seems to have fallen out of favour with both private and public sector funders. The first is cost – a single CCS demonstration plant costs in the range of $1 billion. Unlike solar or wind, which can be demonstrated at a much smaller scale, CCS can only be demonstrated at a large scale, driven by the size of commercial-scale power plants and the need to characterise the geological formations which will store the CO2.

“Scaling up any new technology is difficult, but it’s that much harder if you’re working in billion-dollar chunks,” said Reiner. “At 10 or even 100 million dollars, you will be able to find ways to fund the research & development. But being really serious about demonstrating CCS and making it work means allocating very large sums at a time when national budgets are still under stress after the global financial crisis.”

Another reason is commercial pressures and timescales. “The nature of demonstration is that you work out the kinks – you find out what works and what doesn’t, and you learn from it,” said Reiner. “It’s what’s done in science or in research and development all the time: you expect that nine of ten ideas won’t work, that nine of ten oil wells you drill won’t turn up anything, that nine of ten new drug candidates will fail. Whereas firms can make ample returns on a major oil discovery or a blockbuster drug to make up for the many failures along the way, that is clearly not the case for CCS, so the answer is almost certainly government funding or mandates.

“The scale of CCS and the fact that it’s at the demonstration rather than the research and development phase also means that you don’t get to play around with the technology as such – you’re essentially at the stage where, to use a gambling analogy, you’re putting all your money on red 32 or black 29. And when a certain approach turns out to be more expensive than expected, it’s easy for nay-sayers to dismiss the whole technology, rather than to consider how to learn from that failure and move forward.”

There is also the issue that before 2008 countries thought they would each be developing their own portfolios of projects and so they focused inward, rather than working together to develop a global portfolio of large-scale CCS demonstrations. In the rush to fund CCS projects between 2005 and 2009, countries assembled projects independently, and now only a handful of those projects remain.

According to Reiner, building a global portfolio, where countries learn from each other’s projects, will assist in learning through diversity and replication, ‘de-risking’ the technology and determining whether it ever emerges from the demonstration phase.

“If we’re not going to get CCS to happen, it’s hard to imagine getting the dramatic emissions reductions we need to limit global warming to two degrees – or three degrees, for that matter,” he said. “However, there’s an inherent tension in developing CCS – it is not a single technology, but a whole suite and if there are six CCS paths we can go down, it’s almost impossible to know sitting where we are now which is the right path. Somewhat ironically, we have to be willing to invest in these high-cost gambles or we will never be able to deliver an affordable, low-carbon energy system.”

Reference:
David M. Reiner. ‘Learning through a portfolio of carbon capture and storage demonstration projects.’ Nature Energy (2016). DOI: 10.1038/nenergy.2015.11

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/global-learning-is-needed-to-save-carbon-capture-and-storage-from-being-abandoned#sthash.HBlfRDwr.dpuf

New £369m Contract to Service RAF Hercules in Cambridge

New £369m contract to service RAF Hercules in Cambridge

Hercules transport planeImage copyrightAFP/Getty
Image captionRAF Hercules aircraft are used to carry troops, supplies and equipment in support of military operations around the world

A £369m contract to maintain the RAF’s Hercules fleet has been awarded to Cambridge-based Marshall Aerospace helping to secure 1,200 jobs.

The agreement will mean continued support for the C-130Js until 2022, the Ministry of Defence (MoD) said.

Marshall, which was founded in 1909, are specialists in servicing transport planes.

RAF Hercules aircraft are used to carry troops, supplies and equipment in support of military operations.

More on this story and others from CambridgeshireHercules transport planeImage copyrightPA

Image captionThe MoD describes the Hercules as “one of RAF’s workhorses” and a “vital part of its transport fleet” 

The MoD describes the Hercules as “one of RAF’s workhorses” and a “vital part of its transport fleet”.

As part of the six-year contract work will also be undertaken by Lockheed Martin and its sub-contractors at sites in Havant, Stansted and Cheltenham.

Defence Secretary Michael Fallon said: “It (the contract) will secure around 1,200 skilled jobs and ensure our essential RAF transport aircraft are prepared for operations for years to come.”

The contract will focus on servicing the 24 RAF C-130J-type Hercules planes following the retirement of the C-130K planes.

source: http://www.bbc.co.uk/

Bango Expands Collaboration with Microsoft to Put Carrier Billed Payments on Windows 10 Devices

Bango Expands Collaboration with Microsoft to Put Carrier Billed Payments on Windows 10 Devices

Successful relationship now scales-up to include PC, tablet, smartphone.

News Image

Bango offers our operator partners a sophisticated platform for launching, managing and growing carrier billing business in the Windows Store.

Bango (AIM: BGO), the mobile payments company, has expanded its agreement and integration with Microsoft Corp. to deliver carrier-billed payments across Windows 10 devices. As a result, charging payments to the user’s phone bills will become available to Windows Store customers.

With more than 110 million devices running Windows 10, consumers are using the Windows Store to acquire a vast range of applications and entertainment content that can be used across devices. For the first time, customers running Windows 10 on a PC or tablet will now be able to buy digital content by charging the costs to their mobile phone bill.

Bango is working in collaboration with Microsoft and mobile operator partners globally to ensure maximum coverage for this payment method. Specific operator availability will be announced as they are launched starting in January. Operators will benefit from unique Bango Boost technology, which analyzes and benchmarks a wide range of KPIs that grow payments success rates from carrier billing, in some instances by over 70%

App stores are seeing the emergence of carrier billing as a vital enabler of mobile commerce globally, generating 3x – 10x increase in conversion rates. In September 2015, Progressive Research reported approximately 280 carrier billing routes live with app stores, with over 40% running through the Bango Platform.

Commenting on the announcement, Bango CEO Ray Anderson said:

“We have enjoyed our collaboration with Microsoft for the Windows Store, so it is a major milestone that Microsoft is now adopting this payment method across Windows 10. Operators using the Bango Payment Platform will get to market quickly and can then use Bango technology to maximize their Windows 10 revenue.”

Todd Brix, Windows Store General Manager said:
“This addition to Windows 10 presents a new opportunity for app and game developers to reach millions of unbanked or under-banked consumers by enabling them to easily bill content to their existing mobile operator accounts. Bango offers our operator partners a sophisticated platform for launching, managing and growing carrier billing business in the Windows Store.”

About Bango
Bango’s mobile payment platform is vital to the global growth in digital content sales. The giants of mobile choose the Bango Payment Platform to provide a delightful and immediate payment experience that maximizes sales of digital content.

With over 140 markets activated by our partners, the Bango Payment Platform is established as the global standard for app stores to offer carrier billing. As the next billion consumers pick up their first smartphone, Bango technology will be there to unlock the universe of apps, video, games and other content that bring those smartphones to life. Global leaders plugging into Bango include Amazon, BlackBerry, Facebook, Google, Samsung, Microsoft and Mozilla.

source: http://uk.prweb.com/

Opinion: Paying People To Stay Away Is Not Always The Best Way To Protect Watersheds

Opinion: Paying people to stay away is not always the best way to protect watersheds

source: www.cam.ac.uk

Libby Blanchard and Bhaskar Vira from Cambridge’s Department of Geography argue that we need to consider alternative approaches in order to protect watersheds.

The successful management of the Wasatch demonstrates that an overreliance on markets to deliver watershed protection might be misguided.

In the American West, unprecedented droughts have caused extreme water shortages. The current drought in California and across the West is entering its fourth year, with precipitation and water storage reaching record low levels.

Such drought and water scarcity are onlylikely to increase with climate change, and the chances of a “megadrought” – one that lasts 35 years or longer — affecting the Southwest and central Great Plains by 2100 are above 80% if greenhouse gas emissions are not reduced.

Droughts are currently ranked second in the US in terms of national weather-related damages, with annual losses just shy of US$9 billion annually. Such economic impacts are likely to worsen as the century progresses.

As the frequency and severity of droughts increases, the successful protection of watersheds to capture, store and deliver water downstream in catchments will become increasingly important, even as the effective protection of watersheds becomes more challenging.

Since the early 2000s, the prevailing view in watershed protection is that paying upstream resource users for avoiding harmful activities, or rewarding positive action, is the most effective and direct method. This is the case of the Catskills watershed in New York, where environmentally sound economic development is incentivized.

There are, however, many different ways communities can invest in watersheds to harness the benefits they provide downstream communities.

In a recently published paper in the journal Ecosystem Services, we highlight an alternative option with the example of Salt Lake City’s successful management of the Wasatch watershed. Instead of offering financial incentives for the “ecosystem services” provided by this watershed, planners use regulations to secure the continued delivery of water, while allowing for recreational and public use.

The successful management of the Wasatch demonstrates that an overreliance on markets to deliver watershed protection might be misguided.

Perhaps part of the reason for this overreliance on market-based tools is a paucity of alternative success stories of watershed management. We note that the Wasatch story has been largely absent from much of the literature that discusses the potential of investing in watersheds for the important services that they provide. This absence results in an incomplete understanding of options to secure watershed ecosystem services, and limits the consideration of alternative watershed conservation approaches.

The Wasatch management strategy

The Wasatch is a 185-square-mile watershed that is an important drinking water source to over half a million people in Salt Lake City. This water comes from the annual snowmelt from the 11,000-foot-high peaks in the Wasatch range, which act as Salt Lake City’s virtual reservoir.

Salt Lake City’s management of the Wasatch watershed is somewhat unusual in contemporary examples of watershed protection in that it is focused on nonexclusionary regulation – that is, allowing permitted uses – and zoning to protect the urban water supply. For instance, the cities of Portland, Oregon and Santa Fe, New Mexico have worked with the US Forest Service to prohibit public access to source water watersheds within forests to protect drinking water supplies. In contrast, the governance of the Wasatch allows for public access and both commercial and noncommercial activities to occur in the watershed, such as skiing and mountain biking. It also imposes restrictions on allowable uses, such as restricting dogs in the watershed.

This permitted use, socially negotiated, helps mitigate the potential trade-offs associated with protection activities.

The suite of policies that protect the Wasatch do not include a “payments for ecosystem services” or other market-based incentives component, nor has there been any discussion of compensating potential resource users in the watershed for foregone economic opportunities. By not having a market-based incentives component, the Wasatch example provides an alternative regulatory-based solution for the protection of natural capital, which contrasts with the now prevalent market-based payments approach.

Importantly, the Wasatch example reinforces the rights of citizens to derive positive benefits from nature, without these being mediated through the mechanism of markets. In most payment-based systems, potential harm to a watershed is avoided by organizing beneficiaries so that they can compensate upstream resource users for foregone activities. In contrast, reliance on regulation and permitted activities supports the ‘polluter pays principle,’ which might be more appropriate in many circumstances.

Why we need alternative strategies

With the American West facing ever-increasing droughts, policymakers will be faced with the increasingly difficult task of protecting and preserving water supplies. Thus, awareness of alternative, successful strategies of watershed protection and management is crucially important.

The Wasatch offers an important example of how natural capital can be instrumentally and economically valued, but conserved via regulatory approaches and land use management and zoning, rather than a reliance on the creation of water markets, which are often misplaced and not suitable. Bringing stakeholders together to negotiate allowable uses that preserve critical watershed functions is an additional option within the policymaker’s toolkit, and one that is at risk of being forgotten in the rush to payment-based systems.

Libby Blanchard, Gates Cambridge Scholar and PhD Candidate, University of Cambridgeand Bhaskar Vira, Reader in Political Economy at the Department of Geography and Fellow of Fitzwilliam College; Director, University of Cambridge Conservation Research Institute, University of Cambridge

This article was originally published on The Conversation. Read the original article.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/discussion/opinion-paying-people-to-stay-away-is-not-always-the-best-way-to-protect-watersheds#sthash.xMyKtxrR.dpuf

Second Contagious Form Of Cancer Found In Tasmanian Devils

Second contagious form of cancer found in Tasmanian devils

source: www.cam.ac.uk

Transmissible cancers – cancers which can spread between individuals by the transfer of living cancer cells – are believed to arise extremely rarely in nature. One of the few known transmissible cancers causes facial tumours in Tasmanian devils, and is threatening this species with extinction. Today, scientists report the discovery of a second transmissible cancer in Tasmanian devils.

Until now, we’ve always thought that transmissible cancers arise extremely rarely in nature, but this new discovery makes us question this belief

Elizabeth Murchison

The discovery, published in the journalProceedings of the National Academy of Science, calls into question our current understanding of the processes that drive cancers to become transmissible.

Tasmanian devils are iconic marsupial carnivores that are only found in the wild on the Australian island state of Tasmania. The size of a small dog, the animals have a reputation for ferocity as they frequently bite each other during mating and feeding interactions.

In 1996, researchers observed Tasmanian devils in the north-east of the island with tumours affecting the face and mouth; soon it was discovered that these tumours were contagious between devils, spread by biting. The cancer spreads rapidly throughout the animal’s body and the disease usually causes the death of affected animals within months of the appearance of symptoms. The cancer has since spread through most of Tasmania and has triggered widespread devil population declines. The species was listed as endangered by the International Union for Conservation of Nature in 2008.

To date, only two other forms of transmissible cancer have been observed in nature: in dogs and in soft-shell clams. Cancer normally occurs when cells in the body start to proliferate uncontrollably; occasionally, cancers can spread and invade the body in a process known as ‘metastasis’; however, cancers do not normally survive beyond the body of the host from whose cells they originally derived. Transmissible cancers, however, arise when cancer cells gain the ability to spread beyond the body of the host that first spawned them, by transmission of cancer cells to new hosts.

Now, a team led by researchers from the University of Tasmania, Australia, and the University of Cambridge, UK, has identified a second, genetically distinct transmissible cancer in Tasmania devils.

“The second cancer causes tumours on the face that are outwardly indistinguishable from the previously-discovered cancer,” said first author Dr Ruth Pye from the Menzies Institute for Medical Research at the University of Tasmania. “So far it has been detected in eight devils in the south-east of Tasmania.”

“Until now, we’ve always thought that transmissible cancers arise extremely rarely in nature,” says Dr Elizabeth Murchison from the Department of Veterinary Medicine at the University of Cambridge, a senior author on the study, “but this new discovery makes us question this belief.

“Previously, we thought that Tasmanian devils were extremely unlucky to have fallen victim to a single runaway cancer that emerged from one individual devil and spread through the devil population by biting. However, now that we have discovered that this has happened a second time, it makes us wonder if Tasmanian devils might be particularly vulnerable to developing this type of disease, or that transmissible cancers may not be as rare in nature as we previously thought.”

Professor Gregory Woods, joint senior author from the Menzies Institute for Medical Research at the University of Tasmania, adds: “It’s possible that in the Tasmanian wilderness there are more transmissible cancers in Tasmanian devils that have not yet been discovered. The potential for new transmissible cancers to emerge in this species has important implications for Tasmanian devil conservation programmes.”

The discovery of the second transmissible cancer began in 2014, when a devil with facial tumours was found in south-east Tasmania. Although this animal’s tumours were outwardly very similar to those caused by the first-described Tasmanian devil transmissible cancer, the scientists found that this devil’s cancer carried different chromosomal rearrangements and was genetically distinct. Since then, eight additional animals have been found with the new cancer in the same area of south-east Tasmania.

The research was primarily supported the Wellcome Trust and the Australian Research Council, with additional support provided by Dr Eric Guiler Tasmanian Devil Research Grants and by the Save the Tasmanian Devil Program.

For more information about the research into Tasmanian devils, see T is for Tasmanian Devil.

Reference
Pye, RJ et al. A second transmissible cancer in Tasmanian devils. PNAS; 28 Dec 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/second-contagious-form-of-cancer-found-in-tasmanian-devils#sthash.Jy0fZf1U.dpuf

Opinion: How Frugal Innovation Can Kickstart The Global Economy In 2016

Opinion: How frugal innovation can kickstart the global economy in 2016

source: www.cam.ac.uk

Jaideep Prabhu (Cambridge Judge Business School) discusses the frugal innovation revolution that is taking the world by storm.

In late 2015 a Cambridge-based nonprofit released the Raspberry Pi Zero, a tiny £4 computer that was a whole £26 cheaper than the original 2012 model. The Zero is not only remarkable for its own sake – a computer so cheap it comes free with a £5.99 magazine – it is also symptomatic of a larger “frugal innovation” revolution that is taking the world by storm.

With the global economy struggling, this is the kind of innovation that could kickstart it in 2016. Empowered by cheap computers such as the Raspberry Pi and other ubiquitous tools such as smartphones, cloud computing, 3D printers, crowdfunding, and social media, small teams with limited resources are now able to innovate in ways that only large companies and governments could in the past. This frugal innovation – the ability to create faster, better and cheaper solutions using minimal resources – is poised to drive global growth in 2016 and beyond.

More than four billion people around the world, most of them in developing countries, live outside the formal economy and face significant unmet needs when it come to health, education, energy, food, and financial services. For years this large population was either the target of aid or was left to the mercy of governments.

More recently, large firms and smaller social enterprises have begun to see these four billion as an enormous opportunity to be reached through market-based solutions. These solutions must, however, be frugal – highly affordable and flexible in nature. They typically include previously excluded groups both as consumers and producers. Bringing the next four billion into the formal economy through frugal innovation has already begun to unleash growth and create unprecedented wealth in Asia, Africa and Latin America. But there’s much, much more to come.

Good news

Take the case of telecommunications. Over the last decade or so, highly affordable handsets and cheap calling rates have made mobile phones as commonplace as toothbrushes. In addition to bringing massive productivity gains to farmers and small businesses – not to mention creating new sources of employment – mobile phones also enable companies to roll out financial, healthcare and educational services affordably and at scale.

Take the case of Safaricom, Vodafone’s subsidiary in Kenya. In 2007 the company introduced M-Pesa, a service that enables anyone with a basic, SMS-enabled mobile phone to send and receive money that can be cashed in a corner shop acting as an M-Pesa agent.

This person-to-person transfer of small amounts of money between people who are often outside the banking system has increased financial inclusion in Kenya in a highly affordable and rapid way. So much so that more than 20m Kenyans now use M-Pesa and the volume of transactions on the system is more than US$25 billion, more than half the country’s GDP. M-Pesa (and services like it) have now spread to several other emerging markets in Africa and Asia.

Similar frugal innovations in medical devices, transport, solar lighting and heating, clean cookstoves, cheap pharmaceuticals, sanitation, consumer electronics and so on, have driven growth in Asia and Africa over the past decade and will continue to do so in the decades to come.

Catching on

Meanwhile the developed world is catching up. Declining real incomes and government spending, accompanied by greater concern for the environment, are making Western consumers both value and values conscious.

The rise of two massive movements in recent years, the sharing economy and the maker movement, shows the potential of frugal innovation in the West. The sharing economy, exemplified by Airbnb, BlaBlaCar and Kickstarter, has empowered consumers to trade spare assets with each other and thus generate new sources of income. The maker movement, meanwhile, features proactive consumers who tinker in spaces such as FabLabs,TechShops and MakeSpaces, designing solutions to problems they encounter.

Square, a small white, square device that fits into the audio jack of a smartphone, using its computing power and connectivity to make credit card payments, is an example of a product that was developed in a TechShop. Launched in 2010, the Square is on track to make US$1 billion in revenue in 2015.

Frugal innovation not only has the power to drive more inclusive growth by tackling poverty and inequality around the world, it is also increasingly the key to growth that will not simultaneously wreck the planet. The big issue at the Paris climate summit was the increasing wedge between the developed and the developing world. On the one hand, the rich countries cannot stop the poor ones from attempting to achieve the West’s levels of prosperity. On the other, however, poor countries cannot grow in the way the West did without wrecking the planet.

The only way to square this circle is to ensure that the growth is sustainable. The need for frugal innovation is therefore all the more vital in areas such as energy generation and use, manufacturing systems that are more local, and a move to a circular economy where companies (and consumers) reduce, reuse and recycle materials in a potentially endless loop.

Never before have so many been able to do so much for so little. Aiding and stimulating this frugal innovation revolution holds the key to driving global growth by employing more people to solve some of the big problems of poverty, inequality and climate change that stalk the planet.

Jaideep Prabhu, Director, Centre for India & Global Business at Cambridge Judge Business School, University of Cambridge

This article was originally published on The Conversation. Read the original article.

The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

Inset images: M-Pesa stand (Fiona Bradley); Square readers (Dom Sagolla).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/discussion/opinion-how-frugal-innovation-can-kickstart-the-global-economy-in-2016#sthash.8sVre3w0.dpuf

Newton, Darwin, Shakespeare – and a Jar of Ectoplasm: Cambridge University Library at 600

Newton, Darwin, Shakespeare – and a jar of ectoplasm: Cambridge University Library at 600

source: www.cam.ac.uk

In 2016, Cambridge University Library will celebrate 600 years as one of the world’s greatest libraries with a spectacular exhibition of priceless treasures – and a second show throwing light on its more weird and wonderful collections.

For six centuries, the collections of Cambridge University Library have challenged and changed the world around us. Across science, literature and the arts, the millions of books, manuscripts and digital archives we hold have altered the very fabric of our understanding.

Anne Jarvis

Older than the British Library and the Vatican Library, Cambridge University Library was first mentioned by name in two wills dated March 1416 and its most valuable contents stored in a wooden chest. The library now holds nine million books, journals, maps and magazines – as well as some of the world’s most iconic scientific, literary and cultural treasures.

Its priceless collections include Newton’s own annotated copy of Principia Mathematica, Darwin’s papers on evolution, 3000-year-old Chinese oracle bones, and the earliest reliable text for 20 of Shakespeare’s plays.

But is also home to a bizarre assembly of non-book curiosities, collected over centuries, including a jar of ectoplasm, a trumpet for hearing spirits and a statue of the Virgin Mary, miraculously saved from an earthquake on Martinique.

Since 1710, Cambridge University Library has also been entitled to one copy of each and every publication in the UK and Ireland under Legal Deposit – meaning the greatest works of more than three millennia of recorded thought sit alongside copies of Woman’s Own and the Beano on more than 100 miles of shelves. With two million of its volumes on open display, readers have the largest open-access collection in Europe immediately available to them.

To celebrate the Library’s 600th birthday, a spectacular free exhibition, Lines of Thought, will open on March 11, 2016. Featuring some of Cambridge’s most iconic and best-known treasures, it investigates through six distinct themes how both Cambridge and its collections have changed the world and will continue to do so in the digital era.

As well as the iconic Newton, Darwin and Shakespeare artefacts mentioned above, items going on display include:

  • Edmund Halley’s handwritten notebook/sketches of Halley’s Comet (1682)
  • Stephen Hawking’s draft typescript of A Brief History of Time
  • Darwin’s first pencil sketch of Species Theory and his Primate Tree
  • A second century AD fragment of Homer’s Odyssey.
  • The Nash Papyrus – a 2,000-year-old copy of the Ten Commandments
  • Codex Bezae – 5th New Testament, crucial to our understanding of The Bible.
  • A hand-coloured copy of Vesalius’ 1543 De fabrica – the most influential work in western medicine
  • A written record of the earliest known human dissection in England (1564)
  • A Babylonian tablet dated 2039 BCE (the oldest object in the library)
  • The Gutenberg Bible – the earliest substantive printed book in Western Europe (1454)
  • The first catalogue listing the contents of the Library in 1424, barely a decade after it was first identified in the wills of William Loryng and William Hunden

 

As well as Lines of Thought, 2016 will also see dozens of celebratory events including the library’s 17-storey tower being lit up as part of the e-Luminate Festival in February. Cambridge University Library is also producing a free iPad app giving readers the chance to interact with digitised copies of six of the most revolutionary texts held in its collections. The app analyses the context of the six era-defining works, including Darwin’s family copy of On the origin of species, Newton’s annotated copy of Principia Mathematica, and William Tyndale’s translation of the New Testament into English, an undertaking which led to his execution for heresy.

From October 2016, an exhibition featuring some of the University Library’s most unusual curiosities and oddities will replace Lines of Thought as the second major exhibition of the sexcentenary.

Over the past 600 years, Cambridge has accumulated an extraordinary collection of objects, often arriving at the library as part of bequests and donations. Some of the library’s more unusual artefacts include children’s games, ration books, passports, prisoner art, Soviet cigarettes and cigars and an East African birthing stool.

University Librarian Anne Jarvis said: “For six centuries, the collections of Cambridge University Library have challenged and changed the world around us. Across science, literature and the arts, the millions of books, manuscripts and digital archives we hold have altered the very fabric of our understanding. Thousands of lines of thoughts run through them, back into the past, and forward into tomorrow. Our 600th anniversary is a chance to celebrate one of the world’s oldest and greatest research libraries, and to look forward to its future.

“Only in Cambridge, can you find Newton’s greatest works sitting alongside Darwin’s most important papers on evolution, or Sassoon’s wartime poetry books taking their place next to the Gutenberg Bible and the archive of Margaret Drabble. Our aim now, through our Digital Library, is to share as many of these great collections as widely as possible so that anyone, anywhere in the world, can stand on the shoulders of these giants.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/newton-darwin-shakespeare-and-a-jar-of-ectoplasm-cambridge-university-library-at-600#sthash.AZVjy8FN.dpuf

New Origami-Like Material May Help Prevent Brain Injuries in Sport

New origami-like material may help prevent brain injuries in sport

Source: www.cam.ac.uk

Researchers are developing the next generation of advanced materials for use in sport and military applications, with the goal of preventing brain injuries.

The key challenge for us is to come up with a material that can be optimised for a range of different types of impacts

Graham McShane

Researchers from Cambridge and Cardiff Universities are developing an origami-like material that could help prevent brain injuries in sport, as part of a programme sponsored in part by American football’s National Football League (NFL).

A number of universities and commercial companies are taking part in the NFL’s Head Health Challenge, which, as one of its goals, aims to develop the next generation of advanced materials for use in helmets and other types of body protection, for sport, military and other applications.

The Cambridge and Cardiff team are working in collaboration with helmet designer and manufacturer Charles Owen Inc, with funding support from the NFL, GE Healthcare, Under Armour and the National Institute of Standards and Technology, to develop and test their material over the next 12 months.

The Head Health Challenge competition is a $20 million collaborative project to develop new and innovative technologies in order to improve early-stage detection of mild traumatic brain injuries and to improve brain protection. The five-year collaboration is aiming to improve the safety of athletes, members of the military and society overall.

“The key challenge for us is to come up with a material that can be optimised for a range of different types of impacts,” said Dr Graham McShane of Cambridge’s Department of Engineering, who is part of the team behind the material. “A direct impact is different than an oblique impact, so the ideal material will behave and deform accordingly, depending on how it’s been hit – what we are looking at is the relationship between the material, geometry and the force of impact.”

In high-impact sports such as American football, players can hit each other with the equivalent of half a ton of force, and in an average game, there can be dozens of these high-impact blocks or tackles. More than 100 concussions are reported each year in the NFL, and over the course of a career, multiple concussions can do serious long-term damage.

The Head Health Challenge has created three separate challenges as part of its program of funding: Challenge I – Methods for Diagnosis and Prognosis of Mild Traumatic Brain Injuries; Challenge II – Innovative Approaches for Preventing and Identifying Brain Injuries; and Challenge III – Advanced Materials for Impact Mitigation. The C3 project is part of Challenge III. The various projects from Challenge III will have their efforts judged in a year’s time by a review panel, with the most promising technology receiving another $500,000 to develop the material further.

The multi-layered, elastic material developed by McShane and his colleagues at Cambridge and Cardiff, called C3, has been designed and tested using a mixture of theoretical and experimental techniques, so that it can be tailored for specific impact scenarios.

C3 has its origins in cellular materials conceived in the Department of Engineering for defence applications, and is based on folded, origami-like structures. It is more versatile than the polymer foams currently used in protective helmets, which are highly limited in terms of how they behave under different conditions.

Structures made from C3 can be designed in such a way that impact energy can be dissipated relatively easily, making it an ideal material to use in protective clothing and accessories.

Dr Peter Theobald, a Senior Lecturer at Cardiff University who is leading on the project, said: “Head injury prevention strategies have remained relatively stagnant versus the evolution of other technologies. Our trans-Atlantic collaboration with Charles Owen Inc. has enabled us to pool our highly relevant skills and expertise in injury prevention, mechanics, manufacturing and commercialisation.”

“This approach has already enabled us to develop C3 which, in the words of our evaluators, presents a potentially ‘game-changing’ material with great promise to better absorb the vertical and horizontal components of an oblique impact. This highly prestigious award provides us with a platform to continue developing C3 towards our ultimate goal of achieving a material that provides a step-change in head health and protection, whilst achieving metrics that ensure commercial viability.”

Inset image: Sample of C3. Credit: Cardiff University.

Adapted from Cardiff University press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/new-origami-like-material-may-help-prevent-brain-injuries-in-sport#sthash.MlQCVPaJ.dpuf

Epigenetic Discovery Suggests DNA Modifications More Diverse Than Previously Thought

Epigenetic discovery suggests DNA modifications more diverse than previously thought

source: www.cam.ac.uk

The world of epigenetics – where molecular ‘switches’ attached to DNA turn genes on and off – has just got bigger with the discovery by a team of scientists from the University of Cambridge of a new type of epigenetic modification.

It’s possible that we struck lucky with this modifier, but we believe it is more likely that there are many more modifications that directly regulate our DNA

Magdalena Koziol

Published today in the journal Nature Structural and Molecular Biology, the discovery suggests that many more DNA modifications than previously thought may exist in human, mouse and other vertebrates.

DNA is made up of four ‘bases’: molecules known as adenine, cytosine, guanine and thymine – the A, C, G and T letters. Strings of these letters form genes, which provide the code for essential proteins, and other regions of DNA, some of which can regulate these genes.

Epigenetics (epi – the Greek prefix meaning ‘on top of’) is the study of how genes are switched on or off. It is thought to be one explanation for how our environment and behaviour, such as our diet or smoking habit, can affect our DNA and how these changes may even be passed down to our children and grandchildren.

Epigenetics has so far focused mainly on studying proteins called histones that bind to DNA. Such histones can be modified, which can result in genes being switched on or of. In addition to histone modifications, genes are also known to be regulated by a form of epigenetic modification that directly affects one base of the DNA, namely the base C. More than 60 years ago, scientists discovered that C can be modified directly through a process known as methylation, whereby small molecules of carbon and hydrogen attach to this base and act like switches to turn genes on and off, or to ‘dim’ their activity. Around 75 million (one in ten) of the Cs in the human genome are methylated.

Now, researchers at the Wellcome Trust-Cancer Research UK Gurdon Institute and the Medical Research Council Cancer Unit at the University of Cambridge have identified and characterised a new form of direct modification – methylation of the base A – in several species, including frogs, mouse and humans.

Methylation of A appears to be far less common that C methylation, occurring on around 1,700 As in the genome, but is spread across the entire genome. However, it does not appear to occur on sections of our genes known as exons, which provide the code for proteins.

“These newly-discovered modifiers only seem to appear in low abundance across the genome, but that does not necessarily mean they are unimportant,” says Dr Magdalena Koziol from the Gurdon Institute. “At the moment, we don’t know exactly what they actually do, but it could be that even in small numbers they have a big impact on our DNA, gene regulation and ultimately human health.”

More than two years ago, Dr Koziol made the discovery while studying modifications of RNA. There are 66 known RNA modifications in the cells of complex organisms. Using an antibody that identifies a specific RNA modification, Dr Koziol looked to see if the analogous modification was also present on DNA, and discovered that this was indeed the case. Researchers at the MRC Cancer Unit then confirmed that this modification was to DNA, rather than from any RNA contaminating the sample.

“It’s possible that we struck lucky with this modifier,” says Dr Koziol, “but we believe it is more likely that there are many more modifications that directly regulate our DNA. This could open up the field of epigenetics.”

The research was funded by the Biotechnology and Biological Sciences Research Council, Human Frontier Science Program, Isaac Newton Trust, Wellcome Trust, Cancer Research UK and the Medical Research Council.

Reference
Koziol, MJ et al. Identification of methylated deoxyadenosines in vertebrates reveals diversity in DNA modifications. Nature Structural and Molecular Biology; 21 Dec 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/epigenetic-discovery-suggests-dna-modifications-more-diverse-than-previously-thought#sthash.HChitFuh.dpuf

Stem Cells Likely To Be Safe For Use In Regenerative Medicine, Study Confirms

Stem cells likely to be safe for use in regenerative medicine, study confirms

Source: www.cam.ac.uk

Cambridge researchers have found the strongest evidence to date that human pluripotent stem cells – cells that can give rise to all tissues of the body – will develop normally once transplanted into an embryo. The findings, published today in the journal Cell Stem Cell, could have important implications for regenerative medicine.

Our study provides strong evidence to suggest that human stem cells will develop in a normal – and importantly, safe – way. This could be the news that the field of regenerative medicine has been waiting for

Roger Pedersen

Human pluripotent stem cells for use in regenerative medicine or biomedical research come from two sources: embryonic stem cells, derived from fertilised egg cells discarded from IVF procedures; and induced pluripotent stem cells, where skin cells are ‘reset’ to their original, pluripotent form. They are seen as having promising therapeutic uses in regenerative medicine to treat devastating conditions that affect various organs and tissues, particularly those that have poor regenerative capacity, such as the heart, brain and pancreas.

However, some scientists have been concerned that the cells may not incorporate properly into the body and hence not proliferate or distribute themselves as intended, resulting in tumours. Today’s study suggests that this will not be the case and that stem cells, when transplanted appropriately, are likely to be safe for use in regenerative medicine.

Professor Roger Pedersen from the Anne McLaren Laboratory for Regenerative Medicine at the University of Cambridge, commenting on co-author Victoria Mascetti’s research findings, says: “Our study provides strong evidence to suggest that human stem cells will develop in a normal – and importantly, safe – way. This could be the news that the field of regenerative medicine has been waiting for.”

The best way to test how well stem cells would incorporate into the body is to transplant them into an early-stage embryo and see how they develop. As this cannot be done ethically in humans, scientists use mouse embryos. The gold standard test, developed in Cambridge in the 1980s, involves putting the stem cells into a mouse blastocyst, a very early stage embryo after fertilisation, then assessing stem cell contribution to the various tissues of the body.

Previous research has not succeeded in getting human pluripotent stem cells to incorporate into embryos. However, in research funded by the British Heart Foundation, Victoria Mascetti and Professor Pedersen have shown that it is possible to successfully transplant human pluripotent stem cells into the mouse embryo and that they then develop and grow normally.


Image: Mouse embryo with human pluripotent stem cells (red) incorporated into the brain region

“Stem cells hold great promise for treating serious conditions such as heart disease and Parkinson’s disease, but until now there has been a big question mark over how safe and effective they will be,” explains Professor Pedersen.

Mascetti’s research breakthrough in this new study was to demonstrate that human pluripotent stem cells are equivalent to an embryonic counterpart. Where attempts to incorporate human pluripotent stem cells had failed previously, it was because the stem cells had not been matched to the correct stage of embryo development: the cells needed to be transplanted into the mouse embryo at a later stage than was previously thought (a stage of embryo development known as gastrulation). Once transplanted at the correct stage, the stem cells went on to grow and proliferate normally, to integrate into the embryo and to distribute themselves correctly across relevant tissues.

Ms Mascetti adds: “Our finding that human stem cells integrate and develop normally in the mouse embryo will allow us to study aspects of human development during a window in time that would otherwise be inaccessible.”

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, which helped fund the study, said: “These results substantially strengthen the view that induced pluripotent stem cells from adult tissue are suitable for use in regenerative medicine – for example in attempts to repair damaged heart muscle after a heart attack.

“The Cambridge team has shown definitively that when stem cells are introduced into early mouse embryos under the right conditions, they multiply and contribute in the correct way to all the cell types that are formed as the embryo develops.”

Reference
Victoria L Mascetti and Roger A Pedersen. Human-Mouse Chimerism Validates Human Stem Cell Pluripotency. Cell Stem Cell; 17 Dec 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/stem-cells-likely-to-be-safe-for-use-in-regenerative-medicine-study-confirms#sthash.tsvDaRXD.dpuf

‘Virtual Fossil’ Reveals Last Common Ancestor of Humans and Neanderthals

‘Virtual fossil’ reveals last common ancestor of humans and Neanderthals

source: www.cam.ac.uk

New digital techniques have allowed researchers to predict structural evolution of the skull in the lineage of Homo sapiens and Neanderthals, in an effort to fill in blanks in the fossil record, and provide the first 3D rendering of their last common ancestor. The study suggests populations that led to the lineage split were older than previously thought.

Our models are not the exact truth, but in the absence of fossils these new methods can be used to test hypotheses for any palaeontological question, whether it is horses or dinosaurs

Aurélien Mounier

We know we share a common ancestor with Neanderthals, the extinct species that were our closest prehistoric relatives. But what this ancient ancestral population looked like remains a mystery, as fossils from the Middle Pleistocene period, during which the lineage split, are extremely scarce and fragmentary.

Now, researchers have applied digital “morphometrics” and statistical algorithms to cranial fossils from across the evolutionary story of both species, and recreated in 3D the skull of the last common ancestor of Homo sapiens and Neanderthals for the first time.

The “virtual fossil” has been simulated by plotting a total of 797 “landmarks” on the cranium of fossilised skulls stretching over almost two million years of Homo history – including a 1.6 million-year-old Homo erectus fossil, Neanderthal crania found in Europe and even 19th century skulls from the Duckworth collection in Cambridge.

The landmarks on these samples provided an evolutionary framework from which researchers could predict a timeline for the skull structure, or ‘morphology’, of our ancient ancestors. They then fed a digitally-scanned modern skull into the timeline, warping the skull to fit the landmarks as they shifted through history.

This allowed researchers to work out how the morphology of both species may have converged in the last common ancestor’s skull during the Middle Pleistocene – an era dating from approximately 800,000 to 100,000 years ago.

The team generated three possible ancestral skull shapes that corresponded to three different predicted split times between the two lineages. They digitally rendered complete skulls and then compared them to the few original fossils and bone fragments of the Pleistocene age.

This enabled the researchers to narrow down which virtual skull was the best fit for the ancestor we share with Neanderthals, and which timeframe was most likely for that last common ancestor to have existed.

Previous estimates based on ancient DNA have predicted the last common ancestor lived around 400,000 years ago. However, results from the ‘virtual fossil’ show the ancestral skull morphology closest to fossil fragments from the Middle Pleistocene suggests a lineage split of around 700,000 years ago, and that – while this ancestral population was also present across Eurasia – the last common ancestor most likely originated in Africa.

The results of the study are published in the Journal of Human Evolution.

“We know we share a common ancestor with Neanderthals, but what did it look like? And how do we know the rare fragments of fossil we find are truly from this past ancestral population? Many controversies in human evolution arise from these uncertainties,” said the study’s lead author Dr Aurélien Mounier, a researcher at Cambridge University’s Leverhulme Centre for Human Evolutionary Studies (LCHES).

“We wanted to try an innovative solution to deal with the imperfections of the fossil record: a combination of 3D digital methods and statistical estimation techniques. This allowed us to predict mathematically and then recreate virtually skull fossils of the last common ancestor of modern humans and Neanderthals, using a simple and consensual ‘tree of life’ for the genus Homo,” he said.

The virtual 3D ancestral skull bears early hallmarks of both species. For example, it shows the initial budding of what in Neanderthals would become the ‘occipital bun’: the prominent bulge at the back of the skull that contributed to elongated shape of a Neanderthal head.

However, the face of the virtual ancestor shows hints of the strong indention that modern humans have under the cheekbones, contributing to our more delicate facial features. In Neanderthals, this area – the maxillia – is ‘pneumatized’, meaning it was thicker bone due to more air pockets, so that the face of a Neanderthal would have protruded.

Research from New York University published last week showed that bone deposits continued to build on the faces of Neanderthal children during the first years of their life.

The heavy, thickset brow of the virtual ancestor is characteristic of the hominin lineage, very similar to early Homo as well as Neanderthal, but lost in modern humans. Mounier says the virtual fossil is more reminiscent of Neanderthals overall, but that this is unsurprising as taking the timeline as a whole it is Homo sapiens who deviate from the ancestral trajectory in terms of skull structure.

“The possibility of a higher rate of morphological change in the modern human lineage suggested by our results would be consistent with periods of major demographic change and genetic drift, which is part of the history of a species that went from being a small population in Africa to more than seven billion people today,” said co-author Dr Marta Mirazón Lahr, also from Cambridge’s LCHES.

The population of last common ancestors was probably part of the species Homo heidelbergensis in its broadest sense, says Mounier. This was a species of Homo that lived in Africa, Europe and western Asia between 700,000 and 300,000 years ago.

For their next project, Mounier and colleagues have started working on a model of the last common ancestor of Homo and chimpanzees. “Our models are not the exact truth, but in the absence of fossils these new methods can be used to test hypotheses for any palaeontological question, whether it is horses or dinosaurs,” he said.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/virtual-fossil-reveals-last-common-ancestor-of-humans-and-neanderthals#sthash.BdJ6blxX.dpuf

Low Cost, Safe and Accurate Test Could Help Diagnose Rare Childhood Cancers

Low cost, safe and accurate test could help diagnose rare childhood cancersAddenbrooke's Hospital

Source: www.cam.ac.uk

A non-invasive, low cost blood test that could help doctors diagnose some types of malignant childhood tumour has been developed by researchers at the University of Cambridge and Addenbrooke’s Hospital, Cambridge University Health NHS Foundation Trust.

At the moment, we are not good enough at diagnosing these tumours and monitoring their treatment: we need better, safer and more cost-effective tests

Nick Coleman

Reported today in the British Journal of Cancer, the test could enable doctors to monitor the effectiveness of treatments without exposing patients to repeated doses of radiation.

The target of the test is a type of cancer known as germ cell cancer. Germ cells are those cells in the body that go on to develop into sperm and egg cells. Germ cells can develop into tumours – both benign and malignant – particularly in the testes or ovaries, where the cells are normally found. However, occasionally germ cells can get trapped in the wrong part of the body during development and may later turn into brain tumours, for example.

The five year disease-free and overall survival rates for patients with high-risk malignant germ cell tumours remains less than 50%, and so accurate diagnosis and monitoring is crucial to improving outcomes for patients. All of the current tests are expensive, and none are ideal.

The most reliable diagnostic method currently in use is biopsy, where a section of the suspected tumour is extracted surgically and analysed by a pathologist. However, biopsies are prone to sampling errors and so may not be representative of the tumour as a whole. Computerised tomography (CT) scans and magnetic resonance imaging (MRI) also provide useful information but are not diagnostic and do not discriminate between benign and malignant tumours.

The ideal tool for diagnosis would be a non-invasive blood test; however, currently available tests only identify around three in five malignant germ cell tumours, potentially delaying diagnosis and the ability to prioritise patients for surgery. Accurate disease monitoring with routine blood testing is not possible for two in five patients, requiring follow up CT scans with exposure to harmful radiation and an associated increased secondary cancer risk.

“Although relatively rare, childhood germ cell tumours need to be diagnosed accurately and followed up carefully to give us the best chances of treating them,” says Professor Nick Coleman from the Department of Pathology, University of Cambridge. “At the moment, we are not good enough at diagnosing these tumours and monitoring their treatment: we need better, safer and more cost-effective tests.”

In research funded by Sparks charity, Great Ormond Street Hospital Children’s Charity and Cancer Research UK, researchers at the University of Cambridge have developed a test for blood and cerebrospinal fluid samples that looks for a specific panel of four pieces of short genetic code known as microRNAs, which are found in greater quantities in malignant germ cell tumours. The test can distinguish malignant germ cell tumours from benign germ cell tumours and other cancers. The test can be used for diagnosis of malignant germ cell tumours in any part of the body, including the brain.

The test can also be used to check the effectiveness of treatments and, as it is safe and cost-effective, allows for frequent testing to monitor for the recurrence of malignant germ cell tumours.

Dr Matthew Murray from the Department of Paediatric Haematology and Oncology, Cambridge University Hospitals NHS Foundation Trust, says: “This test, developed with Dr Emma Bell, a postdoctoral scientist in our laboratory, could be exactly what we need: it could help us diagnose malignant germ cell tumours cheaply, safely and above all, more accurately than current methods. Our next step is to confirm our findings in a large clinical trial and, if this is successful, we hope to see the test in routine use in hospitals in the near future.”

Reference
Murray, MJ, Bell E, et al. A pipeline to quantify serum and cerebrospinal fluid microRNAs for diagnosis and detection of relapse in paediatric malignant germ-cell tumours. British Journal of Cancer; 15 Dec 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/low-cost-safe-and-accurate-test-could-help-diagnose-rare-childhood-cancers#sthash.DxuIyyjT.dpuf

Feeding Food Waste To Pigs Could Save Vast Swathes of Threatened Forest and Savannah

Feeding food waste to pigs could save vast swathes of threatened forest and savannah

source: www.cm.ac.uk

New research suggests that feeding our food waste, or swill, to pigs (currently banned under EU law) could save 1.8 million hectares of global agricultural land – an area roughly half the size of Germany, including hundreds of thousands of acres of South America’s biodiverse forests and savannahs – and provide a use for the 100 million tonnes of food wasted in the EU each year.

It is time to reassess whether the EU’s blanket ban on the use of food waste as feed is the right thing for the pig industry

Erasmus zu Ermgassen

A new study shows that if the European Union lifted the pigswill ban imposed following 2001’s foot-and-mouth disease epidemic, and harnessed technologies developed in East Asian countries for ‘heat-treating’ our food waste to safely turn it into pig feed, around 1.8 million hectares of land could be saved from being stripped for grain and soybean-based pig feed production – including over quarter of a million hectares of Brazilian forest and savannah.

While swill-feeding was banned across the EU in 2002 following the foot-and-mouth outbreak – triggered by a UK farmer illegally feeding uncooked food waste to pigs – other countries, such as Japan, responded by creating a highly regulated system for safely recycling heat-treated food waste as animal feed.

Researchers describe the EU ban as a “knee-jerk reaction” that no longer makes sense when East Asian countries have demonstrated that food waste can be safely recycled. The models in the latest study show that pigswill reintroduction would not only decrease the amount of land the EU pork industry requires by 21.5%, but also cut in half the ever-increasing feed costs faced by European pig farmers.

Researchers describe swill as a feed which is commonly used in other parts of the world, one that could save a huge amount of global resources, and provide an environmentally sound recycling solution to the estimated 102.5 million tonnes of food wasted in the EU each year. Over 35% of food waste is now recycled into animal feed in Japan, where swill-fed “Eco-pork” is marketed as a premium product.

“Following the foot-and-mouth disease outbreak, different countries looked at the same situation, the same evidence, and came to opposite conclusions for policy,” said Erasmus zu Ermgassen from the University of Cambridge’s Department of Zoology, who led the study, published today in the journal Food Policy. “In many countries in East Asia we have a working model for the safe use of food waste as pig feed. It is a highly regulated and closely monitored system that recycles food waste and produces low-cost pig feed with a low environmental impact.”

The researchers examined data about the current land use of EU pork, the availability of food waste in the EU, and the quality and quantity of pork produced in feed trials that compared pigswill to grain-based diets, to produce a model of how much land could be saved if the pigswill ban was lifted.

Some 21.5 million tonnes of pork, around 34kg of pork per person, are produced in the EU each year. Livestock production occupies approximately 75% of agricultural land worldwide – with most of this used to produce animal feed. For EU pork, much of the environmental burden stems from the farming of soybean meal, which takes up in excess of 1.2 million hectares of land across South America.

As swill is much cheaper than grain and soybean-based pig feed, reintroducing swill feeding could reduce costs faced by EU pig farmers by 50%, say the researchers.

Most objection to swill feeding in the EU stems from concerns about safety, and the sentiment that feeding human food waste to pigs is unnatural. But zu Ermgassen argues that those concerns are largely based on incorrect assumptions.

“Pigs are omnivorous animals; in the wild they would eat anything they could forage for, from vegetable matter to other animal carcasses, and they have been fed food waste since they were domesticated by humans 10,000 years ago. Swill actually provides a more traditional diet for pigs than the grain-based feed currently used in modern EU systems,” said zu Ermgassen.

“A recent survey found that 25% of smallholder farmers in the UK admit to illegally feeding uncooked food waste to their pigs, so the fact is that the current ban is not particularly safe from a disease-outbreak perspective. Feeding uncooked food waste is dangerous because pigs can catch diseases from raw meat, but a system supporting the regulated use of heat-treated swill does not have the same risks,” he said.

With the demand for meat and dairy products forecast to increase 60% by 2050, reducing the environmental footprint of current systems of meat production will become increasingly critical.

zu Ermgassen points out that economic and environmental concern is driving a reassessment of EU animal feed bans that were put in place in the 2000s, as well as attempts to recycle food waste more effectively. The EU is currently looking into repealing bans on using waste pig and poultry products as fish feed and reintroducing insects as pig and poultry feed.

“The reintroduction of swill feeding in the EU would require backing from pig producers, the public, and policy makers, but it has substantial potential to improve the environmental and economic sustainability of EU pork production. It is time to reassess whether the EU’s blanket ban on the use of food waste as feed is the right thing for the pig industry,” he said.

Erasmus zu Ermgassen’s research is funded by the Biotechnology and Biological Sciences Research Council.

Reference

Erasmus K.H.J. zu Ermgassen, et al. “Reducing the land use of EU pork production: where there’s swill, there’s a way” Food Policy Vol 58 (January 2016). DOI:10.1016/j.foodpol.2015.11.001.

Inset image: “Save Kitchen Waste to Feed the Pigs!” poster from the Imperial War Museums © IWM (Art.IWM PST 14743).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/feeding-food-waste-to-pigs-could-save-vast-swathes-of-threatened-forest-and-savannah#sthash.WeYqJS6y.dpuf

The Periodic Table of Proteins

The periodic table of proteins

source: www.cam.ac.uk

Researchers have devised a periodic table of protein complexes, making it easier to visualise, understand and predict how proteins combine to drive biological processes.

We’re bringing a lot of order into the messy world of protein complexes

Sebastian Ahnert

A new ‘periodic table’ of protein complexes, devised by an interdisciplinary team of researchers, provides a unified way to classify and visualise protein complexes, which drive a huge range of biological processes, from DNA replication to catalysing metabolic reactions.

The table, published in the journal Science, offers a new way of looking at almost all known molecular structures and predicting how new ones could be made, providing a valuable tool for research into evolution and protein engineering.

By using the table, researchers are able predict the likely forms of protein complexes with unknown structure, estimate the feasibility of entirely new structures, and identify possible errors in existing structural databases. It was created by an interdisciplinary team led by researchers at the University of Cambridge and the Wellcome Genome Campus.

Almost every biological process depends on proteins interacting and assembling into complexes in a specific way, and many diseases, such as Alzheimer’s and Parkinson’s, are associated with problems in complex assembly. The principles underpinning this organisation are not yet fully understood, but the new periodic table presents a systematic, ordered view on protein assembly, providing a visual tool for understanding biological function.

“We’re bringing a lot of order into the messy world of protein complexes,” said the paper’s lead author Sebastian Ahnert of Cambridge’s Cavendish Laboratory, a physicist who regularly tangles with biological problems. “Proteins can keep combining in these simple ways, adding more and more levels of complexity and resulting in a huge variety of structures. What we’ve made is a classification based on underlying principles that helps people get a handle on the complexity.”

The exceptions to the rule are interesting in their own right, added Ahnert, and are the subject of continuing studies.

“Evolution has given rise to a huge variety of protein complexes, and it can seem a bit chaotic,” said study co-author Joe Marsh, formerly of the Wellcome Genome Campus and now of the MRC Human Genetics Unit at the University of Edinburgh. “But if you break down the steps proteins take to become complexes, there are some basic rules that can explain almost all of the assemblies people have observed so far.”

Ballroom dancing can be seen as an endless combination of riffs on the waltz, fox trot and cha-cha. Similarly, the ‘dance’ of protein complex assembly can be seen as endless variations on dimerization (one doubles, and becomes two), cyclisation (one forms a ring of three or more) and subunit addition (two different proteins bind to each other). Because these happen in a fairly predictable way, it’s not as hard as you might think to predict how a novel protein would form.

Some protein complexes, called homomers, feature multiple copies of a single protein, while others, called heteromers, are made from several different types of proteins. The table shows that there is a very close relationship between the possible structures of heteromers and homomers. In fact, the vast majority of heteromers can be thought of as homomers in which the single protein is replaced by a repeated unit of several proteins. The table was constructed using computational analysis of a large database of protein-protein interfaces.

“By analysing the tens of thousands of protein complexes for which three-dimensional structures have already been experimentally determined, we could see repeating patterns in the assembly transitions that occur – and with new data from mass spectrometry we could start to see the bigger picture,” said Walsh.

“The core work for this study is in theoretical physics and computational biology, but it couldn’t have been done without the mass spectrometry work by our colleagues at Oxford University,” said Sarah Teichmann, Research Group Leader at the European Bioinformatics Institute (EMBL-EBI) and the Wellcome Trust Sanger Institute. “This is yet another excellent example of how extremely valuable interdisciplinary research can be.”

Reference:
Ahnert SE, et. al. ‘Principles of assembly reveal a periodic table of protein complexes.’ Science (2015). DOI: 10.1126/science.aaa2245

Adapted from an EMBL-EBI press release.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/the-periodic-table-of-proteins#sthash.Kt9HC4c2.dpuf

Millet: The Missing Piece In The Puzzle Of Prehistoric Humans’ Transition From Hunter-Gatherers To Farmers

Millet: the missing piece in the puzzle of prehistoric humans’ transition from hunter-gatherers to farmers

source: www.cam.ac.uk

New research shows a cereal familiar today as birdseed was carried across Eurasia by ancient shepherds and herders laying the foundation, in combination with the new crops they encountered, of ‘multi-crop’ agriculture and the rise of settled societies. Archaeologists say ‘forgotten’ millet has a role to play in modern crop diversity and today’s food security debate.

We have been able to follow millet moving in deep history, from where it originated in China and spread across Europe and India

Martin Jones

The domestication of the small-seeded cereal millet in North China around 10,000 years ago created the perfect crop to bridge the gap between nomadic hunter-gathering and organised agriculture in Neolithic Eurasia, and may offer solutions to modern food security, according to new research.

Now a forgotten crop in the West, this hardy grain – familiar in the west today as birdseed – was ideal for ancient shepherds and herders, who carried it right across Eurasia, where it was mixed with crops such as wheat and barley. This gave rise to ‘multi-cropping’, which in turn sowed the seeds of complex urban societies, say archaeologists.

A team from the UK, USA and China has traced the spread of the domesticated grain from North China and Inner Mongolia into Europe through a “hilly corridor” along the foothills of Eurasia. Millet favours uphill locations, doesn’t require much water, and has a short growing season: it can be harvested 45 days after planting, compared with 100 days for rice, allowing a very mobile form of cultivation.

Nomadic tribes were able to combine growing crops of millet with hunting and foraging as they travelled across the continent between 2500 and 1600 BC. Millet was eventually mixed with other crops in emerging populations to create ‘multi-crop’ diversity, which extended growing seasons and provided our ancient ancestors with food security.

The need to manage different crops in different locations, and the water resources required, depended upon elaborate social contracts and the rise of more settled, stratified communities and eventually complex ‘urban’ human societies.

Researchers say we need to learn from the earliest farmers when thinking about feeding today’s populations, and millet may have a role to play in protecting against modern crop failure and famine.

“Today millet is in decline and attracts relatively little scientific attention, but it was once among the most expansive cereals in geographical terms. We have been able to follow millet moving in deep history, from where it originated in China and spread across Europe and India,” said Professor Martin Jones from the University of Cambridge’s Department of Archaeology and Anthropology, who is presenting the research findings today at the Shanghai Archaeological Forum.

“These findings have transformed our understanding of early agriculture and society. It has previously been assumed that early agriculture was focused in river valleys where there is plentiful access to water. However, millet remains show that the first agriculture was instead centred higher up on the foothills – allowing this first pathway for ‘exotic’ eastern grains to be carried west.”

The researchers carried out radiocarbon dating and isotope analysis on charred millet grains recovered from archaeological sites across China and Inner Mongolia, as well as genetic analysis of modern millet varieties, to reveal the process of domestication that occurred over thousands of years in northern China and produced the ancestor of all broomcorn millet worldwide.

“We can see that millet in northern China was one of the earliest centres of crop domestication, occurring over the same timescale as rice domestication in south China and barley and wheat in west China,” explained Jones.

“Domestication is hugely significant in the development of early agriculture – humans select plants with seeds that don’t fall off naturally and can be harvested, so over several thousand years this creates plants that are dependent on farmers to reproduce,” he said.

“This also means that the genetic make-up of these crops changes in response to changes in their environment – in the case of millet, we can see that certain genes were ‘switched off’ as they were taken by farmers far from their place of origin.”

As the network of farmers, shepherds and herders crystallised across the Eurasian corridor, they shared crops and cultivation techniques with other farmers, and this, Jones explains, is where the crucial idea of ‘multi-cropping’ emerged.

“The first pioneer farmers wanted to farm upstream in order to have more control over their water source and be less dependent on seasonal weather variations or potential neighbours upstream,” he said. “But when ‘exotic’ crops appear in addition to the staple crop of the region, then you start to get different crops growing in different areas and at different times of year. This is a huge advantage in terms of shoring up communities against possible crop failures and extending the growing season to produce more food or even surplus.

“However, it also introduces a more pressing need for cooperation, and the beginnings of a stratified society. With some people growing crops upstream and some farming downstream, you need a system of water management, and you can’t have water management and seasonal crop rotation without an elaborate social contract.”

Towards the end of the second and first millennia BC larger human settlements, underpinned by multi-crop agriculture, began to develop. The earliest examples of text, such as the Sumerian clay tablets from Mesopotamia, and oracle bones from China, allude to multi-crop agriculture and seasonal rotation.

But the significance of millet is not just in transforming our understanding of our prehistoric past. Jones believes that millet and other small-seeded crops may have an important role to play in ensuring future food security.

“The focus for looking at food security today is on the high-yield crops, rice, maize and wheat, which fuel 50% of the human food chain. However, these are only three of 50 types of cereal, the majority of which are small-grained cereals or “millets”. It may be time to consider whether millets have a role to play in a diverse response to crop failure and famine,” said Jones.

“We need to understand more about millet and how it may be part of the solution to global food security – we may have a lot still to learn from our Neolithic predecessors.”

Inset images: Martin Jones with millet in North China (Martin Jones); Inner Mongolian millet farmer in Chifeng (Martin Jones).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/millet-the-missing-piece-in-the-puzzle-of-prehistoric-humans-transition-from-hunter-gatherers-to#sthash.bT4UI0Ay.dpuf

Alternative Ways of Protecting Urban Water Supplies Must Be Considered in Light of Worsening Droughts in the US, Study Claims

Alternative ways of protecting urban water supplies must be considered in light of worsening droughts in the US, study claims

source: www.cam.ac.uk

Alternative models of watershed protection that balance recreational use and land conservation must no longer be ignored to preserve water supplies against the effects of climate change, argues a new study. Researchers claim that the management of Salt Lake City’s Wasatch watershed in Utah provides a valuable example contradicting the dominant view presented in academic literature that informs many current conservation strategies.

The chances of a ‘megadrought’ – one that lasts for 35 years or longer – affecting the Southwest and central Great Plains by 2100 are above 80% if climate change projections are not mitigated

Libby Blanchard

Salt Lake City’s preservation of the Wasatch watershed is an important model for protecting urban water sources through land use regulation and conservation, which could have important implications for preserving future water supplies against the effects of climate change in the American West, according to a new study. This example is currently absent from academic literature on ecosystem services, meaning that conservation discussions are instead dominated by models that focus on financial, ‘market-based’ incentives to protect watershed areas, which the researchers argue could be inappropriate in many circumstances.

The most prevalent model for water resource preservation is that of New York City’s Catskills/Delaware watershed, which is based on upstream resource users being paid to avoid harmful practices that might affect water flows and water quality, typically by beneficiaries who are downstream. These ‘market-based’ approaches (also known as Payments for Watershed or Ecosystem Services) have been widely promoted, but risk neglecting alternative approaches that do not always require monetary transactions to improve environmental outcomes.

In contrast, Salt Lake City’s management strategy allows regulated use of the watershed area for public recreation (unlike other forested catchments in the US where public access is prohibited to preserve water resources). In the Wasatch case, this means that the upstream catchment remains accessible, including for high impact uses such as skiing and mountain biking. Researchers argue that it is vital to consider these alternative strategies for solving the increasing water scarcity in the American West.

“While regulatory exclusion is often thought of as the only viable alternative to market-based incentives in managing ecosystem services, the management of the Wasatch watershed provides a third, yet under-recognised, successful conservation strategy for water resources,” says Libby Blanchard, lead author of the study from the University of Cambridge’s Department of Geography.

“The dominance of the Catskills example in discussions of watershed protection provides an unduly limited, and historically incomplete, perspective on interventions to secure water resources, and limits policy discussions about alternative conservation approaches,” she adds.

In the American West, unprecedented droughts have caused extreme water shortages. The current drought in California and across the West is entering its fourth year, with precipitation and water storage reaching record low levels. Droughts are ranked second in the US in terms of national weather-related economic impacts, with annual losses just shy of $9 billion. With water scarcity likely to increase due to advancing climate change, the economic and environmental impacts of drought are also likely to get worse.

“The chances of a ‘megadrought’ – one that lasts for 35 years or longer – affecting the Southwest and central Great Plains by 2100 are above 80% if climate change projections are not mitigated,” says Blanchard. “As the West faces more frequent and severe droughts, the successful protection of watersheds for the ecosystem services of water capture, storage, and delivery they provide will be increasingly important.”

“The sufficient and effective protection of watersheds will become more challenging, so awareness of alternative, successful strategies is critically important,” adds Bhaskar Vira, co-author of the study also from Cambridge’s Department of Geography. “The management of the Wasatch is one such strategy that should be more widely recognised amongst policymakers and researchers alike seeking effective solutions to water scarcity.”

The economic and instrumental value of the Wasatch watershed was noticed by Salt Lake City’s government as early as the 1850s, when the first legislation to protect the city’s natural resources was passed. Salt Lake City uses two tools to protect its watershed: purchasing land for conservation, and regulating land use by restricting a variety of activities within the watershed such as cattle grazing. Recreation is not altogether restricted, but is negotiated with the local community to allow public use. The Uinta-Wasatch-Cache National Forest is one of the most heavily visited national forests in the US, with 7 million annual visitors.

“Salt Lake City has been able to preserve the natural capital that protects its watershed while allowing recreational use. The preservation of the watershed actually boosts recreation, providing visitors with natural landscapes and unadulterated settings for mountain biking, hiking, skiing, and fly-fishing,” says Blanchard.

The city raises funds to buy land within the watershed through a surcharge on water customers’ monthly bills, which provides around $1.5 million each year to protect watershed lands from development. Since 1907, the city has managed to purchase over 23,000 acres of the watershed.

“Despite the popularity and power of the Catskills narrative to promote the preservation of ecosystems via market-based incentives, we found that this narrative is at best partial, and quite possibly flawed,” says Blanchard.

“The Wasatch’s absence in the ecosystem literature results in an incomplete perspective on interventions to secure watershed ecosystem services and limits policy discussions in relation to alternative conservation approaches. It is vital that such alternatives are given more recognition in order to find effective solutions for the protection of natural capital in the future.”

Reference:

Blanchard, L et al. “The lost narrative: Ecosystem service narratives and the missing Wasatch watershed conservation story” Ecosystem Services, December 2015. The paper can be accessed at http://dx.doi.org/10.1016/j.ecoser.2015.10.019 

Libby Blanchard’s research is funded by the Gates Cambridge Trust.

Inset image: Recreation in the Wasatch watershed (Libby Blanchard).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

The Future of Intelligence: Cambridge University Launches New Centre to Study AI and the Future of Humanity

The future of intelligence: Cambridge University launches new centre to study AI and the future of humanity

 

source: www.cam.ac.uk

The University of Cambridge is launching a new research centre, thanks to a £10 million grant from the Leverhulme Trust, to explore the opportunities and challenges to humanity from the development of artificial intelligence.

Machine intelligence will be one of the defining themes of our century, and the challenges of ensuring that we make good use of its opportunities are ones we all face together

Huw Price

Human-level intelligence is familiar in biological “hardware” – it happens inside our skulls. Technology and science are now converging on a possible future where similar intelligence can be created in computers.

While it is hard to predict when this will happen, some researchers suggest that human-level AI will be created within this century. Freed of biological constraints, such machines might become much more intelligent than humans. What would this mean for us? Stuart Russell, a world-leading AI researcher at the University of California, Berkeley, and collaborator on the project, suggests that this would be “the biggest event in human history”. Professor Stephen Hawking agrees, saying that “when it eventually does occur, it’s likely to be either the best or worst thing ever to happen to humanity, so there’s huge value in getting it right.”

Now, thanks to an unprecedented £10 million grant from the Leverhulme Trust, the University of Cambridge is to establish a new interdisciplinary research centre, the Leverhulme Centre for the Future of Intelligence, to explore the opportunities and challenges of this potentially epoch-making technological development, both short and long term.

The Centre brings together computer scientists, philosophers, social scientists and others to examine the technical, practical and philosophical questions artificial intelligence raises for humanity in the coming century.

Huw Price, the Bertrand Russell Professor of Philosophy at Cambridge and Director of the Centre, said: “Machine intelligence will be one of the defining themes of our century, and the challenges of ensuring that we make good use of its opportunities are ones we all face together. At present, however, we have barely begun to consider its ramifications, good or bad”.

The Centre is a response to the Leverhulme Trust’s call for “bold, disruptive thinking, capable of creating a step-change in our understanding”. The Trust awarded the grant to Cambridge for a proposal developed with the Executive Director of the University’s Centre for the Study of Existential Risk (CSER), Dr Seán Ó hÉigeartaigh. CSER investigates emerging risks to humanity’s future including climate change, disease, warfare and technological revolutions.

Dr Ó hÉigeartaigh said: “The Centre is intended to build on CSER’s pioneering work on the risks posed by high-level AI and place those concerns in a broader context, looking at themes such as different kinds of intelligence, responsible development of technology and issues surrounding autonomous weapons and drones.”

The Leverhulme Centre for the Future of Intelligence spans institutions, as well as disciplines. It is a collaboration led by the University of Cambridge with links to the Oxford Martin School at the University of Oxford, Imperial College London, and the University of California, Berkeley. It is supported by Cambridge’s Centre for Research in the Arts, Social Sciences and Humanities (CRASSH). As Professor Price put it, “a proposal this ambitious, combining some of the best minds across four universities and many disciplines, could not have been achieved without CRASSH’s vision and expertise.”

Zoubin Ghahramani, Deputy Director, Professor of Information Engineering and a Fellow of St John’s College, Cambridge, said:

“The field of machine learning continues to advance at a tremendous pace, and machines can now achieve near-human abilities at many cognitive tasks—from recognising images to translating between languages and driving cars. We need to understand where this is all leading, and ensure that research in machine intelligence continues to benefit humanity. The Leverhulme Centre for the Future of Intelligence will bring together researchers from a number of disciplines, from philosophers to social scientists, cognitive scientists and computer scientists, to help guide the future of this technology and  study its implications.”

The Centre aims to lead the global conversation about the opportunities and challenges to humanity that lie ahead in the future of AI. Professor Price said: “With far-sighted alumni such as Charles Babbage, Alan Turing, and Margaret Boden, Cambridge has an enviable record of leadership in this field, and I am delighted that it will be home to the new Leverhulme Centre.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Law in Focus: ‘Parliament’s Role in Voting on the Syrian Conflict

Law in Focus: ‘Parliament’s Role in Voting on the Syrian Conflict

 

www.cam.ac.uk

This video discusses six issues arising out of the recent statement of Prime Minister David Cameron to the House of Commons on the extension of offensive British military operations in Syria.

Following the statement of Prime Minister David Cameron to the House of Commons entitled: “Prime Minister’s Response to the Foreign Affairs Select Committee on the Extension of Offensive British Military Operations to Syria“, Dr Veronika Fikfak and Dr Hayley J Hooper discuss the questionable international legality of military action, the strategic use of parliament and its potential impact upon the emerging Consultation Convention, and the responsibility of MPs to hold government to account across a broad range of relevant domestic issues.

They analyse the impact of the way government shares intelligence information with the House of Commons, especially in light of the 2003 Iraq conflict, highlighting several relevant but under-discussed rules. Finally, they discuss the role of party political discipline on armed conflict votes.

Dr Fikfak researches in the fields of public law, human rights and international law. She is particularly interested in the interface between domestic and international law and is currently writing a monograph on the role of national judges in relation to international law. Dr Hooper is currently a Fellow at Homerton College, and her doctoral research at Balliol College, University of Oxford concerned the use of “closed” or “secret” evidence in the context of judicial review of counterterrorism powers, and its extension to civil procedure more broadly.

Drs Fikfak and Hooper are currently co-authoring a monograph on parliament’s involvement in war powers entitled Parliament’s Secret War (forthcoming with Hart Bloomsbury, 2016).

Law in Focus is a series of short videos featuring academics from the University of Cambridge Faculty of Law, addressing legal issues in current affairs and the news. These issues are examples of the many which challenge researchers and students studying undergraduate and postgraduate law at the Faculty. Law in Focus is available on YouTube, or to subscribe to in iTunes U.

Other collections of video and audio recordings from the Faculty of Law are available at Lectures at Law.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Funding Boost For Infrastructure Research at Cambridge

Funding boost for infrastructure research at Cambridge

www.cam.ac.uk

Two new funding initiatives at the University of Cambridge will support the UK’s infrastructure and cities.

Research at the University of Cambridge to support the UK’s infrastructure and cities has received further backing in the form of two major funding initiatives. The Centre for Smart Infrastructure and Construction (CSIC) has secured a further five years of funding from the Engineering and Physical Sciences Research Council (EPSRC) and Innovate UK; while the UK Collaboratorium for Research in Infrastructure and Cities (UKCRIC), of which Cambridge is a partner, has secured £138 million of funding, to be match funded from other sources, as part of last week’s spending review.

The additional funding to CSIC will allow it to build on its significant achievements over the past five years to become a widely-recognised hub for the infrastructure and construction industry, bringing together leading academics and industrialists, developing a faster route for innovation adoption, providing an ecosystem for building confidence in new innovations and enabling their timely implementation and exploitation.

“CSIC will continue to engage with business leaders and decision makers in key markets to ensure that our work continues to meet industry needs, and that industry leaders are well informed of the value that ‘smart’ innovations in infrastructure and construction can bring to their business,” said Jennifer Schooling, Director of CSIC. “CSIC’s ability to deliver value is unrivalled. Our outputs present real opportunities to make major improvements in how we create new infrastructure.”

There has already been substantial impact of CSIC’s activities in terms of the wide variety of tools and technologies – including fibre optic strain measurement, UtterBerry ultra-low power wireless sensor motes, vibration energy harvesting devices and CSattAR photogrammetric monitoring system – recently deployed on some of the largest civil engineering projects including Crossrail, National Grid, London Underground, CERN and the Staffordshire Alliance.

The application of CSIC’s capability and knowledge is now being broadened to new markets including water infrastructure, highways and power.

“Securing this funding for the next five years offers a wide range of opportunities to take CSIC’s work forward and embed a culture of innovation adoption in the infrastructure and construction industries,” said Schooling. “CSIC cannot achieve this alone – working with industry is the key to our success to date and we always welcome approaches from industry partners seeking to collaborate.”

Professor Philip Nelson, CEO, EPSRC, said: “The Centre will continue its leading role within the UK by increasing the lifetime of ageing infrastructure, making it more resilient, and making construction processes more efficient by using smart sensing technology. This collaborative research between academia and industry will increase the UK’s competitiveness, lead to savings quantified in millions of pounds and provide technology that can be exported for UK based companies.”

Kevin Baughan, Director of Technology and Innovation at Innovate UK said: “The work of CSIC has helped to demonstrate the value of smart infrastructure to the construction industry, and this is reflected in the recognition of innovation at the heart of the future plans of the construction leadership council. By extending funding for a further five years, we underline our support for their commitment to raise the commercialisation bar even higher. This will help companies of all sizes grow through leveraging the excellent UK science base in smart infrastructure.”

UKCRIC is a collaboration of 14 UK universities which aims to provide a knowledge base to ensure the long-term functioning of the UK’s transport systems, energy systems, clean water supplies, waste management, flood defences and the development of SMART infrastructures.

Outside national security and medicine, UKCRIC will be one of the largest collaborative research projects in the UK. Current national and international partners include: Bristol City Council, Network Rail, Mott MacDonald, Buro Happold, Atkins, National Grid, DfT, EDF and Thames Water, with many more partners to follow. In order to tap further into the UK’s expertise and creativity UKCRIC’s founding core of 14 universities will be expanded over the coming years.

Cambridge will receive funding through UKCRIC which will be used to support research in the application of advanced sensor technologies to the monitoring of the UK’s existing and future infrastructure, in order to protect and maintain it.

UKCRIC programmes will integrate research on infrastructure needs, utilisation and performance through experiments, analysis, living labs and modelling. This will provide a new combination of decision support tools to inform infrastructure operators, planners, financiers, regulators, cities, and government on the optimisation of infrastructure capacity, performance and investment.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

f

How To Escape a Black Hole

How to escape a black hole

 

source: www.cam.ac.uk

An international team of astrophysicists, including researchers from the University of Cambridge, has observed a new way for gas to escape the gravitational pull of a supermassive black hole.

These jets are a unique tool for probing supermassive black holes

Morgan Fraser

The results, published in the journalScience, are based on new radio observations tracking a star as it gets torn apart by a black hole. Such violent events yield a burst of light which is produced as the bits and pieces of the star fall into the black hole. For the first time, the researchers were able to show that this burst of light is followed by a radio signal from the matter that was able to escape the black hole by travelling away in a jetted outflow at nearly the speed of light.

The discovery of the jet was made possible by a rapid observational response after the stellar disruption (known as ASAS-SN-14li) was announced in December 2014. The radio data was taken by the by the 4 PI SKY team at Oxford, using the Arcminute Microkelvin Imager Large Array located in Cambridge.

“Previous efforts to find evidence for these jets, including my own, were late to the game,” said Sjoert van Velzen of Johns Hopkins University, the study’s lead author. Co-author Nicholas Stone added that “even after they got to the game, these earlier attempts were observing from the bleachers, while we were the first to get front row seats.”

In this branch of astronomy, the ‘front row’ means a distance of 300 million light years, while previous observations were based on events at occurring least three times as far away.

Jets are often observed in association with black holes, but their launch mechanism remains a major unsolved problem in astrophysics. Most supermassive black hole are fed a steady diet of gas, leading to jets that live for millions of years and change little on a human timescale. However, the newly discovered jet behaved very differently: the observations show that following a brief injection of energy, it produced short but spectacular radio fireworks.

The observed jet was anticipated by the so-called scale-invariant model of accretion, also known as the Matryoshka-doll theory of astrophysics. It predicts that all compact astrophysical objects (white dwarfs, neutron stars, or black holes) that accrete matter behave and look the same after a simple correction based on solely the mass of the object. In other words, the larger Matryoshka doll (a supermassive black hole) is just a scaled-up version of the smaller doll (a neutron star). Since neutron stars consistently produce radio-emitting jets when they are supplied with a large amount of gas, the theory predicts that supermassive black holes should do the same when they swallow a star.

“I always liked the elegant nature of the scale-invariant theory, but previous observations never found evidence for the new type of jet it predicted,” said van Velzen. “Our new findings suggest that this new type of jet could indeed be common and previous observations were simply not sensitive enough to detect them.”

“These jets are a unique tool for probing supermassive black holes,” said co-author Dr Morgan Fraser of Cambridge’s Institute of Astronomy. “While black holes themselves do not emit light, by observing how a star is torn apart as it falls in we can indirectly study the sleeping monster at the heart of a galaxy.”

The study hypothesises that every stellar disruption leads to a radio flare similar to the one just discovered. Ongoing surveys such as the Gaia Alerts project, led by the University of Cambridge will find many more of these rare events.

“Gaia has exceptionally sharp eyes, and is ideally suited to find events like this, which occur in the very centres of galaxies,” said co-author Dr Heather Campbell, also from Cambridge’s Institute of Astronomy. “Finding more of these rare events may further our understanding of the processes that allow black holes to launch such spectacular outflows.”

Reference:
Van Velzen, S. et. al. ‘A radio jet from the optical and X-ray bright stellar tidal disruption flare ASASSN-14li.’ Science (2015). DOI: 10.1126/science.aad1182

Adapted from a Johns Hopkins press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/how-to-escape-a-black-hole#sthash.IHxrgGvX.dpuf

Two-Thirds of Studies On ‘Psychosocial’ Treatments Fail To Declare Conflicts of Interest

Two-thirds of studies on ‘psychosocial’ treatments fail to declare conflicts of interest

source: www.cam.ac.uk

The creators of commercially sold counselling programmes increasingly profit from public health services across the world. However, a new study into the evidence basis for some of the market leaders reveals that serious conflicts of interest across the majority of the research go habitually undisclosed.

Policy makers in public health have a right to expect transparency about conflicts of interest in academic research

Manuel Eisner

Health services in many countries increasingly rely on prescribed ‘psychosocial interventions’: treatments that use counselling techniques to tackle mental health issues, behavioural problems such as substance abuse, and assist parents with new or troubled children.

These highly-regarded therapeutic and educational programmes, devised by senior academics and practitioners, are sold commercially to public health services across the world on the basis that they are effective interventions for people in need of support – with the evidence to back them up.

However, the first study to investigate conflicts of interest in the published evidence for intervention treatments has revealed that the majority of academic studies which assert evidence of effectiveness list authors who profit from the distribution of these programmes, yet few declare a conflict of interest.

In fact, the new research shows that as many as two-thirds of the studies that list an author who financially benefits from sales of said treatment programmes declare no conflict of interest whatsoever.

While major steps have been taken to counter research bias in other fields such as pharmaceuticals, the new study’s authors say that hugely influential psychosocial treatments suffer a distinct lack of transparency from academics that both publish research on treatment effectiveness and stand to gain significantly from any positive findings.

They write that as commercial psychosocial treatments – many of which cost hundreds, even thousands, of dollars per participant – continue to gain traction with national public health services, it is important that “systems for effective transparency are implemented” to ensure clinical commissioning bodies are aware of potential research biases. The findings are published today in the journal PLOS ONE.

“Contrary to some, I have no problem with introducing commercial programmes into a national health service if decision makers and trusts come to the conclusion that a commercially disseminated treatment is more effective than their current psychosocial offerings, but this must be based on fair and transparent evidence,” said the study’s lead author Professor Manuel Eisner, from Cambridge’s Institute of Criminology.

“What you don’t want to see is an intervention system that remains as effective, or becomes less effective, despite buying in expensive programmes, because you have a public goods service competing with research that has a commercial interest to publish overly optimistic findings,” Eisner said.

“Policy makers in public health have a right to expect transparency about conflicts of interest in academic research.”

Four internationally disseminated psychosocial interventions – described by Eisner as “market leaders” – were examined: the Positive Parenting Programme (or Triple P); theNurse-Family Partnership; the parenting and social skills programme Incredible Years; the Multi-Systemic Therapy intervention for youth offenders.

The researchers inspected all articles published in academic journals between 2008 and 2014 on these interventions that were co-authored by at least one lead developer of the programme – a total of 136 studies.

Two journal editors refused consent to be included in the research, leaving 134 studies. Of all these studies, researchers found 92 of them – equalling 71% – to have absent, incomplete or partly misleading conflict of interest disclosures.

The research team contacted journal editors about the 92 published studies on the effectiveness of one of these four commercial psychosocial interventions co-authored by a primary developer of the self-same therapy, yet listed no conflict of interest, or, in the case of a few, an incomplete one.

This led to 65 of the studies being amended with an ‘erratum’, or correction. In 16 cases, the journal editors admitted “mishandling” a disclosure, resulting in the lack of a conflict of interest statement.

In the remaining 49 cases, the journal editors contacted the study’s authors seeking clarification. In every case the authors submitted a new or revised conflict of interest. Eisner and colleagues write that the “substantial variability in disclosure rates suggests that much responsibility seems to lie with the authors”.

The most common reason given by those journals that did not issue a correction was that they did not have a conflict of interest policy in place at the time of the published study’s submission.

While the overall rate of adequate disclosures in clear cases of conflict of interest was less than a third, just 32%, the rates for the four programmes varied significantly. The lowest rate of disclosures was found in academic studies on the Triple P programme, at just 11%.

Triple P is a standardised system of parenting support interventions based on cognitive-behavioural therapy. Initially developed by Professor Matthew Sanders at the University of Queensland, Triple P has sold around seven million copies of its standard programme across 25 countries since it began commercial operations in 1996, with over 62,000 licensed providers – mainly trained psychologists.

In 2001, Queensland ‘spun out’ the licencing contract into a private company, the royalties from which are distributed between three groups of beneficiaries: Queensland University itself, Prof Sanders’ Parenting and Family Support Centre (also at Queensland), and the authors of Triple P.

Despite being one of the most widely evaluated parenting programmes worldwide, the evidence for the success of Triple P is controversial, say the researchers.

Several analyses of Triple P – including those by Triple P authors with previously undeclared conflicts of interest – show positive effects. However, at least one independent systematic review cited in the new PLOS ONE study found “no convincing evidence” that the Triple P has any positive effects in the long run.

“Researchers with a conflict of interest should not be presumed to conduct less valid scholarship, and transparency doesn’t necessarily improve the quality of research, but it does make a difference to how those findings are assessed,” said Eisner.

In the Journal of Child and Family Studies in January 2015, Triple P creator Prof Sanders wrote that “[p]artly as a result of these types of criticisms” his research group had “undertaken a comprehensive review of our own quality assurance practices”.

Added Eisner: “The development of standardised, evidence-based programmes such as Triple P is absolutely the right thing to do. If we have comparable interventions providing an evidence base then it promotes innovation and stops us running around in circles. But we need to be able to trust the findings, and that requires transparency when it comes to conflicts of interest.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.