All posts by Admin

Feeding Food Waste To Pigs Could Save Vast Swathes of Threatened Forest and Savannah

Feeding food waste to pigs could save vast swathes of threatened forest and savannah

source: www.cm.ac.uk

New research suggests that feeding our food waste, or swill, to pigs (currently banned under EU law) could save 1.8 million hectares of global agricultural land – an area roughly half the size of Germany, including hundreds of thousands of acres of South America’s biodiverse forests and savannahs – and provide a use for the 100 million tonnes of food wasted in the EU each year.

It is time to reassess whether the EU’s blanket ban on the use of food waste as feed is the right thing for the pig industry

Erasmus zu Ermgassen

A new study shows that if the European Union lifted the pigswill ban imposed following 2001’s foot-and-mouth disease epidemic, and harnessed technologies developed in East Asian countries for ‘heat-treating’ our food waste to safely turn it into pig feed, around 1.8 million hectares of land could be saved from being stripped for grain and soybean-based pig feed production – including over quarter of a million hectares of Brazilian forest and savannah.

While swill-feeding was banned across the EU in 2002 following the foot-and-mouth outbreak – triggered by a UK farmer illegally feeding uncooked food waste to pigs – other countries, such as Japan, responded by creating a highly regulated system for safely recycling heat-treated food waste as animal feed.

Researchers describe the EU ban as a “knee-jerk reaction” that no longer makes sense when East Asian countries have demonstrated that food waste can be safely recycled. The models in the latest study show that pigswill reintroduction would not only decrease the amount of land the EU pork industry requires by 21.5%, but also cut in half the ever-increasing feed costs faced by European pig farmers.

Researchers describe swill as a feed which is commonly used in other parts of the world, one that could save a huge amount of global resources, and provide an environmentally sound recycling solution to the estimated 102.5 million tonnes of food wasted in the EU each year. Over 35% of food waste is now recycled into animal feed in Japan, where swill-fed “Eco-pork” is marketed as a premium product.

“Following the foot-and-mouth disease outbreak, different countries looked at the same situation, the same evidence, and came to opposite conclusions for policy,” said Erasmus zu Ermgassen from the University of Cambridge’s Department of Zoology, who led the study, published today in the journal Food Policy. “In many countries in East Asia we have a working model for the safe use of food waste as pig feed. It is a highly regulated and closely monitored system that recycles food waste and produces low-cost pig feed with a low environmental impact.”

The researchers examined data about the current land use of EU pork, the availability of food waste in the EU, and the quality and quantity of pork produced in feed trials that compared pigswill to grain-based diets, to produce a model of how much land could be saved if the pigswill ban was lifted.

Some 21.5 million tonnes of pork, around 34kg of pork per person, are produced in the EU each year. Livestock production occupies approximately 75% of agricultural land worldwide – with most of this used to produce animal feed. For EU pork, much of the environmental burden stems from the farming of soybean meal, which takes up in excess of 1.2 million hectares of land across South America.

As swill is much cheaper than grain and soybean-based pig feed, reintroducing swill feeding could reduce costs faced by EU pig farmers by 50%, say the researchers.

Most objection to swill feeding in the EU stems from concerns about safety, and the sentiment that feeding human food waste to pigs is unnatural. But zu Ermgassen argues that those concerns are largely based on incorrect assumptions.

“Pigs are omnivorous animals; in the wild they would eat anything they could forage for, from vegetable matter to other animal carcasses, and they have been fed food waste since they were domesticated by humans 10,000 years ago. Swill actually provides a more traditional diet for pigs than the grain-based feed currently used in modern EU systems,” said zu Ermgassen.

“A recent survey found that 25% of smallholder farmers in the UK admit to illegally feeding uncooked food waste to their pigs, so the fact is that the current ban is not particularly safe from a disease-outbreak perspective. Feeding uncooked food waste is dangerous because pigs can catch diseases from raw meat, but a system supporting the regulated use of heat-treated swill does not have the same risks,” he said.

With the demand for meat and dairy products forecast to increase 60% by 2050, reducing the environmental footprint of current systems of meat production will become increasingly critical.

zu Ermgassen points out that economic and environmental concern is driving a reassessment of EU animal feed bans that were put in place in the 2000s, as well as attempts to recycle food waste more effectively. The EU is currently looking into repealing bans on using waste pig and poultry products as fish feed and reintroducing insects as pig and poultry feed.

“The reintroduction of swill feeding in the EU would require backing from pig producers, the public, and policy makers, but it has substantial potential to improve the environmental and economic sustainability of EU pork production. It is time to reassess whether the EU’s blanket ban on the use of food waste as feed is the right thing for the pig industry,” he said.

Erasmus zu Ermgassen’s research is funded by the Biotechnology and Biological Sciences Research Council.

Reference

Erasmus K.H.J. zu Ermgassen, et al. “Reducing the land use of EU pork production: where there’s swill, there’s a way” Food Policy Vol 58 (January 2016). DOI:10.1016/j.foodpol.2015.11.001.

Inset image: “Save Kitchen Waste to Feed the Pigs!” poster from the Imperial War Museums © IWM (Art.IWM PST 14743).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/feeding-food-waste-to-pigs-could-save-vast-swathes-of-threatened-forest-and-savannah#sthash.WeYqJS6y.dpuf

The Periodic Table of Proteins

The periodic table of proteins

source: www.cam.ac.uk

Researchers have devised a periodic table of protein complexes, making it easier to visualise, understand and predict how proteins combine to drive biological processes.

We’re bringing a lot of order into the messy world of protein complexes

Sebastian Ahnert

A new ‘periodic table’ of protein complexes, devised by an interdisciplinary team of researchers, provides a unified way to classify and visualise protein complexes, which drive a huge range of biological processes, from DNA replication to catalysing metabolic reactions.

The table, published in the journal Science, offers a new way of looking at almost all known molecular structures and predicting how new ones could be made, providing a valuable tool for research into evolution and protein engineering.

By using the table, researchers are able predict the likely forms of protein complexes with unknown structure, estimate the feasibility of entirely new structures, and identify possible errors in existing structural databases. It was created by an interdisciplinary team led by researchers at the University of Cambridge and the Wellcome Genome Campus.

Almost every biological process depends on proteins interacting and assembling into complexes in a specific way, and many diseases, such as Alzheimer’s and Parkinson’s, are associated with problems in complex assembly. The principles underpinning this organisation are not yet fully understood, but the new periodic table presents a systematic, ordered view on protein assembly, providing a visual tool for understanding biological function.

“We’re bringing a lot of order into the messy world of protein complexes,” said the paper’s lead author Sebastian Ahnert of Cambridge’s Cavendish Laboratory, a physicist who regularly tangles with biological problems. “Proteins can keep combining in these simple ways, adding more and more levels of complexity and resulting in a huge variety of structures. What we’ve made is a classification based on underlying principles that helps people get a handle on the complexity.”

The exceptions to the rule are interesting in their own right, added Ahnert, and are the subject of continuing studies.

“Evolution has given rise to a huge variety of protein complexes, and it can seem a bit chaotic,” said study co-author Joe Marsh, formerly of the Wellcome Genome Campus and now of the MRC Human Genetics Unit at the University of Edinburgh. “But if you break down the steps proteins take to become complexes, there are some basic rules that can explain almost all of the assemblies people have observed so far.”

Ballroom dancing can be seen as an endless combination of riffs on the waltz, fox trot and cha-cha. Similarly, the ‘dance’ of protein complex assembly can be seen as endless variations on dimerization (one doubles, and becomes two), cyclisation (one forms a ring of three or more) and subunit addition (two different proteins bind to each other). Because these happen in a fairly predictable way, it’s not as hard as you might think to predict how a novel protein would form.

Some protein complexes, called homomers, feature multiple copies of a single protein, while others, called heteromers, are made from several different types of proteins. The table shows that there is a very close relationship between the possible structures of heteromers and homomers. In fact, the vast majority of heteromers can be thought of as homomers in which the single protein is replaced by a repeated unit of several proteins. The table was constructed using computational analysis of a large database of protein-protein interfaces.

“By analysing the tens of thousands of protein complexes for which three-dimensional structures have already been experimentally determined, we could see repeating patterns in the assembly transitions that occur – and with new data from mass spectrometry we could start to see the bigger picture,” said Walsh.

“The core work for this study is in theoretical physics and computational biology, but it couldn’t have been done without the mass spectrometry work by our colleagues at Oxford University,” said Sarah Teichmann, Research Group Leader at the European Bioinformatics Institute (EMBL-EBI) and the Wellcome Trust Sanger Institute. “This is yet another excellent example of how extremely valuable interdisciplinary research can be.”

Reference:
Ahnert SE, et. al. ‘Principles of assembly reveal a periodic table of protein complexes.’ Science (2015). DOI: 10.1126/science.aaa2245

Adapted from an EMBL-EBI press release.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/the-periodic-table-of-proteins#sthash.Kt9HC4c2.dpuf

Millet: The Missing Piece In The Puzzle Of Prehistoric Humans’ Transition From Hunter-Gatherers To Farmers

Millet: the missing piece in the puzzle of prehistoric humans’ transition from hunter-gatherers to farmers

source: www.cam.ac.uk

New research shows a cereal familiar today as birdseed was carried across Eurasia by ancient shepherds and herders laying the foundation, in combination with the new crops they encountered, of ‘multi-crop’ agriculture and the rise of settled societies. Archaeologists say ‘forgotten’ millet has a role to play in modern crop diversity and today’s food security debate.

We have been able to follow millet moving in deep history, from where it originated in China and spread across Europe and India

Martin Jones

The domestication of the small-seeded cereal millet in North China around 10,000 years ago created the perfect crop to bridge the gap between nomadic hunter-gathering and organised agriculture in Neolithic Eurasia, and may offer solutions to modern food security, according to new research.

Now a forgotten crop in the West, this hardy grain – familiar in the west today as birdseed – was ideal for ancient shepherds and herders, who carried it right across Eurasia, where it was mixed with crops such as wheat and barley. This gave rise to ‘multi-cropping’, which in turn sowed the seeds of complex urban societies, say archaeologists.

A team from the UK, USA and China has traced the spread of the domesticated grain from North China and Inner Mongolia into Europe through a “hilly corridor” along the foothills of Eurasia. Millet favours uphill locations, doesn’t require much water, and has a short growing season: it can be harvested 45 days after planting, compared with 100 days for rice, allowing a very mobile form of cultivation.

Nomadic tribes were able to combine growing crops of millet with hunting and foraging as they travelled across the continent between 2500 and 1600 BC. Millet was eventually mixed with other crops in emerging populations to create ‘multi-crop’ diversity, which extended growing seasons and provided our ancient ancestors with food security.

The need to manage different crops in different locations, and the water resources required, depended upon elaborate social contracts and the rise of more settled, stratified communities and eventually complex ‘urban’ human societies.

Researchers say we need to learn from the earliest farmers when thinking about feeding today’s populations, and millet may have a role to play in protecting against modern crop failure and famine.

“Today millet is in decline and attracts relatively little scientific attention, but it was once among the most expansive cereals in geographical terms. We have been able to follow millet moving in deep history, from where it originated in China and spread across Europe and India,” said Professor Martin Jones from the University of Cambridge’s Department of Archaeology and Anthropology, who is presenting the research findings today at the Shanghai Archaeological Forum.

“These findings have transformed our understanding of early agriculture and society. It has previously been assumed that early agriculture was focused in river valleys where there is plentiful access to water. However, millet remains show that the first agriculture was instead centred higher up on the foothills – allowing this first pathway for ‘exotic’ eastern grains to be carried west.”

The researchers carried out radiocarbon dating and isotope analysis on charred millet grains recovered from archaeological sites across China and Inner Mongolia, as well as genetic analysis of modern millet varieties, to reveal the process of domestication that occurred over thousands of years in northern China and produced the ancestor of all broomcorn millet worldwide.

“We can see that millet in northern China was one of the earliest centres of crop domestication, occurring over the same timescale as rice domestication in south China and barley and wheat in west China,” explained Jones.

“Domestication is hugely significant in the development of early agriculture – humans select plants with seeds that don’t fall off naturally and can be harvested, so over several thousand years this creates plants that are dependent on farmers to reproduce,” he said.

“This also means that the genetic make-up of these crops changes in response to changes in their environment – in the case of millet, we can see that certain genes were ‘switched off’ as they were taken by farmers far from their place of origin.”

As the network of farmers, shepherds and herders crystallised across the Eurasian corridor, they shared crops and cultivation techniques with other farmers, and this, Jones explains, is where the crucial idea of ‘multi-cropping’ emerged.

“The first pioneer farmers wanted to farm upstream in order to have more control over their water source and be less dependent on seasonal weather variations or potential neighbours upstream,” he said. “But when ‘exotic’ crops appear in addition to the staple crop of the region, then you start to get different crops growing in different areas and at different times of year. This is a huge advantage in terms of shoring up communities against possible crop failures and extending the growing season to produce more food or even surplus.

“However, it also introduces a more pressing need for cooperation, and the beginnings of a stratified society. With some people growing crops upstream and some farming downstream, you need a system of water management, and you can’t have water management and seasonal crop rotation without an elaborate social contract.”

Towards the end of the second and first millennia BC larger human settlements, underpinned by multi-crop agriculture, began to develop. The earliest examples of text, such as the Sumerian clay tablets from Mesopotamia, and oracle bones from China, allude to multi-crop agriculture and seasonal rotation.

But the significance of millet is not just in transforming our understanding of our prehistoric past. Jones believes that millet and other small-seeded crops may have an important role to play in ensuring future food security.

“The focus for looking at food security today is on the high-yield crops, rice, maize and wheat, which fuel 50% of the human food chain. However, these are only three of 50 types of cereal, the majority of which are small-grained cereals or “millets”. It may be time to consider whether millets have a role to play in a diverse response to crop failure and famine,” said Jones.

“We need to understand more about millet and how it may be part of the solution to global food security – we may have a lot still to learn from our Neolithic predecessors.”

Inset images: Martin Jones with millet in North China (Martin Jones); Inner Mongolian millet farmer in Chifeng (Martin Jones).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/millet-the-missing-piece-in-the-puzzle-of-prehistoric-humans-transition-from-hunter-gatherers-to#sthash.bT4UI0Ay.dpuf

Alternative Ways of Protecting Urban Water Supplies Must Be Considered in Light of Worsening Droughts in the US, Study Claims

Alternative ways of protecting urban water supplies must be considered in light of worsening droughts in the US, study claims

source: www.cam.ac.uk

Alternative models of watershed protection that balance recreational use and land conservation must no longer be ignored to preserve water supplies against the effects of climate change, argues a new study. Researchers claim that the management of Salt Lake City’s Wasatch watershed in Utah provides a valuable example contradicting the dominant view presented in academic literature that informs many current conservation strategies.

The chances of a ‘megadrought’ – one that lasts for 35 years or longer – affecting the Southwest and central Great Plains by 2100 are above 80% if climate change projections are not mitigated

Libby Blanchard

Salt Lake City’s preservation of the Wasatch watershed is an important model for protecting urban water sources through land use regulation and conservation, which could have important implications for preserving future water supplies against the effects of climate change in the American West, according to a new study. This example is currently absent from academic literature on ecosystem services, meaning that conservation discussions are instead dominated by models that focus on financial, ‘market-based’ incentives to protect watershed areas, which the researchers argue could be inappropriate in many circumstances.

The most prevalent model for water resource preservation is that of New York City’s Catskills/Delaware watershed, which is based on upstream resource users being paid to avoid harmful practices that might affect water flows and water quality, typically by beneficiaries who are downstream. These ‘market-based’ approaches (also known as Payments for Watershed or Ecosystem Services) have been widely promoted, but risk neglecting alternative approaches that do not always require monetary transactions to improve environmental outcomes.

In contrast, Salt Lake City’s management strategy allows regulated use of the watershed area for public recreation (unlike other forested catchments in the US where public access is prohibited to preserve water resources). In the Wasatch case, this means that the upstream catchment remains accessible, including for high impact uses such as skiing and mountain biking. Researchers argue that it is vital to consider these alternative strategies for solving the increasing water scarcity in the American West.

“While regulatory exclusion is often thought of as the only viable alternative to market-based incentives in managing ecosystem services, the management of the Wasatch watershed provides a third, yet under-recognised, successful conservation strategy for water resources,” says Libby Blanchard, lead author of the study from the University of Cambridge’s Department of Geography.

“The dominance of the Catskills example in discussions of watershed protection provides an unduly limited, and historically incomplete, perspective on interventions to secure water resources, and limits policy discussions about alternative conservation approaches,” she adds.

In the American West, unprecedented droughts have caused extreme water shortages. The current drought in California and across the West is entering its fourth year, with precipitation and water storage reaching record low levels. Droughts are ranked second in the US in terms of national weather-related economic impacts, with annual losses just shy of $9 billion. With water scarcity likely to increase due to advancing climate change, the economic and environmental impacts of drought are also likely to get worse.

“The chances of a ‘megadrought’ – one that lasts for 35 years or longer – affecting the Southwest and central Great Plains by 2100 are above 80% if climate change projections are not mitigated,” says Blanchard. “As the West faces more frequent and severe droughts, the successful protection of watersheds for the ecosystem services of water capture, storage, and delivery they provide will be increasingly important.”

“The sufficient and effective protection of watersheds will become more challenging, so awareness of alternative, successful strategies is critically important,” adds Bhaskar Vira, co-author of the study also from Cambridge’s Department of Geography. “The management of the Wasatch is one such strategy that should be more widely recognised amongst policymakers and researchers alike seeking effective solutions to water scarcity.”

The economic and instrumental value of the Wasatch watershed was noticed by Salt Lake City’s government as early as the 1850s, when the first legislation to protect the city’s natural resources was passed. Salt Lake City uses two tools to protect its watershed: purchasing land for conservation, and regulating land use by restricting a variety of activities within the watershed such as cattle grazing. Recreation is not altogether restricted, but is negotiated with the local community to allow public use. The Uinta-Wasatch-Cache National Forest is one of the most heavily visited national forests in the US, with 7 million annual visitors.

“Salt Lake City has been able to preserve the natural capital that protects its watershed while allowing recreational use. The preservation of the watershed actually boosts recreation, providing visitors with natural landscapes and unadulterated settings for mountain biking, hiking, skiing, and fly-fishing,” says Blanchard.

The city raises funds to buy land within the watershed through a surcharge on water customers’ monthly bills, which provides around $1.5 million each year to protect watershed lands from development. Since 1907, the city has managed to purchase over 23,000 acres of the watershed.

“Despite the popularity and power of the Catskills narrative to promote the preservation of ecosystems via market-based incentives, we found that this narrative is at best partial, and quite possibly flawed,” says Blanchard.

“The Wasatch’s absence in the ecosystem literature results in an incomplete perspective on interventions to secure watershed ecosystem services and limits policy discussions in relation to alternative conservation approaches. It is vital that such alternatives are given more recognition in order to find effective solutions for the protection of natural capital in the future.”

Reference:

Blanchard, L et al. “The lost narrative: Ecosystem service narratives and the missing Wasatch watershed conservation story” Ecosystem Services, December 2015. The paper can be accessed at http://dx.doi.org/10.1016/j.ecoser.2015.10.019 

Libby Blanchard’s research is funded by the Gates Cambridge Trust.

Inset image: Recreation in the Wasatch watershed (Libby Blanchard).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

The Future of Intelligence: Cambridge University Launches New Centre to Study AI and the Future of Humanity

The future of intelligence: Cambridge University launches new centre to study AI and the future of humanity

 

source: www.cam.ac.uk

The University of Cambridge is launching a new research centre, thanks to a £10 million grant from the Leverhulme Trust, to explore the opportunities and challenges to humanity from the development of artificial intelligence.

Machine intelligence will be one of the defining themes of our century, and the challenges of ensuring that we make good use of its opportunities are ones we all face together

Huw Price

Human-level intelligence is familiar in biological “hardware” – it happens inside our skulls. Technology and science are now converging on a possible future where similar intelligence can be created in computers.

While it is hard to predict when this will happen, some researchers suggest that human-level AI will be created within this century. Freed of biological constraints, such machines might become much more intelligent than humans. What would this mean for us? Stuart Russell, a world-leading AI researcher at the University of California, Berkeley, and collaborator on the project, suggests that this would be “the biggest event in human history”. Professor Stephen Hawking agrees, saying that “when it eventually does occur, it’s likely to be either the best or worst thing ever to happen to humanity, so there’s huge value in getting it right.”

Now, thanks to an unprecedented £10 million grant from the Leverhulme Trust, the University of Cambridge is to establish a new interdisciplinary research centre, the Leverhulme Centre for the Future of Intelligence, to explore the opportunities and challenges of this potentially epoch-making technological development, both short and long term.

The Centre brings together computer scientists, philosophers, social scientists and others to examine the technical, practical and philosophical questions artificial intelligence raises for humanity in the coming century.

Huw Price, the Bertrand Russell Professor of Philosophy at Cambridge and Director of the Centre, said: “Machine intelligence will be one of the defining themes of our century, and the challenges of ensuring that we make good use of its opportunities are ones we all face together. At present, however, we have barely begun to consider its ramifications, good or bad”.

The Centre is a response to the Leverhulme Trust’s call for “bold, disruptive thinking, capable of creating a step-change in our understanding”. The Trust awarded the grant to Cambridge for a proposal developed with the Executive Director of the University’s Centre for the Study of Existential Risk (CSER), Dr Seán Ó hÉigeartaigh. CSER investigates emerging risks to humanity’s future including climate change, disease, warfare and technological revolutions.

Dr Ó hÉigeartaigh said: “The Centre is intended to build on CSER’s pioneering work on the risks posed by high-level AI and place those concerns in a broader context, looking at themes such as different kinds of intelligence, responsible development of technology and issues surrounding autonomous weapons and drones.”

The Leverhulme Centre for the Future of Intelligence spans institutions, as well as disciplines. It is a collaboration led by the University of Cambridge with links to the Oxford Martin School at the University of Oxford, Imperial College London, and the University of California, Berkeley. It is supported by Cambridge’s Centre for Research in the Arts, Social Sciences and Humanities (CRASSH). As Professor Price put it, “a proposal this ambitious, combining some of the best minds across four universities and many disciplines, could not have been achieved without CRASSH’s vision and expertise.”

Zoubin Ghahramani, Deputy Director, Professor of Information Engineering and a Fellow of St John’s College, Cambridge, said:

“The field of machine learning continues to advance at a tremendous pace, and machines can now achieve near-human abilities at many cognitive tasks—from recognising images to translating between languages and driving cars. We need to understand where this is all leading, and ensure that research in machine intelligence continues to benefit humanity. The Leverhulme Centre for the Future of Intelligence will bring together researchers from a number of disciplines, from philosophers to social scientists, cognitive scientists and computer scientists, to help guide the future of this technology and  study its implications.”

The Centre aims to lead the global conversation about the opportunities and challenges to humanity that lie ahead in the future of AI. Professor Price said: “With far-sighted alumni such as Charles Babbage, Alan Turing, and Margaret Boden, Cambridge has an enviable record of leadership in this field, and I am delighted that it will be home to the new Leverhulme Centre.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Law in Focus: ‘Parliament’s Role in Voting on the Syrian Conflict

Law in Focus: ‘Parliament’s Role in Voting on the Syrian Conflict

 

www.cam.ac.uk

This video discusses six issues arising out of the recent statement of Prime Minister David Cameron to the House of Commons on the extension of offensive British military operations in Syria.

Following the statement of Prime Minister David Cameron to the House of Commons entitled: “Prime Minister’s Response to the Foreign Affairs Select Committee on the Extension of Offensive British Military Operations to Syria“, Dr Veronika Fikfak and Dr Hayley J Hooper discuss the questionable international legality of military action, the strategic use of parliament and its potential impact upon the emerging Consultation Convention, and the responsibility of MPs to hold government to account across a broad range of relevant domestic issues.

They analyse the impact of the way government shares intelligence information with the House of Commons, especially in light of the 2003 Iraq conflict, highlighting several relevant but under-discussed rules. Finally, they discuss the role of party political discipline on armed conflict votes.

Dr Fikfak researches in the fields of public law, human rights and international law. She is particularly interested in the interface between domestic and international law and is currently writing a monograph on the role of national judges in relation to international law. Dr Hooper is currently a Fellow at Homerton College, and her doctoral research at Balliol College, University of Oxford concerned the use of “closed” or “secret” evidence in the context of judicial review of counterterrorism powers, and its extension to civil procedure more broadly.

Drs Fikfak and Hooper are currently co-authoring a monograph on parliament’s involvement in war powers entitled Parliament’s Secret War (forthcoming with Hart Bloomsbury, 2016).

Law in Focus is a series of short videos featuring academics from the University of Cambridge Faculty of Law, addressing legal issues in current affairs and the news. These issues are examples of the many which challenge researchers and students studying undergraduate and postgraduate law at the Faculty. Law in Focus is available on YouTube, or to subscribe to in iTunes U.

Other collections of video and audio recordings from the Faculty of Law are available at Lectures at Law.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Funding Boost For Infrastructure Research at Cambridge

Funding boost for infrastructure research at Cambridge

www.cam.ac.uk

Two new funding initiatives at the University of Cambridge will support the UK’s infrastructure and cities.

Research at the University of Cambridge to support the UK’s infrastructure and cities has received further backing in the form of two major funding initiatives. The Centre for Smart Infrastructure and Construction (CSIC) has secured a further five years of funding from the Engineering and Physical Sciences Research Council (EPSRC) and Innovate UK; while the UK Collaboratorium for Research in Infrastructure and Cities (UKCRIC), of which Cambridge is a partner, has secured £138 million of funding, to be match funded from other sources, as part of last week’s spending review.

The additional funding to CSIC will allow it to build on its significant achievements over the past five years to become a widely-recognised hub for the infrastructure and construction industry, bringing together leading academics and industrialists, developing a faster route for innovation adoption, providing an ecosystem for building confidence in new innovations and enabling their timely implementation and exploitation.

“CSIC will continue to engage with business leaders and decision makers in key markets to ensure that our work continues to meet industry needs, and that industry leaders are well informed of the value that ‘smart’ innovations in infrastructure and construction can bring to their business,” said Jennifer Schooling, Director of CSIC. “CSIC’s ability to deliver value is unrivalled. Our outputs present real opportunities to make major improvements in how we create new infrastructure.”

There has already been substantial impact of CSIC’s activities in terms of the wide variety of tools and technologies – including fibre optic strain measurement, UtterBerry ultra-low power wireless sensor motes, vibration energy harvesting devices and CSattAR photogrammetric monitoring system – recently deployed on some of the largest civil engineering projects including Crossrail, National Grid, London Underground, CERN and the Staffordshire Alliance.

The application of CSIC’s capability and knowledge is now being broadened to new markets including water infrastructure, highways and power.

“Securing this funding for the next five years offers a wide range of opportunities to take CSIC’s work forward and embed a culture of innovation adoption in the infrastructure and construction industries,” said Schooling. “CSIC cannot achieve this alone – working with industry is the key to our success to date and we always welcome approaches from industry partners seeking to collaborate.”

Professor Philip Nelson, CEO, EPSRC, said: “The Centre will continue its leading role within the UK by increasing the lifetime of ageing infrastructure, making it more resilient, and making construction processes more efficient by using smart sensing technology. This collaborative research between academia and industry will increase the UK’s competitiveness, lead to savings quantified in millions of pounds and provide technology that can be exported for UK based companies.”

Kevin Baughan, Director of Technology and Innovation at Innovate UK said: “The work of CSIC has helped to demonstrate the value of smart infrastructure to the construction industry, and this is reflected in the recognition of innovation at the heart of the future plans of the construction leadership council. By extending funding for a further five years, we underline our support for their commitment to raise the commercialisation bar even higher. This will help companies of all sizes grow through leveraging the excellent UK science base in smart infrastructure.”

UKCRIC is a collaboration of 14 UK universities which aims to provide a knowledge base to ensure the long-term functioning of the UK’s transport systems, energy systems, clean water supplies, waste management, flood defences and the development of SMART infrastructures.

Outside national security and medicine, UKCRIC will be one of the largest collaborative research projects in the UK. Current national and international partners include: Bristol City Council, Network Rail, Mott MacDonald, Buro Happold, Atkins, National Grid, DfT, EDF and Thames Water, with many more partners to follow. In order to tap further into the UK’s expertise and creativity UKCRIC’s founding core of 14 universities will be expanded over the coming years.

Cambridge will receive funding through UKCRIC which will be used to support research in the application of advanced sensor technologies to the monitoring of the UK’s existing and future infrastructure, in order to protect and maintain it.

UKCRIC programmes will integrate research on infrastructure needs, utilisation and performance through experiments, analysis, living labs and modelling. This will provide a new combination of decision support tools to inform infrastructure operators, planners, financiers, regulators, cities, and government on the optimisation of infrastructure capacity, performance and investment.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

f

How To Escape a Black Hole

How to escape a black hole

 

source: www.cam.ac.uk

An international team of astrophysicists, including researchers from the University of Cambridge, has observed a new way for gas to escape the gravitational pull of a supermassive black hole.

These jets are a unique tool for probing supermassive black holes

Morgan Fraser

The results, published in the journalScience, are based on new radio observations tracking a star as it gets torn apart by a black hole. Such violent events yield a burst of light which is produced as the bits and pieces of the star fall into the black hole. For the first time, the researchers were able to show that this burst of light is followed by a radio signal from the matter that was able to escape the black hole by travelling away in a jetted outflow at nearly the speed of light.

The discovery of the jet was made possible by a rapid observational response after the stellar disruption (known as ASAS-SN-14li) was announced in December 2014. The radio data was taken by the by the 4 PI SKY team at Oxford, using the Arcminute Microkelvin Imager Large Array located in Cambridge.

“Previous efforts to find evidence for these jets, including my own, were late to the game,” said Sjoert van Velzen of Johns Hopkins University, the study’s lead author. Co-author Nicholas Stone added that “even after they got to the game, these earlier attempts were observing from the bleachers, while we were the first to get front row seats.”

In this branch of astronomy, the ‘front row’ means a distance of 300 million light years, while previous observations were based on events at occurring least three times as far away.

Jets are often observed in association with black holes, but their launch mechanism remains a major unsolved problem in astrophysics. Most supermassive black hole are fed a steady diet of gas, leading to jets that live for millions of years and change little on a human timescale. However, the newly discovered jet behaved very differently: the observations show that following a brief injection of energy, it produced short but spectacular radio fireworks.

The observed jet was anticipated by the so-called scale-invariant model of accretion, also known as the Matryoshka-doll theory of astrophysics. It predicts that all compact astrophysical objects (white dwarfs, neutron stars, or black holes) that accrete matter behave and look the same after a simple correction based on solely the mass of the object. In other words, the larger Matryoshka doll (a supermassive black hole) is just a scaled-up version of the smaller doll (a neutron star). Since neutron stars consistently produce radio-emitting jets when they are supplied with a large amount of gas, the theory predicts that supermassive black holes should do the same when they swallow a star.

“I always liked the elegant nature of the scale-invariant theory, but previous observations never found evidence for the new type of jet it predicted,” said van Velzen. “Our new findings suggest that this new type of jet could indeed be common and previous observations were simply not sensitive enough to detect them.”

“These jets are a unique tool for probing supermassive black holes,” said co-author Dr Morgan Fraser of Cambridge’s Institute of Astronomy. “While black holes themselves do not emit light, by observing how a star is torn apart as it falls in we can indirectly study the sleeping monster at the heart of a galaxy.”

The study hypothesises that every stellar disruption leads to a radio flare similar to the one just discovered. Ongoing surveys such as the Gaia Alerts project, led by the University of Cambridge will find many more of these rare events.

“Gaia has exceptionally sharp eyes, and is ideally suited to find events like this, which occur in the very centres of galaxies,” said co-author Dr Heather Campbell, also from Cambridge’s Institute of Astronomy. “Finding more of these rare events may further our understanding of the processes that allow black holes to launch such spectacular outflows.”

Reference:
Van Velzen, S. et. al. ‘A radio jet from the optical and X-ray bright stellar tidal disruption flare ASASSN-14li.’ Science (2015). DOI: 10.1126/science.aad1182

Adapted from a Johns Hopkins press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/how-to-escape-a-black-hole#sthash.IHxrgGvX.dpuf

Two-Thirds of Studies On ‘Psychosocial’ Treatments Fail To Declare Conflicts of Interest

Two-thirds of studies on ‘psychosocial’ treatments fail to declare conflicts of interest

source: www.cam.ac.uk

The creators of commercially sold counselling programmes increasingly profit from public health services across the world. However, a new study into the evidence basis for some of the market leaders reveals that serious conflicts of interest across the majority of the research go habitually undisclosed.

Policy makers in public health have a right to expect transparency about conflicts of interest in academic research

Manuel Eisner

Health services in many countries increasingly rely on prescribed ‘psychosocial interventions’: treatments that use counselling techniques to tackle mental health issues, behavioural problems such as substance abuse, and assist parents with new or troubled children.

These highly-regarded therapeutic and educational programmes, devised by senior academics and practitioners, are sold commercially to public health services across the world on the basis that they are effective interventions for people in need of support – with the evidence to back them up.

However, the first study to investigate conflicts of interest in the published evidence for intervention treatments has revealed that the majority of academic studies which assert evidence of effectiveness list authors who profit from the distribution of these programmes, yet few declare a conflict of interest.

In fact, the new research shows that as many as two-thirds of the studies that list an author who financially benefits from sales of said treatment programmes declare no conflict of interest whatsoever.

While major steps have been taken to counter research bias in other fields such as pharmaceuticals, the new study’s authors say that hugely influential psychosocial treatments suffer a distinct lack of transparency from academics that both publish research on treatment effectiveness and stand to gain significantly from any positive findings.

They write that as commercial psychosocial treatments – many of which cost hundreds, even thousands, of dollars per participant – continue to gain traction with national public health services, it is important that “systems for effective transparency are implemented” to ensure clinical commissioning bodies are aware of potential research biases. The findings are published today in the journal PLOS ONE.

“Contrary to some, I have no problem with introducing commercial programmes into a national health service if decision makers and trusts come to the conclusion that a commercially disseminated treatment is more effective than their current psychosocial offerings, but this must be based on fair and transparent evidence,” said the study’s lead author Professor Manuel Eisner, from Cambridge’s Institute of Criminology.

“What you don’t want to see is an intervention system that remains as effective, or becomes less effective, despite buying in expensive programmes, because you have a public goods service competing with research that has a commercial interest to publish overly optimistic findings,” Eisner said.

“Policy makers in public health have a right to expect transparency about conflicts of interest in academic research.”

Four internationally disseminated psychosocial interventions – described by Eisner as “market leaders” – were examined: the Positive Parenting Programme (or Triple P); theNurse-Family Partnership; the parenting and social skills programme Incredible Years; the Multi-Systemic Therapy intervention for youth offenders.

The researchers inspected all articles published in academic journals between 2008 and 2014 on these interventions that were co-authored by at least one lead developer of the programme – a total of 136 studies.

Two journal editors refused consent to be included in the research, leaving 134 studies. Of all these studies, researchers found 92 of them – equalling 71% – to have absent, incomplete or partly misleading conflict of interest disclosures.

The research team contacted journal editors about the 92 published studies on the effectiveness of one of these four commercial psychosocial interventions co-authored by a primary developer of the self-same therapy, yet listed no conflict of interest, or, in the case of a few, an incomplete one.

This led to 65 of the studies being amended with an ‘erratum’, or correction. In 16 cases, the journal editors admitted “mishandling” a disclosure, resulting in the lack of a conflict of interest statement.

In the remaining 49 cases, the journal editors contacted the study’s authors seeking clarification. In every case the authors submitted a new or revised conflict of interest. Eisner and colleagues write that the “substantial variability in disclosure rates suggests that much responsibility seems to lie with the authors”.

The most common reason given by those journals that did not issue a correction was that they did not have a conflict of interest policy in place at the time of the published study’s submission.

While the overall rate of adequate disclosures in clear cases of conflict of interest was less than a third, just 32%, the rates for the four programmes varied significantly. The lowest rate of disclosures was found in academic studies on the Triple P programme, at just 11%.

Triple P is a standardised system of parenting support interventions based on cognitive-behavioural therapy. Initially developed by Professor Matthew Sanders at the University of Queensland, Triple P has sold around seven million copies of its standard programme across 25 countries since it began commercial operations in 1996, with over 62,000 licensed providers – mainly trained psychologists.

In 2001, Queensland ‘spun out’ the licencing contract into a private company, the royalties from which are distributed between three groups of beneficiaries: Queensland University itself, Prof Sanders’ Parenting and Family Support Centre (also at Queensland), and the authors of Triple P.

Despite being one of the most widely evaluated parenting programmes worldwide, the evidence for the success of Triple P is controversial, say the researchers.

Several analyses of Triple P – including those by Triple P authors with previously undeclared conflicts of interest – show positive effects. However, at least one independent systematic review cited in the new PLOS ONE study found “no convincing evidence” that the Triple P has any positive effects in the long run.

“Researchers with a conflict of interest should not be presumed to conduct less valid scholarship, and transparency doesn’t necessarily improve the quality of research, but it does make a difference to how those findings are assessed,” said Eisner.

In the Journal of Child and Family Studies in January 2015, Triple P creator Prof Sanders wrote that “[p]artly as a result of these types of criticisms” his research group had “undertaken a comprehensive review of our own quality assurance practices”.

Added Eisner: “The development of standardised, evidence-based programmes such as Triple P is absolutely the right thing to do. If we have comparable interventions providing an evidence base then it promotes innovation and stops us running around in circles. But we need to be able to trust the findings, and that requires transparency when it comes to conflicts of interest.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Moonlighting Molecules: Finding New Uses For Old Enzymes

Moonlighting molecules: finding new uses for old enzymes

source: www.cam.ac.uk

A collaboration between the University of Cambridge and MedImmune, the global biologics research and development arm of AstraZeneca, has led researchers to identify a potentially significant new application for a well-known human enzyme, which may have implications for treating respiratory diseases such as asthma.

MMP8 is well-known to biochemists and we all thought we understood its function, but it’s clear that this – and probably many other enzymes – ‘moonlight’ and have several functions within the body

Florian Hollfelder

Enzymes are biological catalysts – molecules that speed up chemical reactions within living materials. Many enzymes are already well characterised and their functions fairly well understood. For example, the enzyme known as MMP8 is present in the connective tissue of most mammals, where it breaks the chemical bonds found in collagen.

In pre-clinical research published in the journal Chemistry & Biology, Dr Florian Hollfelder from the Department of Biochemistry at Cambridge and Dr Lutz Jermutus,Senior Director, Research and Development at MedImmune, led a study to map a list of human enzymes (proteases) against potential protein drug targets.

Using automation technology at MedImmune, the team then tested each of the enzymes against each target protein in turn, allowing them to identify a significant number of so-far unknown interactions.

Of particular interest was how MMP8 was able to disable a molecule known as IL-13, which is known to play an important role in several inflammatory diseases such as asthma and dermatitis. The researchers believe this may be a previously-unknown way in which the body regulates the action of IL-13, preventing these diseases in the majority of individuals. If so, it could provide an interesting target for new drugs against these common diseases.

“MMP8 is well-known to biochemists and we all thought we understood its function, but it’s clear that this – and probably many other enzymes – ‘moonlight’ and have several functions within the body,” explains Dr Hollfelder. “Because the enzyme already had a ‘name’ and a function, nobody thought to see if it had a promiscuous side.”

Designing new enzymes has proven an extremely difficult technical challenge, hence the drive to find new uses for previously ‘understood’ enzymes. By focusing on human proteases, rather than bacterial proteases – which are actually easier to source – the researchers are confident that their research will be far more applicable to drug discovery.

“Our approach is new: we ‘recycle’ known enzymes and ask whether they can do other things than the ones they are known for,” adds Dr Jermutus. “In fact, we believe we have found other enzymes that could be similarly deployed against other disease-causing proteins, and this approach, if expanded, could provide further leads for new drugs.”

Commenting on the benefits of the collaboration with industry, Dr Hollfelder adds: “Without MedImmune, our work would have stopped after seeing and characterising the interactions. The additional extension to cell and mouse models would have been inconceivable in my basic science group.”

Reference
Urbach, C et al. Combinatorial Screening Identifies Novel Promiscuous Matrix Metalloproteinase Activities that Lead to Inhibition of the Therapeutic Target IL-13. Chemistry & Biology; 19 Nov 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/moonlighting-molecules-finding-new-uses-for-old-enzymes#sthash.riMShKgt.dpuf

Stored Fat Fights Against The Body’s Attempts To Lose Weight

Stored fat fights against the body’s attempts to lose weight

source: www.cam.ac.uk

The fatter we are, the more our body appears to produce a protein that inhibits our ability to burn fat, suggests new research published in the journal Nature Communications. The findings may have implications for the treatment of obesity and other metabolic diseases.

Our discovery may help explain why overweight individuals find it incredibly hard to lose weight. Their stored fat is actively fighting against their efforts to burn it off at the molecular level

Andrew Whittle

Most of the fat cells in the body act to store excess energy and release it when needed but some types of fat cells, known as brown adipocytes, function primarily for a process known as thermogenesis, which generates heat to keep us warm. However, an international team of researchers from the Wellcome Trust-Medical Research Council Institute of Metabolic Sciences at the University of Cambridge, UK, and Toho University, Japan, have shown that a protein found in the body, known as sLR11, acts to suppress this process.

Researchers investigated why mice that lacked the gene for the production of this protein were far more resistant to weight gain. All mice – and, in fact, humans – increase their metabolic rate slightly when switched from a lower calorie diet to a higher calorie diet, but mice lacking the gene responded with a much greater increase, meaning that they were able to burn calories faster.

Further examinations revealed that in these mice, genes normally associated with brown adipose tissue were more active in white adipose tissue (which normally stores fat for energy release). In line with this observation, the mice themselves were indeed more thermogenic and had increased energy expenditure, particularly following high fat diet feeding.

The researchers were able to show that sLR11 binds to specific receptors on fat cells – in the same way that a key fits into a lock – to inhibit their ability to activate thermogenesis. In effect, sLR11 acts as a signal to increase the efficiency of fat to store energy and prevents excessive energy loss through unrestricted thermogenesis.

When the researchers examined levels of sLR11 in humans, they found that levels of the protein circulating in the blood correlated with total fat mass – in other words, the greater the levels of the protein, the higher the total fat mass. In addition, when obese patients underwent bariatric surgery, their degree of postoperative weight loss was directly proportional to the reduction in their sLR11 levels, suggesting that sLR11 is produced by fat cells.

In their paper the authors suggest that sLR11 helps fat cells resist burning too much fat during ‘spikes’ in other metabolic signals following large meals or short term drops in temperature. This in turn makes adipose tissue more effective at storing energy over long periods of time.

There is growing interest in targeting thermogenesis with drugs in order to treat obesity, diabetes and other associated conditions such as heart disease. This is because it offers a mechanism for disposing of excess fat in a relatively safe manner. A number of molecules have already been identified that can increase thermogenesis and/or the number of fat cells capable of thermogenesis. However to date there have been very few molecules identified that can decrease thermogenesis.

These findings shed light on one of the mechanisms that the body employs to hold onto stored energy, where sLR11 levels increase in line with the amount of stored fat and act to prevent it being ‘wasted’ for thermogenesis.

Dr Andrew Whittle, joint first author, said: “Our discovery may help explain why overweight individuals find it incredibly hard to lose weight. Their stored fat is actively fighting against their efforts to burn it off at the molecular level.”

Professor Toni Vidal-Puig, who led the team, added: “We have found an important mechanism that could be targeted not just to help increase people’s ability to burn fat, but also help people with conditions where saving energy is important such as anorexia nervosa.”

Jeremy Pearson, Associate Medical Director at the British Heart Foundation (BHF), which helped fund the research, said: “This research could stimulate the development of new drugs that either help reduce obesity, by blocking the action of this protein, or control weight loss by mimicking its action. Based on this promising discovery, we look forward to the Cambridge team’s future findings.

“But an effective medicine to treat obesity, which safely manages weight loss is still some way off. In the meantime people can find advice on healthy ways to lose weight and boost their heart healthy on the BHF website.”

The study was part-funded in part by the British Heart Foundation, the Wellcome Trust, the Medical Research Council and the Biotechnology and Biological Sciences Research Council.

Reference
Whittle, AJ, Jiang, M, et al. Soluble LR11/SorLA represses thermogenesis in adipose tissue and correlates with BMI in humans. Nature Communications; 20 November 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/stored-fat-fights-against-the-bodys-attempts-to-lose-weight#sthash.sBPa301J.dpuf

Sensor Technology Firm Snapped Up

SENSOR TECHNOLOGY FIRM SNAPPED UP

20th Nov 2015 Central and East

Sensor technology firm snapped up

source: www.insidermedia.com

A sensor technology business in Cambridge has been snapped up by a subsidiary of US-owned infrastructure company Sensus in a deal worked on by law firm Mills & Reeve.

Sentec, based in Milton Road, is well-known for developing sensor technology and electronics for utilities, meters and appliance manufacturers. The company will maintain its independence, team and operations in Cambridge, and continue “business as normal”.

The sale comes after a ten-year working relationship between the two companies, during which Sensus helped develop some of the key technologies used in Sentec’s water and electric meters.

“This change of ownership will enable Sentec to further invest and develop its IP portfolio and resources to meet the increasing requirements of both companies’ customers as they look to exploit the full potential of connected devices and data,” said Sentec chief executive Chris Shelley (pictured).

“Sensus already has a strong commitment to invest in R&D, this tie-up will further bolster that commitment by bringing the complementary capabilities of both companies closer together – Sentec will function as a centre of excellence, continuing to provide its current range of services to its, and Sensus’, existing and future customers.”

Mills & Reeve corporate partner Anthony McGurk added: “The outlook of these two technology firms are well aligned and we are pleased to have been able to help another Cambridge business realise their potential. The innovative technology coming out of the region is having a real impact globally which this sale clearly demonstrates.”

The financial value of the transaction was undisclosed.

More or Less Ethical

More or less ethical

source: www.cam.ac.uk

The ethics of a person’s negotiating tactics may differ according to the nationality of the other party to the negotiation, according to a new study.

Business is increasingly global, so ethical concerns are becoming more important in terms of cross-national business and negotiations

David De Cremer

Do the ethics of a person’s negotiating tactics differ when they negotiate with someone from a different country? A new study co-authored at University of Cambridge Judge Business School suggests that they do.

While some prior studies have looked at the relative negotiating ethics of different nationalities, the new study, entitled “How ethically would Americans and Chinese negotiate? The effect of intra-cultural versus inter-cultural negotiations”, published in the Journal of Business Ethics looks at a significant new factor: it finds that the nationality of the counterparty to negotiations can make people prefer the use of more or less ethical strategies, particularly in areas such as false promises and inappropriate information gathering.

“Business is increasingly global, so ethical concerns are becoming more important in terms of cross-national business and negotiations,” said co-author David De Cremer, KPMG Professor of Management Studies at Cambridge Judge. “This study shows that the other party’s nationality can affect the ethics of negotiating tactics, and this has important implications.”

The study is co-authored by Yu Yang of ShanghaiTech University, David De Cremer of Cambridge Judge Business School, and Chao Wang of the University of Illinois.

The study looks specifically at negotiations between Americans and Chinese, and doesn’t compile data involving other nationalities – but it suggests that the findings are not restricted to negotiations between US and Chinese nationals.

“Our current analysis suggests that people may change their use of ethically questionable tactics when they negotiate with someone from a different country,” the study says. “In negotiations, people adopt different models of what is ethically acceptable for themselves in intra-cultural versus inter-cultural situations.”

Specifically, the study found that American participants were more likely to use “ethically questionable” tactics in negotiations with Chinese (particularly related to dubious information gathering and false promises) than in negotiations with fellow Americans; for their part, Chinese participants were less likely to use ethically questionable tactics in negotiations with Americans (particularly related to false promises and “attacking the opponent’s network,” such as attempting to get the counterparty fired so a new person will take their place) than in intra-cultural negotiations with other Chinese.

“The US and China are currently the two largest economies in the world,” the study says. “Given the importance and complexity of this bilateral relationship, we must address how negotiations in such circumstances are shaped, particularly with respect to the norms and ethics being used when the representatives of both countries approach each other.”

The study is based on 389 American and 421 Chinese participants, all over age 22, with the vast majority employed and with at least some college education.

Participants were outlined a scenario: “You are the lead negotiator for a company that manufactures heavy equipment,” and are about to negotiate a deal to sell expensive excavators; the market is very competitive and your company has not met recent targets, and “if this sale is not secured your company will incur a loss.”

Each person works for a company located either in the US state of Illinois or in Hunan, China; the only variable is the counterparty (and their presumed nationality), who is located either “nearby” in your own country or “far away” (in the US or China), named either “Justin Adams” or “Jia Liu.”

Participants were asked, on a scale of one to seven, their likelihood of using 16 “ethically questionable” (in various degrees) negotiation strategies. In five broad categories, these strategies comprise false promises, misrepresentation to strengthen negotiating position, inappropriate information gathering about the counterparty’s negotiating position, attacking the opponent’s network, and “traditional” competitive bargaining such as inflated opening demands.

The study then calculated participants’ likelihood of overall use of ethically questionable negotiation tactics, as well as a breakdown by category.

“American participants were significantly more likely to use ethically questionable negotiation tactics in inter-cultural negotiations (Mean 3.00) with Chinese counterparts than in intra-cultural negotiations (Mean 2.75) with American counterparts. By contrast, Chinese participants were marginally less likely to use such tactics in inter-cultural negotiations (Mean 3.92) with American counterparts than in intra-cultural negotiations (Mean 4.06) with Chinese counterparts.”

The study concludes: “As current business relationships are increasingly built on a global level, ethical concerns will become an even more important issue in future cross-national business negotiations. As such, we strongly believe that a more nuanced understanding of ethical practices in different countries needs to be developed.”

Reference:
Yu Yang et. al. ‘How ethically would Americans and Chinese negotiate? The effect of intra-cultural versus inter-cultural negotiations‘. Journal of Business Ethics (2015). DOI: 10.1007/s10551-015-2863-2

Originally published on the Cambridge Judge Business School website. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/more-or-less-ethical#sthash.Cyz9UtBg.dpuf

Evolution Website Sets Out To Tackle Great Scientific Unknowns

Evolution website sets out to tackle great scientific unknowns

source: www.cam.ac.uk

Ever wondered if a fly can ride a bicycle, or whether you could survive only on water? A new website on evolution, created by Cambridge scientists and featuring contributions from luminaries including Sir David Attenborough, has some intriguing answers.

Like all the sciences, evolution is constantly, well, evolving. New insights and unexpected discoveries combine with seeing old things in a completely new light. It is active, dynamic, changing and unpredictable. We wanted to create a website that captures the excitement and thrill of that exploration.

Simon Conway-Morris

Are there actually Martians out there? Could life survive in boiling water? And more importantly, what is your dog really thinking?

If these are the sort of questions that keep you awake at night, then help is finally at hand, in the form of a new website created by a team of scientists at Cambridge and featuring contributions from a host of leading academics.

Named Forty Two (after Douglas Adams’ famously cryptic solution to the meaning of life), the site is an online resource dedicated to the subject of evolution, and includes video interviews in which researchers including Sir David Attenborough, Simon Conway Morris, Eugene Koonin, and Carenza Lewis offer their views on topics ranging from the nature of evolution itself, to the future of life as we know it.

Aimed at general readers and, in particular, young people who are just starting to get into science, its aim is to provide an innovative and authoritative source of information about evolution on the web.

Its creators also hope to demonstrate that evolution is a subject that is, in itself, evolving. To prove this, the site uses the study of evolution to attempt to answer a host of knotty problems drawn from the fringes of current scientific understanding – questions such as “Can you have blue blood?”, “Do insects copulate with flowers?” and “Can you see heat?

The site was created by a team led by Simon Conway Morris, Professor of Evolutionary Palaeobiology and a Fellow of St John’s College at the University of Cambridge. “Evolution is true, and if it didn’t happen, we wouldn’t be here,” he said. “Like all the sciences, evolution is constantly, well, evolving. New insights and unexpected discoveries combine with seeing old things in a completely new light. It is active, dynamic, changing and unpredictable. We wanted to create a website that captures the excitement and thrill of that exploration.”

The site features a unique video archive that collects the thoughts of leading scientists around the world. The most familiar, Sir David Attenborough, is, for example, captured reflecting with troubling pessimism about the future of the planet, in response to the question: “Are you optimistic about the human species?” “The truthful answer is that I am not,” he replies. “It seems to me almost inevitable that things are going to get worse before they get better… and the only way that we can stop that is by reducing carbon emissions very, very quickly indeed.”

Around that archive, the website’s designers have constructed a living database of information about evolutionary studies that illustrates the scope and scale of the scientific discussion that Darwin brought to the fore of public debate more than 150 years ago.

Dr Victoria Ling, from the University’s Department of Earth Sciences, said: “When you type ‘evolution’ into Google you get a lot of information, not all of which is very reliable, but even the sources that are reliable can inadvertently give the impression that evolution was ‘solved’ with Darwin. In fact, evolution remains a vibrant area of research and there’s an awful lot left to learn. We wanted to produce a site which showcases that ongoing discussion, one which has plenty of serious content, but also a strong sense of fun.”

The core material is divided into three main subject areas: “Here And Now”, for topics on which the scientific community has reached a rough consensus; “Near Horizons”, for nagging questions that are hotly debated; and “Far Horizons”, for really big issues that sit at the edge of current knowledge.

Users can also play a careers game that takes a light-hearted look at some of the real tasks real scientists can end up doing in the course of their work. This is personalised to the user’s interests based upon their answers to the questions “What am I into?” “How does my mind work?” and “Where is my focus?”. Historians can meanwhile delve into a selection of potted biographies of scientific pioneers, ranging from familiar figures such as Darwin himself, to lesser-sung heroes and heroines, such as the 19th-century fossil collector Mary Anning, and the American cytogeneticist Barbara McClintock.

Perhaps, however, the site’s most revelatory feature is its selection of Q&A topics so bizarre that they periodically sound more like something out of science fiction, as Douglas Adams himself might have hoped.

Who knew, for example, that rattlesnakes really can “see” heat, in a manner of speaking, thanks to evolved pits close to the front of their faces that relay information about thermal contrasts to the same part of the brain that registers information from the eyes? Or that dogs, which have evolved alertness to human gestures but appear to lack self-awareness may simply be part of a greater consciousness that we ourselves have yet to fathom?

As for the Martians, the answer remains similarly unclear, but Conway-Morris suggests that we might be looking at the wrong planet for alien life. One theory has it that Venus could technically be inhabited by aerial microbes – something equivalent to the extremophiles found on Earth – ekeing out their existence amid the sulphuric clouds shrouding the planet.

“Even today maybe Venusian aerial life wafts its way across the 25 million miles or so of space that separate us,” Conway Morris writes. “Unlikely? Most certainly. Impossible? Perhaps not.”

For more, visit: http://www.42evolution.org/

Additional images taken from www.42evolution.org. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/evolution-website-sets-out-to-tackle-great-scientific-unknowns#sthash.JAwJzTi9.dpuf

‘Fourth Strand’ of European Ancestry Originated With Hunter-Gatherers Isolated By Ice Age

‘Fourth strand’ of European ancestry originated with hunter-gatherers isolated by Ice Age

source: www.cam.ac.uk

Populations of hunter-gatherers weathered Ice Age in apparent isolation in Caucasus mountain region for millennia, later mixing with other ancestral populations, from which emerged the Yamnaya culture that would bring this Caucasus hunter-gatherer lineage to Western Europe.

This Caucasus pocket is the fourth major strand of ancient European ancestry, one that we were unaware of until now

Andrea Manica

The first sequencing of ancient genomes extracted from human remains that date back to the Late Upper Palaeolithic period over 13,000 years ago has revealed a previously unknown “fourth strand” of ancient European ancestry.

This new lineage stems from populations of hunter-gatherers that split from western hunter-gatherers shortly after the ‘out of Africa’ expansion some 45,000 years ago and went on to settle in the Caucasus region, where southern Russia meets Georgia today.

Here these hunter-gatherers largely remained for millennia, becoming increasingly isolated as the Ice Age culminated in the last ‘Glacial Maximum’ some 25,000 years ago, which they weathered in the relative shelter of the Caucasus mountains until eventual thawing allowed movement and brought them into contact with other populations, likely from further east.

This led to a genetic mixture that resulted in the Yamnaya culture: horse-borne Steppe herders that swept into Western Europe around 5,000 years ago, arguably heralding the start of the Bronze Age and bringing with them metallurgy and animal herding skills, along with the Caucasus hunter-gatherer strand of ancestral DNA – now present in almost all populations from the European continent.

The research was conducted by an international team led by scientists from Cambridge University, Trinity College Dublin and University College Dublin. The findings are published today in the journal Nature Communications.

“The question of where the Yamnaya come from has been something of a mystery up to now,” said one of the lead senior authors Dr Andrea Manica, from Cambridge’s Department of Zoology.

“We can now answer that as we’ve found that their genetic make-up is a mix of Eastern European hunter-gatherers and a population from this pocket of Caucasus hunter-gatherers who weathered much of the last Ice Age in apparent isolation. This Caucasus pocket is the fourth major strand of ancient European ancestry, one that we were unaware of until now,” he said

Professor Daniel Bradley, leader of the Trinity team, said: “This is a major new piece in the human ancestry jigsaw, the influence of which is now present within almost all populations from the European continent and many beyond.”

Previously, ancient Eurasian genomes had revealed three ancestral populations that contributed to contemporary Europeans in varying degrees, says Manica.

Following the ‘out of Africa’ expansion, some hunter-gatherer populations migrated north-west, eventually colonising much of Europe from Spain to Hungary, while other populations settled around the eastern Mediterranean and Levant, where they would develop agriculture around 10,000 years ago. These early farmers then expanded into and colonised Europe.

Lastly, at the start of the Bronze Age around 5,000 years ago, there was a wave of migration from central Eurasia into Western Europe – the Yamnaya.

However, the sequencing of ancient DNA recovered from two separate burials in Western Georgia – one over 13,000 years old, the other almost 10,000 years old – has enabled scientists to reveal that the Yamnaya owed half their ancestry to previously unknown and genetically distinct hunter-gatherer sources: the fourth strand.

By reading the DNA, the researchers were able to show that the lineage of this fourth Caucasus hunter-gatherer strand diverged from the western hunter-gatherers just after the expansion of anatomically modern humans into Europe from Africa.

The Caucasus hunter-gatherer genome showed a continued mixture with the ancestors of the early farmers in the Levant area, which Manica says makes sense given the relative proximity. This ends, however, around 25,000 years ago – just before the time of the last glacial maximum, or peak Ice Age.

At this point, Caucasus hunter-gatherer populations shrink as the genes homogenise, a sign of breeding between those with increasingly similar DNA. This doesn’t change for thousands of years as these populations remain in apparent isolation in the shelter of the mountains – possibly cut off from other major ancestral populations for as long as 15,000 years – until migrations began again as the Glacial Maximum recedes, and the Yamnaya culture ultimately emerges. 

“We knew that the Yamnaya had this big genetic component that we couldn’t place, and we can now see it was this ancient lineage hiding in the Caucasus during the last Ice Age,” said Manica.

While the Caucasus hunter-gatherer ancestry would eventually be carried west by the Yamnaya, the researchers found it also had a significant influence further east. A similar population must have migrated into South Asia at some point, says Eppie Jones, a PhD student from Trinity College who is the first author of the paper.

“India is a complete mix of Asian and European genetic components. The Caucasus hunter-gatherer ancestry is the best match we’ve found for the European genetic component found right across modern Indian populations,” Jones said. Researchers say this strand of ancestry may have flowed into the region with the bringers of Indo-Aryan languages.

The widespread nature of the Caucasus hunter-gatherer ancestry following its long isolation makes sense geographically, says Professor Ron Pinhasi, a lead senior author from University College Dublin. “The Caucasus region sits almost at a crossroads of the Eurasian landmass, with arguably the most sensible migration routes both west and east in the vicinity.”

He added: “The sequencing of genomes from this key region will have a major impact on the fields of palaeogeneomics and human evolution in Eurasia, as it bridges a major geographic gap in our knowledge.”

David Lordkipanidze, Director of the Georgian National Museum and co-author of the paper, said: “This is the first sequence from Georgia – I am sure soon we will get more palaeogenetic information from our rich collections of fossils.”

Inset image: the view from the Satsurblia cave in Western Georgia, where a human right temporal bone dating from over 13,000 years ago was discovered. DNA extracted from this bone was used in the new research.

Reference:
E.R. Jones et. al. ‘Upper Palaeolithic genomes reveal deep roots of modern Eurasians.’ Nature Communications (2015). DOI: 10.1038/ncomms9912


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/fourth-strand-of-european-ancestry-originated-with-hunter-gatherers-isolated-by-ice-age#sthash.ingVVJtK.dpuf

Climate Change Sentiment Could Hit Global Investment Portfolios in the Short Term

Climate change sentiment could hit global investment portfolios in the short term

source: www.cam.ac.uk

A new report by the University of Cambridge Institute for Sustainability Leadership (CISL) reveals that global investment portfolios could lose up to 45 per cent as a consequence of short-term shifts in climate change sentiment.

No investor is immune from the risks posed by climate change, even in the short run

Jake Reynolds

The report, “Unhedgeable Risk: How climate change sentiment impacts investment,” concluded that about half of this potential loss could be avoided through portfolio reallocation, while the other half is “unhedgeable”, meaning that investors cannot necessarily protect themselves from losses unless action on climate change is taken at a system level.

“This new research indicates that no investor is immune from the risks posed by climate change, even in the short run,” said Jake Reynolds, Director, Sustainable Economy at the Cambridge Institute for Sustainability Leadership. “However, it is surprisingly difficult to distinguish between risks that can be addressed by an individual investor through smart hedging strategies, and ones that are systemic and require much deeper transformations in the economy to deal with. That’s what this report attempts to do.”

While existing studies have analysed the direct, physical effects of climate change on long-term economic performance, this new report, commissioned by CISL and the Investment Leaders Group, looks at the short-term risks stemming from how investors react to climate-related information, from policy decisions and technology uptake, to market confidence and weather events.

Reynolds continued, “What’s new about this study is its focus on the potential short-term impacts which could surface at any time. Major events, such as the outcome of the upcoming United Nations climate talks in Paris in December, can send signals which drive market sentiment – sometimes slowly, sometimes rapidly – and this study allows us to model the implications.”

The study modelled the impact of three sentiment scenarios on four typical investment portfolios.

The scenarios tested were:

1. Two Degrees, limiting average temperature increase to two degrees Celsius (as recommended by the Intergovernmental Panel on Climate Change [IPCC]) and collectively making relatively good progress towards sustainability, and future socio-economic development goals.

2. Baseline, where past trends continue (i.e. the business-as-usual BAU scenario) and where there is no significant change in the willingness of governments to step up actions on climate change.

3. No Mitigation, oriented towards economic growth without any special consideration of environmental challenges, rather the hope that pursuing self-interest will allow adaptive responses to any climate change impacts as they arise.

The portfolio structures modelled were:

1. High Fixed Income, comprising 84 per cent fixed income, 12 per cent equity, four per cent cash; mimicking the strategies of insurance companies.

2. Conservative, comprising 60 per cent sovereign and corporate bonds, 40 per cent equity; mimicking certain pension funds.

3. Balanced, comprising 50 per cent equity, 47 per cent fixed income, three per cent commodities; mimicking certain pension funds.

4. Aggressive, comprising 60 per cent equity, 35 per cent fixed income, five per cent commodities; mimicking certain pension funds.

Each scenario was linked to a series of economic and market confidence factors used to explore macroeconomic effects within a global economic model. In turn these were cascaded down to portfolio level through an industry sector analysis. The factors included alternative levels of carbon taxation, fossil energy investment, green investment, energy and food prices, energy demand, market confidence, and housing prices.

The study found that shifts in climate change sentiment could cause global economic growth to reduce over a 5-10 year period in both the Two Degree and No Mitigation scenarios as a consequence of economic adjustment. In the longer-term, however, the study found that economic growth picks up most quickly along a Two Degrees (low carbon) pathway, with annual growth rates of 3.5 per cent not only exceeding the baseline (2.9 per cent), but significantly exceeding the No Mitigation scenario (2.0 per cent).

This is consistent with recent comments by the Governor of the Bank of England about the risk of “potentially huge” losses to financial markets due to climate change in the short term, and the “merit” of stress testing elements of the financial system to understand and deal with climate risks.

Urban Angehrn, Chief Investment Officer of Zurich Insurance Group and member of the Investment Leaders Group, echoed this view: “As an insurer we understand that the potential human impact and economic cost of extreme weather and climate events are vast. Multiplied by population growth, coastal migration and urbanisation, the threat seems even larger. We see it as our responsibility to help our customers and communities to build resilience against such events. As investors, the tools to help us translate that threat into investment decisions are – at present – limited. This report provides us with a meaningful basis to discuss investment strategies that tackle climate risk. It will help us go beyond the significant commitments that Zurich has already made.”

Under the Two Degrees scenario, the Aggressive portfolio suffers the largest loss in the short term, but it recovers relatively quickly and generates returns above and beyond the baseline projection levels by the end of the modelling period. In contrast, under a No Mitigation scenario, a

Conservative portfolio with a 40 per cent weighting to equities (typical of a pension fund) could suffer permanent losses of more than 25 per cent within five years after the shock is experienced.

“Far from being a lost cause, investors can ‘climate proof’ their investments to a significant extent by understanding how climate change sentiment could filter through to returns,” said Scott Kelly, research associate at the Centre for Risk Studies, University of Cambridge Judge Business School, and one of the authors of the report. “However, almost half the risk is “unhedgeable” in the sense that it cannot be addressed by individual investors. System-wide action is necessary to deal with this in the long-term interests of savers.”

The report offers a series of insights for investors, regulators and policy makers including:

  • Seeing climate change as a short-term financial risk as well as a long-term economic threat.
  • Recognising the value of “stress testing” investment portfolios for a wide range of sustainability risks (not only climate risks) to understand their financial impacts, and how to manage them.
  • Pinpointing areas of “unhedgeable” risk where system-wide action is required to address risks that cannot be escaped by individual investors.
  • The importance of using capital flows to improve the resilience and carbon profile of the asset base, especially in emerging markets.
  • Identifying significant gaps in knowledge where new research is required, including of an interdisciplinary nature.

Originally published on the CISL website.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/climate-change-sentiment-could-hit-global-investment-portfolios-in-the-short-term#sthash.seuYdSNr.dpuf

Ketchup and Traffic Jams: the Maths of Soft Matter

Ketchup and traffic jams: the maths of soft matter

source: www.cam.ac.uk

The class of materials known as soft matter – which includes everything from mayonnaise to molten plastic – is the subject of the inaugural lecture by Michael Cates, Cambridge’s Lucasian Professor of Mathematics.

Having now understood what’s going on in these active systems, we hope to design better versions that can be used to create a wide range of new materials

Michael Cates

Good things come to those who wait – according to a marketing slogan for Heinz ketchup from the 1980s. But why is the ketchup so difficult to get out of the bottle? The reason is that ketchup is in two minds: whether to pour like a liquid or stay put like a solid. It is one example of soft matter – a huge class of materials which behave in complex and nonlinear ways.

We interact with soft matter every day: toothpaste, chocolate, shampoo and mayonnaise are all examples, which can behave either as liquids or solids depending on the circumstances. Soft matter can also be found in laptop screens, advanced batteries, and in the processing of functional ceramics and plastic LEDs. Cambridge researchers have developed new mathematical models to describe why these materials behave the way they do, which could help improve them for both domestic and high-tech applications.

Soft matter is the focus of the inaugural lecture by Professor Michael Cates, who was elected as the University of Cambridge’s 19th Lucasian Professor of Mathematics earlier this year. His lecture, which will be held on Wednesday 4 November, will cover how mathematical models can explain how soft materials can suddenly convert from liquid-like to solid-like behaviour, through a process resembling an internal traffic jam.

Cates’ research aims to understand better why these materials behave as they do, allowing improved control for a range of future applications, including the design of entirely new materials with tailored properties.

In his lecture, Cates will discuss the ‘jamming’ behaviour of colloids and dense suspensions. Both are types of soft matter with an internal structure something like tiny ping-pong balls dispersed in a liquid. Recently, researchers have created ‘active’ colloids in which the ping-pong balls are self-propelled, like tiny rockets. When their propulsion is switched on, these particles form tight clusters, despite the fact that there are no attractive forces between them.

“The question in this case is what causes the clustering? More generally, how does the internal structure of various types of soft matter affect the way they behave?” said Cates. After considering other explanations – including the idea that the clusters arise by a process like the flocking of birds – Cates concluded that each cluster is effectively a sort of traffic jam.

As every driver knows, a smooth distribution of moving cars becomes unstable at high density, leading to the formation of traffic jams. These can be triggered by even a single driver lightly tapping the brakes, and the new mathematical model explains the spontaneous ‘clumping’ of active colloids in very similar terms.

“Having now understood what’s going on in these active systems, we hope to design better versions that can be used to create a wide range of new materials,” Cates said.

Cates and his colleagues have also looked at very dense suspensions, such as paints, molten chocolate or wet sand. Previous mathematical models have assumed that the particles in a dense suspension are hard and smooth, like ball bearings.

“The approximation of hard, smooth particles – though it has served us well for 25 years – does not predict the observed behaviour in these cases,” said Cates. “So we needed to figure out what physics was missing. And we’ve found the answer: a better description of friction between the particles.”

When a dense suspension flows in response to stress, the particles have to push past each other. So long as the stress is low, they easily slide past, with little friction between them. But when stress is increased, friction between the particles also increases. This smooth change in friction can trigger another jamming transition: the suspension suddenly gets much thicker when pushed too hard.

“In many dense suspensions, the aim is to maximise the amount of solids they contain without losing the ability to flow,” said Cates. “In paints, for example, this reduces both drying time and solvent vapour emissions. Now that we know how much friction matters, we can think of new ways to improve flow by reducing friction, so that we can pack more particles in. Allowing the particles to glide past each other by reducing friction is like solving the age-old problem of getting the ketchup out of the glass bottle.”

The Lucasian Professorship has an exceptionally long and distinguished history, established in 1663. Previous holders include Isaac Newton (1669-1702), and, more recently, Paul Dirac (1932-1969), James Lighthill (1969-1979), Stephen Hawking (1979-2009) and Michael Green (2009-2013).

Professor Cates’ lecture will take place at 5pm on Wednesday 4 November at the Department of Applied Mathematics and Theoretical Physics.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Bringing Ukraine to the Screen

Bringing Ukraine to the screen

Source: www.cam.ac.uk

Over the past eight years, the University of Cambridge has become Britain’s pre-eminent showcase for documentary and feature films from and about Ukraine.

Documentary cinema fosters an open dialogue about human rights and social justice in Ukraine and around the world.

Rory Finnin

Today and tomorrow (November 6/7), the Annual Cambridge Festival of Ukrainian Film once again offers UK audiences a unique opportunity to experience some of the best of Ukrainian cinema. Free and open to the public, the event is organised by Cambridge Ukrainian Studies, an academic centre in the Department of Slavonic Studies at Cambridge.

Since 2008 the Festival has premiered prize-winning new releases as well as provocative forgotten masterpieces; invigorated silent classics with live piano accompaniments; made world headlines with a documentary about Stalin’s man-made famine of 1932-33; and hosted contemporary Ukrainian filmmakers, film scholars, preservationists and musicians who have educated and engaged with well over a thousand attendees.

This year Cambridge Ukrainian Studies is partnering with the Docudays UA International Documentary Human Rights Film Festival to bring six powerful new documentaries to local audiences. DocuDays UA was launched in Kyiv in 2003 as a non-profit organisation dedicated to the development of documentary cinema and to the flourishing of democratic civil society in Ukraine.

Many of the films in the Festival programme confront the tumult of revolution and war in today’s Ukraine with an uncommon honesty, sensitivity and maturity. They avail the viewer of the perspectives of the volunteer doctor, the wounded veteran, the soldier preparing to leave home for war. Other films in the programme meditate upon the passing of generations in a Ukraine very far from today’s headlines: the village and countryside.

“We are very proud and very honoured to collaborate with DocuDays UA in this year’s Cambridge Film Festival of Ukrainian Film”, said Dr Rory Finnin, Head of the Department of Slavonic Studies and Director of the Cambridge Ukrainian Studies programme. “We share their passion for documentary cinema and their belief in its ability to foster an open dialogue about human rights and social justice in Ukraine and around the world.”

“For the Cambridge Festival of Ukrainian Film we have chosen both full-length and short documentaries produced during the last two years,” explained Darya Bassel, Docudays Programme Coordinator. “With these screenings we hope to bring Ukraine and its documentary scene closer to international audiences and to create space for a discussion of problems relevant not only for Ukraine but for the whole world.”

Admission to the Eighth Annual Cambridge Festival of Ukrainian Film on 6-7 November 2015 is free and open to the public, but online registration is required. The screenings of Maidan Is Everywhere; The Medic Leaves Last; Living Fire; Post Maidan; This Place We Call Home; and Twilight take place in the Winstanley Theatre of Trinity College, Cambridge. Wine receptions follow both the November 6 and 7 screenings.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/bringing-ukraine-to-the-screen#sthash.MbunO5D3.dpuf

Graphene Means Business – Two-Dimensional Material Moves From the Lab to the UK Factory Floor

Graphene means business – two-dimensional material moves from the lab to the UK factory floor

www.cam.ac.uk

A major showcase of companies developing new technologies from graphene and other two-dimensional materials took place this week at the Cambridge Graphene Centre.

Cambridge is very well-placed in the network of UK, European and global initiatives targeting the development of new products and devices based on graphene and related materials

Andrea Ferrari

More than 40 companies, mostly from the UK, are in Cambridge this week to demonstrate some of the new products being developed from graphene and other two-dimensional materials.

Graphene is a two-dimensional material made up of sheets of carbon atoms. With its combination of exceptional electrical, mechanical and thermal properties, graphene has the potential to revolutionise industries ranging from healthcare to electronics.

On Thursday, the Cambridge Graphene Technology Day – an exhibition of graphene-based technologies organised by the Cambridge Graphene Centre, together with its partner companies – took place, showcasing new products based on graphene and related two-dimensional materials.

Some of the examples of the products and prototypes on display included flexible displays, printed electronics, and graphene-based heaters, all of which have potential for consumer applications. Other examples included concrete and road surfacing incorporating graphene, which would mean lighter and stronger infrastructure, and roads that have to be resurfaced far less often, greatly lowering the costs to local governments.

“At the Cambridge Graphene Technology Day we saw several real examples of graphene making its way from the lab to the factory floor – creating jobs and growth for Cambridge and the UK,” said Professor Andrea Ferrari, Director of the Cambridge Graphene Centre and of the EPSRC Centre for Doctoral Training in Graphene Technology. “Cambridge is very well-placed in the network of UK, European and global initiatives targeting the development of new products and devices based on graphene and related materials.”

Cambridge has a long history of research and application into carbon-based materials, since the identification of the graphite structure in 1924, moving through to diamond, diamond-like carbon, conducting polymers, and carbon nanotubes, with a proven track-record in taking carbon research from the lab to the factory floor.

Cambridge is also one of the leading centres in graphene technology. Dr Krzysztof Koziol from the Department of Materials Science & Metallurgy sits on the management board of the EPSRC Centre for Doctoral Training in Graphene Technology. He is developing hybrid electrical wires made from copper and graphene in order to improve the amount of electric current they can carry, functional graphene heaters, anti-corrosion coatings, and graphene inks which can be used to draw printed circuit boards directly onto paper and other surfaces.

Koziol has established a spin-out company, Cambridge Nanosystems, which produces high volume amounts of graphene for industrial applications. The company, co-founded by recent Cambridge graduate Catharina Paulkner, has recently established a partnership with a major auto manufacturer to start developing graphene-based applications for cars.

Other researchers affiliated with the Cambridge Graphene Centre include Professor Clare Grey of the Department of Chemistry, who is part of the Cambridge Graphene Centre Management Board. She is incorporating graphene and related materials into next-generation batteries and has recently demonstrated a breakthrough in Lithium air batteries by exploiting graphene. Professor Mete Atature from the Department of Physics, is one of the supervisors of the Centre for Doctoral Training in Graphene Technology. He uses two-dimensional materials for research in quantum optics, including the possibility of a computer network based on quantum mechanics, which would be far more secure and more powerful than classical computers.

“The Cambridge Graphene Centre is a great addition to the Cambridge technology and academic cluster,” said Chuck Milligan, CEO of FlexEnable, which is developing technology for flexible displays and other electronic components. “We are proud to be a partner of the Centre and support its activities. Graphene and other two dimensional materials are very relevant to flexible electronics for displays and sensors, and we are passionate about taking technology from labs to the factory floor. Our unique manufacturing processes for flexible electronics, together with the exponential growth expected in the flexible display and Internet of Things sensor markets, provide enormous opportunity for this exciting class of materials. It is for this reason that today we placed in the Cambridge Graphene Centre Laboratories a semi-automatic, large area EVG Spray coater. This valuable tool, donated to the University, will be a good match between the area of research of solution processable graphene and Flexenable long term technological vision.”

FlexEnable is supporting efforts to scale the graphene technology for use in tomorrow’s factories. The company has donated a large area deposition machine to the University, which is used for depositing large amounts of graphene onto various substrates.

“The University is at the heart of the largest, most vibrant technology cluster in Europe,” said Professor Sir Leszek Borysiewicz, the University’s Vice-Chancellor. “Our many partnerships with industry support the continued economic success of the region and the UK more broadly, and the Cambridge Graphene Centre is an important part of that – working with industry to bring these promising materials to market.”

Professor David Cardwell, Head of the Cambridge Engineering Department, indicated the planned development in Cambridge of a scale-up centre, where research will be nurtured towards higher technology readiness levels in collaboration with UK industry. “The Cambridge Graphene Centre is a direct and obvious link to this scale-up initiative, which will offer even more exciting opportunities for industry university collaborations,” he said.

Among the many local companies with an interest in graphene technologies are FlexEnable, the R&D arm of global telecommunications firm Nokia, printed electronics pioneer Novalia, Cambridge Nanosystems, Cambridge Graphene, and Aixtron, which specialises in the large-scale production of graphene powders, inks and films for a variety of applications.

Underpinning this commercial R&D effort in Cambridge and the East of England is public and private investment in the Cambridge Graphene Centre via the Graphene Flagship, part funded by the European Union. The flagship is a pan-European consortium, with a fast-growing number of industrial partners and associate members.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/graphene-means-business-two-dimensional-material-moves-from-the-lab-to-the-uk-factory-floor#sthash.wCxwzZVH.dpuf

First Evidence of ‘Ghost Particles’

First evidence of ‘ghost particles’

source: www.cam.ac.uk

Major international collaboration has seen its first neutrinos – so-called ‘ghost particles’ – in the experiment’s newly built detector.

This is an important step towards the much larger Deep Underground Neutrino Experiment (DUNE)

Mark Thomson

An international team of scientists at the MicroBooNE physics experiment in the US, including researchers from the University of Cambridge, detected their first neutrino candidates, which are also known as ‘ghost particles’. It represents a milestone for the project, involving years of hard work and a 40-foot-long particle detector that is filled with 170 tons of liquid argon.

Neutrinos are subatomic, almost weightless particles that only interact via gravity or nuclear decay. Because they don’t interact with light, they can’t be seen. Neutrinos carry no electric charge and travel through the universe almost entirely unaffected by natural forces. They are considered a fundamental building block of matter. The 2015 Nobel Prize in physics was awarded for neutrino oscillations, a phenomenon that is of great important to the field of elementary particle physics.

“It’s nine years since we proposed, designed, built, assembled and commissioned this experiment,” said Bonnie Fleming, MicroBooNE co-spokesperson and a professor of physics at Yale University. “That kind of investment makes seeing first neutrinos incredible.”

Following a 13-week shutdown for maintenance, Fermilab’s accelerator complex near Chicago delivered a proton beam on Thursday, which is used to make the neutrinos, to the laboratory’s experiments. After the beam was turned on, scientists analysed the data recorded by MicroBooNE’s particle detector to find evidence of its first neutrino interactions.

Scientists at the University of Cambridge have been working on advanced image reconstruction techniques that contributed to the ability to identify the rare neutrino interactions in the MicroBooNE data.

The MicroBooNE experiment aims to study how neutrinos interact and change within a distance of 500 meters. The detector will help scientists reconstruct the results of neutrino collisions as finely detailed, three-dimensional images. MicroBooNE findings also will be relevant for the forthcoming Deep Underground Neutrino Experiment (DUNE), which will examine neutrino transitions over longer distances.

“Future neutrino experiments will use this technology,” said Sam Zeller, Fermilab physicist and MicroBooNE co-spokesperson. “We’re learning a lot from this detector. It’s important not just for us, but for the whole physics community.”

“This is an important step towards the much larger Deep Underground Neutrino Experiment (DUNE)”, said Professor Mark Thomson of Cambridge’s Cavendish Laboratory, co-spokesperson of the DUNE collaboration and member of MicroBooNE. “It is the first time that fully automated pattern recognition software has been used to identify neutrino interactions from the complex images in a detector such as MicroBooNE and the proposed DUNE detector.”

Adapted from a Fermilab press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above

– See more at: http://www.cam.ac.uk/research/news/first-evidence-of-ghost-particles#sthash.lr5o0H6c.dpuf

Cambridge Chemists Make Breakthrough With “Ultimate” Battery Which Can Power a Car From London to Edinburgh

Cambridge chemists make breakthrough with “ultimate” battery which can power a car from London to Edinburgh

source: http://www.independent.co.uk/

Scientists have made a breakthrough at Cambridge University by solving issues related to a battery that, in theory, could enable a car to drive from London to Edinburgh on a single charge.

A research paper published in the journal Science details how the team at Cambridge University overcome obstacles in the development of lithium-air batteries. The batteries, touted as the “ultimate battery” theoretically have the ability to store ten times more energy than lithium-ion batteries.

But until now, unwanted chemical reactions and problems with efficiency associated with lithium-air batteries have plagued efforts by scientists to develop them.

The researchers at Cambridge are claiming to have solved a number of the issues and if the team’s laboratory experiment can be turned into a commercial product it will enable a car, on a single charge, to drive from London to Edinburgh.

electric-car-3379965.jpg
A driver demonstrates a miniature electric car, in 1985

Professor Clare Grey, one of the paper’s senior authors, said: “What we’ve achieved is a significant advance for this technology and suggests whole new areas for research – we haven’t solved all the problems inherent to this chemistry, but our results do show routes forward towards a practical device.”

But the report’s authors do warn that a practical lithium-air battery still remains at least a decade away – there are several practical challenges that need the batteries become a viable alternative to gasoline.

Prof Grey added: “While there are still plenty of fundamental studies that remain to be done, to iron out some of the mechanistic details, the current results are extremely exciting – we are still very much at the development stage, but we’ve shown that there are solutions to some of the tough problems associated with this technology.”

Breaking the Mould: Untangling the Jelly-Like Properties of Diseased Proteins

Breaking the mould: Untangling the jelly-like properties of diseased proteins

source: www.cam.ac.uk

Scientists at the University of Cambridge have identified a new property of essential proteins which, when it malfunctions, can cause the build up, or ‘aggregation’, of misshaped proteins and lead to serious diseases.

Our approach shows the importance of considering the mechanisms of diseases as not just biological, but also physical processes

Peter St George-Hyslop

A common characteristic of neurodegenerative diseases – such as Alzheimer’s, Parkinson’s and Huntington’s disease – is the build-up of ‘misfolded’ proteins, which cause irreversible damage to the brain. For example, Alzheimer’s disease sees the build-up of beta-amyloid ‘plaques’ and tau ‘tangles’.

In the case of some forms of motor neurone disease (also known as amyotrophic lateral sclerosis, or ALS) and frontotemporal dementia, it is the build up of ‘assemblies’ of misshapen FUS protein and several other RNA-binding proteins that is associated with disease. However, the assembly of these RNA binding proteins has several differences to conventional protein aggregates seen in Alzheimer’s disease and Parkinson’s disease and as a result, the significance of the build-up of these proteins and how it occurs has until now been unclear.

FUS is an RNA-binding protein, which has a number of important functions in regulating RNA transcription (the first step in DNA expression) and splicing in the nucleus of cells. FUS also has functions in the cytoplasm of cells involved in regulating the translation of RNA into proteins. There are several other similar RNA binding proteins: a common feature of all of them is that in addition to having domains to bind RNA they also have domains where the protein appears to be unfolded or unstructured.

In a study published today in the journal Neuron, scientists at the University of Cambridge examined FUS’s physical properties to demonstrate how the protein’s unfolded domain enables it to undergo reversible ‘phase transitions’. In other words, it can change back and forth from a fully soluble ‘monomer’ form into distinct localised accumulations that resemble liquid droplets and then further condense into jelly-like structures that are known as hydrogels. During these changes, the protein ‘assemblies’ capture and release RNA and other proteins. In essence this process allows cellular machinery for RNA transcription and translation to be condensed in high concentrations within restricted three-dimensional space without requiring a limiting membrane, thereby helping to easily regulate these vital cellular processes.

Using the nematode worm C. elegans as a model of ALS and frontotemporal dementia, the team was then able to also show that this process can become irreversible. Mutated FUS proteins cause the condensation process to go too far, forming thick gels that are unable to return to their soluble state. As a result, these irreversible gel-like assemblies trap other important proteins, preventing them carrying out their usual functions. One consequence is that it affects the synthesis of new proteins in nerve cell axons (the trunk of a nerve cell).

Importantly, the researchers also showed that by disrupting the formation of these irreversible assemblies (for example, by targeting with particular small molecules), it is possible to rescue the impaired motility and prolong the worm’s lifespan.

Like jelly on a plate

The behaviour of FUS can be likened to that of a jelly, explains Professor Peter St George Hyslop from the Cambridge Institute for Medical Research.

When first made, jelly is runny, like a liquid. As it cools the fridge, it begins to set, initially becoming slightly thicker than water, but still runny as the gelatin molecules forms into longer, fibre-like chains known as fibrils. If you dropped a droplet of this nearly-set jelly into water, it would (at least briefly) remain distinct from the surrounding water – a ‘liquid droplet’ within a liquid.

As the jelly cools further in the fridge, the gelatin fibres condense more, and it eventually becomes a firmly set jelly that can be flipped out of the mould onto a plate. This set jelly is a ‘hydrogel’, a loose meshwork of protein (gelatin) fibrils that is dense enough to hold the water inside the spaces between its fibres. The set jelly holds the water in a constrained 3D space – and depending on the recipe, there may be some other ‘cargo’ suspended within the jelly, such as bits of fruit (in the case of FUS this ‘cargo’ might be ribosomes, other proteins, enzymes or RNA, for example).

When the jelly is stored in a cool room, the fruit is retained in the jelly. This means the fruit (or ribosomes, etc) can be moved around the house and eventually put on the dinner table (or in the case of FUS, be transported to parts of a cell with unique protein synthesis requirements).

If the jelly is re-warmed, it melts and releases its fruit, which then float off‎. But if the liquid molten jelly is put back in the fridge and re-cooled, it re-makes a firm hydrogel again, and the fruit is once again trapped. In theory, this cycle of gel-melt-gel-melt can be repeated endlessly.

However, if the jelly is left out, the water will slowly evaporate, and the jelly condenses down, changing from a soft, easily-melted jelly to a thick, rubbery jelly.  (In fact, jelly is often sold as a dense cube like this.) In this condensed jelly, the meshwork of protein fibrils are much closer together and it becomes increasingly difficult to get the condensed jelly to melt (you would have to pour boiling water on it to get it to melt). Because the condensed jelly is not easily meltable when it gets to this state, any cargo (fruit, ribosomes, etc.) within the jelly essentially becomes irreversibly trapped.

In the case of FUS and other RNA binding proteins, the ‘healthy’ proteins only very rarely spontaneously over-condense. However, disease-causing mutations make these proteins much more prone to spontaneously ‎condense down into thick fibrous gels, trapping their cargo (in this case the ribosomes, etc), which then become unavailable for use.

So essentially, this new research shows that the ability of some proteins to self-assemble into liquid droplets and (slightly more viscous) jellies/hydrogel is a useful property that allows cells to transiently concentrate cellular machinery into a constrained 3D space in order to perform key tasks, and then disassemble and disperse the machinery when not needed. It is probably faster and less energy-costly than doing the same thing inside intracellular membrane-bound vesicles – but that same property can go too far, leading to disease.

Professor St George Hyslop says: “We’ve shown that a particular group of proteins can regulate vital cellular processes by their distinct ability to transition between different states. But this essential property also makes them vulnerable to forming more fixed structures if mutated, disrupting their normal function and causing disease.

“The same principles are likely to be at play in other more common forms of these diseases due to mutation in other related binding proteins. Understanding what is in these assemblies should provide further targets for disease treatments.

“Our approach shows the importance of considering the mechanisms of diseases as not just biological, but also physical processes. By bringing together people from the biological and physical sciences, we’ve been able to better understand how misshapen proteins build up and cause disease.”

The research was funded by in the UK by the Wellcome Trust, Medical Research Council and National Institutes of Health Research, in Canada by Canadian Institutes of Health Research, and in the US by National Institutes of Health.

Reference
Murakami, T et al. ALS/FTD mutation-induced phase transition of FUS liquid droplets and reversible hydrogels into irreversible hydrogels impairs RNP granule function. Neuron; 29 Oct 2015


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

New Design Points a Path To The ‘Ultimate’ Battery

New design points a path to the ‘ultimate’ battery

source: www.cam.ac.uk

Researchers have successfully demonstrated how several of the problems impeding the practical development of the so-called ‘ultimate’ battery could be overcome.

What we’ve achieved is a significant advance for this technology and suggests whole new areas for research

Clare Grey

Scientists have developed a working laboratory demonstrator of a lithium-oxygen battery which has very high energy density, is more than 90% efficient, and, to date, can be recharged more than 2000 times, showing how several of the problems holding back the development of these devices could be solved.

Lithium-oxygen, or lithium-air, batteries have been touted as the ‘ultimate’ battery due to their theoretical energy density, which is ten times that of a lithium-ion battery. Such a high energy density would be comparable to that of gasoline – and would enable an electric car with a battery that is a fifth the cost and a fifth the weight of those currently on the market to drive from London to Edinburgh on a single charge.

However, as is the case with other next-generation batteries, there are several practical challenges that need to be addressed before lithium-air batteries become a viable alternative to gasoline.

Now, researchers from the University of Cambridge have demonstrated how some of these obstacles may be overcome, and developed a lab-based demonstrator of a lithium-oxygen battery which has higher capacity, increased energy efficiency and improved stability over previous attempts.

Their demonstrator relies on a highly porous, ‘fluffy’ carbon electrode made from graphene (comprising one-atom-thick sheets of carbon atoms), and additives that alter the chemical reactions at work in the battery, making it more stable and more efficient. While theresults, reported in the journal Science, are promising, the researchers caution that a practical lithium-air battery still remains at least a decade away.

“What we’ve achieved is a significant advance for this technology and suggests whole new areas for research – we haven’t solved all the problems inherent to this chemistry, but our results do show routes forward towards a practical device,” said Professor Clare Grey of Cambridge’s Department of Chemistry, the paper’s senior author.

Many of the technologies we use every day have been getting smaller, faster and cheaper each year – with the notable exception of batteries. Apart from the possibility of a smartphone which lasts for days without needing to be charged, the challenges associated with making a better battery are holding back the widespread adoption of two major clean technologies: electric cars and grid-scale storage for solar power.

“In their simplest form, batteries are made of three components: a positive electrode, a negative electrode and an electrolyte,’’ said Dr Tao Liu, also from the Department of Chemistry, and the paper’s first author.

In the lithium-ion (Li-ion) batteries we use in our laptops and smartphones, the negative electrode is made of graphite (a form of carbon), the positive electrode is made of a metal oxide, such as lithium cobalt oxide, and the electrolyte is a lithium salt dissolved in an organic solvent. The action of the battery depends on the movement of lithium ions between the electrodes. Li-ion batteries are light, but their capacity deteriorates with age, and their relatively low energy densities mean that they need to be recharged frequently.

Over the past decade, researchers have been developing various alternatives to Li-ion batteries, and lithium-air batteries are considered the ultimate in next-generation energy storage, because of their extremely high energy density. However, previous attempts at working demonstrators have had low efficiency, poor rate performance, unwanted chemical reactions, and can only be cycled in pure oxygen.

What Liu, Grey and their colleagues have developed uses a very different chemistry than earlier attempts at a non-aqueous lithium-air battery, relying on lithium hydroxide (LiOH) instead of lithium peroxide (Li2O2). With the addition of water and the use of lithium iodide as a ‘mediator’, their battery showed far less of the chemical reactions which can cause cells to die, making it far more stable after multiple charge and discharge cycles.

By precisely engineering the structure of the electrode, changing it to a highly porous form of graphene, adding lithium iodide, and changing the chemical makeup of the electrolyte, the researchers were able to reduce the ‘voltage gap’ between charge and discharge to 0.2 volts. A small voltage gap equals a more efficient battery – previous versions of a lithium-air battery have only managed to get the gap down to 0.5 – 1.0 volts, whereas 0.2 volts is closer to that of a Li-ion battery, and equates to an energy efficiency of 93%.

The highly porous graphene electrode also greatly increases the capacity of the demonstrator, although only at certain rates of charge and discharge. Other issues that still have to be addressed include finding a way to protect the metal electrode so that it doesn’t form spindly lithium metal fibres known as dendrites, which can cause batteries to explode if they grow too much and short-circuit the battery.

Additionally, the demonstrator can only be cycled in pure oxygen, while the air around us also contains carbon dioxide, nitrogen and moisture, all of which are generally harmful to the metal electrode.

“There’s still a lot of work to do,” said Liu. “But what we’ve seen here suggests that there are ways to solve these problems – maybe we’ve just got to look at things a little differently.”

“While there are still plenty of fundamental studies that remain to be done, to iron out some of the mechanistic details, the current results are extremely exciting – we are still very much at the development stage, but we’ve shown that there are solutions to some of the tough problems associated with this technology,” said Grey.

The authors acknowledge support from the US Department of Energy, the Engineering and Physical Sciences Research Council (EPSRC), Johnson Matthey and the European Union via Marie Curie Actions and the Graphene Flagship. The technology has been patented and is being commercialised through Cambridge Enterprise, the University’s commercialisation arm.

Reference:
Liu, T et. al. ‘Cycling Li-O2 Batteries via LiOH Formation and Decomposition.’ Science (2015). DOI: 10.1126/science.aac7730


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/new-design-points-a-path-to-the-ultimate-battery#sthash.g1djETiB.dpuf

Entanglement at Heart of ‘Two-For-One’ Fission in Next-Generation Solar Cells

Entanglement at heart of ‘two-for-one’ fission in next-generation solar cells

source: www.cam.ac.uk

The mechanism behind a process known as singlet fission, which could drive the development of highly efficient solar cells, has been directly observed by researchers for the first time.

Harnessing the process of singlet fission into new solar cell technologies could allow tremendous increases in energy conversion efficiencies in solar cells

Alex Chin

An international team of scientists have observed how a mysterious quantum phenomenon in organic molecules takes place in real time, which could aid in the development of highly efficient solar cells.

The researchers, led by the University of Cambridge, used ultrafast laser pulses to observe how a single particle of light, or photon, can be converted into two energetically excited particles, known as spin-triplet excitons, through a process called singlet fission. If singlet fission can be controlled, it could enable solar cells to double the amount of electrical current that can be extracted.

In conventional semiconductors such as silicon, when one photon is absorbed it leads to the formation of one free electron that can be harvested as electrical current. However certain materials undergo singlet fission instead, where the absorption of a photon leads to the formation of two spin-triplet excitons.

Working with researchers from the Netherlands, Germany and Sweden, the Cambridge team confirmed that this ‘two-for-one’ transformation involves an elusive intermediate state in which the two triplet excitons are ‘entangled’, a feature of quantum theory that causes the properties of each exciton to be intrinsically linked to that of its partner.

By shining ultrafast laser pulses – just a few quadrillionths of a second – on a sample of pentacene, an organic material which undergoes singlet fission, the researchers were able to directly observe this entangled state for the first time, and showed how molecular vibrations make it both detectable and drive its creation through quantum dynamics. Theresults are reported today (26 October) in the journal Nature Chemistry.

“Harnessing the process of singlet fission into new solar cell technologies could allow tremendous increases in energy conversion efficiencies in solar cells,” said Dr Alex Chin from the University’s Cavendish Laboratory, one of the study’s co-authors. “But before we can do that, we need to understand how exciton fission happens at the microscopic level. This is the basic requirement for controlling this fascinating process.”

The key challenge for observing real-time singlet fission is that the entangled spin-triplet excitons are essentially ‘dark’ to almost all optical probes, meaning they cannot be directly created or destroyed by light. In materials like pentacene, the first stage – which can be seen – is the absorption of light that creates a single, high-energy exciton, known as a spin singlet exciton. The subsequent fission of the singlet exciton into two less energetic triplet excitons gives the process its name, but the ability to see what is going on vanishes as the process take place.

To get around this, the team employed a powerful technique known as two-dimensional spectroscopy, which involves hitting the material with a co-ordinated sequence of ultrashort laser pulses and then measuring the light emitted by the excited sample. By varying the time between the pulses in the sequence, it is possible to follow in real time how energy absorbed by previous pulses is transferred and transformed into different states.

Using this approach, the team found that when the pentacene molecules were vibrated by the laser pulses, certain changes in the molecular shapes cause the triplet pair to become briefly light-absorbing, and therefore detectable by later pulses. By carefully filtering out all but these frequencies, a weak but unmistakable signal from the triplet pair state became apparent.

The authors then developed a model which showed that when the molecules are vibrating, they possess new quantum states that simultaneously have the properties of both the light-absorbing singlet exciton and the dark triplet pairs. These quantum ‘super positions’, which are the basis of Schrödinger’s famous thought experiment in which a cat is – according to quantum theory – in a state of being both alive and dead at the same time, not only make the triplet pairs visible, they also allow fission to occur directly from the moment light is absorbed.

“This work shows that optimised fission in real materials requires us to consider more than just the static arrangements and energies of molecules; their motion and quantum dynamics are just as important,” said Dr Akshay Rao, from the University’s Cavendish Laboratory. “It is a crucial step towards opening up new routes to highly efficiency solar cells.”

The research was supported by the European LaserLab Consortium, Royal Society, and the Netherlands Organization for Scientific Research. The work at Cambridge forms part of a broader initiative to harness high tech knowledge in the physical sciences to tackle global challenges such as climate change and renewable energy. This initiative is backed by the UK Engineering and Physical Sciences Research Council (EPSRC) and the Winton Programme for the Physics of Sustainability.

Reference:
Bakulin, Artem et. al. ‘Real-time observation of multiexcitonic states in ultrafast singlet fission using coherent 2D electronic spectroscopy.’ Nature Chemistry (2015). DOI: 10.1038/nchem.2371


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Social Yeast Cells Prefer to Work With Close Relatives to Make Our Beer, Bread & Wine

Social yeast cells prefer to work with close relatives to make our beer, bread & wine

source: www.cam.ac.uk

Baker’s yeast cells living together in communities help feed each other, but leave incomers from the same species to die from starvation, according to new research from the University of Cambridge.

The cell-cell cooperation we uncovered plays a significant role in allowing yeast to help us to produce our food, beer and wine

Kate Campbell

The findings, published today in the open access journal eLife, could lead to new biotechnological production systems based on metabolic cooperation. They could also be used to inhibit cell growth by blocking the exchange of metabolites between cells. This could be a new strategy to combat fungal pathogens or tumour cells.

“The cell-cell cooperation we uncovered plays a significant role in allowing yeast to help us to produce our food, beer and wine,” says first author Kate Campbell.

“It may also be crucial for all eukaryotic life, including animals, plants and fungi.”

Yeast metabolism has been exploited for thousands of years by mankind for brewing and baking. Yeast metabolizes sugar and secretes a wide array of small molecules during their life cycle, from alcohols and carbon dioxide to antioxidants and amino acids. Although much research has shown yeast to be a robust metabolic work-horse, only more recently has it become clear that these single-cellular organisms assemble in communities, in which individual cells may play a specialised function.

For the new study funded by the Wellcome Trust and European Research Council, researchers at the University of Cambridge and the Francis Crick Institute found cells to be highly efficient at exchanging some of their essential building blocks (amino acids and nucleobases, such as the A, T, G and C constituents of DNA) in what they call metabolic cooperation. However, they do not do so with every kind of yeast cell: they share nutrients with cells descendant from the same ancestor, but not with other cells from the same species when they originate from another community.

Using a synthetic biology approach, the team led by Dr Markus Ralser at the Department of Biochemistry started with a metabolically competent yeast mother cell, genetically manipulated so that its daughters progressively loose essential metabolic genes. They used it to grow a heterogeneous population of yeast with multiple generations, in which individual cells are deficient for various nutrients.

Campbell then tested whether cells lacking a metabolic gene can survive by sharing nutrients with their family members. When living within their community setting, these cells could continue to grow and survive. This meant that cells were being kept alive by neighbouring cells, which still had their metabolic activity intact, providing them with a much needed nutrient supply. Eventually, the colony established a composition where the majority of cells did help each other out. When cells of the same species but derived from another community were introduced, social interactions did not establish and the foreign cells died from starvation.

When the successful community was compared to other yeast strains, which had no metabolic deficiencies, the researchers found no pronounced differences in how both communities grew and produced biomass. This is implies that sharing was so efficient that any disadvantage was cancelled out.

The implications of these results may therefore be substantial for industries in which yeast are used to produce biomolecules of interest. This includes biofuels, vaccines and food supplements. The research might also help to develop therapeutic strategies against pathogenic fungi, such as the yeast Candida albicans, which form cooperative communities to overcome our immune system.

Reference

Kate Campbell, Jakob Vowinckel, Michael Muelleder, Silke Malmsheimer, Nicola Lawrence, Enrica Calvani, Leonor Miller-Fleming, Mohammad T. Alam, Stefan Christen, Markus A. Keller, and Markus Ralser

Self-establishing communities enable cooperative metabolite exchange in a eukaryote eLife 2015, http://dx.doi.org/10.7554/eLife.09943


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.