All posts by Admin

Oldest Genetic Evidence of Hepatitis B Virus Found in Ancient DNA From 4,500 Year-Old Skeletons

Oldest genetic evidence of Hepatitis B virus found in ancient DNA from 4,500 year-old skeletons

source: cam.ac.uk

An extinct strain of the human Hepatitis B virus (HBV) has been discovered in Bronze Age human skeletons found in burial sites across Europe and Asia.

This study could have huge implications for how the virus affects humans today.

Eske Willerslev

A pioneering study has identified the oldest evidence of HBV in the ancient remains and proved that viruses can become extinct. The scientific significance of the research has been described as ‘truly remarkable’ and compared to the discovery of the first fossils.

Today the Hepatitis B virus affects millions of people worldwide. In 2015 it was estimated that approximately 257 million people were chronically infected with HBV and 887,000 died due to associated complications such as liver cancer.

The new research, led by a group of academics at the Center for Pathogen Evolution in the Department of Zoology at the University of Cambridge and the Centre for GeoGenetics at the University of Copenhagen, took genetic samples from skeletons across Europe and Asia from the Bronze Age to the Medieval period, and found 25 HBV-positive skeletons amongst the remains. In 12 of these skeletons, they found enough of the HBV genome to perform detailed analyses – the oldest of which was 4,500-years-old.

From this data they were able to extract the genetic sequences of HBV that infected the individuals thousands of years ago.

The findings, published in the journal Naturepresent new insights into the origins and evolution of HBV. The genetic makeup of this strain could have implications for improving vaccines for HBV.

Before this study, the oldest human viruses to be discovered were approximately 450-years- old but most are no more than 50-years-old. The research now forms the oldest and largest datasets scientists have of ancient human viruses.

Barbara Mühlemann, joint first author on the research paper and a graduate student at the University of Cambridge, said: “People have tried to unravel the history of HBV for decades – this study transforms our understanding of the virus and proves it affected people as far back as the Bronze Age. We have also shown that it is possible to recover viral sequences from samples of this age which will have much wider scientific implications.”

Although HBV is a global health issue, little is known about its origin and evolution. As with many human viruses, this is largely due to a lack of historical evidence which has been difficult to locate and identify.

Dr Terry Jones, joint first author who is based at the University of Cambridge’s Department of Zoology, explained: “Scientists mostly study modern virus strains and we have mainly been in the dark regarding ancient sequences – until now. It was like trying to study evolution without fossils. If we only studied the animals living today it would give us a very inaccurate picture of their evolution – it is the same with viruses.”

Understanding more about HBV may now be possible. Showing that the virus has been circulating in humans since at least the Bronze Age is a big scientific advancement, as previous attempts to estimate how long the virus has infected humans have ranged from 400 years to 34,000 years.

The study was led by Professor Eske Willerslev, who holds positions both at St John’s College, University of Cambridge, and the University of Copenhagen.

He said: “This data gives us an idea of how this virus behaves, and it provides us with a better idea of what is biologically possible in the future. Analysis of other ancient DNA samples may reveal further discoveries and this pioneering study could have huge implications for how the virus affects humans today.”

The research also shows the existence of ancient HBV genotypes in locations incompatible with their present-day distribution, contradicting previously-suggested geographical origins of the virus.

Professor Willerslev initially suspected that it might be possible to find viruses in human remains based on previous research during his role at the University of Copenhagen. He approached Mühlemann and Jones who have specialised in identifying and studying the evolution of viruses.

The research approach the group used in the study, called ‘shotgun sequencing’, looks at all genetic material present in a sample, as opposed to ‘genome bio-capture’ which focuses only on the human genome.

Professor Willerslev said: “This study is just the start. We’re talking about one virus here, but there are a lot of other viruses we could look for.”

Reference:
Barbara Mühlemann et al. ‘Ancient Hepatitis B viruses from the Bronze Age to the Medieval period.’ Nature (2018). DOI: 10.1038/s41586-018-0097-z


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


Syria Airstrikes Add Another ‘Exception’ To Beleaguered Parliamentary Convention, Say Experts

Syria airstrikes add another ‘exception’ to beleaguered parliamentary convention, say experts

source: www.cam.ac.uk

A new book launching in Cambridge today explores the parliamentary convention intended to allow MPs a vote on military action. The authors say that the intervention in Syria provides just the latest of several ‘exceptions’ – chipping away further at a convention that may no longer meaningfully exist.

Our analysis reveals repeated exceptions created by successive governments even prior to the recent unilateral strikes in Syria

Veronika Fikfak

The recent intervention in Syria may add airstrikes to the expanding list of exceptions to the convention established to provide democratic oversight of UK military action through a parliamentary vote, say experts in international and constitutional law.

During research for a new book, launched today, the legal academics found that, in addition to broadly-defined ‘emergency’ or ‘secrecy’ exceptions, two specific types of military activity – the deployment of embedded Special Forces and unmanned drone strikes – had already been exempted from the convention.

Now, by unilaterally authorising the recent intervention in Syria, and justifying the action using language that further narrows the convention’s purview, the current government may have created a further exception for airstrikes – a cornerstone of modern warfare.

Drs Veronika Fikfak and Hayley J. Hooper, who conducted the research for their book at Cambridge’s Faculty of Law, say that “if the War Powers Convention continues to exist, we question whether it exists in any meaningful sense”.

They argue that increasing exemptions from the convention, combined with a flourishing “information asymmetry” between government and parliament, creates a real risk of another ‘Iraq moment’ in the near future.

The book Parliament’s Secret War traces the last century of Westminster decision-making during the build up to hostilities, with a focus on the legal debates following the establishment of the War Power Convention in the wake of the Iraq war.

Published by Bloomsbury, the book will be launched at Homerton College, Cambridge, this evening (30 April) with a Q and A session with both authors as part of the College’s 250 anniversary series of events.

“The idea that the War Powers Convention gives parliament political control over whether the UK goes to war has now been hollowed out to the point where any claim that elected MPs have a say on military action is essentially a deception of British civil society,” says Fikfak, a Fellow of Homerton College.

“The War Powers Convention initially looked like it might level the playing field between parliament and government. However, our analysis reveals repeated exceptions created by successive governments even prior to the recent unilateral strikes in Syria.”

The convention has its origins in the House of Commons vote sanctioning the Iraq invasion in 2003, although some argue this was a fait accompli given the thousands of troops already in the region.

Nevertheless, a convention requiring parliamentary support for armed conflict was solidified through a series of votes in the years following Iraq – most significantly with 2013’s decisive vote on Syria, when the government was defeated.

Heralded by the media as a milestone in British democracy, the convention sees a “yes or no vote” put to MPs, rather than the government of the day invoking Royal Prerogative: the traditional legal right to declare war in the name of the Crown.

Plans to enshrine the convention in law were shelved in 2016, although Labour leader Jeremy Corbyn has renewed discussions of a possible War Power Act since the recent Syria airstrikes.

The convention has, however, been a fixture of the Cabinet Manual – an official guide to the UK’s uncodified constitution – since 2011, with successive Defence Ministers recommitting to it both in principle and, to some extent, in practice.

Yet the recent circumvention of this potential check on power is arguably only the latest, as the convention has already been subject to “a myriad of exceptions” controlled by government – explored in depth by the new book.

For example, in 2015 a British member of Da’esh was killed by drones in Syria, despite parliament making it clear on two previous occasions that it did not support use of force in Syrian territory.

Justified by the then government as a ‘new departure’, and couched in language of ‘immediacy’ and ‘direct threat’, this was interpreted “generously” by the Joint Committee on Human Rights as an ‘emergency’ that didn’t breach convention – a precedent for the exception of drone warfare.

Also in 2015, British military took part in ground raids on Syrian territory with US forces. The government response was to state that the convention apparently “does not apply [to those] embedded in the armed forces of other nations”, despite the non-emergency situation.

The researchers argue that undermining of the convention is compounded by “selective disclosure” of vital information to parliament, often under the guise of state secrecy. This was the current government’s primary justification for disregarding the convention with the recent Syrian strikes.

“In the wake of Iraq, the position that ‘Whitehall knows best’ is constitutionally untenable,” says Hooper, now a Fellow at Christ Church College, Oxford. “Sources of intelligence should never be revealed, but reports of the Joint Intelligence Committee could be considered by parliamentarians in secure premises.”

The researchers argue that the nature of war has changed, now limited for the most part to drone and air strikes. “To exclude the majority of military interventions from parliamentary scrutiny risks undermining the accountability of government,” says Hooper.

Adds Fikfak: “In addition to the non-application of the convention to Special Forces deployments, the embedding of British forces in foreign countries’ armies, and the use of drones, there is now room for significant doubt as to whether the War Powers Convention applies to air strikes.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

UK and US Join Forces To Understand How Quickly a Massive Antarctic Glacier Could Collapse

UK and US join forces to understand how quickly a massive Antarctic glacier could collapse

source: www.cam.ac.uk

A Cambridge researcher will lead one of eight projects in a new joint UK-US research programme that is one of the most detailed and extensive examinations of a massive Antarctic glacier ever undertaken.

These margins have so far never been studied directly, due to the logistical challenges of working in such a remote region of Antarctica.

Poul Christoffersen

The collapse of the Thwaites Glacier in West Antarctica could significantly affect global sea levels. It already drains an area roughly the size of Britain or the US state of Florida, accounting for around four percent of global sea-level rise —an amount that has doubled since the mid-1990s.

As part of a new £20 million research collaboration, the UK Natural Environment Research Council and the US National Science Foundation will deploy scientists to gather the data needed to understand whether the glacier’s collapse could begin in the next few decades or centuries.

NERC and NSF have jointly funded eight large-scale projects that will bring together leading polar scientists in one of the most inhospitable regions of the planet. The programme, called the International Thwaites Glacier Collaboration (ITGC), is the largest joint project undertaken by the two nations in Antarctica for more than 70 years – since the conclusion of a mapping project on the Antarctic Peninsula in the late 1940s.

In addition to the £20 million-worth ($25 million) of awards to the research teams, the logistics of mounting a scientific campaign in one of the most remote places in Antarctica could cost as much again in logistical support. The nearest permanently occupied research station to the Thwaites Glacier is more than 1600km away, so getting the scientists to where they need to be will take a massive joint effort from both nations. While researchers on the ice will rely on aircraft support from UK and U.S. research stations, oceanographers and geophysicists will approach the glacier from the sea in UK and U.S. research icebreakers.

Dr Poul Christoffersen from the University of Cambridge’s Scott Polar Research Institute is co-leading one of the eight projects with Professor Slawek Tulaczyk from the University of California, Santa Cruz. Their project, Thwaites Interdisciplinary Margin Evolution (TIME) also includes researchers from the University of Leeds, Stanford University, the University of Texas and the University of Oklahoma. The team will investigate how the margins of the drainage basin will evolve and influence ice flow over the coming decades.

“These margins have so far never been studied directly, due to the logistical challenges of working in such a remote region of Antarctica,” said Christoffersen. “The margins, which separate the fast-flowing glacier from the surrounding slow-moving ice, are often thought of as being stationary, but they might not be. The hypothesis that drives our science is that they can move and thereby exert powerful control on the future evolution of ice flow in the whole drainage basin.”

“This international collaboration will lead to a step change in our understanding of ice sheet stability,” said Cambridge’s Dr Marion Bougamont, who will use observational data records gathered in the field to improve computer models needed to predict sea level rise. “The glacier’s response will depend on where the margins are and how they evolve.”

Today’s collaboration involves around 100 scientists from world-leading research institutes in both countries alongside researchers from South Korea, Germany, Sweden, New Zealand and Finland, who will contribute to the various projects. These projects aim to deliver answers to some of the big questions for scientists trying to predict global sea-level rise.

Antarctica’s glaciers contribute to sea-level rise when more ice is lost to the ocean than is replaced by snow. To fully understand the causes of changes in ice flow requires research on the ice itself, the nearby ocean, and the Antarctic climate in the region. The programme will deploy the most up-to-date instruments and techniques available, from drills that can make access holes 1,500 meters into the ice with jets of hot water to autonomous submarines like the Autosub Long Range affectionately known around the world as Boaty McBoatface.

“Rising sea levels are a globally important issue which cannot be tackled by one country alone,” said UK Science Minister, Sam Gyimah. “The Thwaites Glacier already contributes to rising sea levels and understanding its likely collapse in the coming century is vitally important. Science, research and innovation are at the heart of our Industrial Strategy and this UK-U.S. research programme will be the biggest field campaign of its type ever mounted by these countries. I’m delighted that our world-leading scientists will help to lead this work.”

The science programme and logistics on the five-year programme begins in October 2018 and continues to 2021. The funding is for eight research projects and a co-ordination grant to maximise success.

Adapted from a NERC/NSF press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Receives £10 Million in Funding For New AI Supercomputer

Cambridge receives £10 million in funding for new AI supercomputer

source: www.cam.ac.uk

The UK’s fastest academic supercomputer, based at the University of Cambridge, will be made available to artificial intelligence (AI) technology companies from across the UK, in support of the government’s industrial strategy.

Cambridge’s supercomputer provides researchers with the fast and affordable supercomputing power they need for AI work.

Paul Calleja

The new AI supercomputer is a £10 million partnership between the Engineering and Physical Sciences Research Council (EPSRC), the Science and Technology Facilities Council (STFC) and the University. Capable of solving the largest scientific and industrial challenges at very high speeds, the supercomputer is supported by Cambridge’s Research Computing Service. The aim is to help companies to create real business value from advanced computing infrastructures.

The supercomputer is part of the UK government’s AI Sector Deal, which involves more than 50 leading technology companies and organisations. The deal is worth almost £1 billion, including almost £300 million of private sector investment into AI.

“AI research requires supercomputing capacity capable of processing huge amounts of data at very high speeds,” said Dr Paul Calleja, Director of the University’s Research Computing Service. “Cambridge’s supercomputer provides researchers with the fast and affordable supercomputing power they need for AI work.”

In addition to computing power, Calleja and his team will provide training, guidance and support Cambridge researchers, and the wider UK IA industry, to make the most of their data.

“AI projects involving Cambridge researchers are already underway,” said Calleja. “In the life sciences we are working on medical imaging analysis and genomics, and in astronomy, AI is being used as part of the Square Kilometre Array project and research to map exoplanets.”

Cambridge is home to the largest technology cluster in Europe. Over the past decade, start-ups based on AI and machine learning, in Cambridge and elsewhere, have seen explosive growth.

“The UK must be at the forefront of emerging technologies, pushing boundaries and harnessing innovation to change people’s lives for the better,” said Secretary of State for Digital, Culture, Media and Sport Matt Hancock. “Artificial Intelligence is at the centre of our plans to make the UK the best place in the world to start and grow a digital business. We have a great track record and are home to some of the world’s biggest names in AI like Deepmind, Swiftkey and Babylon, but there is so much more we can do. By boosting AI skills and data-driven technologies we will make sure that we continue to build a Britain that is shaping the future.”

Building on the commitment made in the government’s modern Industrial Strategy and its AI Grand Challenge, the AI Sector Deal marks the first phase of a major innovation-focused investment drive in AI which aims to help the UK seize the £232 billion opportunity AI offers the UK economy by 2030 (10% of GDP).

The deal will help establish the UK as a research hotspot, with measures to ensure the innovators and tech entrepreneurs of tomorrow are based in the UK, with investment in the high-level post-graduate skills needed to capitalise on technology’s huge potential.

It includes money for training for 8,000 specialist computer science teachers, 1,000 government-funded AI PhDs by 2025 and a commitment to develop a prestigious global Turing Fellowship programme to attract and retain the best research talent in AI to the UK.

“Artificial intelligence provides limitless opportunities to develop new, efficient and accessible products and services which transform the way we live and work,” said Business and Energy Secretary Greg Clark. “Today’s new deal with industry will ensure we have the right investment, infrastructure and highly-skilled workforce to establish the UK as a driving force in the development and commercial use of artificial intelligence technologies.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

How Landscapes and Landforms ‘Remember’ or ‘Forget’ Their Initial Formations

How landscapes and landforms ‘remember’ or ‘forget’ their initial formations

source: www.cam.ac.uk

Laboratory findings point to what affects the development of nature’s shapes.

The answer to the question ‘What’s in a shape?’ hinges on this memory property.

Megan Davies Wykes

Crescent dunes and meandering rivers can ‘forget’ their initial shapes as they are carved and reshaped by wind and water while other landforms keep a memory of their past shape, suggests new research.

“Asking how these natural sculptures come to be is more than mere curiosity because locked in their shapes are clues to the history of an environment,” said Leif Ristroph from New York University and the senior author of the paper, which is published in the journal Physical Review Fluids. “We found that some shapes keep a ‘memory’ of their starting conditions as they develop while others ‘forget’ the past entirely and take on new forms.” This understanding is important in geological dating and in understanding how landscapes form.

Shape ‘memory’ and its ‘loss’—or the retention of or departure from earlier formations—are key issues in geomorphology, the field of study that tries to explain landforms and the developing face of the Earth and other celestial surfaces. The morphology, or shape of a landscape, is the first and most direct clue into its history and serves as a scientific window for a range of questions—such as inferring flowing water on Mars in the past as well as present-day erosion channels and river islands.

“The answer to the question ‘What’s in a shape?’ hinges on this memory property,” said first author Dr Megan Davies Wykes, a postdoctoral researcher in Cambridge’s Department of Applied Mathematics and Theoretical Physics, who completed the work while she was based at NYU.

To shed light on these phenomena, the researchers replicated nature’s dissolvable minerals—such as limestone—with a ready-made stand-in: pieces of hard candy. Specifically, they sought to understand how the candy dissolved to take different forms when placed in water.

To mimic different environmental conditions, they cast the candy into different initial shapes, which led to different flow conditions as the surface dissolved. Their results showed that when the candy dissolved most strongly from its lower surface, it tended to retain its overall shape—reflecting a near-perfect memory. By contrast, when dissolved from its upper surface, the candy tended to erase or ‘forget’ any given initial shape and form an upward spike structure.

The key difference, the team found, is the type of water flow that ‘licks’ and reshapes the candy. Turbulent flows on the underside tend to dissolve the candy at a uniform rate and thus preserve the shape. The smooth flow on an upper surface, however, carries the dissolved material from one location to the next, which changes the dissolving rate and leads to changes in shape.

“Candy in water may seem like a far cry from geology, but there are in fact whole landscapes carved from minerals dissolving in water, their shapes revealed later when the water table recedes,” said Ristroph. “Caves, sinkholes, stone pillars and other types of craggy terrain are born this way.”

Reference:
Megan S. Davies Wykes et al. ‘Self-sculpting of a dissolvable body due to gravitational convection.’ Physical Review Fluids (2018). DOI: 10.1103/PhysRevFluids.3.043801

Adapted from an NYU press release

Video: Side-view photograph of candy body (initially a sphere). The upper surface remains smooth while the undersurface becomes pitted and dissolves several times faster.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Addenbrooke’s To Pioneer Liver Transplant Technology

Addenbrooke’s to pioneer liver transplant technology

  • 2 May 2018
  • source: http://www.bbc.co.uk/news/uk-england-cambridgeshire-43968706
perfusion machineImage copyrightDAVID NASRALLA
Image captionThe perfusion machine mimics a transplant, keeping the liver warm

A hospital has become the first in the UK to bring state-of-the-art liver technology into routine use.

The “perfusion machine” at Addenbrooke’s Hospital in Cambridge keeps a donor liver “alive” outside the body by pumping it with blood, nutrients and medicine.

Donor organs are normally stored in ice but many become damaged and unusable as a result, those behind the machine say.

The procedure could mean a further 54 liver transplants over two years.

The decision to acquire the machine comes 50 years’ to the day since the first liver transplant in Europe was performed at the hospital.

Prof Sir Roy Calne, who led the pioneering operation on 2 May 1968 said it proved how far transplant technology had come.

“It is a big step and it will become a bigger step as more experience is accumulated on how to use the machine, and how much the machine can make a bad liver that’s damaged better for transplantation,” he said.

liver perfusion machineImage copyrightSALLY WORDLE/PA
Image captionProf Chris Watson (left) and surgeon Andrew Butler with the machine at Addenbrooke’s

The technology was invented and developed over 20 years by scientists Prof Constantin Coussios and Prof Peter Friend at Oxford University.

Prof Chris Watson, a transplant consultant at Addenbrooke’s, said perfusion can “mimic the transplant”, supplying the organ with oxygenated blood and nutrients, and tests can check how it will function.

Scientists performing surgeryImage copyrightDAVID NASRALLA
Image captionOne in eight patients waiting for a liver dies without one, doctors say

It will be particularly useful for testing livers which may have been deemed too risky and unsuitable for transplant, and enable more patients to undergo the procedure, he said.

“Some livers are potentially usable, we just haven’t got the confidence to know it’s going to work first time. And it’s got to work first time or the recipient dies.

“One in eight patients waiting for a liver dies without one. If we can reduce that, it will make a huge difference.”

Cambridge Uni Graphene Spin-Out Paragraf Gets $3.9M

Cambridge Uni graphene spin-out Paragraf gets $3.9M

graphene

source: https://techcrunch.com/2018/05/01/cambridge-uni-graphene-spin-out-paragraf-gets-3-9m/

Paragraf, a Cambridge University graphene spin-out, has closed a £2.9M (~$3.9M) seed round. The funding is led by the university’s commercialization arm, Cambridge Enterprise, with Parkwalk Advisors, Amadeus Capital Partners, IQ Capital Partners and angel investors also participating.

Graphene refers to the one atom-thick latticed carbon material that’s been exciting scientists with its potential for more than a decade. Although turning a nanomaterial with transformative promise into practical and robust commercial products has not turned out to be a cake walk.

Paragraf reckons it’s onto something that can help accelerate developments though, having come up with what it says is a novel (and patent-protected) approach to manufacturing graphene for commercial use-cases — so in larger and better quantities and qualities than the small flakes that have typically been the production rule so far.

The team claims their technique overcomes a raft of problems which have stymied graphene developments to date, such as poor uniformity, reproducibility, limited size and material contamination.

Their wider focus is on producing atom-layer thick 2D materials — with graphene their starting points — for the development of a new generation of electronic devices. And early claims for the nanomaterial included suggestions that it could enable a new generation of flexible, transparent electronics.

“Harnessing the extremely high conductivity, superb strength, very low weight and ultimate flexibility of graphene, Paragraf’s technology is the first ever commercial-scale method validated to reproducibly deliver functionally active graphene with properties targeted to its final device-specific application, with both high quality and high throughput,” is the team’s claim for their approach.

The business has been spun out of the Centre for Gallium Nitride group of Professor Sir Colin Humphreys in the university’s department of materials science. According to Crunchbase Paragraf was founded in 2015.

So far they say they’ve produced layers of graphene with electrical characteristics optimized for producing “very sensitive detectors at commercial scale”, as well as “improved efficiency contact layers for common technologies such as LEDs”.

Among their targets for graphene devices are transistors, where they reckon the nanomaterial could deliver clock speeds several orders of magnitude faster than silicon-based devices; chemical and electrical sensors, where they say it could increase sensitivity by a factor of >1,000; and novel energy generation devices — arguing graphene could tap into “kinetic and chemical green energy sources yet to be exploited by any other technology” (so chalk up another wonder claim).

Of course those are all yet more miraculous sounding claims being made for graphene — the likes of which have been liberally attached to the substance for years. And Paragraf still faces the hard graft of proving out their claims. So it’s a pretty safe bet that multiple years of R&D are still needed before mass market graphene based devices are within the average consumer’s grasp.

Commenting on the funding in a statement, Hermann Hauser, co-founder of Amadeus Capital Partners, said: “Graphene has demonstrated some remarkable achievements in the lab, showing great promise for many future electronic technologies. However, without a pathway to commercial viability, scaling from proof of concept to end user accessible products remains beyond the horizon. Paragraf’s novel approach to two-dimensional materials fabrication brings the possibility of mass market graphene based devices a step closer to reality.”

In another supporting statement, Dr Simon Thomas, CEO and co-founder of Paragraf, added: “There’s no doubt that the electronic, mechanical and optical properties of two-dimensional materials such as graphene have the potential to significantly increase performance in a multitude of state of the art technologies. However, until materials like graphene can be delivered in commercially viable, device compatible, functionally targeted forms, the achievements demonstrated at lab scale will not be transferred to real-world products. At Paragraf we have developed the first production technique that allows true scaling of graphene based devices.”

Thomas also told us Paragraf is looking to deliver its first product by the end of this year.

“We will be looking to a Series A raise,” he told TechCrunch. “This will be required to drive expansion of development activities and deliver products from our IP pipeline as opposed to delivering our first product to market. We expect this additional capital will be required in Q3 2019.”

Asked how far out he reckons graphene is from finding its way into real world electronics, he added: “While considerable challenges still exist, some great steps forward have been made over the past year or so and I expect high-tech applications of graphene in consumer technologies to appear in the general market within the next 2-3 years.

“Paragraf’s first applications will be in the sensor product space, enabling a new generation of highly sensitive electronic detectors, initially helping increase energy efficiencies is fields such as the automotive and transport industries then through to consumer smart home and handheld device technologies.”

This report was updated with additional comment

Crescendo Biologics Ltd. Raises $70 Million (€57 Million) in Series B Financing

Crescendo Biologics Ltd. Raises $70 Million (€57 Million) in Series B Financing

source:https://www.marketwatch.com/amp/story/guid/

Published: Apr 30, 2018 2:00 am ET

Round led by Andera Partners with significant investments by Quan Capital and existing investors, largest disclosed Series B financing in the European Biotech sector in 2018

Crescendo Biologics Ltd (Crescendo) the developer of multi-functional biologics with a focus on novel targeted T-cell engagers, announced today that it has completed a $70 million (€57 million) Series B financing.

The funds will be used to advance the development of its lead programme, CB307, which stimulates local activation of tumour-specific T-cells, into the clinic and further expand its internal pipeline of products.

Crescendo Biologics is developing potent, multi-functional Humabody [®] therapeutics in oncology. It is pursuing novel Humabody [®] -based product opportunities, through in-house development and strategic partnerships. To date, it has a collaboration with Takeda Pharma worth up to $790m.

The Series B round was led by Andera Partners (formerly Edmond de Rothschild Investment Partners) with Europe’s largest life science fund Biodiscovery V, and joined by Quan Capital with its leading life sciences fund, Quan Venture Fund I, and Crescendo’s existing investors Sofinnova Partners, IP Group, EMBL and Takeda Ventures. This is the largest disclosed Series B biotech financing in Europe in 2018.

Gilles Nobécourt, Partner at Andera Partners and lead investor said: “We have been very impressed with the high quality of the novel biology behind multi-functional Humabodies and Crescendo’s growing development portfolio. Crescendo is a true pioneer in the development of targeted T-cell engagement and we are looking forward to working with the team.”

Graziano Seghezzi, Managing Partner at Sofinnova Partners added: “We have been supporting Crescendo since its seed round in 2009, and then through the Series A together with IP Group, the substantial size of the round and the participation of new investors of the Series B underlines the potential and success of VH-based Humabodies.”

Marietta Wu, Managing Director of Quan Capital which invested a significant amount in the round, explained: “We have been especially drawn to the Humabody [®] platform that offers multiple potential advantages over the current antibody (IgG) approaches and could enable the Company to quickly build a substantial portfolio of impactful therapeutics. We look forward to joining our partners to rapidly advance Crescendo’s portfolio into clinical development where we can improve patient lives.”

 

Peter Pack CEO of Crescendo said: “We appreciate the strong support – past and present – from our current investors, who have enabled us to grow the Company to this point. In this round, we are also welcoming two new investors, Andera Partners and Quan Capital. We look forward to taking our lead programme, CB307 into the clinic and further exploit our technology platform with new products.”

For information on all parties please visit www.crescendobiologics.com/

View source version on businesswire.com: https://www.businesswire.com/news/home/20180429005039/en/

SOURCE: Crescendo Biologics Ltd”> <Property FormalName=”PrimaryTwitterHandle” Value=”@HUMABODY

Crescendo Biologics
Dr Peter Pack, CEO, + 44 (0)1223 497140
info@crescendobiologics.com
or
Instinctif Partners for Crescendo Biologics
Deborah Bell, Dr Christelle Kerouedan, Melanie Toyne-Sewell, + 44 (0)20 7457 2020
crescendo@instinctif.com

Copyright Business Wire 2018

See original version of this story

 

Labelling Alcoholic Drinks As Lower In Strength Could Encourage People To Drink More, Study Suggests

Labelling alcoholic drinks as lower in strength could encourage people to drink more, study suggests

source: www.cam.ac.uk

Wines and beers labelled as lower in alcohol strength may increase the total amount of alcoholic drink consumed, according to a study published in the journal Health Psychology. The study was carried out by the Behaviour and Health Research Unit at the University of Cambridge in collaboration with the Centre for Addictive Behaviours Research at London South Bank University.

For lower strength alcohol products to reduce consumption, consumers will need to select them in place of equal volumes of higher strength products

Milica Vasiljevic

Alcohol is the fifth leading cause of disease and premature death both in the UK and globally. Reducing consumption of alcohol is a public health priority in many countries. In the UK, as part of a range of steps to reduce overall alcohol consumption, policymakers are currently interested in allowing industry to label a wider range of alcohol products as lower in alcohol.

Proposed legislative changes include extending the variety of terms that could be used to denote lower alcohol content, and extending the strength limit to include products lower than the current average on the market (12.9% ABV for wine and 4.2% ABV for beer*).

“For lower strength alcohol products to reduce consumption, consumers will need to select them in place of equal volumes of higher strength products,” says Dr Milica Vasiljevic from the University of Cambridge. “But what if the lower strength products enable people to feel they can consume more?”

In this study, two-hundred and sixty-four weekly wine and beer drinkers – sampled from a representative panel of the general population of England – were randomised to one of three groups to taste test drinks in a laboratory designed to mimic a bar environment. The drinks varied only in the label displayed. In one group participants taste-tested drinks labelled ‘Super Low’ and ‘4%ABV’ for wine or ‘1%ABV’ for beer. In another group the drinks were labelled ‘Low’ and ‘8%ABV’ for wine or ‘3%ABV’ for beer. In the final group participants taste-tested drinks labelled with no verbal descriptors of strength, but displaying the average strength on the market – wine (‘12.9%ABV’) or beer (‘4.2%ABV’).

The results showed the total amount of drink consumed increased as the label on the drink denoted successively lower alcohol strength. The mean consumption of drinks labelled ‘Super Low’ was 214ml, compared with 177ml for regular (unlabelled) drinks. Individual differences in drinking patterns and socio-demographic indicators did not affect these results.

“Labelling lower strength alcohol may sound like a good idea if it encourages people to switch drinks, but our study suggests it may paradoxically encourage people to drink more,” says Professor Theresa Marteau, senior author and Director of the Behaviour and Health Research Unit.

While this study shows that people may drink more if drinks are labelled as lower in strength, the researchers do not yet know if this effect is sufficient to result in the consumption of more units of alcohol overall from lower strength alcohol drinks. Furthermore, participants in this study were tested in a bar-laboratory setting. To learn more about the impact of lower strength alcohol labelling, research in real-world settings is needed.

The study was funded by the Department of Health.

*ABV denotes alcohol by volume, the standard measure of how much alcohol is contained in a given volume of an alcoholic drink.

Reference
Vasiljevic M, Couturier DL, Frings D, Moss AC, Albery IP, Marteau TM. Impact of lower strength alcohol labeling on consumption: A randomized controlled trial. Health Psychology. DOI: 10.1037/hea0000622


Researcher profile: Dr Milica Vasiljevic

On the face of it, e-cigarettes and low alcohol seem to be a step in the right direction towards reducing the health impacts of smoking and drinking. But are things really so clear cut? This is one of the questions that social psychologist Dr Milica Vasiljevic is asking.

Vasiljevic investigates the impact that environmental cues have on health behaviours, and how this knowledge can be translated into effective interventions to change our behaviour to improve health and reduce inequalities. “The bulk of my work to date has looked at how cues such as labelling and advertising encourage people to eat unhealthily, drink alcohol, and/or smoke tobacco,” she says.

Her work is of particular interest to policymakers and has informed national and international policies. “My recent work on the impact of e-cigarette adverts on perceived harm of tobacco smoking amongst children has been discussed at the US Food & Drug Administration, the German Bundestag, and the UK House of Lords in relation to legislative changes surrounding the marketing of e-cigarettes,” she explains.

Similarly, her work on lower strength alcohol labelling is currently used by the Department of Health to inform legislative changes to national alcohol labelling rules in England, which are due to come into force after 2018.

The Behaviour and Health Research Unit, where she works, is a multidisciplinary policy research unit including psychologists, economists, medics, sociologists, social scientists, and statisticians.

“This diverse mix is very enriching, and on many occasions has spurred creative solutions to research problems that we have been grappling with. But, most importantly, being in such close contact with stellar researchers with diverse training backgrounds is fun and inspirational; and has helped me develop my research skills and communication style.”

Vasiljevic is a keen communicator, as is appropriate for someone whose work has relevance to all of our lives.  “The most interesting days I’ve had so far are the Cambridge Science Festival days and also the days when I have carried out outreach work in schools,” she says. “These events are always lots of fun, and are an excellent opportunity for children and adults from the local communities to get involved in our research, learn more about what we do, and of course help us shape some of our future studies.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Gaia Creates Richest Star Map Of Our Galaxy – And Beyond

Gaia creates richest star map of our Galaxy – and beyond

source: www.cam.ac.uk

The European Space Agency’s Gaia mission has produced the richest star catalogue to date, including high-precision measurements of nearly 1.7 billion stars and revealing previously unseen details of our home Galaxy.

There is hardly a branch of astrophysics which will not be revolutionised by Gaia data.

Gerry Gilmore

A multitude of discoveries are on the horizon after today’s much-awaited release, which is based on 22 months of charting the sky, as part of Gaia’s mission to produce the largest, most precise three-dimensional map of our Galaxy ever created. The new data includes positions, distance indicators and motions of more than one billion stars, along with high-precision measurements of asteroids within our Solar System and stars beyond our own Milky Way Galaxy.

Preliminary analysis of this phenomenal data reveals fine details about the makeup of the Milky Way’s stellar population and about how stars move, essential information for investigating the formation and evolution of our home Galaxy.

“The observations collected by Gaia are redefining the foundations of astronomy,” said Günther Hasinger, ESA Director of Science. “Gaia is an ambitious mission that relies on a huge human collaboration to make sense of a large volume of highly complex data. It demonstrates the need for long-term projects to guarantee progress in space science and technology and to implement even more daring scientific missions of the coming decades.”

This unique mission is reliant on the work of Cambridge researchers who collect the vast quantities of data transmitted by Gaia to a data processing centre at the University, overseen by a team at the Institute of Astronomy.

“There is hardly a branch of astrophysics which will not be revolutionised by Gaia data,” said Cambridge’s Professor Gerry Gilmore, Principal Investigator for the UK participation in the Gaia Data Processing and Analysis Consortium, and one of the original proposers of the mission to ESA. “The global community will advance our understanding of what we see, where it came from, what it is made from, how it is changing. All this is made freely available to everyone, based on the dedicated efforts of hundreds of people.”

Gaia was launched in December 2013 and started science operations the following year. The first data release, based on just over one year of observations, was published in 2016; it contained distances and motions of two million stars. The new data release, which covers the period between 25 July 2014 and 23 May 2016, pins down the positions of nearly 1.7 billion stars, and with a much greater precision. For some of the brightest stars in the survey, the level of precision equates to Earth-bound observers being able to spot a Euro coin lying on the surface of the Moon.

With these accurate measurements it is possible to separate the parallax of stars – an apparent shift on the sky caused by Earth’s yearly orbit around the Sun – from their true movements through the Galaxy. The new catalogue lists the parallax and velocity across the sky, or proper motion, for more than 1.3 billion stars. From the most accurate parallax measurements, about ten percent of the total, astronomers can directly estimate distances to individual stars.

The comprehensive dataset provides a wide range of topics for the astronomy community. As well as positions, the data include brightness information of all surveyed stars and colour measurements of nearly all, plus information on how the brightness and colour of half a million variable stars change over time. It also contains the velocities along the line of sight of a subset of seven million stars, the surface temperatures of about a hundred million and the effect of interstellar dust on 87 million.

Gaia also observes objects in our Solar System: the second data release comprises the positions of more than 14,000 known asteroids, which allows precise determination of their orbits. A much larger asteroid sample will be compiled in Gaia’s future releases.

Further afield, Gaia closed in on the positions of half a million distant quasars, bright galaxies powered by the activity of the supermassive black holes at their cores. These sources are used to define a reference frame for the celestial coordinates of all objects in the Gaia catalogue, something that is routinely done in radio waves but now for the first time is also available at optical wavelengths.

Major discoveries are expected to come once scientists start exploring Gaia’s new release. An initial examination performed by the data consortium to validate the quality of the catalogue has already unveiled some promising surprises – including new insights on the evolution of stars.

The team in Cambridge is led by Dr Floor van Leeuwen, Dr Dafydd Wyn Evans, Dr Francesca De Angeli and Dr Nicholas Walton.

“This data release has proven an exciting challenge to process from spacecraft camera images to science-ready catalogues,” said De Angeli, head of the Cambridge processing centre. “More sophisticated strategies and updated models will be applied to the Gaia data to achieve even more precise and accurate photometric and spectrophotometric information, which will enable even more exciting scientific investigations and results.”

“Gaia has so far observed each of its more than 1.7 billion sources on average about 200 times,” said Evans. “This very large data set has to have all the changing satellite and sky responses removed, and everything converted on to a well-defined scale of brightness and colour. While a huge challenge, it is worth it.”

“Groups of dwarf galaxies, including the Magellanic Clouds, can now be observed to be moving around in very similar orbits, hinting at a shared formation history,” said van Leeuwen, Project Manager for the UK and European photometric processing work. “Similarly, a pair of globular clusters has been observed with very similar orbital characteristics and chemical composition, again pointing towards a shared history of formation. The accurate observed motions and positions of the globular clusters and dwarf galaxies provide tracers of the overall mass distribution of our galaxy in a way that has not been possible with this level of accuracy before.”

“The Gaia data will be a globally accessible resource for astronomical research for decades to come, enabling the future research of today’s young astronomers in the UK, Europe and the World,” said Walton, a member of the ESA Gaia Science Team. “Gaia is raising excitement and opportunity, bringing the next generation of researchers together to tackle many key questions in our understanding of the Milky Way.”

More data releases will be issued in future years, with the final Gaia catalogue to be published in the 2020s. This will be the definitive stellar catalogue for the foreseeable future, playing a central role in a wide range of fields in astronomy.

“This vast step into a new window on the Universe is a revolution in our knowledge of the contents, motions and properties of our local Universe,” said Gilmore. “We look forward to the international astronomical community building on this European project, with its major UK contributions, to interpret these Gaia data to revolutionise our understanding of our Universe. This is a magnificent harvest, but cornucopia awaits. We are all proud to be part of this magnificent project.”

Adapted from an ESA press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Gender Inequality Is ‘Drowning Out’ the Voices of Women Scientists

Gender inequality is ‘drowning out’ the voices of women scientists

source: www.cam.ac.uk

A University of Cambridge researcher is calling for the voices of women to be given a fairer platform at a leading scientific conference.

We need the majority groups to think about representation, otherwise minority voices will continue to be drowned out.

Heather L. Ford

Dr Heather Ford and her colleagues analysed data from the American Geophysical Union (AGU) Fall Meeting and found that, overall; female scientists are offered fewer opportunities than men to present their research.

The team examined the gender, career stage and type of presentation delivered by each participant from 2014 to 2016. They found that female members are at a disadvantage because the majority of them are students or in the early stages of their careers, groups whose members are typically given fewer chances to present their research. The resultsare reported in the journal Nature Communications.

Conference speakers are often at more senior stages of their careers, where there are usually fewer women in Science, Technology, Engineering and Maths (STEM) fields. A further problem is that men are more likely to provide speaking opportunities to other men, potentially limiting women’s career prospects.

“The burden of representation often falls on under-represented groups. We need the majority groups to think about representation, otherwise minority voices will continue to be drowned out,” said Ford, who is a NERC Independent Research Fellow in Cambridge’s Department of Earth Sciences.

However, the research showed some positive signs, as women were invited at a much higher rate than men in the early and mid- career stages.

The researchers are calling for more students and early career researchers to have opportunities to speak at future conferences, in a bid to help some of the many female members who are at the beginning of their careers. They also want to see more women selecting the conference speakers, and suggest that all members may benefit from diversity training before they can invite speakers and assign conference presentations.

Attending and presenting at conferences helps academics at every stage of their careers to build their network, meet potential collaborators and share their research. Conferences are important for career progression, and can be key in helping researchers to find funding and receive job offers. Presenting at academic conferences can also help researchers to gain recognition and awards for their work.

Ford says she and her co-author Petra Dekens from San Francisco State University were motivated to look into this topic after sitting in “too many conference sessions” with either no female speakers, or a single female speaker.

The global context is also an important issue for Ford, particular the ongoing campaign for gender equality. She said; “A lot of women have been motivated to speak out about gender inequality in the past year – people are much more vocal about how they’ve been treated. I wanted to find a productive way to channel my frustrations.”

The AGU Fall Meeting is the world’s largest geoscience conference, with more than 22,000 presentation proposals each year. The AGU has more than 60,000 members in 137 countries, and around a third of its members are women. Geoscience is one of the least diverse STEM fields.

Reference:
Heather L. Ford et al. ‘’Gender inequity in speaking opportunities at the American Geophysical Union Fall Meeting.’ Nature Communications (2018). DOI: 10.1038/s41467-018-03809-5


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Sense of Control and Meaning Helps Protect Women From Anxiety, Study Suggests

Sense of control and meaning helps protect women from anxiety, study suggests

source: www.cam.ac.uk

People who feel in control of their lives and who find purpose and meaning in life are less likely to have anxiety disorders even when going through the toughest times, according to a study led by the University of Cambridge.

This study sheds light on inner strengths or qualities that we may have which can protect us from anxiety when we’re exposed to hardships, such as living in deprivation

Olivia Remes

The study, published today in BMJ Open, found that women who had these traits did not have anxiety, even if they were living in the most deprived circumstances, but women who did not feel that they were in control of their lives and who lacked purpose and meaning in life had high levels of anxiety when facing the hardships of living in deprivation. The study could help researchers develop new ways of teaching women how to overcome anxiety.

Anxiety disorders can manifest as fear, restlessness, an inability to concentrate on work or school tasks, and difficulty in falling asleep at night.  In some cases, anxiety can arise out of the blue as in a panic attack, when sudden spikes of intense anxiety make the sufferer think they are having a heart attack, ‘going mad’, or even dying.  In other cases, it is triggered by specific situations, such as being on a bus or at a social gathering, and symptoms such as sweating, gastrointestinal discomfort, dizziness, and chest pains may ensue.

Anxiety disorders are some of the most common mental health problems and their annual cost in the United States is estimated to be $42.3 billion.  In the European Union, they affect over 60 million people in a given year.

Despite anxiety disorders being common and costly, few studies have looked at what makes some people have anxiety when going through tough times, while others facing the exact same situations are able to maintain good mental health.  National Institute for Health Research (NIHR)-funded researchers from the Cambridge Institute of Public Health used data from over 10,000 British women who had responded to a structured health and lifestyle questionnaire.  The questionnaire included a measure of Sense of Coherence, which is a personality disposition.

Women living in deprivation but who reported the following traits were less likely to have anxiety: believing they were in control of their lives, believing their lives made sense, and having a purpose and meaning in life.  Women living in deprivation but without these desirable traits had high levels of anxiety.  In fact, women in deprived communities without these traits were almost twice as likely to have anxiety as women living in more affluent communities.

“This study sheds light on inner strengths or qualities that we may have which can protect us from anxiety when we’re exposed to hardships, such as living in deprivation,” says first author and PhD candidate Olivia Remes. “Fostering such strengths or traits may be useful for people who do not respond well to medication or other therapies for anxiety, and further research would be needed on this.”

The researchers say that living in deprivation can lead to a sense of meaninglessness among individuals, and can give rise to poor mental health and suicide.  In deprived communities, people are more fearful of their neighbours, assaults are more likely to happen, and it is difficult to form close relationships with others.  The total number of people living in deprivation worldwide is large; as such, the results of this study are particularly important.

“This study takes a different approach to mental health,” continues Professor Carol Brayne, Director of the Cambridge Institute of Public Health.  “Up until now, most studies have looked at what makes someone prone to disease, and the risk factors for ill health.  But we have taken a different approach.  Instead of looking at risk factors for disease, we are looking at traits or strengths that we have within us that can help us maintain good mental health and overcome adversity.

“The study could help researchers develop new ways to approach how women can be helped to overcome anxiety, and also highlights the key role of context in our mental health.”

Dr Louise Lafortune, Senior Research Associate at the institute, explains: “Anxiety disorders are common, debilitating, and impairing.  Now we know that people who feel that they are in control of their lives, who believe that life makes sense, and who have found purpose and meaning are less likely to have anxiety even if they are going through hardships, such as living in deprivation.”

Reference
Remes, O. et al. Sense of coherence as a coping mechanism for women with anxiety living in deprivation: British population study. BMJ Open; Tuesday 24 April; DOI: 10.1136/bmjopen-2017-018501


Researcher Profile: Olivia Remes

Anxiety is one of the most common mental health problems, and if left untreated, can lead to substance abuse, depression, and risk of suicide. Yet little seems to known about its causes and consequences.

It is to address this gap in our knowledge that Olivia Remes, originally from Canada, is carrying out research for her PhD. She has been looking at who is most affected by anxiety, some of the factors that can give rise to it, and the impact that untreated illness can have on society.

“Anxiety is not only very costly for society in terms of high health service use, work absenteeism and decreased work productivity, but it can cause much suffering to those affected,” she says.

To carry out her research, Olivia uses data from the European Prospective Investigation of Cancer in Norfolk, one of the largest cohort studies looking at chronic diseases, mental health, and the way people live their lives.

Olivia is keenly aware of the importance of sharing her research with other academics, policy-makers and the public, often through the media. This led to significant interest when she published her findings on the burden of anxiety around the world, with radio and TV interviews across the BBC and other media outlets.

“It was truly exhilarating. Knowing that I had done something to increase awareness about anxiety and that I was able to reach people with key messages from my research was very rewarding,” she says.

“As I started received personal messages from people suffering from anxiety, I felt that all the hard work I had done to bring this condition to light was truly worthwhile.  It made me persevere in my research and gave me hope that, through my work, I can have a positive impact on people’s lives.”

Olivia hopes her research will help inform prevention and intervention efforts directed to help those suffering from anxiety, but also that it will lead to greater awareness of the condition. “I hope that, as more studies on anxiety come out, more people will start talking about this condition and will seek help if experiencing symptoms without feeling embarrassed or ashamed.”

Studying at Cambridge has given her the opportunity to work with and learn from some of the brightest minds in the field, she says.

“The postgrad community is also very welcoming – the Colleges organize many events for students throughout the year, providing opportunities to meet many wonderful people from all over the world,” she says. “I have made many friendships here that I will treasure for many years to come. Cambridge is an inspiring place steeped in history, and is dedicated to inspiring innovation.  I have enjoyed and continue to enjoy every minute here.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Mechanism Behind Neuron Death in Motor Neurone Disease and Frontotemporal Dementia Discovered

Mechanism behind neuron death in motor neurone disease and frontotemporal dementia discovered

source: www.cam.ac.uk

Scientists have identified the molecular mechanism that leads to the death of neurons in amyotrophic lateral sclerosis (also known as ALS or motor neurone disease) and a common form of frontotemporal dementia.

This was a very exciting set of experiments where we were able to apply cutting edge tools from physics, chemistry and neurobiology to understand how the FUS protein normally works in nerve cells, and how it goes wrong in motor neurone disease and dementia

Peter St George-Hyslop

Writing in Cell, the researchers from the University of Cambridge and University of Toronto also identify potential therapeutic targets for these currently incurable diseases.

ALS is a progressive and terminal disease that damages the function of nerves and muscle, affecting up to 5,000 adults in the UK at any one time. Frontotemporal dementia is a form of dementia that causes changes in personality and behaviour, and language difficulties.

A common characteristic of ALS and frontotemporal dementia is the build-up of clumps of misfolded RNA-binding proteins, including a protein called FUS, in the brain and spinal cord.  This leads to the death of neurons, which stops them from communicating with each other and from reaching the muscles.

FUS proteins can change back and forth from small liquid droplets (resembling oil droplets in water) to small gels (like jelly) inside nerve cells. As the FUS protein condenses (from droplets to gel) it captures RNA and transfers it to remote parts of the neuron that are involved in making connections (known as synapses) with other neurons. Here, the protein ‘melts’ and releases the RNA. The RNA are then used to create new proteins in the synapses, which are essential for keeping the synapses working properly, especially during memory formation and learning.

In frontotemporal dementia and ALS, the proteins become permanently stuck as abnormally dense gels, trapping the RNA and making it unavailable for use. This damages nerve cells by blocking their ability to make the proteins needed for synaptic function and leads to the death of neurons in the brain and spinal cord.

In research funded by Wellcome, scientists used human cells that resembled neurons and neurons from frogs to investigate how the change in FUS from liquid droplets to small gels process is regulated and what makes it go awry. They found that this reversible process was tightly controlled by enzymes which chemically alter FUS making it able or unable to form droplets and gels. In frontotemporal dementia, the abnormal gelling was found to be caused by defects in the chemical modification of FUS. In motor neuron disease, it was caused by mutations in the FUS protein itself which meant it was no longer able to change form.

This research provides new ideas and tools to find ways to prevent or reverse the abnormal gelling of FUS as a treatment for these devastating diseases. Potential therapeutic targets identified by the researchers are the enzymes that regulate the chemical modification of FUS and the molecular chaperones that facilitate FUS proteins to change its form. These treatments would need to allow FUS to continue moving between safe reversible states (liquid droplets and reversible gels) but prevent FUS from dropping into the dense, irreversible gel states that cause disease.

Professor Peter St George-Hyslop from the Cambridge Institute for Medical Research said: “This was a very exciting set of experiments where we were able to apply cutting edge tools from physics, chemistry and neurobiology to understand how the FUS protein normally works in nerve cells, and how it goes wrong in motor neurone disease and dementia. It now opens up a new avenue of work to use this knowledge to identify ways to prevent the abnormal gelling of FUS in motor neurone disease and dementia.”

Dr Giovanna Lalli, from Wellcome’s Neuroscience and Mental Health team, said: “Motor neurone disease and frontotemporal dementia are devastating diseases that affect thousands of people across the UK, resulting in severe damage to the brain and spinal cord. By bringing together an interdisciplinary team of researchers, this study provides important new insights into a fundamental process underlying neurodegeneration. Through their research, the team have uncovered promising new ways to tackle these diseases.”

Reference
Qamar, S et al. FUS Phase Separation Is Modulated by a Molecular Chaperone and Methylation of Arginine Cation-π Interactions. Cell; 19 Apr 2018; DOI: 10.1016/j.cell.2018.03.056

Adapted from a press release by Wellcome


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Some Superconductors Can Also Carry Currents Of ‘Spin’

Some superconductors can also carry currents of ‘spin’

source: www.cam.ac.uk

Researchers have shown that certain superconductors – materials that carry electrical current with zero resistance at very low temperatures – can also carry currents of ‘spin’. The successful combination of superconductivity and spin could lead to a revolution in high-performance computing, by dramatically reducing energy consumption.

Spin is a particle’s intrinsic angular momentum, and is normally carried in non-superconducting, non-magnetic materials by individual electrons. Spin can be ‘up’ or ‘down’, and for any given material, there is a maximum length that spin can be carried. In a conventional superconductor electrons with opposite spins are paired together so that a flow of electrons carries zero spin.

A few years ago, researchers from the University of Cambridge showed that it was possible to create electron pairs in which the spins are aligned: up-up or down-down. The spin current can be carried by up-up and down-down pairs moving in opposite directions with a net charge current of zero. The ability to create such a pure spin supercurrent is an important step towards the team’s vision of creating a superconducting computing technology which could use massively less energy than the present silicon-based electronics.

Now, the same researchers have found a set of materials which encourage the pairing of spin-aligned electrons, so that a spin current flows more effectively in the superconducting state than in the non-superconducting (normal) state. Their resultsare reported in the journal Nature Materials.

“Although some aspects of normal state spin electronics, or spintronics, are more efficient than standard semiconductor electronics, their large-scale application has been prevented because the large charge currents required to generate spin currents waste too much energy,” said Professor Mark Blamire of Cambridge’s Department of Materials Science and Metallurgy, who led the research. “A fully-superconducting method of generating and controlling spin currents offers a way to improve on this.”

In the current work, Blamire and his collaborators used a multi-layered stack of metal films in which each layer was only a few nanometres thick. They observed that when a microwave field was applied to the films, it caused the central magnetic layer to emit a spin current into the superconductor next to it.

“If we used only a superconductor, the spin current is blocked once the system is cooled below the temperature when it becomes a superconductor,” said Blamire. “The surprising result was that when we added a platinum layer to the superconductor, the spin current in the superconducting state was greater than in the normal state.”

Although the researchers have shown that certain superconductors can carry spin currents, so far these only occur over short distances. The next step for the research team is to understand how to increase the distance and how to control the spin currents.

The research was funded by the Engineering and Physical Sciences Research Council (EPSRC).

Reference:
Kun-Rok Jeon et al. ‘Enhanced spin pumping into superconductors provides evidence for superconducting pure spin currents.’ Nature Materials (2018). DOI: 10.1038/s41563-018-0058-9


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Drinking More Than Five Pints a Week Could Shorten Your Life, Study Finds

Drinking more than five pints a week could shorten your life, study finds

source: www.cam.ac.uk

Regularly drinking more than the recommended UK guidelines for alcohol could take years off your life, according to new research from the University of Cambridge. Part-funded by the British Heart Foundation, the study shows that drinking more alcohol is associated with a higher risk of stroke, fatal aneurysm, heart failure and death.

If you already drink alcohol, drinking less may help you live longer and lower your risk of several cardiovascular conditions

Angela Wood

The authors say their findings challenge the widely held belief that moderate drinking is beneficial to cardiovascular health, and support the UK’s recently lowered guidelines.

The study compared the health and drinking habits of over 600,000 people in 19 countries worldwide and controlled for age, smoking, history of diabetes, level of education and occupation.

The upper safe limit of drinking was about five drinks per week (100g of pure alcohol, 12.5 units or just over five pints of 4% ABV beer or five 175ml glasses of 13% ABV wine). However, drinking above this limit was linked with lower life expectancy. For example, having 10 or more drinks per week was linked with one to two years shorter life expectancy. Having 18 drinks or more per week was linked with four to five years shorter life expectancy.

The research, published today in the Lancet, supports the UK’s recently lowered guidelines, which since 2016 recommend both men and women should drink no more than 14 units of alcohol each week. This equates to around six pints of beer or six glasses of wine a week.

However, the worldwide study carries implications for countries across the world, where alcohol guidelines vary substantially.

The researchers also looked at the association between alcohol consumption and different types of cardiovascular disease. Alcohol consumption was associated with a higher risk of stroke, heart failure, fatal aortic aneurysms, fatal hypertensive disease and heart failure and there were no clear thresholds where drinking less did not have a benefit.

By contrast, alcohol consumption was associated with a slightly lower risk of non-fatal heart attacks.

The authors note that the different relationships between alcohol intake and various types of cardiovascular disease may relate to alcohol’s elevating effects on blood pressure and on factors related to elevated high-density lipoprotein cholesterol (HDL-C) (also known as ‘good’ cholesterol). They stress that the lower risk of non-fatal heart attack must be considered in the context of the increased risk of several other serious and often fatal cardiovascular diseases.

The study focused on current drinkers to reduce the risk of bias caused by those who abstain from alcohol due to poor health. However, the study used self-reported alcohol consumption and relied on observational data, so no firm conclusions can me made about cause and effect. The study did not look at the effect of alcohol consumption over the life-course or account for people who may have reduced their consumption due to health complications.

Dr Angela Wood, from the University of Cambridge, lead author of the study said: “If you already drink alcohol, drinking less may help you live longer and lower your risk of several cardiovascular conditions.

“Alcohol consumption is associated with a slightly lower risk of non-fatal heart attacks but this must be balanced against the higher risk associated with other serious – and potentially fatal – cardiovascular diseases.”

Victoria Taylor, Senior dietician at the British Heart Foundation, which part-funded the study, said: “This powerful study may make sobering reading for countries that have set their recommendations at higher levels than the UK, but this does seem to broadly reinforce government guidelines for the UK.

“This doesn’t mean we should rest on our laurels, many people in the UK regularly drink over what’s recommended. We should always remember that alcohol guidelines should act as a limit, not a target, and try to drink well below this threshold.”

The study was funded by the UK Medical Research Council, British Heart Foundation, National Institute for Health Research, European Union Framework 7, and European Research Council.

Reference
Wood, AM et al. Risk thresholds for alcohol consumption: combined analysis of individual-participant data for 599 912 current drinkers in 83 prospective studies. Lancet; 14 April 2018; DOI: 10.1016/S0140-6736(18)30134-X

Adapted from a press release by British Heart Foundation.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Human Anti-Cancer Drugs Could Help Treat Transmissible Cancers in Tasmanian Devils

Human anti-cancer drugs could help treat transmissible cancers in Tasmanian devils

source: www.cam.ac.uk

Transmissible cancers are incredibly rare in nature, yet have arisen in Tasmanian devils on at least two separate occasions. New research from the University of Cambridge identifies key anti-cancer drugs which could be trialled as a treatment for these diseases, which are threatening Tasmanian devils with extinction.

This study gives us optimism that anti-cancer drugs that are already in use in humans may offer a chance to assist with conservation efforts for this iconic animal

Elizabeth Murchison

The research also found that the two Tasmanian devil transmissible cancers are very similar to each other, and likely both arose due to susceptibilities inherent to the devils themselves.

Tasmanian devils are marsupial carnivores endemic to the Australian island of Tasmania. The species is considered endangered due to devil facial tumour 1 (DFT1), a cancer that is passed between animals through the transfer of living cancer cells when the animals bite each other. DFT1 causes grotesque and disfiguring facial tumours, which usually kill affected individuals.

The DFT1 cancer first arose in a single individual devil several decades ago, but rather than dying together with this devil, the cancer survived by ‘metastasising’ into different devils. Therefore, the DNA of the devils’ tumour cells is not their own DNA, but rather belongs to the individual devil that first gave rise to DFT1 all those years ago. Remarkably, DFT1 cells can escape the devils’ immune systems despite being in essence a foreign body.

The DFT1 cancer was first observed in north-east Tasmania in 1996, but has subsequently spread widely throughout the island, causing significant declines in devil populations.

In 2014, routine diagnostic screening revealed a second transmissible cancer in Tasmanian devils, devil facial tumour 2 (DFT2), which causes facial tumours indistinguishable to the naked eye from those caused by DFT1, and which is probably also spread by biting. However, analysis showed that the two types of cancer differ at a biological level, and whereas DFT1 first arose from the cells of a female devil, DFT2 appears to have first arisen from a male animal. For now, DFT2 appears to be confined to a peninsula in Tasmania’s south-east.

“The discovery of a second transmissible cancer in Tasmanian devils was a huge surprise,” says Dr Elizabeth Murchison from the Department of Veterinary Medicine at the University of Cambridge. “Other than these two cancers, we know of only one other naturally occurring transmissible cancer in mammals – the canine transmissible venereal tumour in dogs, which first emerged several thousand years ago.”

In fact, outside of mammals, only five transmissible cancers have been observed, all of which cause leukaemia-like diseases in clams and other shellfish.

“The scarcity of transmissible cancers suggests that such diseases emerge rarely,” she adds. “Before 1996, no one had observed them in Tasmanian devils, so finding two transmissible cancers in Tasmanian devils in just eighteen years was very surprising.”

In order to see whether the devil transmissible cancers are caused by external factors or whether the animals were just particularly susceptible to developing these cancers, a research team led by Dr Murchison analysed the genetic profiles of DFT1 and DFT2 tumours taken from a number of Tasmanian devils. The results are published today in the journal Cancer Cell.

The team found striking similarities in tissues-of-origin, genetics, how the cancer cells mutate, and possible drug targets. This implies that the two tumours belong to the same cancer type and arose via similar mechanisms.

The team studied the genetic and physical features of the tumours, and compared the two lineages with each other and with human cancers. In doing so, they identified an important role in the tumours for particular types of molecules known as receptor tyrosine kinases (RTKs) in sustaining growth and survival of DFT cancers.

Importantly, drugs targeting RTKs have already been developed for human cancer, and the researchers showed that these drugs efficiently stopped the growth of devil cancer cells growing in the lab. This leads to hope that it may be possible to use these drugs to help Tasmanian devils.

First author of the study, Maximilian Stammnitz, adds: “Altogether, our findings suggest that transmissible cancers may arise naturally in Tasmanian devils. We found no DNA-level evidence of these cancers being caused by external factors or infectious agents such as viruses. It seems plausible that similar transmissible cancers may have occurred in the past, but escaped detection, perhaps because they remained in localised populations, or because they existed prior to the arrival of Europeans in Tasmania in the nineteenth century.”

Why Tasmanian devils should be particularly susceptible to the emergence of DFTs is not clear. However, devils bite each other frequently around the facial area, often causing significant tissue injury. Given the important role for RTK molecules in wound healing, the researchers speculate that DFT cancers may arise from errors in the maintenance of proliferative cells involved in tissue repair after injury.

“When fighting, Tasmanian devils often bite their opponent’s face, which may predispose these animals to the emergence of this particular type of cancer via tissue injury,” adds Stammnitz. “As biting occurs on the face, this would simultaneously provide a route of cell transmission.”

The researchers say it is also possible that human activities may have indirectly increased the risk of the emergence or spread of transmissible devil facial tumours (DFTs) in recent years. For instance, it is possible that some modern land use practices may have provided favourable conditions for devils, leading to an increase in local population densities of devils, and to greater competition, interactions and fights between animals, which may in turn have raised probabilities of DFTs arising or spreading. Alternatively, early persecution of devils by European colonists may have additionally contributed to this species’ documented low genetic diversity, a possible risk factor for disease spread and the ability of DFTs to escape the immune system.

The researchers also identified deletions in DFT1 and in DFT2 in genes involved in recognition of cancer cells by the immune system. This may help explain how these cancers escape the immune system.

“The story of Tasmanian devils in recent years has been a very concerning one,” says Dr Murchison. “This study gives us optimism that anti-cancer drugs that are already in use in humans may offer a chance to assist with conservation efforts for this iconic animal.”

The research was funded by Wellcome, the National Science Foundation, Save the Tasmanian Devil Appeal, Leverhulme Trust, Cancer Research UK and Gates Cambridge Trust.

Reference
Stammnitz, M.R., et al. (2018). The origins and vulnerabilities of two transmissible cancers in Tasmanian devils. Cancer Cell 33(4), 607-619.


Researcher profile: Maximilian Stammnitz

One of Maximilian Stammnitz’s best memories at Cambridge has been his encounter with Tasmanian devils on a field trip to Tasmania in 2016. “There is nothing more exciting than examining actual devils in the wild – they are truly majestic animals!” he says.

Stammnitz is a Gates Cambridge Scholar at Cambridge’s Department of Veterinary Medicine. Originally from “Germany’s sunniest spot: Heidelberg”, he came to Cambridge to join the Computational Biology MPhil program at the Department of Applied Mathematics and Theoretical Physics in 2014.

“This course provides fascinating opportunities to study biology through a big data lens, and to learn about vastly emerging genomics technologies from experts in the field,” he says. “The DNA-level expertise and collaboration at Cambridge surrounding topics of genetics, evolution, medicine and computational data analysis is breath-taking.”

It was a seminar by Elizabeth Murchison on transmissible cancers that caught his imagination, however, and he subsequently joined her group at Veterinary Medicine for a summer internship, and then as a PhD student and Gates Cambridge Scholar. The ultimate aim of his work is to save the largest carnivorous marsupial on the planet, but by studying the fundamental processes of cancer development in Tasmanian devils, his work could help us understand better how cancer develops in humans.

“I spend most of my working days behind a computer screen, processing and analysing large volumes of DNA and RNA sequencing data from Tasmanian devil tumour biopsies,” he says. “Occasionally, I also do molecular biology experiments in the ‘wet lab’, to validate our computational results or to establish testing protocols for the devils.”

It isn’t all about work, though. “Over the past year, I have been the captain of our university’s Blues men’s volleyball team and co-founded PuntSeq, a citizen science project aiming at cost-effective pathogen surveillance of our house river Cam’s water,” he says.

“My biggest challenge of living here is to balance truly focused work life and quiet time with the many inspiring distractions that wait behind the corners of Cambridge’s old walls. It’s a luxury problem to have as a PhD student.”

Follow Maximilian Stammnitz on Twitter @DevilsAdvoMax


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

What Makes a Faster Typist?

What makes a faster typist?

source: www.cam.ac.uk

The largest-ever dataset on typing speeds and styles, based on 136 million keystrokes from 168,000 volunteers, finds that the fastest typists not only make fewer errors, but they often type the next key before the previous one has been released.

Crowdsourcing experiments that allow us to analyse how people interact with computers on a large scale are instrumental for identifying solution principles for the design of next-generation user interfaces.

Per Ola Kristensson

The data was collected by researchers from Aalto University in Finland and the University of Cambridge. Volunteers from over 200 countries took the typing test, which is freely available online. Participants were asked to transcribe randomised sentences, and their accuracy and speed were assessed by the researchers.

Unsurprisingly, the researchers found that faster typists make fewer mistakes. However, they also found that the fastest typists also performed between 40 and 70 percent of keystrokes using rollover typing, in which the next key is pressed down before the previous key is lifted. The strategy is well-known in the gaming community but has not been observed in a typing study. The results will be presented later this month at the ACM CHI Conference on Human Factors in Computing Systems in Montréal.

“Crowdsourcing experiments that allow us to analyse how people interact with computers on a large scale are instrumental for identifying solution principles for the design of next-generation user interfaces,” said study co-author Dr Per Ola Kristensson from Cambridge’s Department of Engineering.

Most of our knowledge of how people type is based on studies from the typewriter era. Now, decades after the typewriter was replaced by computers, people make different types of mistakes. For example, errors where one letter is replaced by another are now more common, whereas in the typewriter era typists often added or omitted characters.

Another difference is that modern users use their hands differently. “Modern keyboards allow us to type keys with different fingers of the same hand with much less force than what was possible with typewriters,” said co-author Anna Feit from Aalto University. “This partially explains why self-taught typists using fewer than ten fingers can be as fast as touch typists, which was probably not the case in the typewriter era.”

The average user in the study typed 52 words per minute, much slower than the professionally trained typists in the 70s and 80s, who typically reached 60-90 words per minute. However, performance varied largely. “The fastest users in our study typed 120 words per minute, which is amazing given that this is a controlled study with randomised phrases,” said co-author Dr Antti Oulasvirta, also from Aalto. “Many informal tests allow users to practice the sentences, resulting in unrealistically high performance.”

The researchers found that users who had previously taken a typing course actually had a similar typing behaviour as those who had never taken such a course, in terms of how fast they type, how they use their hands and the errors they make – even though they use fewer fingers.

The researchers found that users display different typing styles, characterised by how they use their hands and fingers, the use of rollover, tapping speeds, and typing accuracy.

For example, some users could be classified as ‘’careless typists’’ who move their fingers quickly but have to correct many mistakes; and others as attentive error-free typists, who gain speed by moving hands and fingers in parallel, pressing the next key before the first one is released.

It is now possible to classify users’ typing behaviour based on the observed keystroke timings which does not require the storage of the text that users have typed. Such information can be useful for example for spell checkers, or to create new personalised training programmes for typing.

“You do not need to change to the touch typing system if you want to type faster,” said Feit. “A few simple exercises can help you to improve your own typing technique.”

The anonymised dataset is available at the project homepage: http://userinterfaces.aalto.fi/136Mkeystrokes/

Reference: 
Dhakal, V., Feit, A., Kristensson, P.O. and Oulasvirta, A. 2018. ‘Observations on typing from 136 million keystrokes.’ In Proceedings of the 36th ACM Conference on Human Factors in Computing Systems (CHI 2018). ACM Press.

Adapted from an Aalto University press release. 


Want to type faster?
  • Pay attention to errors, as they are costly to correct. Slow down to avoid them and you will be faster in the long run.
  • Learn to type without looking at fingers; your motor system will automatically pick up very fast ‘’trills’’ for frequently occurring letter combinations (“the”), which will speed up your typing. Being able to look at the screen while typing also allows you to quickly detect mistakes.
  • Practice rollover: use different fingers for successive letter keys instead of moving a single finger from one key to another. Then, when typing a letter with one finger, press the next one with the other finger.
  • Take an online typing test to track performance and identify weaknesses such as high error rates. Make sure that the test requires you to type new sentences so you do not over-practice the same text.
  • Dedicate time to practice deliberately. People may forget the good habits and relapse to less efficient ways of typing.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Online Tool Can Measure Individuals’ Likelihood To Fall For Internet Scams

Online tool can measure individuals’ likelihood to fall for internet scams

source: www.cam.ac.uk

Researchers have developed an online questionnaire which measures a range of personality traits to identify individuals who are more likely to fall victim to internet scams and other forms of cybercrime.

Scams have been around for hundreds of years, and over the centuries, they haven’t really changed that much – the only difference now is with the internet, it requires a lot less effort to do it.

David Modic

The psychometric tool, developed by researchers at the Universities of Cambridge and Helsinki, asks participants to answer a range of questions in order to measure how likely they are to respond to persuasive techniques. The test, called Susceptibility to Persuasion II (StP-II) is freely available and consists of the StP-II scale and several other questions to understand persuadability better. A brief, automated, interpretation of the results is displayed at the end of the questionnaire.

The results of the test can be used to predict who will be more likely to become a victim of cybercrime, although the researchers say that StP-II could also be used for hiring in certain professions, for the screening of military personnel or to establish the psychological characteristics of criminal hackers. Their results are reported in the journal PLOS One.

“Scams are essentially like marketing offers, except they’re illegal,” said paper’s first author Dr David Modic from Cambridge’s Department of Computer Science and Technology. “Just like in advertising, elements of consumer psychology and behavioural economics all come into the design of an online scam, which is why it’s useful to know which personality traits make people susceptible to them.”

Modic and his colleagues at the University of Exeter designed an initial version of the test five years ago that yielded solid results but was not sufficiently detailed. The new version is far more comprehensive and robust.

“We are not aware of an existing scale that would measure all the constructs that are part of StP-II,” said Modic, who is also a senior member of King’s College, Cambridge. “There are existing scales that measure individual traits, but when combined, the sheer length of these scales would present the participant with a psychometric tool that is almost unusable.”

The questions in StP-II fall into 10 categories, measuring different traits which might make people more susceptible to fraud: the ability to premeditate, consistency, sensation seeking, self-control, social influence, need for similarity, attitude towards risk, attitude towards advertising, cognition and uniqueness. Participants are given a score out of seven in each of the ten areas.

Using a large data set obtained from a collaboration with the BBC, the researchers found that the strongest predictor was the ability to premeditate: individuals who fail to consider the possible consequences of a particular action are more likely to engage with a fraudster. However, they found that the likelihood of falling for one of the measured categories of Internet fraud is partially explained by at least one of the mechanisms in StP-II.

“Over the past ten years, crime, like everything else, has moved online,” said co-author Professor Ross Anderson, also from Cambridge’s Department of Computer Science. “This year, about a million UK households will be the victim of typical household crime, such as burglary, where the average victim is an elderly working-class woman. However, now 2.5 million households will be the victims of an online or electronic scam, where the victims are younger and more educated. Crime is moving upmarket.”

“Scams have been around for hundreds of years, and over the centuries, they haven’t really changed that much – the only difference now is with the internet, it requires a lot less effort to do it,” said Modic.

The researchers say that despite the changing demographics of crime victims, there isn’t a ‘typical victim of cybercrime. “Older generations might be seen as less internet-savvy, but younger generations are both more exposed to scams and might be seen as more impulsive,” said co-author Jussi Palomӓki, from the University of Helsinki’s Cognitive Science Unit. “There isn’t a specific age range – there are many different risk factors.”

“The immediate benefit of StP-II is that people will get an indication of the sorts of things they should look out for – I’m not saying it’s a sure-fire way that they will not be scammed, but there are things they should be aware of,” said Modic. “StP-II doesn’t just measure how likely you are to fall for scams, it’s how likely you are to change your behaviour.”

Ross Anderson’s blog on the paper can be found at: https://www.lightbluetouchpaper.org/2018/03/16/we-will-make-you-like-our-research/.

 

Reference:
David Modic, Ross Anderson and Jussi Palomäki. ‘We will make you like our research: The development of a susceptibility-to-persuasion scale.’ PLOS ONE (2018). DOI: 10.1371/journal.pone.0194119


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Research Shows First Land Plants Were Parasitised By Microbes

Research shows first land plants were parasitised by microbes

source: www.cam.ac.uk

Relationship between plants and filamentous microbes not only dates back millions of years, but modern plants have maintained this ancient mechanism to accommodate and respond to microbial invaders.

Why do some plants welcome some microbes with open arms while giving others the cold-shoulder? Like most relationships, it’s complicated, and it all goes back a long way. By studying liverworts – which diverged from other land plants early in the history of plant evolution – researchers from the Sainsbury Laboratory at the University of Cambridge have found that the relationship between plants and filamentous microbes not only dates back millions of years, but that modern plants have maintained this ancient mechanism to accommodate and respond to microbial invaders.

Liverworts

Liverworts are small green plants that don’t have roots, stems, leaves or flowers. They belong to a group of plants called Bryophytes, which also includes mosses and hornworts. Bryophytes diverged from other plant lineages early in the evolution of plants and are thought to be similar to some of the earliest diverging land plant lineages. Liverworts are found all over the world and are often seen growing as a weed in the cracks of paving or soil of potted plants. Marchantia polymorpha, which is also known as the common liverwort or umbrella liverwort, was used in this research.

Published today in the journal Proceedings of the National Academy of Sciences, a new study shows that aggressive filamentous microbial (fungi-like) pathogens can invade liverworts and that some elements of the liverwort’s response are shared with distantly related plants. The first author of the paper, Dr Philip Carella, said the research showed that liverworts could be infected by the common and devastating microorganism Phytophthora: “We know a great deal about microbial infections of modern flowering plants, but until now we haven’t known how distantly related plant lineages dealt with an invasion by an aggressive microbe. To test this, we first wanted to see if Phytophthora could infect and complete its life cycle in a liverwort.”


Above image: ​A healthy Marchantia polymorpha liverwort (left) and one that has been infected by Phytophthora palmivora (right).

“We found that Phytophthora palmivora can colonise the photosynthetic tissues of the liverwort Marchantia polymorpha by invading living cells. Marchantia responds to this by deploying proteins around the invading Phytophthora hyphal structures. These proteins are similar to those that are produced in flowering plants such as tobacco, legumes or Arabidopsis in response to infections by both symbiont and pathogenic microbes.”


Above image: Microscopy image of a cross-section of a Marchantia polymorpha thallus showing the Phytophthora infection (red) in the upper photosynthetic layer of the liverwort plant.

These lineages share a common ancestor that lived over 400 million years ago, and fossils from this time period show evidence that plants were already forming beneficial relationships with filamentous microbes. Dr Carella added: “These findings raise interesting questions about how plants and microbes have interacted and evolved pathogenic and symbiotic relationships. Which mechanisms evolved early in a common ancestor before the plant groups diverged and which evolved independently?”

Phytophthora

Phytophthora is a water mould. Although it looks like it, it is not a fungus at all. Instead it belongs to the oomycetes and is a type of filamentous microbe. Phytophthora pathogens are best known for devastating crops, such as causing the Irish potato famine through potato late blight disease as well as many tropical diseases. This research used the tropical species, Phytophthora palmivora, which causes diseases in cocoa, oil palms, coconut palms and rubber trees.

Dr Sebastian Schornack, who led the research team, says the study indicates that early land plants were already genetically equipped to respond to microbial infections: “This discovery reveals that certain response mechanisms were already in place very early on in plant evolution.”

“Finding that pathogenic filamentous microbes can invade living liverwort cells and that liverworts respond using similar proteins as in flowering plants suggests that the relationship between filamentous pathogens and plants can be considered ancient. We will continue to study whether pathogens are exploiting mechanisms evolved to support symbionts and, hopefully, this will allow us to establish future crop plants that both benefit from symbionts while remaining more resistant to pathogens. “Liverworts are showing great promise as a model plant system and this discovery that they can be colonised by pathogens of flowering plants makes them a valuable model plant to continue research into plant-microbe interactions.”

This research was funded by the Gatsby Charitable Foundation, the Royal Society, the BBSRC OpenPlant initiative and the Natural Environment Research Council.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Researchers Develop Infrared-Based System To Read Body Language

Researchers develop infrared-based system to read body language

source: www.cam.ac.uk

Infrared sensors and a marshmallow offer researchers a new way to monitor and assess social interaction.

The ability to use invisible light to determine someone’s role and attitude in social settings has powerful implications for individuals and organisations that are concerned about how they communicate.

Cecilia Mascolo

A joint research team from the University of Cambridge and Dartmouth College has developed a system for using infrared light tags to monitor face-to-face interactions. The technique could lead to a more precise understanding of how individuals interact in social settings and can increase the effectiveness of communications coaching.

The system, named Protractor by the Cambridge-Dartmouth team, uses invisible light to record how people employ body language by measuring body angles and distances between individuals.

Prior studies have revealed that body language can influence many aspects of everyday life including job interviews, doctor-patient conversations and team projects. Each Protractor setting includes a specific set of interaction details such as eye contact and hand gestures for which an accurate monitoring of distance and relative orientation is crucial.

“The ability to use invisible light to determine someone’s role and attitude in social settings has powerful implications for individuals and organisations that are concerned about how they communicate,” said Professor Cecilia Mascolo from Cambridge’s Department of Computer Science and Technology, who led the research.

Body language is already commonly studied through video sessions, audio recordings and paper questionnaires. Compared to the new, light-based system, these approaches can require invasive cameras, necessitate complex infrastructure support, and impose high burdens on users.

“Our system is a key departure from existing approaches,” said co-author Xia Zhou from Dartmouth. “The ability to sense both body distance and relative angle with fine accuracy using only infrared light offers huge advantages and can deepen the understanding of how body language plays a role in social interactions.”

Protractor is a lightweight, wearable tag resembling an access badge worn with a lanyard or clip. The device measures non-verbal behaviour with fine granularity by using near-infrared light from photodiodes. The light technology operates at a wavelength commonly used in television remote controls.

Before settling on infrared light for the unit, the research team also considered ultrasound and radio frequency. In addition to the overall accuracy, infrared was favourable because light cannot penetrate human bodies, ensuring the accurate sensing of face-to-face interaction. Near-infrared light is also imperceptible to human eyes and keeps the sensing unobtrusive.

Although well-suited for measuring body language, the research team needed to correct for when a user’s hand or clothing could temporarily block the light channel. They did so by designing algorithms that exploit inertial sensors to work around the absence of light tracking results.

In demonstrating the system, the researchers also had to devise a way for the sensors to accurately identify participants and to limit power consumption.

“By modulating the light from each Protractor tag to encode the tag ID, each tag can then figure out which individuals are participating. To increase energy efficiency, we also adapt the frequency of emitting light signals based on the specific context,” said co-author Zhao Tian, a PhD candidate at Dartmouth.

To study the technique’s effectiveness, the team used the Protractor tags to track non-verbal behaviours during a problem-solving group task known as “The Marshmallow Challenge.” In this task, teams of four members were given 18 minutes to build a structure that could support a marshmallow using tape, string and a handful of spaghetti.

“Beyond simply observing body language with the tags, we identified the task role each group member was performing and delineated each stage in the building process through the recorded body angle and distance measurements,” said Alessandro Montanari, a researcher at the University of Cambridge.

In the study of 64 participants, Protractor achieved 1- to 2-inch mean error in estimating interaction distance and less than 6 degrees error 95 percent of the time for measuring relative body orientation. The system also allowed researchers to assess an individual’s task role within the challenge with close to 85 percent accuracy while identifying stages in the building process with over 93 percent accuracy.

According to the research team, the system will not only support social research, but it can also potentially provide real-time feedback during interviews and other interactions. Trainers, supervisors and team facilitators can use these findings to better understand team dynamics and intervene during intense problem-focused discussions to achieve higher creativity.

Protractor can also help study the impact of culture on body language in light of research that shows that cultural backgrounds can impact the way people think, feel, and act while working with others – an important feature in today’s highly-internationalized workplaces.

Researchers at Maastricht University and the University of Nottingham also contributed to this study.

The research was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies and will be presented during UbiComp’18.

Adapted from a Dartmouth press release


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Citizen Science Experiment Predicts Massive Toll of Flu Pandemic On The UK

Citizen science experiment predicts massive toll of flu pandemic on the UK

source: www.cam.ac.uk

How fast could a new flu epidemic spread? The results of the UK’s largest citizen science project of its kind ever attempted, carried out by thousands of volunteers, predict that 43 million people in the UK could be infected in an influenza pandemic, and with up to 886,000 of those infected expected to be fatalities.

We don’t know of any studies that join up the movement and survey data so comprehensively.

Julia Gog

The numbers are frightening, but even more daunting is the very real danger of a major flu pandemic emerging at any moment. Experts around the world agree that it’s a question of when not if the next deadly pandemic will strike, making it number one on the government civilian risk register in the UK.

When it happens the pandemic will almost certainly reach the UK and the government will be faced with a series of life-saving decisions. Should we close schools or public transport? Who should be given priority when the first doses of vaccine become available? How will we cope if there is a high mortality rate? Having the right answers to these and many other crucial pandemic response questions depends on mathematical models.

The model behind the results, designed by researchers at the University of Cambridge and the London School of Hygiene and Tropical Medicine, is based on data from nearly 30,000 volunteers and represents the largest and most comprehensive dataset of its kind. The results will be broadcast on Contagion! The BBC Four Pandemic, tonight (22 March) at 9pm on BBC Four, presented by Dr Hannah Fry and Dr Javid Abdelmoneim. The resultsare also published in the journal Epidemics.

“The value of predictions hinges completely on the quality of the model,” said Professor Julia Gog from Cambridge’s Department of Applied Mathematics and Theoretical Physics, who heads the disease dynamics group. “Up to now, the picture of how the population in the UK move around has been surprisingly limited, and existing studies use relatively small samples of the population. Getting a handle on how people move and interact day to day is vital to understanding how a virus will actually spread from person to person and place to place. The BBC Pandemic project has aimed to address this gap, with volunteers using an app to track movements and record who they encounter day to day, creating the biggest dataset for UK pandemic research ever collected.”

“BBC Pandemic experiment has been hugely successful in recruiting study participants,” said Dr Petra Klepac, the lead author of the paper. “The resulting dataset is incredibly rich and will become a new gold standard in modelling contact and movement patterns that shape the spread of infectious diseases. For the programme, we were able to create a detailed UK model based on data from almost 30,000 users.”

The BBC Four programme will show how a pandemic might spread in the UK, starting from Haslemere in Surrey, where the team modelled in detail an introduction starting from a hypothetical patient zero.

“We don’t know of any studies that join up the movement and survey data so comprehensively,” said Gog. “And this experiment is just huge already, an order of magnitude bigger than anything even similar. The BBC Pandemic experiment sets a new benchmark for other future studies around the world.”

The study remains open during all of 2018, and anyone in the UK can volunteer by using the app (available via App Store or Google Play). Once the project is complete, the anonymised dataset will be made available to all researchers, enabling more accurate prediction in future. “Our focus so far has been on a prospective influenza pandemic, but this dataset will be valuable in our efforts to understand and control a variety of infectious diseases, both in the UK and in extrapolating to other countries,” said Gog.

“While these preliminary results are eye-opening there’s a lot more this data can be used for,” said programme host Dr Hannah Fry. “Scientists around the country will be using it for years to come.”

The BBC Pandemic app was launched in September 2017. Once downloaded, app users enter some basic anonymous demographic information about themselves such as age and gender, and then are asked to be tracked via the GPS on their phone once an hour for 24 hours. The app also records the people they come into close contact with. This is the first time tracking, demographic and contact data have been combined, making it an unrivalled tool for pandemic research.

The headline results of the simulation shown in the programme are based on a moderately transmissible influenza pandemic virus with a high fatality rate, in accordance with a ‘reasonable worst case’. The details of assumptions and limitations are discussed in detail in the paper.

Reference
Petra Klepac, Stephen Kissler and Julia Gog. ‘Contagion! The BBC Four Pandemic – the model behind the documentary.’ Epidemics (2018). DOI: 10.1016/j.epidem.2018.03.003

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Potassium Gives Perovskite-Based Solar Cells An Efficiency Boost

Potassium gives perovskite-based solar cells an efficiency boost

source: www.cam.ac.uk

A simple potassium solution could boost the efficiency of next-generation solar cells, by enabling them to convert more sunlight into electricity.

Perovskites are very tolerant to additives – you can add new components and they’ll perform better.

Mojtaba Abdi-Jalebi

An international team of researchers led by the University of Cambridge found that the addition of potassium iodide ‘healed’ the defects and immobilised ion movement, which to date have limited the efficiency of cheap perovskite solar cells. These next-generation solar cells could be used as an efficiency-boosting layer on top of existing silicon-based solar cells, or be made into stand-alone solar cells or coloured LEDs. The results are reported in the journal Nature.

The solar cells in the study are based on metal halide perovskites – a promising group of ionic semiconductor materials that in just a few short years of development now rival commercial thin film photovoltaic technologies in terms of their efficiency in converting sunlight into electricity. Perovskites are cheap and easy to produce at low temperatures, which makes them attractive for next-generation solar cells and lighting.

Despite the potential of perovskites, some limitations have hampered their efficiency and consistency. Tiny defects in the crystalline structure of perovskites, called traps, can cause electrons to get ‘stuck’ before their energy can be harnessed. The easier that electrons can move around in a solar cell material, the more efficient that material will be at converting photons, particles of light, into electricity. Another issue is that ions can move around in the solar cell when illuminated, which can cause a change in the bandgap – the colour of light the material absorbs.

“So far, we haven’t been able to make these materials stable with the bandgap we need, so we’ve been trying to immobilise the ion movement by tweaking the chemical composition of the perovskite layers,” said Dr Sam Stranks from Cambridge’s Cavendish Laboratory, who led the research. “This would enable perovskites to be used as versatile solar cells or as coloured LEDs, which are essentially solar cells run in reverse.”

In the study, the researchers altered the chemical composition of the perovskite layers by adding potassium iodide to perovskite inks, which then self-assemble into thin films. The technique is compatible with roll-to-roll processes, which means it is scalable and inexpensive. The potassium iodide formed a ‘decorative’ layer on top of the perovskite which had the effect of ‘healing’ the traps so that the electrons could move more freely, as well as immobilising the ion movement, which makes the material more stable at the desired bandgap.

The researchers demonstrated promising performance with the perovskite bandgaps ideal for layering on top of a silicon solar cell or with another perovskite layer – so-called tandem solar cells. Silicon tandem solar cells are the most likely first widespread application of perovskites. By adding a perovskite layer, light can be more efficiently harvested from a wider range of the solar spectrum.

“Potassium stabilises the perovskite bandgaps we want for tandem solar cells and makes them more luminescent, which means more efficient solar cells,” said Stranks, whose research is funded by the European Union and the European Research Council’s Horizon 2020 Programme. “It almost entirely manages the ions and defects in perovskites.”

“We’ve found that perovskites are very tolerant to additives – you can add new components and they’ll perform better,” said first author Mojtaba Abdi-Jalebi, a PhD candidate at the Cavendish Laboratory who is funded by Nava Technology Limited. “Unlike other photovoltaic technologies, we don’t need to add an additional layer to improve performance, the additive is simply mixed in with the perovskite ink.”

The perovskite and potassium devices showed good stability in tests, and were 21.5% efficient at converting light into electricity, which is similar to the best perovskite-based solar cells and not far below the practical efficiency limit of silicon-based solar cells, which is (29%). Tandem cells made of two perovskite layers with ideal bandgaps have a theoretical efficiency limit of 45% and a practical limit of 35% – both of which are higher than the current practical efficiency limits for silicon. “You get more power for your money,” said Stranks.

The research has also been supported in part by the Royal Society and the Engineering and Physical Sciences Research Council. The international team included researchers from Cambridge, Sheffield University, Uppsala University in Sweden and Delft University of Technology in the Netherlands.

Reference:
Mojtaba Abdi-Jalebi et al. ‘Maximising and Stabilising Luminescence from Halide Perovskites with Potassium Passivation.’ Nature (2018). DOI: 10.1038/nature25989


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Volcanic Eruption Influenced Iceland’s Conversion To Christianity

Volcanic eruption influenced Iceland’s conversion to Christianity

source: www.cam.ac.uk

Memories of the largest lava flood in the history of Iceland, recorded in an apocalyptic medieval poem, were used to drive the island’s conversion to Christianity, new research suggests.

With a firm date for the eruption, many entries in medieval chronicles snap into place as likely consequences.

Clive Oppenheimer

A team of scientists and medieval historians, led by the University of Cambridge, has used information contained within ice cores and tree rings to accurately date a massive volcanic eruption, which took place soon after the island was first settled.

Having dated the eruption, the researchers found that Iceland’s most celebrated medieval poem, which describes the end of the pagan gods and the coming of a new, singular god, describes the eruption and uses memories of it to stimulate the Christianisation of Iceland. The results are reported in the journal Climatic Change.

The eruption of the Eldgjá in the tenth century is known as a lava flood: a rare type of prolonged volcanic eruption in which huge flows of lava engulf the landscape, accompanied by a haze of sulphurous gases. Iceland specialises in this type of eruption – the last example occurred in 2015, and it affected air quality 1400 kilometres away in Ireland.

The Eldgjá lava flood affected southern Iceland within a century of the island’s settlement by Vikings and Celts around 874, but until now the date of the eruption has been uncertain, hindering investigation of its likely impacts. It was a colossal event with around 20 cubic kilometres of lava erupted – enough to cover all of England up to the ankles.

The Cambridge-led team pinpointed the date of the eruption using ice core records from Greenland that preserve the volcanic fallout from Eldgjá. Using the clues contained within the ice cores, the researchers found that the eruption began around the spring of 939 and continued at least through the autumn of 940.

“This places the eruption squarely within the experience of the first two or three generations of Iceland’s settlers,” said first author Dr Clive Oppenheimer of Cambridge’s Department of Geography. “Some of the first wave of migrants to Iceland, brought over as children, may well have witnessed the eruption.”

Once they had a date for the Eldgjá eruption, the team then investigated its consequences. First, a haze of sulphurous dust spread across Europe, recorded as sightings of an exceptionally blood-red and weakened Sun in Irish, German and Italian chronicles from the same period.

Then the climate cooled as the dust layer reduced the amount of sunlight reaching the surface, which is evident from tree rings from across the Northern Hemisphere. The evidence contained in the tree rings suggests the eruption triggered one of the coolest summers of the last 1500 years. “In 940, summer cooling was most pronounced in Central Europe, Scandinavia, the Canadian Rockies, Alaska and Central Asia, with summer average temperatures 2°C lower,” said co-author Professor Markus Stoffel from the University of Geneva’s Department of Earth Sciences.

The team then looked at medieval chronicles to see how the cooling climate impacted society. “It was a massive eruption, but we were still amazed just how abundant the historical evidence is for the eruption’s consequences,” said co-author Dr Tim Newfield, from Georgetown University’s Departments of History and Biology. “Human suffering in the wake of Eldgjá was widespread. From northern Europe to northern China, people experienced long, hard winters and severe spring-summer drought. Locust infestations and livestock mortalities occurred. Famine did not set in everywhere, but in the early 940s we read of starvation and vast mortality in parts of Germany, Iraq and China.”

“The effects of the Eldgjá eruption must have been devastating for the young colony on Iceland – very likely, land was abandoned and famine severe,” said co-author Professor Andy Orchard from the University of Oxford’s Faculty of English. “However, there are no surviving texts from Iceland itself during this time that provide us with direct accounts of the eruption.”

But Iceland’s most celebrated medieval poem, Vǫluspá (‘The prophecy of the seeress’) does appear to give an impression of what the eruption was like. The poem, which can be dated as far back as 961, foretells the end of Iceland’s pagan gods and the coming of a new, singular god: in other words, the conversion of Iceland to Christianity, which was formalised around the turn of the eleventh century.

Part of the poem describes a terrible eruption with fiery explosions lighting up the sky, and the Sun obscured by thick clouds of ash and steam:

“The sun starts to turn black, land sinks into sea; the bright stars scatter from the sky.
Steam spurts up with what nourishes life, flame flies high against heaven itself.”

The poem also depicts cold summers that would be expected after a massive eruption, and the researchers link these descriptions to the spectacle and impacts of the Eldgjá eruption, the largest in Iceland since its settlement.

The poem’s apocalyptic imagery marks the fiery end to the world of the old gods. The researchers suggest that these lines in the poem may have been intended to rekindle harrowing memories of the eruption to stimulate the massive religious and cultural shift taking place in Iceland in the last decades of the tenth century.

“With a firm date for the eruption, many entries in medieval chronicles snap into place as likely consequences – sightings in Europe of an extraordinary atmospheric haze; severe winters; and cold summers, poor harvests; and food shortages,” said Oppenheimer. “But most striking is the almost eyewitness style in which the eruption is depicted in Vǫluspá. The poem’s interpretation as a prophecy of the end of the pagan gods and their replacement by the one, singular god, suggests that memories of this terrible volcanic eruption were purposefully provoked to stimulate the Christianisation of Iceland.”

Reference:
Clive Oppenheimer et al “The Eldgjá eruption: timing, long-range impacts and influence on the Christianisation of Iceland.” Climatic Change (2018). DOI: 10.1007/s10584-018-2171-9

Inset image: Codex Regius, which contains a version of the Vǫluspá.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Chain Reaction of Fast-Draining Lakes Poses New Risk For Greenland Ice Sheet

Chain reaction of fast-draining lakes poses new risk for Greenland ice sheet

source: www.cam.ac.uk

A growing network of lakes on the Greenland ice sheet has been found to drain in a chain reaction that speeds up the flow of the ice sheet, threatening its stability.

Researchers from the UK, Norway, US and Sweden have used a combination of 3D computer modelling and real-world observations to show the previously unknown, yet profound dynamic consequences tied to a growing number of lakes forming on the Greenland ice sheet.

Lakes form on the surface of the Greenland ice sheet each summer as the weather warms. Many exist for weeks or months, but drain in just a few hours through more than a kilometre of ice, transferring huge quantities of water and heat to the base of the ice sheet. The affected areas include sensitive regions of the ice sheet interior where the impact on ice flow is potentially large.

Previously, it had been thought that these ‘drainage events’ were isolated incidents, but the new research, led by the University of Cambridge, shows that the lakes form a massive network and become increasingly interconnected as the weather warms. When one lake drains, the water quickly spreads under the ice sheet, which responds by flowing faster. The faster flow opens new fractures on the surface and these fractures act as conduits for the drainage of other lakes. This starts a chain reaction that can drain many other lakes, some as far as 80 kilometres away.

These cascading events – including one case where 124 lakes drained in just five days – can temporarily accelerate ice flow by as much as 400%, which makes the ice sheet less stable, and increases the rate of associated sea level rise. The results are reported in the journal Nature Communications.

The study demonstrates how forces within the ice sheet can change abruptly from one day to the next, causing solid ice to fracture suddenly. The model developed by the international team shows that lakes forming in stable areas of the ice sheet drain when fractures open in response to a high tensile shock force acting along drainage paths of water flowing beneath the ice sheet when other lakes drain far away.

“This growing network of melt lakes, which currently extends more than 100 kilometres inland and reaches elevations as high a 2,000 metres above sea level, poses a threat for the long-term stability of the Greenland ice sheet,” said lead author Dr Poul Christoffersen, from Cambridge’s Scott Polar Research Institute. “This ice sheet, which covers 1.7 million square kilometres, was relatively stable 25 years ago, but now loses one billion tonnes of ice every day. This causes one millimetre of global sea level rise per year, a rate which is much faster than what was predicted only a few years ago.”

The study departs from the current consensus that lakes forming at high elevations on the Greenland ice sheet have only a limited potential to influence the flow of ice sheet as climate warms. Whereas the latest report by Intergovernmental Panel on Climate Change concluded that surface meltwater, although abundant, does not impact the flow of the ice sheet, the study suggests that meltwater delivered to the base of the ice sheet through draining lakes in fact drives episodes of sustained acceleration extending much farther onto the interior of the ice sheet than previously thought.

“Transfer of water and heat from surface to the bed can escalate extremely rapidly due to a chain reaction,” said Christoffersen. “In one case we found all but one of 59 observed lakes drained in a single cascading event. Most of the melt lakes drain in this dynamic way.”

Although the delivery of small amounts of meltwater to the base of the ice sheet only increases the ice sheet’s flow locally, the study shows that the response of the ice sheet can intensify through knock-on effects.

When a single lake drains, the ice flow temporarily accelerates along the path taken by water flowing along the bottom of the ice sheet. Lakes situated in stable basins along this path drain when the loss of friction along the bed temporarily transfers forces to the surface of the ice sheet, causing fractures to open up beneath other lakes, which then also drain.

“The transformation of forces within the ice sheet when lakes drain is sudden and dramatic,” said co-author Dr Marion Bougamont, also from the Scott Polar Research Institute. “Lakes that drain in one area produce fractures that cause more lakes to drain somewhere elsewhere. It all adds up when you look at the pathways of water underneath the ice.”

The study used high-resolution satellite images to confirm that fractures on the surface of the ice sheet open up when cascading lake drainage occurs. “This aspect of our work is quite worrying,” said Christoffersen. “We found clear evidence of these crevasses at 1,800 metres above sea level and as far 135 kilometres inland from the ice margin. This is much farther inland than previously considered possible.”

While complete loss of all ice in Greenland remains extremely unlikely this century, the highly dynamic manner in which the ice sheet responds to Earth’s changing climate clearly underscores the urgent need for a global agreement that will reduce the emission of greenhouse gases.

The work was funded by the Natural Environment Research Council (NERC) and the European Research Council (ERC).

Reference: 
Poul Christoffersen et al. ‘Cascading lake drainage on the Greenland Ice Sheet triggered by tensile shock and fracture.’ Nature Communications (2018). DOI: 10.1038/s41467-018-03420-8


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Ultra-White Coating Modelled On Beetle Scales

Ultra-white coating modelled on beetle scales

source: www.cam.ac.uk

Researchers have developed a super-thin, non-toxic, lightweight, edible ultra-white coating that could be used to make brighter paints and coatings, for use in the cosmetic, food or pharmaceutical industries.

These cellulose-based materials have a structure that’s almost like spaghetti, which is how they are able to scatter light so well.

Silvia Vignolini

The material – which is 20 times whiter than paper – is made from non-toxic cellulose and achieves such bright whiteness by mimicking the structure of the ultra-thin scales of certain types of beetle. The results are reported in the journal Advanced Materials.

Bright colours are usually produced using pigments, which absorb certain wavelengths of light and reflect others, which our eyes then perceive as colour.

To appear as white, however, all wavelengths of light need to be reflected with the same efficiency. Most commercially-available white products – such as sun creams, cosmetics and paints – incorporate highly refractive particles (usually titanium dioxide or zinc oxide) to reflect light efficiently. These materials, while considered safe, are not fully sustainable or biocompatible.

In nature, the Cyphochilus beetle, which is native to Southeast Asia, produces its ultra-white colouring not through pigments, but by exploiting the geometry of a dense network of chitin – a molecule which is also found in the shells of molluscs, the exoskeletons of insects and the cell walls of fungi. Chitin has a structure which scatters light extremely efficiently – resulting in ultra-white coatings which are very thin and light.

“White is a very special type of structural colour,” said paper co-author Olimpia Onelli, from Cambridge’s Department of Chemistry. “Other types of structural colour – for example butterfly wings or opals – have a specific pattern in their structure which results in vibrant colour, but to produce white, the structure needs to be as random as possible.”

The Cambridge team, working with researchers from Aalto University in Finland, mimicked the structure of chitin using cellulose, which is non-toxic, abundant, strong and bio-compatible. Using tiny strands of cellulose, or cellulose nanofibrils, they were able to achieve the same ultra-white effect in a flexible membrane.

By using a combination of nanofibrils of varying diameters, the researchers were able to tune the opacity, and therefore the whiteness, of the end material. The membranes made from the thinnest fibres were more transparent, while adding medium and thick fibres resulted in a more opaque membrane. In this way, the researchers were able to fine-tune the geometry of the nanofibrils so that they reflected the most light.

“These cellulose-based materials have a structure that’s almost like spaghetti, which is how they are able to scatter light so well,” said senior author Dr Silvia Vignolini, also from Cambridge’s Department of Chemistry. “We need to get the mix just right: we don’t want it to be too uniform, and we don’t want it to collapse.”

Like the beetle scales, the cellulose membranes are extremely thin: just a few millionths of a metre thick, although the researchers say that even thinner membranes could be produced by further optimising their fabrication process. The membranes scatter light 20 to 30 times more efficiently than paper and could be used to produce next-generation efficient bright sustainable and biocompatible white materials.

The research was funded in part by the UK Biotechnology and Biological Sciences Research Council and the European Research Council. The technology has been patented by Cambridge Enterprise, the University’s commercialisation arm.

Reference:
Matti S. Toivonen et al. ‘Anomalous-Diffusion-Assisted Brightness in White Cellulose Nanofibril Membranes.’ Advanced Materials (2018). DOI: 10.1002/adma.201704050


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.