All posts by Admin

Volcanic Eruption Influenced Iceland’s Conversion To Christianity

Volcanic eruption influenced Iceland’s conversion to Christianity

source: www.cam.ac.uk

Memories of the largest lava flood in the history of Iceland, recorded in an apocalyptic medieval poem, were used to drive the island’s conversion to Christianity, new research suggests.

With a firm date for the eruption, many entries in medieval chronicles snap into place as likely consequences.

Clive Oppenheimer

A team of scientists and medieval historians, led by the University of Cambridge, has used information contained within ice cores and tree rings to accurately date a massive volcanic eruption, which took place soon after the island was first settled.

Having dated the eruption, the researchers found that Iceland’s most celebrated medieval poem, which describes the end of the pagan gods and the coming of a new, singular god, describes the eruption and uses memories of it to stimulate the Christianisation of Iceland. The results are reported in the journal Climatic Change.

The eruption of the Eldgjá in the tenth century is known as a lava flood: a rare type of prolonged volcanic eruption in which huge flows of lava engulf the landscape, accompanied by a haze of sulphurous gases. Iceland specialises in this type of eruption – the last example occurred in 2015, and it affected air quality 1400 kilometres away in Ireland.

The Eldgjá lava flood affected southern Iceland within a century of the island’s settlement by Vikings and Celts around 874, but until now the date of the eruption has been uncertain, hindering investigation of its likely impacts. It was a colossal event with around 20 cubic kilometres of lava erupted – enough to cover all of England up to the ankles.

The Cambridge-led team pinpointed the date of the eruption using ice core records from Greenland that preserve the volcanic fallout from Eldgjá. Using the clues contained within the ice cores, the researchers found that the eruption began around the spring of 939 and continued at least through the autumn of 940.

“This places the eruption squarely within the experience of the first two or three generations of Iceland’s settlers,” said first author Dr Clive Oppenheimer of Cambridge’s Department of Geography. “Some of the first wave of migrants to Iceland, brought over as children, may well have witnessed the eruption.”

Once they had a date for the Eldgjá eruption, the team then investigated its consequences. First, a haze of sulphurous dust spread across Europe, recorded as sightings of an exceptionally blood-red and weakened Sun in Irish, German and Italian chronicles from the same period.

Then the climate cooled as the dust layer reduced the amount of sunlight reaching the surface, which is evident from tree rings from across the Northern Hemisphere. The evidence contained in the tree rings suggests the eruption triggered one of the coolest summers of the last 1500 years. “In 940, summer cooling was most pronounced in Central Europe, Scandinavia, the Canadian Rockies, Alaska and Central Asia, with summer average temperatures 2°C lower,” said co-author Professor Markus Stoffel from the University of Geneva’s Department of Earth Sciences.

The team then looked at medieval chronicles to see how the cooling climate impacted society. “It was a massive eruption, but we were still amazed just how abundant the historical evidence is for the eruption’s consequences,” said co-author Dr Tim Newfield, from Georgetown University’s Departments of History and Biology. “Human suffering in the wake of Eldgjá was widespread. From northern Europe to northern China, people experienced long, hard winters and severe spring-summer drought. Locust infestations and livestock mortalities occurred. Famine did not set in everywhere, but in the early 940s we read of starvation and vast mortality in parts of Germany, Iraq and China.”

“The effects of the Eldgjá eruption must have been devastating for the young colony on Iceland – very likely, land was abandoned and famine severe,” said co-author Professor Andy Orchard from the University of Oxford’s Faculty of English. “However, there are no surviving texts from Iceland itself during this time that provide us with direct accounts of the eruption.”

But Iceland’s most celebrated medieval poem, Vǫluspá (‘The prophecy of the seeress’) does appear to give an impression of what the eruption was like. The poem, which can be dated as far back as 961, foretells the end of Iceland’s pagan gods and the coming of a new, singular god: in other words, the conversion of Iceland to Christianity, which was formalised around the turn of the eleventh century.

Part of the poem describes a terrible eruption with fiery explosions lighting up the sky, and the Sun obscured by thick clouds of ash and steam:

“The sun starts to turn black, land sinks into sea; the bright stars scatter from the sky.
Steam spurts up with what nourishes life, flame flies high against heaven itself.”

The poem also depicts cold summers that would be expected after a massive eruption, and the researchers link these descriptions to the spectacle and impacts of the Eldgjá eruption, the largest in Iceland since its settlement.

The poem’s apocalyptic imagery marks the fiery end to the world of the old gods. The researchers suggest that these lines in the poem may have been intended to rekindle harrowing memories of the eruption to stimulate the massive religious and cultural shift taking place in Iceland in the last decades of the tenth century.

“With a firm date for the eruption, many entries in medieval chronicles snap into place as likely consequences – sightings in Europe of an extraordinary atmospheric haze; severe winters; and cold summers, poor harvests; and food shortages,” said Oppenheimer. “But most striking is the almost eyewitness style in which the eruption is depicted in Vǫluspá. The poem’s interpretation as a prophecy of the end of the pagan gods and their replacement by the one, singular god, suggests that memories of this terrible volcanic eruption were purposefully provoked to stimulate the Christianisation of Iceland.”

Reference:
Clive Oppenheimer et al “The Eldgjá eruption: timing, long-range impacts and influence on the Christianisation of Iceland.” Climatic Change (2018). DOI: 10.1007/s10584-018-2171-9

Inset image: Codex Regius, which contains a version of the Vǫluspá.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Chain Reaction of Fast-Draining Lakes Poses New Risk For Greenland Ice Sheet

Chain reaction of fast-draining lakes poses new risk for Greenland ice sheet

source: www.cam.ac.uk

A growing network of lakes on the Greenland ice sheet has been found to drain in a chain reaction that speeds up the flow of the ice sheet, threatening its stability.

Researchers from the UK, Norway, US and Sweden have used a combination of 3D computer modelling and real-world observations to show the previously unknown, yet profound dynamic consequences tied to a growing number of lakes forming on the Greenland ice sheet.

Lakes form on the surface of the Greenland ice sheet each summer as the weather warms. Many exist for weeks or months, but drain in just a few hours through more than a kilometre of ice, transferring huge quantities of water and heat to the base of the ice sheet. The affected areas include sensitive regions of the ice sheet interior where the impact on ice flow is potentially large.

Previously, it had been thought that these ‘drainage events’ were isolated incidents, but the new research, led by the University of Cambridge, shows that the lakes form a massive network and become increasingly interconnected as the weather warms. When one lake drains, the water quickly spreads under the ice sheet, which responds by flowing faster. The faster flow opens new fractures on the surface and these fractures act as conduits for the drainage of other lakes. This starts a chain reaction that can drain many other lakes, some as far as 80 kilometres away.

These cascading events – including one case where 124 lakes drained in just five days – can temporarily accelerate ice flow by as much as 400%, which makes the ice sheet less stable, and increases the rate of associated sea level rise. The results are reported in the journal Nature Communications.

The study demonstrates how forces within the ice sheet can change abruptly from one day to the next, causing solid ice to fracture suddenly. The model developed by the international team shows that lakes forming in stable areas of the ice sheet drain when fractures open in response to a high tensile shock force acting along drainage paths of water flowing beneath the ice sheet when other lakes drain far away.

“This growing network of melt lakes, which currently extends more than 100 kilometres inland and reaches elevations as high a 2,000 metres above sea level, poses a threat for the long-term stability of the Greenland ice sheet,” said lead author Dr Poul Christoffersen, from Cambridge’s Scott Polar Research Institute. “This ice sheet, which covers 1.7 million square kilometres, was relatively stable 25 years ago, but now loses one billion tonnes of ice every day. This causes one millimetre of global sea level rise per year, a rate which is much faster than what was predicted only a few years ago.”

The study departs from the current consensus that lakes forming at high elevations on the Greenland ice sheet have only a limited potential to influence the flow of ice sheet as climate warms. Whereas the latest report by Intergovernmental Panel on Climate Change concluded that surface meltwater, although abundant, does not impact the flow of the ice sheet, the study suggests that meltwater delivered to the base of the ice sheet through draining lakes in fact drives episodes of sustained acceleration extending much farther onto the interior of the ice sheet than previously thought.

“Transfer of water and heat from surface to the bed can escalate extremely rapidly due to a chain reaction,” said Christoffersen. “In one case we found all but one of 59 observed lakes drained in a single cascading event. Most of the melt lakes drain in this dynamic way.”

Although the delivery of small amounts of meltwater to the base of the ice sheet only increases the ice sheet’s flow locally, the study shows that the response of the ice sheet can intensify through knock-on effects.

When a single lake drains, the ice flow temporarily accelerates along the path taken by water flowing along the bottom of the ice sheet. Lakes situated in stable basins along this path drain when the loss of friction along the bed temporarily transfers forces to the surface of the ice sheet, causing fractures to open up beneath other lakes, which then also drain.

“The transformation of forces within the ice sheet when lakes drain is sudden and dramatic,” said co-author Dr Marion Bougamont, also from the Scott Polar Research Institute. “Lakes that drain in one area produce fractures that cause more lakes to drain somewhere elsewhere. It all adds up when you look at the pathways of water underneath the ice.”

The study used high-resolution satellite images to confirm that fractures on the surface of the ice sheet open up when cascading lake drainage occurs. “This aspect of our work is quite worrying,” said Christoffersen. “We found clear evidence of these crevasses at 1,800 metres above sea level and as far 135 kilometres inland from the ice margin. This is much farther inland than previously considered possible.”

While complete loss of all ice in Greenland remains extremely unlikely this century, the highly dynamic manner in which the ice sheet responds to Earth’s changing climate clearly underscores the urgent need for a global agreement that will reduce the emission of greenhouse gases.

The work was funded by the Natural Environment Research Council (NERC) and the European Research Council (ERC).

Reference: 
Poul Christoffersen et al. ‘Cascading lake drainage on the Greenland Ice Sheet triggered by tensile shock and fracture.’ Nature Communications (2018). DOI: 10.1038/s41467-018-03420-8


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Ultra-White Coating Modelled On Beetle Scales

Ultra-white coating modelled on beetle scales

source: www.cam.ac.uk

Researchers have developed a super-thin, non-toxic, lightweight, edible ultra-white coating that could be used to make brighter paints and coatings, for use in the cosmetic, food or pharmaceutical industries.

These cellulose-based materials have a structure that’s almost like spaghetti, which is how they are able to scatter light so well.

Silvia Vignolini

The material – which is 20 times whiter than paper – is made from non-toxic cellulose and achieves such bright whiteness by mimicking the structure of the ultra-thin scales of certain types of beetle. The results are reported in the journal Advanced Materials.

Bright colours are usually produced using pigments, which absorb certain wavelengths of light and reflect others, which our eyes then perceive as colour.

To appear as white, however, all wavelengths of light need to be reflected with the same efficiency. Most commercially-available white products – such as sun creams, cosmetics and paints – incorporate highly refractive particles (usually titanium dioxide or zinc oxide) to reflect light efficiently. These materials, while considered safe, are not fully sustainable or biocompatible.

In nature, the Cyphochilus beetle, which is native to Southeast Asia, produces its ultra-white colouring not through pigments, but by exploiting the geometry of a dense network of chitin – a molecule which is also found in the shells of molluscs, the exoskeletons of insects and the cell walls of fungi. Chitin has a structure which scatters light extremely efficiently – resulting in ultra-white coatings which are very thin and light.

“White is a very special type of structural colour,” said paper co-author Olimpia Onelli, from Cambridge’s Department of Chemistry. “Other types of structural colour – for example butterfly wings or opals – have a specific pattern in their structure which results in vibrant colour, but to produce white, the structure needs to be as random as possible.”

The Cambridge team, working with researchers from Aalto University in Finland, mimicked the structure of chitin using cellulose, which is non-toxic, abundant, strong and bio-compatible. Using tiny strands of cellulose, or cellulose nanofibrils, they were able to achieve the same ultra-white effect in a flexible membrane.

By using a combination of nanofibrils of varying diameters, the researchers were able to tune the opacity, and therefore the whiteness, of the end material. The membranes made from the thinnest fibres were more transparent, while adding medium and thick fibres resulted in a more opaque membrane. In this way, the researchers were able to fine-tune the geometry of the nanofibrils so that they reflected the most light.

“These cellulose-based materials have a structure that’s almost like spaghetti, which is how they are able to scatter light so well,” said senior author Dr Silvia Vignolini, also from Cambridge’s Department of Chemistry. “We need to get the mix just right: we don’t want it to be too uniform, and we don’t want it to collapse.”

Like the beetle scales, the cellulose membranes are extremely thin: just a few millionths of a metre thick, although the researchers say that even thinner membranes could be produced by further optimising their fabrication process. The membranes scatter light 20 to 30 times more efficiently than paper and could be used to produce next-generation efficient bright sustainable and biocompatible white materials.

The research was funded in part by the UK Biotechnology and Biological Sciences Research Council and the European Research Council. The technology has been patented by Cambridge Enterprise, the University’s commercialisation arm.

Reference:
Matti S. Toivonen et al. ‘Anomalous-Diffusion-Assisted Brightness in White Cellulose Nanofibril Membranes.’ Advanced Materials (2018). DOI: 10.1002/adma.201704050


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

A Stray Sumerian Tablet: Unravelling the Story Behind Cambridge University Library’s Oldest Written Object

A Stray Sumerian Tablet: Unravelling the story behind Cambridge University Library’s oldest written object

source: www.cam.ac.uk

The story surrounding the oldest written document at one of the world’s great research libraries has been unravelled in a new film.

This little piece of clay is packed full of information from 4,200 years ago

Nicholas Postgate

A Stray Sumerian Tablet has been published today by Cambridge University Library and focuses on a diminutive clay tablet, written by a scribe in ancient Iraq, some 4,200 years ago. A description of the tablet along with high-resolution images and a 3D model can also be seen on Cambridge Digital Library.

Containing six lines of cuneiform script, and roughly the size of an adult thumb, it was donated to the University Library in 1921 but then lost to sight for many years before its rediscovery in 2016, during research for the Curious Objects exhibition, held as part of the University Library’s 600th anniversary.

The full translation of the laconic text runs as follows: 18 jars of pig fat – Balli. 4 jars of pig fat – Nimgir-ab-lah. Fat dispensed (at ?) the city of Zabala. Ab-kid-kid, the scribe. 4th year 10th month.

The man named Balli turns up regularly in other texts from the same area during the same period of history, and seems to be an official in charge of a wide range of oils: from pig fat and butter to sesame oil and almond oil.

Professor Nicholas Postgate, a Senior Fellow at the McDonald Institute for Archaeological Research at Cambridge, who has studied the tablet, said: “This little piece of clay is packed full of information from 4,200 years ago. The language is Sumerian, the oldest written language, and there are six professionally written lines of cuneiform script on it.

 

“In the early years of the 20th century, the antiquity market in the west was flooded, disastrously, with thousands of cuneiform tablets which had been ripped out of their original context from sites where illicit robbers were working. These tablets were then distributed across the world from Moscow, to London to Chicago.

“The content is very simple, it mentions a large quantity (22 jars) of lard or pigs fat and gives the name of the responsible official (Balli). It states that this fat was dispensed in the city of Zabala. We think these jars were eighty litres each, so that means we’re talking about hundreds of litres of lard.”

Since it was displayed as part of Curious Objects, Professor Postgate has conducted further research on the tablet and plans to publish an academic paper on both the tablet and its text later this year.

“We may be able to reconstruct what’s going on in individual tablet, but we can never reconstruct the physical archaeological context from which they came, so there’s a great loss of information there,” he added.

“Since the 1920s, many other tablets from the same archive have surfaced all over the world and our own small tablet makes its own contribution to the reconstruction of a government office more than 4,000 years ago.”

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

One in Ten Stroke Survivors Need More Help With Taking Medication

One in ten stroke survivors need more help with taking medication

source: www.cam.ac.uk

Over a half of stroke patients require a degree of help with taking medicine and a sizeable minority say they do not receive as much assistance as they need, according a study published today in the journal BMJ Open.

Because of the risk of a second stroke, it’s important that stroke survivors take their medication, but our study has shown that this can present challenges

Anna De Simoni

According to the Stroke Associations, as many as four in ten people who have had a stroke, go on to have another one within ten years. As a second stroke carries a greater risk of disability and death than first time strokes, it is important that survivors take medicine daily to lower their risk. There are around 1.2 million stroke survivors in the UK, and at least a third suffer from severe impairments, potentially making adherence to their medicine difficult.

Half of survivors of stroke are dependent on others for everyday activities, though the proportion dependent on others for medicine taking or needing more practical help with tablets is not known.

To examine the practical support stroke survivors living in the community need and receive with taking their medicines, researchers at the University of Cambridge and Queen Mary University of London carried out a postal questionnaire study. The researchers developed the questionnaire together with stroke survivors and caregivers. The questionnaire was completed by 600 participants across 18 GP practices in the UK.

More than half (56%) of respondents needed help with taking medication. This included help with prescriptions and collection of medicines (50%), getting medicines out of the packaging (28%), being reminded to take medicines (36%), swallowing medicines (20%) and checking that medicines have been taken (34%). Being dependent on others was linked to experiencing more unmet needs with daily medicine taking.

Around one in ten (11%) of respondents answered yes to the question “Do you feel you need more help?” The most commonly reported areas where respondents said they needed more assistance were being reminded to take medicines, dealing with prescriptions and collection of medicines, and getting medicines out the packaging. As a result, around one in three (35%) of respondents said they had missed taking medicine in the previous 30 days.

Stroke survivors taking a higher number of daily medicines and experiencing a greater number unmet needs with practical aspects of medicine-taking were more likely to miss medications.

Interestingly, the researchers found that younger stroke survivors were more likely to miss their medicines, possibly because they are less likely to receive help from a caregiver.

“Because of the risk of a second stroke, it’s important that stroke survivors take their medication, but our study has shown that this can present challenges,” says Dr Anna De Simoni from the University of Cambridge and Queen Mary University of London. “In the majority of cases, they receive the help they need, but there is still a sizeable minority who don’t receive all the assistance they need.”

James Jamison at the Department of Public Health and Primary Care, University of Cambridge, who led the study as part of his PhD, adds: “Our study has shown us some of the barriers that people face to taking their medication regularly. We also learned that stroke survivors who are dependent on others are most likely to need more assistance than they currently receive.

“Our response rate was relatively low – just over one in three – so we need more research to find out if what we’ve heard from our respondents is widespread among stroke survivors. If so, this will have implications for the care provided.”

The team point to the need to develop new interventions focused on the practicalities of taking medicines and aimed at improving stroke survivors’ adherence to treatment. “Advances in technology have the potential to help improve adherence, such as electronic devices prompting medication taking times,” says Jamison. “Efforts to improve medication taking among survivors of stroke using technology are already underway and have shown promise.”

The research was supported by the Royal College of General Practitioners, National Institute for Health Research (NIHR), the Stroke Association and the British Heart Foundation.

Reference
Jamison J, Ayerbe L, Di Tanna GL, Sutton S, Mant J, De Simoni A. Evaluating practical support stroke survivors get with medicines and unmet needs in primary care: A survey. 2018 BMJ Open. DOI: 10.1136/bmjopen-2017-019874


Researcher profile: James Jamison

James’ research seeks to understand why people sometimes do not take the medications prescribed by their GP – and then to use this to inform interventions aimed at improving their medication taking practices.

His day-to-day activities are very varied, he says. “They can involve anything from the development of research proposals, liaising with GP practices and pharmacies in the East of England to set up research studies, training health care professionals in research procedures, conducting interviews with patients, collecting questionnaire data or writing up research for publications in health care journals.”

However, the most rewarding and interesting part is coming face-to-face with stroke survivors and their caregivers to talk about their condition and the daily challenges they face, he adds.

“Working at Cambridge provides the opportunity to be part of a leading department conducting research in primary care and offers the potential to work with esteemed colleagues in the field,” he says. “The opportunity to deliver high quality research outputs and undertake collaborations will hopefully help further my career as a successful health care researcher.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Study Finds That Genes Play a Role In Empathy

Study finds that genes play a role in empathy

Hand holding
source: www.cam.ac.uk

A new study published today suggests that how empathic we are is not just a result of our upbringing and experience but also partly a result of our genes.

This is an important step towards understanding the small but important role that genetics plays in empathy

Varun Warrier

Empathy has two parts: the ability to recognize another person’s thoughts and feelings, and the ability to respond with an appropriate emotion to someone else’s thoughts and feelings. The first part is called ‘cognitive empathy’ and the second part ‘affective empathy’.

Fifteen years ago, a team of scientists at the University of Cambridge developed the Empathy Quotient (EQ), a brief self-report measure of empathy. The EQ measures both parts of empathy.

Previous research showed that some of us are more empathetic than others, and that on average, women are slightly more empathetic than men. It also showed that, on average, autistic people score lower on the EQ, and that this was because they struggle with cognitive empathy, even though their affective empathy may be intact.

In a new study published in the journal Translational Psychiatry, the Cambridge team, working with the genetics company 23andMe and a team of international scientists, report the results of the largest genetic study of empathy using information from more than 46,000 23andMe customers. The customers all completed the EQ online and provided a saliva sample for genetic analysis.

The study was led by Varun Warrier, a Cambridge PhD student, and Professors Simon Baron-Cohen, Director of the Autism Research Centre at Cambridge University, Thomas Bourgeron, of the University Paris Diderot and the Institut Pasteur, and David Hinds, Principal Scientist at 23andMe.

The new study has three important results. First, it found that how empathetic we are is partly due to genetics. Indeed, a tenth of this variation is due to genetic factors. This confirms previous research examining empathy in identical versus non-identical twins.

Second, the new study confirmed that women are on average more empathetic than men. However, this difference is not due to our DNA as there were no differences in the genes that contribute to empathy in men and women.

This implies that the sex difference in empathy is the result of other non-genetic biological factors, such as prenatal hormone influences, or non-biological factors such as socialisation, both of which also differ between the sexes.

Finally, the new study found that genetic variants associated with lower empathy are also associated with higher risk for autism.

Varun Warrier said: “This is an important step towards understanding the small but important role that genetics plays in empathy. But keep in mind that only a tenth of individual differences in empathy in the population are due to genetics. It will be equally important to understand the non-genetic factors that explain the other 90%.”

Professor Thomas Bourgeron added: “This new study demonstrates a role for genes in empathy, but we have not yet identified the specific genes that are involved. Our next step is to gather larger samples to replicate these findings, and to pin-point the precise biological pathways associated with individual differences in empathy.”

Dr David Hinds said: “These are the latest findings from a series of studies that 23andMe have collaborated on with researchers at Cambridge. Together these are providing exciting new insights into the genetics influences underlying human behaviour.”

Professor Simon Baron-Cohen added: “Finding that even a fraction of why we differ in empathy is due to genetic factors helps us understand people such as those with autism who struggle to imagine another person’s thoughts and feelings. This can give rise to disability no less challenging than other kinds of disability, such as dyslexia or visual impairment. We as a society need to support those with disabilities, with novel teaching methods, work-arounds, or reasonable adjustments, to promote inclusion.”

This study also benefitted from support from the Medical Research Council, the Wellcome Trust, the Institut Pasteur, the CNRS, the University Paris Diderot, the Bettencourt-Schueller Foundation, the Cambridge Commonwealth Trust, and St John’s College, Cambridge.

Reference
Genome-wide analyses of self-reported empathy: correlations with autism, schizophrenia, and anorexia nervosa, by V Warrier, R Toro, B Chakrabarti, iPSYCH-Broad Autism Group, Grove J, Borglum AD, D Hinds, T Bourgeron, and S Baron-Cohen. Translational Psychiatry. DOI: 10.1038/s41398-017-0082-6


Researcher profile: Varun Warrier

Varun Warrier is a PhD student at the Autism Research Centre, where he studies the genetics of autism and related traits. He moved to Cambridge in 2013 from India because of the Centre’s world-leading reputation.

There are several key challenges in the field, he says. “First, we have identified only a fraction of the genes associated with autism. Second, no two autistic people are alike. Third, within the spectrum autistic people have different strengths and difficulties. Finally, those with a clinical diagnosis blend seamlessly into those in the population who don’t have a diagnosis but simply have a lot of autistic traits. We all have some autistic traits – this spectrum runs right through the population on a bell curve.”

Although much of his work is computational, developing statistical tools to interrogate complex datasets that will enable him to answer biological questions, he also gets to meet many people with autism. “When I meet autistic people, I truly understand what’s often said – no two autistic people are alike.”

Warrier hopes his research will lead to a better understanding of the biology of autism, and that this will enable quicker and more accurate diagnosis. “But that’s only one part of the challenge,” he says. “Understanding the biology has its limits, and I hope that, in parallel, there will be better social policies to support autistic people.”

Cambridge is an exciting place to be a researcher, he says. “In Cambridge, there’s always a local expert, so if you have a particular problem there usually is someone who can help you out. People here are not just thinking about what can be done to address the problems of today; they are anticipating problems that we will face in 20 years’ time, and are working to solve those.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Method To Predict Drug Stability Could Lead To More Effective Medicines

Method to predict drug stability could lead to more effective medicines

source: www.cam.ac.uk

Researchers from the UK and Denmark have developed a new method to predict the physical stability of drug candidates, which could help with the development of new and more effective medicines for patients. The technology has been licensed to Cambridge spin-out company TeraView, who are developing it for use in the pharmaceutical industry in order to make medicines that are more easily released in the body.

This is a very old problem, and for pharmaceutical companies, it’s often too big of a risk.

Axel Zeitler

The researchers, from the Universities of Cambridge and Copenhagen, have developed a new method to solve an old problem: how to predict when and how a solid will crystallise. Using optical and mechanical measuring techniques, they found that localised movement of molecules within a solid is ultimately responsible for crystallisation.

This solution to the problem was first proposed in 1969, but it has only now become possible to prove the hypothesis. The results are reported in two papers in Physical Chemistry Chemical Physics and The Journal of Physical Chemistry B.

Solids behave differently depending on whether their molecular structure is ordered (crystal) or disordered (glass). Chemically, the crystal and glass forms of a solid are exactly the same, but they have different properties.

One of the desirable properties of glasses is that they are more soluble in water, which is especially useful for medical applications. To be effective, medicines need to be water-soluble, so that they can be dissolved within the body and reach their target via the bloodstream.

“Most of the medicines in use today are in the crystal form, which means that they need extra energy to dissolve in the body before they enter the bloodstream,” said study co-author Professor Axel Zeitler from Cambridge’s Department of Chemical Engineering & Biotechnology. “Molecules in the glass form are more readily absorbed by the body because they can dissolve more easily, and many glasses that can cure disease have been discovered in the past 20 years, but they’re not being made into medicines because they’re not stable enough.”

After a certain time, all glasses will undergo spontaneous crystallisation, at which point the molecules will not only lose their disordered structure, but they will also lose the properties that made them effective in the first place. A long-standing problem for scientists has been how to predict when crystallisation will occur, which, if solved, would enable the widespread practical application of glasses.

“This is a very old problem,” said Zeitler. “And for pharmaceutical companies, it’s often too big of a risk. If they develop a drug based on the glass form of a molecule and it crystallises, they will not only have lost a potentially effective medicine, but they would have to do a massive recall.”

In order to determine when and how solids will crystallise, most researchers had focused on the glass transition temperature, which is the temperature above which molecules can move in the solid more freely and can be measured easily. Using a technique called dynamic mechanical analysis as well as terahertz spectroscopy, Zeitler and his colleagues showed that instead of the glass transition temperature, the molecular motions occurring until a lower temperature threshold, are responsible for crystallisation.

These motions are constrained by localised forces in the molecular environment and, in contrast to the relatively large motions that happen above the glass transition temperature, the molecular motions above the lower temperature threshold are much subtler. While the localised movement is tricky to measure, it is a key part of the crystallisation process.

Given the advance in measurement techniques developed by the Cambridge and Copenhagen teams, drug molecules that were previously discarded at the pre-clinical stage can now be tested to determine whether they can be brought to the market in a stable glass form that overcomes the solubility limitations of the crystal form.

“If we use our technique to screen molecules that were previously discarded, and we find that the temperature associated with the onset of the localised motion is sufficiently high, we would have high confidence that the material will not crystallise following manufacture,” said Zeitler. “We could use the calibration curve that we describe in the second paper to predict the length of time it will take the material to crystallise.”

The research has been patented and is being commercialised by Cambridge Enterprise, the University’s commercialisation arm. The research was funded by the Engineering and Physical Sciences Research Council (EPSRC).

References:
Michael T. Ruggiero et al. ‘The significance of the amorphous potential energy landscape for dictating glassy dynamics and driving solid-state crystallisation’ Physical Chemistry Chemical Physics, 19, 30039-30047 (2017). DOI: 10.1039/c7cp06664c

Eric Ofosu Kissi et al. ‘The glass transition temperature of the β-relaxation as the single predictive parameter for recrystallization of neat amorphous drugs.’ The Journal of Physical Chemistry B (2018). DOI: 10.1021/acs.jpcb.7b10105


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Rare Mineral Discovered In Plants For First Time

Rare mineral discovered in plants for first time

source: www.cam.ac.uk

A rare mineral with potential industrial and medical applications has been discovered on alpine plants at Cambridge University Botanic Garden.

Biochemists are working to synthetically manufacture vaterite as it has potential for use in drug delivery, but it is not easy to make

Raymond Wightman

Scientists at Sainsbury Laboratory Cambridge University have found that the mineral vaterite, a form (polymorph) of calcium carbonate, is a dominant component of the protective silvery-white crust that forms on the leaves of a number of alpine plants, which are part of the Garden’s national collection of European Saxifraga species.

Naturally occurring vaterite is rarely found on Earth. Small amounts of vaterite crystals have been found in some sea and freshwater crustaceans, bird eggs, the inner ears of salmon, meteorites and rocks. This is the first time that the rare and unstable mineral has been found in such a large quantity and the first time it has been found to be associated with plants.

The discovery was made through a University of Cambridge collaboration between the Sainsbury Laboratory Cambridge University microscopy facility and Cambridge University Botanic Garden, as part of an ongoing research project that is probing the inner workings of plants in the Garden using new microscopy technologies. The research findings have been published in the latest edition of Flora.

The laboratory’s Microscopy Core Facility Manager, Dr Raymond Wightman, said vaterite was of interest to the pharmaceutical industry: “Biochemists are working to synthetically manufacture vaterite as it has potential for use in drug delivery, but it is not easy to make. Vaterite has special properties that make it a potentially superior carrier for medications due to its high loading capacity, high uptake by cells and its solubility properties that enable it to deliver a sustained and targeted release of therapeutic medicines to patients. For instance, vaterite nanoparticles loaded with anti-cancer drugs appear to offload the drug slowly only at sites of cancers and therefore limit the negative side-effects of the drug.”

Other potential uses of vaterite include improving the cements used in orthopaedic surgery and as an industrial application improving the quality of papers for inkjet printing by reducing the lateral spread of ink.

Dr Wightman said vaterite was often associated with outer space and had been detected in planetary objects in the Solar System and meteorites: “Vaterite is not very stable in the Earth’s humid atmosphere as it often reverts to more common forms of calcium carbonate, such as calcite. This makes it even more remarkable that we have found vaterite in such large quantities on the surface of plant leaves.”

Botanic Garden Alpine and Woodland Supervisor, Paul Aston, and colleague Simon Wallis, are pioneering studies into the cellular-level structures of these alpine plants with Dr Wightman. Mr Wallis, who is also Chairman of the international Saxifrage Society, said: “We started by sampling as wide a range of saxifrage species as possible from our collection. The microscope analysis of the plant material came up with the exciting discovery that some plants were exuding vaterite from “chalk glands” (hydathodes) on the margins of their leaves.

“We then noticed a pattern emerging. The plants producing vaterite were from the section of Saxifraga called Porphyrion. Further to this, it appears that although many species in this section produced vaterite along with calcite, there was at least one species, Saxifraga sempervivum, that was producing pure vaterite.”

Dr Wightman said two new pieces of equipment at the microscopy facility were being used to reveal the inner workings of the plants and uncovering cellular structures never before described: “Our cryo-scanning electron microscope allows us to view, in great detail, cells and plant tissues in their “native” fully hydrated state by freezing samples quickly and maintaining cold under a vacuum for electron microscopy.

“We are also using a Raman microscope to identify and map molecules. In this case, the microscope not only identified signatures corresponding to calcium carbonate as forming the crust, but was also able to differentiate between the calcite and vaterite forms when it was present as a mixture while still attached to the leaf surface.”

So why do these species produce a calcium carbonate crystal crust and why are some crusts calcite and others vaterite?

The Cambridge University Botanic Garden team is hoping to answer this question through further analysis of the leaf anatomy of the Saxifraga group. They suspect that vaterite may be present on more plant species, but that the unstable mineral is being converted to calcite when exposed to wind and rain. This may also be the reason why some plants have both vaterite and calcite present at the same time.

The microscopy research has also turned up some novel cell structures. Mr Aston added: “As well as producing vaterite, Saxifraga scardica has a special tissue surrounding the leaf edge that appears to deflect light from the edge into the leaf. The cells appear to be producing novel cell wall structures to achieve this deflection. This may be to help the plant to collect more light, particularly if it is growing in partly shaded environments.”

The team believes the novel cell wall structures of Saxifrages could one day help inform the manufacture of new bio-inspired optical devices and photonic structures for industry such as communication cables and fibre optics.

Mr Aston said these initial discoveries were just the start: “We expect that there may be other plants that also produce vaterite and have special leaf anatomies that have evolved in harsh environments like alpine regions. The next species we will be looking to study is Saxifraga lolaensis, which has super tiny leaves with an organisation of cell types not seen in a leaf before, and which we think will reveal more fascinating secrets about the complexity of plants.”

There is a risk that some of these tiny but amazing alpine plants could potentially disappear due to climate change, damage from alpine recreation sports and over-collecting. There is still much to learn about these plants, but the collaborative work of the Sainsbury Laboratory and Cambridge University Botanic Garden team is revealing fascinating insights into leaf anatomy and biochemistry as well as demonstrating the potential for Saxifrages to supply a new range of biomaterials.

Story by Kathy Grube, Communications Manager, Sainsbury Laboratory.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Identification of Brain Region Responsible For Alleviating Pain Could Lead To Development of Opioid Alternatives

Identification of brain region responsible for alleviating pain could lead to development of opioid alternatives

source: www.cam.ac.uk

Researchers from the UK & Japan have identified how the brain’s natural painkilling system could be used as a possible alternative to opioids for the effective relief of chronic pain, which affects as many as one in three people at some point in their lives.

Pain can actually help us recover by removing our drive to do unnecessary things – in a sense, this can be considered ‘healthy pain’.

Ben Seymour

The team, led by the University of Cambridge, have pinpointed an area of the brain that is important for endogenous analgesia – the brain’s intrinsic pain relief system. Their results, published in the open access journal eLife, could lead to the development of pain treatments that activate the painkilling system by stimulating this area of the brain, but without the dangerous side-effects of opioids.

Opioid drugs such as oxycodone, hydrocodone and fentanyl hijack the endogenous analgesia system, which is what makes them such effective painkillers. However, they are also highly addictive, which has led to the opioid crisis in the United States, where drug overdose is now the leading cause of death for those under 50, with opioid overdoses representing two-thirds of those deaths.

“We’re trying to understand exactly what the endogenous analgesia system is: why we have it, how it works and where it is controlled in the brain,” said Dr Ben Seymour of Cambridge’s Department of Engineering, who led the research. “If we can figure this out, it could lead to treatments that are much more selective in terms of how they treat pain.”

Pain, while unpleasant, evolved to serve an important survival function. After an injury, for instance, the persistent pain we feel saps our motivation, and so forces us towards rest and recuperation which allows the body to use as much energy as possible for healing.

“Pain can actually help us recover by removing our drive to do unnecessary things – in a sense, this can be considered ‘healthy pain’,” said Seymour. “So why might the brain want to turn down the pain signal sometimes?”

Seymour and his colleagues thought that sometimes this ‘healthy pain’ could be a problem, especially if we could actively do something that might help – such as try and find a way to cool a burn.

In these situations, the brain might activate the pain-killing system to actively look for relief. To prove this, and to try and identify where in the brain this system was activated, the team designed a pair of experiments using brain scanning technology.

In the first experiment, the researchers attached a metal probe to the arm of a series of healthy volunteers – and heated it up to a level that was painful, but not enough to physically burn them. The volunteers then played a type of gambling game where they had to find which button on a small keypad cooled down the probe. The level of difficulty was varied over the course of the experiments – sometimes it was easy to turn the probe off, and sometimes it was difficult. Throughout the task, the volunteers frequently rated their pain, and the researchers constantly monitored their brain activity.

The results found that the level of pain the volunteers experienced was related to how much information there was to learn in the task. When the subjects were actively trying to work out which button they should press, pain was reduced. But when the subjects knew which button to press, it wasn’t. The researchers found that the brain was actually computing the benefits of actively looking for and remembering how they got relief, and using this to control the level of pain.

Knowing what this signal should look like, the researchers then searched the brain to see where it was being used. The second experiment identified the signal in a single region of the prefrontal cortex, called the pregenual cingulate cortex.

“These results build a picture of why and how the brain decides to turn off pain in certain circumstances, and identify the pregenual cingulate cortex as a critical ‘decision centre’ controlling pain in the brain,” said Seymour.

This decision centre is a key place to focus future research efforts. In particular, the researchers are now trying to understand what the inputs are to this brain region, if it is stimulated by opioid drugs, what other chemical messenger systems it uses, and how it could be turned on as a treatment for patients with chronic pain.

Reference
Suyi Zhang et al. ‘The control of tonic pain by active relief learning.’ eLife (2018). DOI:https://doi.org/10.7554/eLife.31949


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

New Evidence Suggests Nutritional Labelling On Menus May Reduce Our Calorie Intake

New evidence suggests nutritional labelling on menus may reduce our calorie intake

source: www.cam.ac.uk

New evidence published in the Cochrane Library today shows that adding calorie labels to menus and next to food in restaurants, coffee shops and cafeterias, could reduce the calories that people consume, although the quality of evidence is low.

There is no ‘magic bullet’ to solve the obesity problem, so while calorie labelling may help, other measures to reduce calorie intake are also needed

Theresa Marteau

Eating too many calories contributes to people becoming overweight and increases the risks of heart disease, diabetes and many cancers, which are among the leading causes of poor health and premature death.

Several studies have looked at whether putting nutritional labels on food and non-alcoholic drinks might have an impact on their purchasing or consumption, but their findings have been mixed. Now, a team of Cochrane researchers has brought together the results of studies evaluating the effects of nutritional labels on purchasing and consumption in a systematic review.

The team reviewed the evidence to establish whether and by how much nutritional labels on food or non-alcoholic drinks affect the amount of food or drink people choose, buy, eat or drink. They considered studies in which the labels had to include information on the nutritional or calorie content of the food or drink. They excluded those including only logos (e.g. ticks or stars), or interpretative colours (e.g. ‘traffic light’ labelling) to indicate healthier and unhealthier foods. In total, the researchers included evidence from 28 studies, of which 11 assessed the impact of nutritional labelling on purchasing and 17 assessed the impact of labelling on consumption.

The team combined results from three studies where calorie labels were added to menus or put next to food in restaurants, coffee shops and cafeterias. For a typical lunch with an intake of 600 calories, such as a slice of pizza and a soft drink, labelling may reduce the energy content of food purchased by about 8% (48 calories). The authors judged the studies to have potential flaws that could have biased the results.

Combining results from eight studies carried out in artificial or laboratory settings could not show with certainty whether adding labels would have an impact on calories consumed. However, when the studies with potential flaws in their methods were removed, the three remaining studies showed that such labels could reduce calories consumed by about 12% per meal. The team noted that there was still some uncertainty around this effect and that further well conducted studies are needed to establish the size of the effect with more precision.

The Review’s lead author, Professor Theresa Marteau, Director of the Behaviour and Health Research Unit at the University of Cambridge, UK, says: “This evidence suggests that using nutritional labelling could help reduce calorie intake and make a useful impact as part of a wider set of measures aimed at tackling obesity,” She added, “There is no ‘magic bullet’ to solve the obesity problem, so while calorie labelling may help, other measures to reduce calorie intake are also needed.”

Author, Professor Susan Jebb from the University of Oxford commented: “Some outlets are already providing calorie information to help customers make informed choices about what to purchase. This review should provide policymakers with the confidence to introduce measures to encourage or even require calorie labelling on menus and next to food and non-alcoholic drinks in coffee shops, cafeterias and restaurants.”

The researchers were unable to reach firm conclusions about the effect of labelling on calories purchased from grocery stores or vending machines because of the limited evidence available. They also added that future research would also benefit from a more diverse consideration of the possible wider impacts of nutritional labelling including impacts on those producing and selling food, as well as consumers.

Professor Ian Caterson, President of the World Obesity Federation, commented: “Energy labelling has been shown to be effective: people see it and read it and there is a resulting decrease in calories purchased. This is very useful to know – combined with a suite of other interventions, such changes will help slow and eventually turnaround the continuing rise in body weight.”

Reference
Crockett RA, et al. Nutritional labelling for healthier food or non-alcoholic drink purchasing and consumption. Cochrane Database of Systematic Reviews 2018, Issue 2. Art. No.: CD009315.


Making sense of our unhealthy behaviour

Professor Marteau will be speaking at the 2018 Cambridge Science Festival about why we sometimes make ‘unhealthy’ choices and how we might encourage people to change.

Friday 16 March: 5:30pm – 6:30pm

Babbage Lecture Theatre, New Museums Site Downing Street, CB2 3RS

​Details and how to book here.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Scientists Link Genes To Brain Anatomy In Autism

Scientists link genes to brain anatomy in autism

source: www.cam.ac.uk

A team of scientists at the University of Cambridge has discovered that specific genes are linked to individual differences in brain anatomy in autistic children.

This takes us one step closer to understanding why the brains of people with and without autism may differ from one another

Richard Bethlehem

Previous studies have reported differences in brain structure of autistic individuals. However, until now, scientists have not known which genes are linked to these differences.

The team at the Autism Research Centre analysed magnetic resonance imaging (MRI) brain scans from more than 150 autistic children and compared them with MRI scans from similarly aged children but who did not have autism. They looked at variation in the thickness of the cortex, the outermost layer of the brain, and linked this to gene activity in the brain.

They discovered a set of genes linked to differences in the thickness of the cortex between autistic kids and non-autistic children. Many of these genes are involved in how brain cells (or neurons) communicate with each other. Interestingly, many of the genes identified in this study have been shown to have lower gene activity at the molecular level in autistic post mortem brain tissue samples.

The study was led by two postdoctoral scientists, Dr Rafael Romero-Garcia and Dr Richard Bethlehem, and Varun Warrier, a PhD student. The study is published in the journal Molecular Psychiatry and provides the first evidence linking differences in the autistic brain to genes with atypical gene activity in autistic brains.

Dr Richard Bethlehem said: “This takes us one step closer to understanding why the brains of people with and without autism may differ from one another. We have long known that autism itself is genetic, but by combining these different data sets (brain imaging and genetics) we can now identify more precisely which genes are linked to how the autistic brain may differ. In essence, we are beginning to link molecular and macroscopic levels of analysis to better understand the diversity and complexity of autism.”

Varun Warrier added: “We now need to confirm these results using new genetic and brain scan data so as to understand how exactly gene activity and thickness of the cortex are linked in autism.”

“The identification of genes linked to brain changes in autism is just the first step,” said Dr Rafael Romero-Garcia. “These promising findings reveal how important multidisciplinary approaches are if we want to better understand the molecular mechanisms underlying autism. The complexity of this condition requires a joint effort from a wide scientific community.”

The research was supported by the Medical Research Council, the Autism Research Trust, the Wellcome Trust, and the Templeton World Charity Foundation, Inc.

Reference
Romero-Garcia, R et al. Synaptic and transcriptionally downregulated genes are associated with cortical thickness differences in autism. Molecular Psychiatry; 26 Feb; DOI: 10.1038/s41380-018-0023-7


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Young Children Use Physics, Not Previous Rewards, To Learn About Tools

Young children use physics, not previous rewards, to learn about tools

source: www.cam.ac.uk

Children as young as seven apply basic laws of physics to problem-solving, rather than learning from what has previously been rewarded, suggests new research from the University of Cambridge.

Remarkably, children begin to emphasise information about physics over information about previous rewards from as young as seven years of age, even when these two types of information are in direct conflict

Lucy Cheke

The findings of the study, based on the Aesop’s fable The Crow and the Pitcher, help solve a debate about whether children learning to use tools are genuinely learning about physical causation or are just driven by what action previously led to a treat.

Learning about causality – about the physical rules that govern the world around us – is a crucial part of our cognitive development. From our observations and the outcome of our own actions, we build an idea – a model – of which tools are functional for particular jobs, and which are not.

However, the information we receive isn’t always as straightforward as it should be. Sometimes outside influences mean that things that should work, don’t. Similarly, sometimes things that shouldn’t work, do.

Dr Lucy Cheke from the Department of Psychology at the University of Cambridge says: “Imagine a situation where someone is learning about hammers. There are two hammers that they are trying out – a metal one and an inflatable one. Normally, the metal hammer would successfully drive a nail into a plank of wood, while the inflatable hammer would bounce off harmlessly.

“But what if your only experience of these two hammers was trying to use the metal hammer and missing the nail, but using the inflatable hammer to successfully push the nail into a large pre-drilled hole? If you’re then presented with another nail, which tool would you choose to use? The answer depends on what type of information you have taken from your learning experience.”

In this situation, explains, Cheke, a learner concerned with the outcome (a ‘reward’ learner) would learn that the inflatable hammer was the successful tool and opt to use it for later hammering. However, a learner concerned with physical forces (a ‘functionality’ learner) would learn that the metal hammer produced a percussive force, albeit in the wrong place, and that the inflatable hammer did not, and would therefore opt for the metal hammer.

Now, in a study published in the open access journal PLOS ONE, Dr Cheke and colleagues investigated what kind of information children extract from situations where the relevant physical characteristics of a potential tool are observable, but often at odds with whether the use of that tool in practice achieved the desired goal.

The researchers presented children aged 4-11 with a task through which they must retrieve a floating token to earn sticker rewards. Each time, the children were presented with a container of water and a set of tools to use to raise the level. This experiment is based on one of the most famous Aesop’s fables, where a thirty crow drops stones into a pitcher to get to the water.

In this test, some of the tools were ‘functional’ and some ‘non-functional’. Functional tools were those that, if dropped into a standard container, would sink, raising the water level and bringing the token within reach; non-functional tools were those that would not do so, for example because they floated.

However, sometimes the children used functional tools to attempt to raise the level in a leaking container – in this context, the water would never rise high enough to bring the token within reach, no matter how functional the tool used.

At other times, the children were successful in retrieving the reward despite using a non-functional tool; for example, when using a water container that self-fills through an inlet pipe, it doesn’t matter whether the tool is functional as the water is rising anyway.

After these learning sessions, the researchers presented the children with a ‘standard’ water container and a series of choices between different tools. From the pattern of these choices the researchers could calculate what type of information was most influential on children’s decision-making: reward or function.

“A child doesn’t have to know the precise rules of physics that allow a tool to work to have a feeling of whether or not it should work,” says Elsa Loissel, co-first author of the study. “So, we can look at whether a child’s decision making is guided by principles of physics without requiring them to explicitly understand the physics itself.

“We expected older children, who might have a rudimentary understanding of physical forces, to choose according to function, while younger children would be expected to use the simpler learning approach and base their decisions on what had been previously rewarded,” adds co-first author Dr Cheke. “But this wasn’t what we found.”

Instead, the researchers showed that information about reward was never a reliable predictor of children’s choices. Instead, the influence of functionality information increased with age – by the age of seven, this was the dominant influence in their decision making.

“This suggests that, remarkably, children begin to emphasise information about physics over information about previous rewards from as young as seven years of age, even when these two types of information are in direct conflict.”

This research was funded by the European Research Council under the European Union’s Seventh Framework Programme.

Reference
Elsa Loissel, Lucy Cheke & Nicola Clayton. Exploring the Relative Contributions of Reward-History and Functionality Information to Children’s Acquisition of The Aesop’s Fable Task. PLOS ONE; 23 Feb 2018; DOI: 10.1371/journal.pone.0193264


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Study in Mice Suggests Personalised Stem Cell Treatment May Offer Relief For Progressive MS

Study in mice suggests personalised stem cell treatment may offer relief for progressive MS

source: www.cam.ac.uk

Scientists have shown in mice that skin cells re-programmed into brain stem cells, transplanted into the central nervous system, help reduce inflammation and may be able to help repair damage caused by multiple sclerosis (MS).

Our mouse study suggests that using a patient’s reprogrammed cells could provide a route to personalised treatment of chronic inflammatory diseases, including progressive forms of MS

Luca Peruzzotti-Jametti

The study, led by researchers at the University of Cambridge, is a step towards developing personalised treatments based on a patient’s own skin cells for diseases of the central nervous system (CNS).

In MS, the body’s own immune system attacks and damages myelin, the protective sheath around nerve fibres, causing disruption to messages sent around the brain and spinal cord. Symptoms are unpredictable and include problems with mobility and balance, pain, and severe fatigue.

Key immune cells involved in causing this damage are macrophages (literally ‘big eaters’), which ordinarily serve to attack and rid the body of unwanted intruders. A particular type of macrophage known as microglia are found throughout the brain and spinal cord – in progressive forms of MS, they attack the CNS, causing chronic inflammation and damage to nerve cells.

Recent advances have raised expectations that diseases of the CNS may be improved by the use of stem cell therapies. Stem cells are the body’s ‘master cells’, which can develop into almost any type of cell within the body. Previous work from the Cambridge team has shown that transplanting neural stem cells (NSCs) – stem cells that are part-way to developing into nerve cells – reduces inflammation and can help the injured CNS heal.

However, even if such a therapy could be developed, it would be hindered by the fact that such NSCs are sourced from embryos and therefore cannot be obtained in large enough quantities. Also, there is a risk that the body will see them as an alien invader, triggering an immune response to destroy them.

A possible solution to this problem would be the use of so-called ‘induced neural stem cells (iNSCs)’ – these cells can be generated by taking an adult’s skin cells and ‘re-programming’ them back to become neural stem cells. As these iNSCs would be the patient’s own, they are less likely to trigger an immune response.

Now, in research published in the journal Cell Stem Cell, researchers at the University of Cambridge have shown that iNSCs may be a viable option to repairing some of the damage caused by MS.

Using mice that had been manipulated to develop MS, the researchers discovered that chronic MS leads to significantly increased levels of succinate, a small metabolite that sends signals to macrophages and microglia, tricking them into causing inflammation, but only in cerebrospinal fluid, not in the peripheral blood.

Transplanting NSCs and iNSCs directly into the cerebrospinal fluid reduces the amount of succinate, reprogramming the macrophages and microglia – in essence, turning ‘bad’ immune cells ‘good’. This leads to a decrease in inflammation and subsequent secondary damage to the brain and spinal cord.

“Our mouse study suggests that using a patient’s reprogrammed cells could provide a route to personalised treatment of chronic inflammatory diseases, including progressive forms of MS,” says Dr Stefano Pluchino, lead author of the study from the Department of Clinical Neurosciences at the University of Cambridge.

“This is particularly promising as these cells should be more readily obtainable than conventional neural stem cells and would not carry the risk of an adverse immune response.”

The research team was led by Dr Pluchino, together with Dr Christian Frezza from the MRC Cancer Unit at the University of Cambridge, and brought together researchers from several university departments.

Dr Luca Peruzzotti-Jametti, the first author of the study and a Wellcome Trust Research Training Fellow, says: “We made this discovery by bringing together researchers from diverse fields including regenerative medicine, cancer, mitochondrial biology, inflammation and stroke and cellular reprogramming. Without this multidisciplinary collaboration, many of these insights would not have been possible.”

The research was funded by Wellcome, European Research Council, Medical Research Council, Italian Multiple Sclerosis Association, Congressionally-Directed Medical Research Programs, the Evelyn Trust and the Bascule Charitable Trust.

Reference
Peruzzotti-Jametti, L et al. Macrophage-derived extracellular succinate licenses neural stem cells to suppress chronic neuroinflammation. Cell Stem Cell; 2018; 22: 1-14; DOI: 10.1016/j.stem.2018.01.20


Researcher profile: Dr Luca Peruzzotti-Jametti

It isn’t every day that you find yourself invited to play croquet with a Nobel laureate, but then Cambridge isn’t every university, as Dr Luca Peruzzotti-Jametti discovered when he was fortunate enough to be invited to the house of Professor Sir John Gurdon.

“It was an honour meet a Nobel laureate who has influenced so much my studies and meet the man behind the science,” he says. “I was moved by how kind he is and extremely impressed by his endless passion for science.”

Dr Peruzzotti-Jametti began his career studying medicine at the University Vita-Salute San Raffaele, Milan. His career took him across Europe, to Switzerland, Denmark, Sweden and now to Cambridge. After completing a PhD in Clinical Neurosciences here he is now a Wellcome Trust Research Training fellow.

His work focuses on multiple sclerosis (MS), an autoimmune disease that affects around 100,000 people in the UK alone. Despite having several therapies to help during the initial (or ‘relapsing remitting’) phase of MS, the majority of people with MS will develop a chronic worsening of disability within 15 years after diagnosis. This late form of MS is called secondary progressive, and differently from relapsing remitting MS, it does not have any effective treatment.

“My research sets out to understand how progression works in MS by studying how inflammation is maintained in the brains of patients, and to develop new treatments aimed at preventing disease progression,” he explains. Among his approaches is the use of neural stem cells and induced neural stem cells, as in the above study. “My hope is that using a patient’s reprogrammed cells could provide a route to personalised treatment of chronic inflammatory diseases, including progressive forms of MS.”

Dr Peruzzotti-Jametti is based on the Cambridge Biomedical Campus where he works closely with clinicians at Addenbrooke’s Hospital and with basic scientists, a community he describes as “vibrant”.

“Cambridge has been the best place to do my research due to the incredible concentration of scientists who pursue novel therapeutic approaches using cutting-edge technologies,” he says. “I am very thankful for the support I received in the past years from top notch scientists. Being in Cambridge has also helped me competing for major funding sources and my work could have not been possible without the support of the Wellcome Trust.

“I wish to continue working in this exceptional environment where so many minds and efforts are put together in a joint cause for the benefit of those who suffer.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Stroke Survivors and Caregivers Feel Abandoned By Health Services, Study Finds

Stroke survivors and caregivers feel abandoned by health services, study finds

source: www.cam.ac.uk

A systematic review of studies focused on stroke survivors’ and carers’ experiences of primary care and community healthcare services has found that they feel abandoned because they have become marginalised by services and do not have the knowledge or skills to re-engage.

Stroke survivors and their caregivers can feel abandoned because they struggle to access the appropriate health services, leading to marginalisation

Lisa Lim

The study, by researchers at the University of Cambridge, suggests that primary care and community health care interventions which focus on improving active follow-up and information provision to patients and caregivers, especially in the first year after stroke, could help improve patient self-management and increase stroke-specific health literacy.

Globally, stroke is the second leading cause of death.  Stroke-related disability burden is on the rise with a 12% increase worldwide since 1990, and contributes to the large economic burden of stroke due to healthcare use, informal care and the loss of productivity. The annual cost of stroke, including health care cost, medicines and missed days of work, is estimated at $33 billion in the USA and £8.9 billion in the UK.

Primary care could play an important role in the care of stroke survivors and their caregivers, supporting access to community services and facilitating transfer back to specialist services when new problems emerge. It could also help provide training, and identify and address health needs of caregivers. However, the feeling of abandonment that people with stroke experience following hospital discharge suggests this role is not being fulfilled.

To better understand the possible reasons behind this feeling of abandonment, a team at Cambridge’s Department of Public Health and Primary Care carried out a systematic review of qualitative evidence in the field. In total, they analysed 51 studies (encompassing 566 stroke survivors and 593 caregivers). Their results are published today in the journal PLOS ONE.

The analysis found an unaddressed need for continued support in a quarter of studies. Survivors and caregivers felt frustrated and dissatisfied with a lack of proactive follow-up either from primary care, the hospital, or allied healthcare professionals. This led to feelings of dissatisfaction, uncertainty, that a stroke survivor was “forgotten and written off” and that their general practice did not care about them.

Lack of support for caregivers was reported in more than one in five studies (22%), even though they felt healthcare professionals assumed that they would provide the majority of care needed. They felt ill prepared and pressured to “become experts” in caring for stroke survivors. In addition, both survivors and caregivers felt emotional support was lacking, even though they are at risk of anxiety and depression.

Long waiting times for assessment and rehabilitation and little or no help from social services left survivors feeling “left in the lurch”. Caregivers felt that access to rehabilitation was not provided early enough, causing survivors to “go backwards”.

More than two out of five (41%) of studies highlighted gaps in information provision. Opportunities for support could be missed due to the lack of knowledge of what services were available. The lack of information about local services and how to find them was confusing and prevented access. Many caregivers and survivors had to find out information by themselves from the internet, friends and other caregivers. When information was provided, it was often inconsistent and covered only some services.

A quarter (23%) of the studies highlighted inadequate information on stroke, its consequences, and recovery. Information presented too early after stroke disempowered stroke survivors and caregivers, leading to feelings of confusion, fear and powerlessness. Survivors and caregivers wanted specific information on the significance of post-stroke symptoms and how to manage them. Lack of information led to unrealistic expectations of “getting back to normal”, leading to disappointment and tensions between the survivor and caregiver.

Ineffective communication between survivors, caregivers and healthcare services as well as within healthcare services resulted in feelings of frustration and having “to battle the system”. Gaps in the transfer of knowledge within the healthcare system and the use of medical jargon sometimes caused confusion and were construed as indifference to survivors’ needs.

“Patients and caregivers would benefit from active follow up and information provision about stroke that is tailored to their specific needs, which change over time,” says Professor Jonathan Mant, who led the study. “People take active efforts to find information for themselves, but navigating and appraising it can be challenging. What is needed is trustworthy information written in an accessible language and format, which could support better self-management.”

The study found that that many stroke survivors and caregivers felt marginalised due to the misalignment between how healthcare access in primary care is organised and survivors’ and caregivers’ competencies. For example, individuals felt that in order to access services they needed an awareness of what services are available, plus the ability to communicate effectively with healthcare professionals. This situation can be compounded by cognitive, speech and language problems that can further affect a patient’s ability to negotiate healthcare access.

“Stroke survivors and their caregivers can feel abandoned because they struggle to access the appropriate health services, leading to marginalisation,” says Dr Lisa Lim, one of the study authors. “This arises because of a number of factors, including lack of continuity of care, limited and delayed access to community services, and inadequate information about stroke, recovery and healthcare services.

“We need mechanisms to encourage better communication and collaboration between generalist services, which tend to provide the longer term care after stroke, and specialist services, which provide the care in the immediate phase post-stroke.”

The researchers argue that providing support from healthcare professionals within the first year after stroke would increase patients’ ability to self-manage their chronic condition. This can be achieved by providing timely and targeted information about stroke, available resources, and by regular follow-ups to foster supporting long-term relationships with healthcare professionals.

“Giving the right information at the right time will help stroke survivors and their caregivers become more self-reliant over time and better able to self-manage living with stroke,” adds Dr Lim.

The team identified two key areas of improvement to address patients’ and caregivers’ marginalisation: increasing stroke-specific health literacy by targeted and timely information provision, and improving continuity of care and providing better access to community healthcare services.

Reference
Pindus, DM et al. Stroke survivors’ and informal caregivers’ experiences of primary care and community healthcare services – a systematic review and meta-ethnography. PLOS ONE; 21 Feb 2018; DOI: 10.1371/journal.pone.0192533


Researcher profile: Dr Lisa Lim

As well as being a researcher in the Department of Public Health and Primary Care, Dr Lisa Lim is also a GP. Her experience with patients helps inform her work.

“My research is with stroke survivors, looking at how we can improve things for them after stroke as well as preventing further strokes,” she says. “We know that stroke survivors and their carers often struggle after they have been discharged from specialist services and their needs are not always identified or addressed by healthcare services; this is what we want to change. This is a problem I see in my clinical practice and I know how important it is to these patients.”

Working in collaboration with researchers at the University of Leicester, Dr Lim and the team at Improving Primary Care after Stroke (IPCAS) have spent the past two years developing and piloting a primary care intervention for stroke survivors. The intervention is now ready to be trialled and they are currently recruiting GP practices and patients.

Dr Lim says she hopes her work will demonstrate how important it is that we continue to invest in primary care research and how primary care can help people to live well with a chronic problem like stroke – “It can make a massive difference to peoples’ lives,” she says.

“It may not be considered by some to be the most glamorous research,” she adds. “We will not be ‘curing’ stroke, but what we are trying to do is make a big impact on the day-to-day lives of people affected by stroke.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

International Experts Sound The Alarm On The Malicious Use of AI in Unique Report

International experts sound the alarm on the malicious use of AI in unique report

source: www.cam.ac.uk

Twenty-six experts on the security implications of emerging technologies have jointly authored a ground-breaking report – sounding the alarm about the potential malicious use of artificial intelligence (AI) by rogue states, criminals, and terrorists.

For many decades hype outstripped fact in terms of AI and machine learning. No longer.

Seán Ó hÉigeartaigh


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Many Highly-Engaged employees Suffer From Burnout

Many highly-engaged employees suffer from burnout

source: www.cam.ac.uk

Underlining the danger of job burnout, a new study of more than 1,000 US workers finds that many employees who are highly engaged in their work are also exhausted and ready to leave their organisations.

These findings are a big challenge to organisations and their management.

Jochen Menges

Whereas lack of engagement is commonly seen as leading to employee turnover due to boredom and disaffection, the study finds that companies, in fact, risk losing some of their most motivated and hard-working employees due to high stress and burnout – a symptom of the “darker side” of workplace engagement.

It is concerning, concludes the study by academics working in the UK, US and Germany, that many engaged employees suffer from stress and burnout symptoms, which may be the beginning of a pathway leading into disengagement.

“Nearly half of all employees were moderately to highly engaged in their work but also exhausted and ready to leave their organisations,” said co-author Dr Jochen Menges from the University of Cambridge. “This should give managers a lot to think about.”

The study, published in the journal Career Development International, examined multiple workplace factors that divide employees into various engagement-burnout profiles. These include low engagement-low burnout (“apathetic”), low engagement-high burnout (“burned-out”), high engagement-low burnout (“engaged”), “moderately engaged-exhausted”; and “highly engaged-exhausted”.

While the largest population at 41 percent fit the healthily “engaged” profile, 19 percent experienced high levels of both engagement and burnout (“highly engaged-exhausted”) and another 35.5 percent were “moderately engaged-exhausted”.

The highest turnover intentions were reported by the “highly engaged-exhausted” group – higher than even the unengaged group that might be commonly expected to be eyeing an exit.

“These findings are a big challenge to organisations and their management,” said Menges, who is a Lecturer in Organisational Behaviour at Cambridge Judge Business School. “By shedding light on some of the factors in both engagement and burnout, the study can help organisations identify workers who are motivated but also at risk of burning out and leaving.”

While previous studies had looked at engagement-burnout profiles, the new study – conducted at the Yale Center for Emotional Intelligence, in collaboration with the Faas Foundation – also focuses on demands placed on employees and resources provided to them in the workplace, and how these affect engagement and burnout.

The study is based on an online survey of 1,085 employees in all 50 US states. It measured engagement, burnout, demands and resources on a six-point scale ranging from such responses as “never” to “almost always” or “strongly agree” to “strongly disagree”.

For engagement, questions included “I strive as hard as I can to complete my job” and “I feel energetic at my job”. For burnout, participants were asked how often at work they feel “disappointed with people” or “physically weak/sickly”. Demand questions included “I have too much work to do”, while resources were measured by questions such as “my supervisor provides me with the support I need to do my job well”.

The researchers then examined overlap of these various factors, and how they interact and influence each other, in order to draw conclusions about the different profile groups.

“High engagement levels in the workplace can be a double-edged sword for some employees,” said Menges. “Engagement is very beneficial to workers and organisations when burnout symptoms are low, but engagement coupled with high burnout symptoms can lead to undesired outcomes including increased intentions to leave an organisation. So managers need to look carefully at high levels of engagement and help those employees who may be headed for burnout, or they risk higher turnover levels and other undesirable outcomes.”

Reference:
Julia Moeller et al. ‘Highly engaged but burned out: intra-individual profiles in the US workforce.’ Career Development International (2018). DOI: 10.1108/CDI-12-2016-0215


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

In Living Colour: Brightly-Coloured Bacteria Could Be Used To ‘Grow’ Paints and Coatings

In living colour: Brightly-coloured bacteria could be used to ‘grow’ paints and coatings

source: www.cam.ac.uk

Researchers have unlocked the genetic code behind some of the brightest and most vibrant colours in nature. The paper, published in the journal PNAS, is the first study of the genetics of structural colour – as seen in butterfly wings and peacock feathers – and paves the way for genetic research in a variety of structurally coloured organisms.

This is the first systematic study of the genes underpinning structural colours — not only in bacteria but in any living system.

Villads Egede Johansen

The study is a collaboration between the University of Cambridge and Dutch company Hoekmine BV and shows how genetics can change the colour, and appearance, of certain types of bacteria. The results open up the possibility of harvesting these bacteria for the large-scale manufacturing of nanostructured materials: biodegradable, non-toxic paints could be ‘grown’ and not made, for example.

Flavobacterium is a type of bacteria that packs together in colonies that produce striking metallic colours, which come not from pigments, but from their internal structure, which reflects light at certain wavelengths. Scientists are still puzzled as to how these intricate structures are genetically engineered by nature, however.

“It is crucial to map the genes responsible for the structural colouration for further understanding of how nanostructures are engineered in nature,” said first author Villads Egede Johansen, from Cambridge’s Department of Chemistry. “This is the first systematic study of the genes underpinning structural colours — not only in bacteria but in any living system.”

The researchers compared the genetic information to optical properties and anatomy of wild-type and mutated bacterial colonies to understand how genes regulate the colour of the colony.

By genetically mutating the bacteria, the researchers changed their dimensions or their ability to move, which altered the geometry of the colonies. By changing the geometry, they changed the colour: they changed the original metallic green colour of the colony in the entire visible range from blue to red. They were also able to create duller colouration or make the colour disappear entirely.

“We mapped several genes with previously unknown functions and we correlated them to the colonies’ self-organisational capacity and their colouration,” said senior author Dr Colin Ingham, CEO of Hoekmine BV.

“From an applied perspective, this bacterial system allows us to achieve tuneable living photonic structures that can be reproduced in abundance, avoiding traditional nanofabrication methods,” said co-senior author Dr Silvia Vignolini from the Cambridge’s Department of Chemistry. “We see a potential in the use of such bacterial colonies as photonic pigments that can be readily optimised for changing colouration under external stimuli and that can interface with other living tissues, thereby adapting to variable environments. The future is open for biodegradable paints on our cars and walls — simply by growing exactly the colour and appearance we want!”

Reference: 
Villads Egede Johansen et al. ‘Living
colors: Genetic manipulation of structural color in bacterial colonies.’ PNAS (2018). DOI: 10.1073/pnas.1716214115


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Fake News ‘vaccine’: Online Game May ‘Inoculate’ By Simulating Propaganda Tactics

Fake news ‘vaccine’: online game may ‘inoculate’ by simulating propaganda tactics

source: www.cam.ac.uk

A new experiment, launching today online, aims to help ‘inoculate’ against disinformation by providing a small dose of perspective from a “fake news tycoon”. A pilot study has shown some early success in building resistance to fake news among teenagers.

We try to let players experience what it is like to create a filter bubble so they are more likely to realise they may be living in one

Sander van der Linden

new online game puts players in the shoes of an aspiring propagandist to give the public a taste of the techniques and motivations behind the spread of disinformation – potentially “inoculating” them against the influence of so-called fake news in the process.

Researchers at the University of Cambridge have already shown that briefly exposing people to tactics used by fake news producers can act as a “psychological vaccine” against bogus anti-science campaigns.

While the previous study focused on disinformation about climate science, the new online game is an experiment in providing “general immunity” against the wide range of fake news that has infected public debate.

The game encourages players to stoke anger, mistrust and fear in the public by manipulating digital news and social media within the simulation.

Players build audiences for their fake news sites by publishing polarising falsehoods, deploying twitter bots, photo-shopping evidence, and inciting conspiracy theories in the wake of public tragedy – all while maintaining a “credibility score” to remain as persuasive as possible.

pilot study conducted with teenagers in a Dutch high school used an early paper-and-pen trial of the game, and showed the perceived “reliability” of fake news to be diminished in those that played compared to a control group.

The research and education project, a collaboration between Cambridge researchers and Dutch media collective DROG, is launching an English version of the game online today at www.fakenewsgame.org.

The psychological theory behind the research is called “inoculation”:

“A biological vaccine administers a small dose of the disease to build immunity. Similarly, inoculation theory suggests that exposure to a weak or demystified version of an argument makes it easier to refute when confronted with more persuasive claims,” says Dr Sander van der Linden, Director of Cambridge University’s Social Decision-Making Lab.

“If you know what it is like to walk in the shoes of someone who is actively trying to deceive you, it should increase your ability to spot and resist the techniques of deceit. We want to help grow ‘mental antibodies’ that can provide some immunity against the rapid spread of misinformation.”

Based in part on existing studies of online propaganda, and taking cues from actual conspiracy theories about organisations such as the United Nations, the game is set to be translated for countries such as Ukraine, where disinformation casts a heavy shadow.

There are also plans to adapt the framework of the game for anti-radicalisation purposes, as many of the same manipulation techniques – using false information to provoke intense emotions, for example – are commonly deployed by recruiters for religious extremist groups.

“You don’t have to be a master spin doctor to create effective disinformation. Anyone can start a site and artificially amplify it through twitter bots, for example. But recognising and resisting fake news doesn’t require a PhD in media studies either,” says Jon Roozenbeek, a researcher from Cambridge’s Department of Slavonic Studies and one of the game’s designers.

“We aren’t trying to drastically change behavior, but instead trigger a simple thought process to help foster critical and informed news consumption.”

Roozenbeek points out that some efforts to combat fake news are seen as ideologically charged. “The framework of our game allows players to lean towards the left or right of the political spectrum. It’s the experience of misleading through news that counts,” he says.

The pilot study in the Netherlands using a paper version of the game involved 95 students with an average age of 16, randomly divided into treatment and control.

This version of the game focused on the refugee crisis, and all participants were randomly presented with fabricated news articles on the topic at the end of the experiment.

The treatment group were assigned roles – alarmist, denier, conspiracy theorist or clickbait monger – and tasked with distorting a government fact sheet on asylum seekers using a set of cards outlining common propaganda tactics consistent with their role.

They found fake news to be significantly less reliable than the control group, who had not produced their own fake article. Researchers describe the results of this small study as limited but promising. The study has been accepted for publication in the Journal of Risk Research.

The team are aiming to take their “fake news vaccine” trials to the next level with today’s launch of the online game.

With content written mostly by the Cambridge researchers along with Ruurd Oosterwoud, founder of DROG, the game only takes a few minutes to complete. The hope is that players will then share it to help create a large anonymous dataset of journeys through the game.

The researchers can then use this data to refine techniques for increasing media literacy and fake news resilience in a ‘post-truth’ world. “We try to let players experience what it is like to create a filter bubble so they are more likely to realise they may be living in one,” adds van der Linden.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Calcium May Play a Role In The Development of Parkinson’s Disease

Calcium may play a role in the development of Parkinson’s disease

source: www.cam.ac.uk

Researchers have found that excess levels of calcium in brain cells may lead to the formation of toxic clusters that are the hallmark of Parkinson’s disease.

This is the first time we’ve seen that calcium influences the way alpha-synuclein behaves.

Janin Lautenschlӓger

The international team, led by the University of Cambridge, found that calcium can mediate the interaction between small membranous structures inside nerve endings, which are important for neuronal signalling in the brain, and alpha-synuclein, the protein associated with Parkinson’s disease. Excess levels of either calcium or alpha-synuclein may be what starts the chain reaction that leads to the death of brain cells.

The findings, reported in the journal Nature Communications, represent another step towards understanding how and why people develop Parkinson’s. According to the charity Parkinson’s UK, one in every 350 adults in the UK – an estimated 145,000 in all – currently has the condition, but as yet it remains incurable.

Parkinson’s disease is one of a number of neurodegenerative diseases caused when naturally occurring proteins fold into the wrong shape and stick together with other proteins, eventually forming thin filament-like structures called amyloid fibrils. These amyloid deposits of aggregated alpha-synuclein, also known as Lewy bodies, are the sign of Parkinson’s disease.

Curiously, it hasn’t been clear until now what alpha-synuclein actually does in the cell: why it’s there and what it’s meant to do. It is implicated in various processes, such as the smooth flow of chemical signals in the brain and the movement of molecules in and out of nerve endings, but exactly how it behaves is unclear.

“Alpha-synuclein is a very small protein with very little structure, and it needs to interact with other proteins or structures in order to become functional, which has made it difficult to study,” said senior author Dr Gabriele Kaminski Schierle from Cambridge’s Department of Chemical Engineering and Biotechnology.

Thanks to super-resolution microscopy techniques, it is now possible to look inside cells to observe the behaviour of alpha-synuclein. To do so, Kaminski Schierle and her colleagues isolated synaptic vesicles, part of the nerve cells that store the neurotransmitters which send signals from one nerve cell to another.

In neurons, calcium plays a role in the release of neurotransmitters. The researchers observed that when calcium levels in the nerve cell increase, such as upon neuronal signalling, the alpha-synuclein binds to synaptic vesicles at multiple points causing the vesicles to come together. This may indicate that the normal role of alpha-synuclein is to help the chemical transmission of information across nerve cells.

“This is the first time we’ve seen that calcium influences the way alpha-synuclein interacts with synaptic vesicles,” said Dr Janin Lautenschlӓger, the paper’s first author. “We think that alpha-synuclein is almost like a calcium sensor. In the presence of calcium, it changes its structure and how it interacts with its environment, which is likely very important for its normal function.”

“There is a fine balance of calcium and alpha-synuclein in the cell, and when there is too much of one or the other, the balance is tipped and aggregation begins, leading to Parkinson’s disease,” said co-first author Dr Amberley Stephens.

The imbalance can be caused by a genetic doubling of the amount of alpha-synuclein (gene duplication), by an age-related slowing of the breakdown of excess protein, by an increased level of calcium in neurons that are sensitive to Parkinson’s, or an associated lack of calcium buffering capacity in these neurons.

Understanding the role of alpha-synuclein in physiological or pathological processes may aid in the development of new treatments for Parkinson’s disease. One possibility is that drug candidates developed to block calcium, for use in heart disease for instance, might also have potential against Parkinson’s disease.

The research was funded in part by the Wellcome Trust, the Medical Research Council, Alzheimer’s Research UK, and the Engineering and Physical Sciences Research Council.

Reference
Janin Lautenschlӓger, Amberley D. Stephens et al. ‘C-terminal calcium binding of Alpha-synuclein modulates synaptic vesicle interaction.’ Nature Communications (2018). DOI: 10.1038/s41467-018-03111-4


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Scientists Discover The Secrets Behind The Cuttlefish’s 3D ‘Invisibility Cloak’

Scientists discover the secrets behind the cuttlefish’s 3D ‘invisibility cloak’

source: www.cam.ac.uk

An international team of scientists has identified the neural circuits that enable cuttlefish to change their appearance in just the blink to eye – and discovered that this is similar to the neural circuit that controls iridescence in squids.

The sea is full of strange and wondrous creatures, but there are few as bizarre and intelligent as octopuses and cuttlefish. We’ve seen dozens of examples of these animals suddenly appearing from nowhere, as if they have thrown off an invisibility cloak

Trevor Wardill

Cuttlefish and octopuses are remarkable creatures. They have the ability to change their appearance in a matter of seconds, camouflaging themselves from predators and enabling them to surprise their prey. However, unlike a number of reptiles and amphibians which merely change colour to blend into their surroundings, these cephalopods are also able to change the physical texture of their skin to match the coarseness of surrounding rocks, coral or seaweed.

“The sea is full of strange and wondrous creatures, but there are few as bizarre and intelligent as octopuses and cuttlefish,” says Dr Trevor Wardill from the Department of Physiology, Development and Neuroscience at the University of Cambridge. “We’ve seen dozens of examples of these animals suddenly appearing from nowhere, as if they have thrown off an invisibility cloak. How they do this has long remained a mystery.”

The skin of these animals is covered in tiny muscular organs known as ‘chromatophores’ that change colour in response to a signal from the brain. It also has a second set of muscular organs that can be activated to create bumps known as ‘papillae’. When stimulated, each papilla can change the texture of the skin from flat to three dimensional. The papillae can serve several functions, including disguise.

Understanding the nervous system of these creatures and how they manipulate their skin has proved challenging, but now a team of scientists from the Marine Biological Laboratory and University of Cambridge has begun to understand how this happens. Their results are published today in the journal iScience.

Image: European cuttlefish (Sepia officinalis). Credit: Roger Hanlon

The researchers found that the instruction signal from the cuttlefish’s brain is routed through the stellate ganglion, a peripheral nerve centre. The stellate ganglion houses the giant axon system, so called because it is large enough to see with the naked eye. It also houses particular motor neurons that control the papillae on the mantle (the cuttlefish’s outer surface). This nerve circuitry is similar to that by which squids control skin iridescence.

The giant axon system, due to its large size of up to 1mm, helped Nobel prize-winning Cambridge scientists Alan Hodgkin and Andrew Huxley, along with Australian scientist John Eccles, figure out how nerve impulses (action potentials) work.

Dr Paloma Gonzalez-Bellido, also from the University of Cambridge, adds: “This discovery is really interesting from an evolutionary point of view. It opens up the question of which came first: was the common ancestor to cuttlefish and squid able to camouflage themselves using papillae or express iridescence, or possibly both?”

The researcher team – including Lexi Scaros of Dalhousie University and Roger Hanlon of the Marine Biological Laboratory – also looked in greater detail at the papillae to find out how they manage to hold their shape over a long period of time without a signal. They found that the papillae use a mechanism which they describe as being ‘catch-like’. It resembles the ‘catch’ mechanism found in bivalves, such as oysters, mussels, and scallops, which enables the bivalve shell to remain closed without expending much energy.

“There is still a big mystery, however, which is how these animals interpret the world around them and translate this into signals that change their appearance,” says Dr Wardill.

The researchers say that understanding how cephalopods’ skin changes from a smooth, flat surface to a textured, 3D structure could help in the design of biologically-inspired materials that can themselves be assembled from flat materials.

“This research on neural control of flexible skin, combined with anatomical studies of the novel muscle groups that enable such shape-shifting skin, has applications for the development of new classes of soft materials that can be engineered for a wide array of uses in industry, society, and medicine,” adds Professor Roger Hanlon of the Marine Biological Laboratory.

The research was largely funded by the US Air Force Office of Scientific Research and the UK Biotechnology and Biological Sciences Research Council.

Reference
Neural control of dynamic 3-dimensional skin papillae for cuttlefish camouflage.iScience; 15 Feb 2018; DOI: 10.1016/j.isci.2018.01.001


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

SYMPOSIUM: Perspectives on Oncology Drug Discovery; Immuno-Oncology and the Next Generation

The 4th  Annual Crown Bioscience Cambridge symposium this year focusses on immuno-oncology and future advances in cancer treatment. Don’t miss out on the chance to hear from AZ, Janssen, Merus, Kymab, Cantargia, Oxford BioMedica, Crescendo Biologics, and CrownBio UK, as well as network with peers.

 

Immuno-oncology has been a game changer in cancer research, providing many novel agents which are already clinically approved. With these agents modulating the host immune system rather than targeting tumor cells, evaluating immunotherapies has brought many challenges for drug developers and the need for a wide range of immunocompetent preclinical models.

 

This symposium will bring together oncology and immuno-oncology R&D and preclinical teams to facilitate cross discipline discussions on current techniques, models and experiences in preclinical immuno-oncology drug development, as well as discussing next generation treatments/models and how to move forward with future approaches.

 

Discussion topics include:

  • How to assess long term antitumor immunotherapeutic efficacy, including combination regimen effects
  • The novel targets for next generation immuno-oncology agents, including CAR-T cell therapy
  • The evolution of preclinical models for improved evaluation of immunotherapeutics

 

This free, whole day event is most suited to researchers from pharmaceutical and biotech companies, and will take place at The Cambridge Building, Babraham Research Campus on March 15th.  Attendee spaces are limited, you can reserve your seat by registering today:  here 

Newly-Developed Image Guidelines Will Improve Mobile Shopping Experience Worldwide

Newly-developed image guidelines will improve mobile shopping experience worldwide

source: www.cam.ac.uk

A new type of online product image, developed by researchers at the University of Cambridge in collaboration with global consumer goods company Unilever, could improve the mobile shopping experience for the world’s 2.5 billion smartphone users.

We want to improve the e-commerce images used for every product, at every retailer, in every country in the world.

Sam Waller

The concept, known as ‘mobile ready hero images’, was designed to make shopping for grocery products faster, by making it easier to quickly spot key information about a product, such as size, type or flavour.

For example, searching for ‘soap’ on Amazon or other retail websites will bring up hundreds of images, and most customers will scroll quickly through the list on their phone in order to find the particular item they want. However, based on product images alone, it can be difficult to quickly spot the differences between items: whether an item contains one, three or ten individual bars of soap, for instance.

“While traditional pack photographs can be effective on desktop screens, different flavours and sizes of products can look identical when these photographs are displayed on mobiles, reduced to the size of a postage stamp,” said Dr Sam Waller from Cambridge’s Engineering Design Centre, who led the project. “This is especially problematic for older consumers with age-related long-sightedness.”

To date, mobile ready hero images have been adopted by over 80 retailers in more than 40 countries. India – where 65% of all online shopping transactions take place on mobiles – has been one of the fastest countries to adopt these images.

In addition to making the mobile shopping experience easier for customers, mobile ready hero images have also been shown to have a positive impact on sales. “Magnum ice cream is one of our billion dollar global brands that has adopted hero images,” said Oliver Bradley, e-commerce director at Unilever. “During an eight-week A/B split test with a retailer, Magnum’s hero images led to a sales increase of 24%.”

In order to meet retailers’ demands for consistent product images across all brands, Unilever commissioned Cambridge to develop a website for hero image guidelines, with freely available templates to help brands create improved product images.

To date, some brands have created mobile ready hero images using the Cambridge templates, while others have developed hero images in a different way. Some retailers have chosen to accept all kinds of hero images, while others will only accept some kinds of hero images, resulting in an inconsistent experience for consumers.

GS1, a global non-profit organisation which sets standards for consumer goods, has recently established a working group to focus on mobile ready hero images.

“We spotted the opportunity to improve the current situation using our Global Standards Management Process,” said Paul Reid, head of standards at GS1 in the UK. “The aim of the working group is to get agreement between competing brands and retailers, leading to a single, globally applicable set of guidelines for mobile ready hero images. These guidelines will help brands and retailers make the shopping experience better and more consistent.”

“Inclusive design can help improve the visual clarity of hero images, making them more accessible to a wider range of consumers,” said Waller. “In particular, our SEE-IT method can estimate the proportion of the population who would be unable to discern the important information from e-commerce images. We have joined the GS1 working group in an advisory capacity, and we are looking forward to contributing our expertise to help inform the critical decisions.

“Grocery products are just the start: we want to improve the e-commerce images used for every product, at every retailer, in every country in the world.”

Inset image: Examples of mobile-ready hero images. Walkers is a trademark owned and designed by PepsiCo and used with permission.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Shoals of Sticklebacks Differ In Their Collective Personalities

Shoals of sticklebacks differ in their collective personalities

source: www.cam.ac.uk

Research from the University of Cambridge has revealed that, among schooling fish, groups can have different collective personalities, with some shoals sticking closer together, being better coordinated, and showing clearer leadership than others.

For centuries, scientists and non-scientists alike have been fascinated by the beautiful and often complex collective behaviour of animal groups, such as the highly synchronised movements of flocks of birds and schools of fish. Often, those spectacular collective patterns emerge from individual group members using simple rules in their interactions, without requiring global knowledge of their group.

In recent years it has also become apparent that, across the animal kingdom, individual animals often differ considerably and consistently in their behaviour, with some individuals being bolder, more active, or more social than others.

New research conducted at the University of Cambridge’s Department of Zoology suggests that observations of different groups of schooling fish could provide important insights into how the make-up of groups can drive collective behaviour and performance.

In the study, published today in the journal Proceedings of the Royal Society B, the researchers created random groups of wild-caught stickleback fish and subjected them repeatedly to a range of environments that included open spaces, plant cover, and patches of food.

Dr Jolle Jolles, lead author of the study, now based at the Max Planck Institute for Ornithology, said: “By filming the schooling fish from above and tracking the groups’ movements in detail, we found that the randomly composed shoals showed profound differences in their collective behaviour that persisted across different ecological contexts. Some groups were consistently faster, better coordinated, more cohesive, and showed clearer leadership structure than others.

“That such differences existed among the groups is remarkable as individuals were randomly grouped with others that were of similar age and size and with which they had very limited previous social contact.”

This research shows for the first time that, even among animals where group membership changes frequently over time and individuals are not very strongly related to each other, such as schooling fish or flocking birds, stable differences can emerge in the collective performance of animal groups.

Such behavioural variability among groups may directly affect the survival and reproductive success of the individuals within them and influence how they associate with one another. Ultimately these findings may therefore help understand the selective pressures that have shaped social behaviour.

Dr Andrea Manica, co-author of the paper from the University of Cambridge, added: “Our research reveals that the collective performance of groups is strongly driven by their composition, suggesting that consistent behavioural differences among groups could be a widespread phenomenon in animal societies.”

These research findings provide important new insights that may help explain and predict the performance of social groups, which could be beneficial in building human teams or constructing automated robot swarms.

The research was supported by the Biotechnology and Biological Sciences Research Council.

Reference
Jolles, JW et al. Repeatable group differences in the collective behaviour of stickleback shoals across ecological contexts. Proceedings of the Royal Society B; 7 Feb 2018; DOI: 10.1098/rspb.2017.2629


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Zero Gravity Graphene Promises Success In Space

Zero gravity graphene promises success in space

source: www.cam.ac.uk

In a series of experiments conducted last month, Cambridge researchers experienced weightlessness testing graphene’s application in space.

This is the first time that graphene has been tested in space-like applications.

Andrea Ferrari

Working as part of a collaboration between the Graphene Flagship and the European Space Agency, researchers from the Cambridge Graphene Centre tested graphene in microgravity conditions for the first time while aboard a parabolic flight – often referred to as the ‘vomit comet’. The experiments they conducted were designed to test graphene’s potential in cooling systems for satellites.

“One of graphene’s potential uses, recognised early on, is space applications, and this is the first time that graphene has been tested in space-like applications,” said Professor Andrea Ferrari, who is Director of the Cambridge Graphene Centre, as well as Science and Technology Officer and Chair of the Management Panel for the Graphene Flagship.

Graphene – a form of carbon just a single atom thick – has a unique combination of properties that make it useful for applications from flexible electronics and fast data communication, to enhanced structural materials and water treatments. It is highly electrically and thermally conductive, as well as strong and flexible.

In this experiment, the researchers aimed to improve the performance of cooling systems in use in satellites, making use of graphene’s excellent thermal properties. “We are using graphene in what are called loop-heat pipes. These are pumps that move fluid without the need for any mechanical parts, so there is no wear and tear, which is very important for space applications,” said Ferrari.

“We are aiming at an increased lifetime and an improved autonomy of the satellites and space probes,” said Dr Marco Molina, Chief Technical Officer of the Space line of business at industry partner Leonardo. “By adding graphene, we will have a more reliable loop heat pipe that can operate autonomously in space.”

In a loop-heat pipe, evaporation and condensation of a fluid are used to transport heat from hot electronic systems out into space. The pressure of the evaporation-condensation cycle forces fluid through the closed systems, providing continuous cooling.

The main element of the loop-heat pipe is the metallic wick, where the fluid is evaporated into gas. In these experiments, the metallic wick was coated in graphene, improving the efficiency of the heat pipe in two ways. Firstly, graphene’s excellent thermal properties improve the heat transfer from the hot systems into the wick. Secondly, the porous structure of the graphene coating increases the interaction of the wick with the fluid, and improves the capillary pressure, meaning the liquid can flow through the wick faster.

After promising results in laboratory tests, the graphene-coated wicks were tested in space-like conditions onboard a Zero-G parabolic flight. To create weightlessness, the plane undergoes a series of parabolic manoeuvres, creating up to 23 seconds of weightlessness in each manoeuvre.

“It was truly a wonderful experience to feel weightlessness, but also the hyper-gravity moments in the plane. I was very excited but at the same time a bit nervous. I couldn’t sleep the night before,” said Dr Yarjan Samad, a Research Associate at the Cambridge Graphene Centre.

During the flight, the graphene-coated wicks again demonstrated excellent performance, with more efficient heat and fluid transfer compared to the untreated wicks. Based on these results, the researchers are continuing to develop and optimise the coatings for applications in real space conditions. “The next step will be to start working on a prototype that could go either on a satellite or on the space station,” said Ferrari.

The research was supported by the Graphene Flagship and the European Space Agency, as a collaboration between researchers from Université libre de Bruxelles, Belgium; the University of Cambridge, UK; the National Research Council of Italy (CNR), Italy; and industry partner Leonardo Spa, Italy.

Inset image: Professor Andrea Ferrari onboard the parabolic flight. 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

£42m New Research Institute To Boost Evidence On Improving Care In The NHS

£42m new research institute to boost evidence on improving care in the NHS

 

source: www.cam.ac.uk

A new research institute launching today is seeking to create a world-leading asset for the NHS by improving the science behind healthcare organisation and delivery.

The Healthcare Improvement Studies Institute (THIS Institute), led by the University of Cambridge, is made possible by the largest single grant ever made by the Health Foundation, an independent charity. The new institute is founded on the principle that efforts to improve care should always be based on the best quality of evidence. Some of that evidence will be created by NHS patients and staff themselves, using innovative citizen science methods in large-scale research projects.

Director of THIS Institute Professor Mary Dixon-Woods, said: “If you ask people to describe the future of healthcare, they might describe a shiny vision of new treatments and technologies. These kinds of innovations are important. But how healthcare is organised and delivered, including its basic systems and processes, has perhaps just as much impact, and sometimes more, on patient outcomes and experience.”

Dr Jennifer Dixon, Chief Executive of the Health Foundation, said: “The UK population clearly wants a high quality and sustainable NHS into the future. Understanding what works, in which contexts and why, is crucial, as is obtaining that evidence fast so it can be acted on. There couldn’t be a more important time to do this, and that is why the Health Foundation has put its money where its mouth is.”

One way the institute will create the evidence-base is through citizen science. Using methods already used in other areas such as biology and astronomy, THIS Institute is building a digital platform to crowdsource research ideas and collect research data from NHS staff and patients, including their opinions on the right indicators of quality of care and their views on equipment design.

Professor Dixon-Woods, Director, THIS Institute, adds: “Tackling healthcare challenges needs to involve a greater variety of people with diverse experience: the institute is looking for expertise in new places. Some of this expertise will come directly from patients – us, you, me – working alongside healthcare staff and other professionals such as engineers and designers.”

The institute will be based at the Cambridge Biomedical Campus, alongside Cambridge University Hospitals NHS Foundation Trust and world-leading research institutes. It is made possible by a ten-year grant from the Health Foundation, whose mission is to bring about better health and healthcare for people in the UK.

Press release from The Healthcare Improvement Studies Institute.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.