All posts by Admin

‘Multiplying’ Light Could Be Key To Ultra-Powerful Optical Computers

Artist's impression of light pulses inside an optical computer
source: www.cam.ac.uk

 

New type of optical computing could solve highly complex problems that are out of reach for even the most powerful supercomputers.

 

An important class of challenging computational problems, with applications in graph theory, neural networks, artificial intelligence and error-correcting codes can be solved by multiplying light signals, according to researchers from the University of Cambridge and Skolkovo Institute of Science and Technology in Russia.

In a paper published in the journal Physical Review Letters, they propose a new type of computation that could revolutionise analogue computing by dramatically reducing the number of light signals needed while simplifying the search for the best mathematical solutions, allowing for ultra-fast optical computers.

Optical or photonic computing uses photons produced by lasers or diodes for computation, as opposed to classical computers which use electrons. Since photons are essentially without mass and can travel faster than electrons, an optical computer would be superfast, energy-efficient and able to process information simultaneously through multiple temporal or spatial optical channels.

The computing element in an optical computer – an alternative to the ones and zeroes of a digital computer – is represented by the continuous phase of the light signal, and the computation is normally achieved by adding two light waves coming from two different sources and then projecting the result onto ‘0’ or ‘1’ states.

However, real life presents highly nonlinear problems, where multiple unknowns simultaneously change the values of other unknowns while interacting multiplicatively. In this case, the traditional approach to optical computing that combines light waves in a linear manner fails.

Now, Professor Natalia Berloff from Cambridge’s Department of Applied Mathematics and Theoretical Physics and PhD student Nikita Stroev from Skolkovo Institute of Science and Technology have found that optical systems can combine light by multiplying the wave functions describing the light waves instead of adding them and may represent a different type of connections between the light waves.

They illustrated this phenomenon with quasi-particles called polaritons – which are half-light and half-matter – while extending the idea to a larger class of optical systems such as light pulses in a fibre. Tiny pulses or blobs of coherent, superfast-moving polaritons can be created in space and overlap with one another in a nonlinear way, due to the matter component of polaritons.

“We found the key ingredient is how you couple the pulses with each other,” said Stroev. “If you get the coupling and light intensity right, the light multiplies, affecting the phases of the individual pulses, giving away the answer to the problem. This makes it possible to use light to solve nonlinear problems.”

The multiplication of the wave functions to determine the phase of the light signal in each element of these optical systems comes from the nonlinearity that occurs naturally or is externally introduced into the system.

“What came as a surprise is that there is no need to project the continuous light phases onto ‘0’ and ‘1’ states necessary for solving problems in binary variables,” said Stroev. “Instead, the system tends to bring about these states at the end of its search for the minimum energy configuration. This is the property that comes from multiplying the light signals. On the contrary, previous optical machines require resonant excitation that fixes the phases to binary values externally.”

The authors have also suggested and implemented a way to guide the system trajectories towards the solution by temporarily changing the coupling strengths of the signals.

“We should start identifying different classes of problems that can be solved directly by a dedicated physical processor,” said Berloff, who also holds a position at Skolkovo Institute of Science and Technology. “Higher-order binary optimisation problems are one such class, and optical systems can be made very efficient in solving them.”

There are still many challenges to be met before optical computing can demonstrate its superiority in solving hard problems in comparison with modern electronic computers: noise reduction, error correction, improved scalability, guiding the system to the true best solution are among them.

“Changing our framework to directly address different types of problems may bring optical computing machines closer to solving real-world problems that cannot be solved by classical computers,” said Berloff.

 

Reference:
Nikita Stroev and Natalia G. Berloff. ‘Discrete Polynomial Optimization with Coherent Networks of Condensates and Complex Coupling Switching.’ Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.126.050504

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Study Highlights Risk of New SARS-CoV-2 Mutations Emerging During Chronic Infection

3D print of Spike protein
source: www.cam.ac.uk

 

SARS-CoV-2 mutations similar to those in the B1.1.7 UK variant could arise in cases of chronic infection, where treatment over an extended period can provide the virus multiple opportunities to evolve, say scientists.

 

Given that both vaccines and therapeutics are aimed at the spike protein, which we saw mutate in our patient, our study raises the worrying possibility that the virus could mutate to outwit our vaccines

Ravi Gupta

Writing in Nature, a team led by Cambridge researchers report how they were able to observe SARS-CoV-2 mutating in the case of an immunocompromised patient treated with convalescent plasma. In particular, they saw the emergence of a key mutation also seen in the new variant that led to the UK being forced once again into strict lockdown, though there is no suggestion that the variant originated from this patient.

Using a synthetic version of the virus Spike protein created in the lab, the team showed that specific changes to its genetic code – the mutation seen in the B1.1.7 variant – made the virus twice as infectious on cells as the more common strain.

SARS-CoV-2, the virus that causes COVID-19, is a betacoronavirus. Its RNA – its genetic code – is comprised of a series of nucleotides (chemical structures represented by the letters A, C, G and U). As the virus replicates itself, this code can be mis-transcribed, leading to errors, known as mutations. Coronaviruses have a relatively modest mutation rate at around 23 nucleotide substitutions per year.

Of particular concern are mutations that might change the structure of the ‘spike protein’, which sits on the surface of the virus, giving it its characteristic crown-like shape. The virus uses this protein to attach to the ACE2 receptor on the surface of the host’s cells, allowing it entry into the cells where it hijacks their machinery to allow it to replicate and spread throughout the body. Most of the current vaccines in use or being trialled target the spike protein and there is concern that mutations may affect the efficacy of these vaccines.

UK researchers within the Cambridge-led COVID-19 Genomics UK (COG-UK) Consortium have identified a particular variant of the virus that includes important changes that appear to make it more infectious: the ΔH69/ΔV70 amino acid deletion in part of the spike protein is one of the key changes in this variant.

Although the ΔH69/ΔV70 deletion has been detected multiple times, until now, scientists had not seen them emerge within an individual. However, in a study published today in Nature, Cambridge researchers document how these mutations appeared in a COVID-19 patient admitted to Addenbrooke’s Hospital, part of Cambridge University Hospitals NHS Foundation Trust.

The individual concerned was a man in his seventies who had previously been diagnosed with marginal B cell lymphoma and had recently received chemotherapy, meaning that that his immune system was seriously compromised. After admission, the patient was provided with a number of treatments, including the antiviral drug remdesivir and convalescent plasma – that is, plasma containing antibodies taken from the blood of a patient who had successfully cleared the virus from their system. Despite his condition initially stabilising, he later began to deteriorate. He was admitted to the intensive care unit and received further treatment, but later died.

During the patient’s stay, 23 viral samples were available for analysis, the majority from his nose and throat. These were sequenced as part of COG-UK. It was in these sequences that the researchers observed the virus’s genome mutating.

Between days 66 and 82, following the first two administrations of convalescent sera, the team observed a dramatic shift in the virus population, with a variant bearing ΔH69/ΔV70 deletions, alongside a mutation in the spike protein known as D796H, becoming dominant. Although this variant initially appeared to die away, it re-emerged again when the third course of remdesivir and convalescent plasma therapy were administered.

Professor Ravi Gupta from the Cambridge Institute of Therapeutic Immunology & Infectious Disease, who led the research, said: “What we were seeing was essentially a competition between different variants of the virus, and we think it was driven by the convalescent plasma therapy.

“The virus that eventually won out – which had the D796H mutation and ΔH69/ΔV70 deletions – initially gained the upper hand during convalescent plasma therapy before being overtaken by other strains, but re-emerged when the therapy was resumed. One of the mutations is in the new UK variant, though there is no suggestion that our patient was where they first arose.”

Under strictly-controlled conditions, the researchers created and tested a synthetic version of the virus with the ΔH69/ΔV70 deletions and D796H mutations both individually and together. The combined mutations made the virus less sensitive to neutralisation by convalescent plasma, though it appears that the D796H mutation alone was responsible for the reduction in susceptibility to the antibodies in the plasma. The D796H mutation alone led to a loss of infection in absence of plasma, typical of mutations that viruses acquire in order to escape from immune pressure.

The researchers found that the ΔH69/ΔV70 deletion by itself made the virus twice as infectious as the previously dominant variant. The researchers believe the role of the deletion was to compensate for the loss of infectiousness due to the D796H mutation.  This paradigm is classic for viruses, whereby escape mutations are followed by or accompanied by compensatory mutations.

“Given that both vaccines and therapeutics are aimed at the spike protein, which we saw mutate in our patient, our study raises the worrying possibility that the virus could mutate to outwit our vaccines,” added Professor Gupta.

“This effect is unlikely to occur in patients with functioning immune systems, where viral diversity is likely to be lower due to better immune control. But it highlights the care we need to take when treating immunocompromised patients, where prolonged viral replication can occur, giving greater opportunity for the virus to mutate.”

The research was largely supported by Wellcome, the Medical Research Council, the National Institute of Health Research, and the Bill and Melinda Gates Foundation.

Reference
Kemp, SA et al. SARS-CoV-2 evolution during treatment of chronic infection. Nature; 5 Feb; DOI: 10.1038/s41586-021-03291-y


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Climate Change May Have Driven the Emergence of SARS-CoV-2

Forest landscape in Yunnan province of China
source: www.cam.ac.uk

 

Global greenhouse gas emissions over the last century have made southern China a hotspot for bat-borne coronaviruses, by driving growth of forest habitat favoured by bats.

 

Governments must seize the opportunity to reduce health risks from infectious diseases by taking decisive action to mitigate climate change.

Andrea Manica

A new study published today in the journal Science of the Total Environment provides the first evidence of a mechanism by which climate change could have played a direct role in the emergence of SARS-CoV-2, the virus that caused the COVID-19 pandemic.

The study has revealed large-scale changes in the type of vegetation in the southern Chinese Yunnan province, and adjacent regions in Myanmar and Laos, over the last century. Climatic changes including increases in temperature, sunlight, and atmospheric carbon dioxide – which affect the growth of plants and trees – have changed natural habitats from tropical shrubland to tropical savannah and deciduous woodland. This created a suitable environment for many bat species that predominantly live in forests.

The number of coronaviruses in an area is closely linked to the number of different bat species present. The study found that an additional 40 bat species have moved into the southern Chinese Yunnan province in the past century, harbouring around 100 more types of bat-borne coronavirus. This ‘global hotspot’ is the region where genetic data suggests SARS-CoV-2 may have arisen.

“Climate change over the last century has made the habitat in the southern Chinese Yunnan province suitable for more bat species,” said Dr Robert Beyer, a researcher in the University of Cambridge’s Department of Zoology and first author of the study, who has recently taken up a European research fellowship at the Potsdam Institute for Climate Impact Research, Germany.

He added: “Understanding how the global distribution of bat species has shifted as a result of climate change may be an important step in reconstructing the origin of the COVID-19 outbreak.”

To get their results, the researchers created a map of the world’s vegetation as it was a century ago, using records of temperature, precipitation, and cloud cover. Then they used information on the vegetation requirements of the world’s bat species to work out the global distribution of each species in the early 1900s. Comparing this to current distributions allowed them to see how bat ‘species richness’, the number of different species, has changed across the globe over the last century due to climate change.

“As climate change altered habitats, species left some areas and moved into others – taking their viruses with them. This not only altered the regions where viruses are present, but most likely allowed for new interactions between animals and viruses, causing more harmful viruses to be transmitted or evolve,” said Beyer.

The world’s bat population carries around 3,000 different types of coronavirus, with each bat species harbouring an average of 2.7 coronaviruses – most without showing symptoms. An increase in the number of bat species in a particular region, driven by climate change, may increase the likelihood that a coronavirus harmful to humans is present, transmitted, or evolves there.

Most coronaviruses carried by bats cannot jump into humans. But several coronaviruses known to infect humans are very likely to have originated in bats, including three that can cause human fatalities: Middle East Respiratory Syndrome (MERS) CoV, and Severe Acute Respiratory Syndrome (SARS) CoV-1 and CoV-2.

The region identified by the study as a hotspot for a climate-driven increase in bat species richness is also home to pangolins, which are suggested to have acted as intermediate hosts to SARS-CoV-2. The virus is likely to have jumped from bats to these animals, which were then sold at a wildlife market in Wuhan – where the initial human outbreak occurred.

The researchers echo calls from previous studies that urge policy-makers to acknowledge the role of climate change in outbreaks of viral diseases, and to address climate change as part of COVID-19 economic recovery programmes.

“The COVID-19 pandemic has caused tremendous social and economic damage. Governments must seize the opportunity to reduce health risks from infectious diseases by taking decisive action to mitigate climate change,” said Professor Andrea Manica in the University of Cambridge’s Department of Zoology, who was involved in the study.

“The fact that climate change can accelerate the transmission of wildlife pathogens to humans should be an urgent wake-up call to reduce global emissions,” added Professor Camilo Mora at the University of Hawai‘i at Manoa, who initiated the project.

The researchers emphasised the need to limit the expansion of urban areas, farmland, and hunting grounds into natural habitat to reduce contact between humans and disease-carrying animals.

The study showed that over the last century, climate change has also driven increases in the number of bat species in regions around Central Africa, and scattered patches in Central and South America.

This research was supported by the European Research Council.

Reference
Beyer, R.M. et al: ‘Shifts in global bat diversity suggest a possible role of climate change in the emergence of SARS-CoV-1 and SARS-CoV-2.’ Science of the Total Environment, Feb 2021. DOI: 10.1016/j.scitotenv.2021.145413

————————–

Hear from other University of Cambridge researchers who are investigating how to reduce the risk of animal viruses jumping to humans.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Free Online Tool Calculates Risk of COVID-19 Transmission In Poorly-Ventilated Spaces

People sitting inside a restaurant wearing face masks
source: www.cam.ac.uk

 

The vital role of ventilation in the spread of COVID-19 has been quantified by researchers, who have found that in poorly-ventilated spaces, the virus can spread further than two metres in seconds, and is far more likely to spread through prolonged talking than through coughing.

 

The tool can help people use fluid mechanics to make better choices, and adapt their day-to-day activities and surroundings in order to suppress risk, both for themselves and for others

Savvas Gkantonas

The results, reported in the journal Proceedings of the Royal Society A, show that social distancing measures alone do not provide adequate protection from the virus, and further emphasise the vital importance of ventilation and face masks in order to slow the spread of COVID-19.

The researchers, from the University of Cambridge and Imperial College London, used mathematical models to show how SARS-CoV-2 – the virus which causes COVID-19 – spreads in different indoor spaces, depending on the size, occupancy, ventilation and whether masks are being worn. These models are also the basis of a free online tool, Airborne.cam, which helps users understand how ventilation and other measures affect the risk of indoor transmission, and how that risk changes over time.

The researchers found that when two people are in a poorly-ventilated space and neither is wearing a mask, prolonged talking is far more likely to spread the virus than a short cough. When speaking, we exhale smaller droplets, or aerosols, which spread easily around a room, and accumulate if ventilation is not adequate. In contrast, coughing expels more large droplets, which are more likely to settle on surfaces after they are emitted.

It only takes a matter of seconds for aerosols to spread over two metres when masks are not worn, implying that physical distancing in the absence of ventilation is not sufficient to provide safety for long exposure times. When masks of any kind are worn however, they slow the breath’s momentum and filter a portion of the exhaled droplets, in turn reducing the amount of virus in aerosols that can spread through the space.

The scientific consensus is that the vast majority of COVID-19 cases are spread through indoor transmission – whether via aerosols or droplets. And as was predicted in the summer and autumn, now that winter has arrived in the northern hemisphere and people are spending more time indoors, there has been a corresponding rise in the number of COVID-19 cases.

“Our knowledge of airborne transmission of SARS-CoV-2 has evolved at an incredible pace, when you consider that it’s been just a year since the virus was identified,” said Dr Pedro de Oliveira from Cambridge’s Department of Engineering, and the paper’s first author. “There are different ways to approach this problem. In our work, we consider the wide range of respiratory droplets humans exhale to demonstrate different scenarios of airborne viral transmission – the first being the quick spread of small infectious droplets over several metres in a matter of a few seconds, which can happen both indoors and outdoors. Then, we show how these small droplets can accumulate in indoor spaces in the long term, and how this can be mitigated with adequate ventilation.”

The researchers used mathematical models to calculate the amount of virus contained in exhaled particles, and to determine how these evaporate and settle on surfaces. In addition, they used characteristics of the virus, such as its decay rate and viral load in infected individuals, to estimate the risk of transmission in an indoor setting due to normal speech or a short cough by an infectious person. For instance, they show that the infection risk after speaking for one hour in a typical lecture room was high, but the risk could be decreased significantly with adequate ventilation.

Based on their models, the researchers have now built Airborne.cam, a free, open-source tool which can be used by those managing public spaces, such as shops, workplaces and classrooms, in order to determine whether ventilation is adequate. The tool is already in use in several academic departments at the University of Cambridge. The tool is now a requirement for any higher-risk spaces at the University, enabling departments to easily identify hazards and control-measure changes needed to ensure aerosols are not allowed to become a risk to health.

“The tool can help people use fluid mechanics to make better choices, and adapt their day-to-day activities and surroundings in order to suppress risk, both for themselves and for others,” said co-author Savvas Gkantonas, who led the development of the app with Dr de Oliveira.

“We’re looking at all sides of aerosol and droplet transmission to understand, for example, the fluid mechanics involved in coughing and speaking,” said senior author Professor Epaminondas Mastorakos, also from the Department of Engineering. “The role of turbulence and how it affects which droplets settle by gravity and which remain afloat in the air is, in particular, not well understood. We hope these and other new results will be implemented as safety factors in the app as we continue to investigate.”

The continuing development of Airborne.cam, which will soon be available for mobile platforms, is supported in part by Cambridge Enterprise and Churchill College.

 

Reference:
P M de Oliveira et al. ‘Evolution of spray and aerosol from respiratory releases: theoretical estimates for insight on viral transmission.’ Proceedings of the Royal Society A (2021). DOI: 10.1098/rspa.2020.0584


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

New Starfish-Like Fossil Reveals Evolution In Action

source: www.cam.ac.uk

 

Researchers from the University of Cambridge have discovered a fossil of the earliest starfish-like animal, which helps us understand the origins of the nimble-armed creature.

 

If you went back in time and put your head under the sea in the Ordovician then you wouldn’t recognize any of the marine organisms – except the starfish, they are one of the first modern animals

Aaron Hunter

The prototype starfish, which has features in common with both sea lilies and modern-day starfish, is a missing link for scientists trying to piece together its early evolutionary history.

The exceptionally preserved fossil, named Cantabrigiaster fezouataensis, was discovered in Morroco’s Anti-Atlas mountain range. Its intricate design – with feathery arms akin to a lacework – has been frozen in time for roughly 480 million years.

The new species is unusual because it doesn’t have many of the key features of its contemporary relatives, lacking roughly 60% of a modern starfish’s body plan.

The fossil’s features are instead a hybrid between those of a starfish and a sea lily or crinoid – not a plant but a wavy-armed filter feeder which fixes itself to the seabed via a cylindrical ‘stem’.

The discovery, reported in Biology Letters, captures the early evolutionary steps of the animal at a time in Earth’s history when life suddenly expanded, a period known as the Ordovician Biodiversification Event.

The find also means scientists can now use the new find as a template to work out how it evolved from this more basic form to the complexity of their contemporaries.

“Finding this missing link to their ancestors is incredibly exciting. If you went back in time and put your head under the sea in the Ordovician then you wouldn’t recognize any of the marine organisms – except the starfish, they are one of the first modern animals,” said lead author Dr Aaron Hunter, a visiting postdoctoral researcher in the Department of Earth Sciences.

Modern starfish and brittle stars are part of a family of spiny-skinned animals called the echinoderms which, although they don’t have a backbone, are one of the closest group of animals to vertebrates. Crinoids, and otherworldly creatures like the sea urchins and sea cucumbers are all echinoderms.

The origin of starfish has eluded scientists for decades. But the new species is so well preserved that its body can finally be mapped in detail and its evolution understood. “The level of detail in the fossil is amazing – its structure is so complex that it took us a while to unravel its significance,” said Hunter.

It was Hunter’s work on both living and fossil echinoderms that helped him spot its hybrid features. “I was looking at a modern crinoid in one of the collections at the Western Australian Museum and I realised the arms looked really familiar, they reminded me of this unusual fossil that I had found years earlier in Morocco but had found difficult to work with,” he said.

Fezouata in Morocco is something of a holy grail for palaeontologists – the new fossil is just one of the many remarkably well preserved soft-bodied animals uncovered from the site.

Hunter and co-author Dr Javier Ortega-Hernández, who was previously based at Cambridge’s Department of Zoology and is now based at Harvard University, named the species Cantabrigiaster in honour of the long history of echinoderm research at their respective institutions.

Hunter and Ortega-Hernández examined their new species alongside a catalogue of hundreds starfish-like animals. They indexed all of their body structures and features, building a road map of the echinoderm skeleton which they could use to assess how Cantabrigiaster was related to other family members.

Modern echinoderms come in many shapes and sizes, so it can be difficult to work out how they are related to one another. The new analysis, which uses extra-axial theory – a biology model usually only applied to living species – meant that Hunter and Ortega-Hernández could identify similarities and differences between the body plan of modern echinoderms and then figure out how each family member was linked to their Cambrian ancestors.

They found that only the key or axial part of the body, the food groove – which funnels food along each of the starfish’s arms – was present in Cantabrigiaster. Everything outside this, the extra-axial body parts, were added later.

The authors plan to expand their work in search of early echinoderms. “One thing we hope to answer in the future is why starfish developed their five arms,” said Hunter. “It seems to be a stable shape for them to adopt – but we don’t yet know why. We still need to keep searching for the fossil that gives us that particular connection, but by going right back to the early ancestors like Cantabrigiaster we are getting closer to that answer.”

Reference:
Aaron W. Hunter and Javier Ortega-Hernández. ‘A new somasteroid from the Fezouata Lagerstätte in Morocco and the Early Ordovician origin of Asterozoa.’ Bioloigy Letters (2021). 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Male Butterflies Mark Their Mates With a Repulsive Smell During Sex To ‘Turn Off’ Other Suitors

Two butterflies mating in captivity. Heliconius cydno (left) and Heliconius melpomene (right).
source: www.cam.ac.uk

 

Butterflies have evolved to produce a strongly scented chemical in their genitals, which they leave behind after sex to deter other males from pursuing their mates.

 

The males want to pass their genes onto the next generation, and they don’t want the females to have babies with other fathers so they use this scent to make them unsexy.

Chris Jiggins

Led by Professor Chris Jiggins in the University of Cambridge’s Department of Zoology, the team mapped production of the scented chemical compound to the genome of a species of butterfly called Heliconius melponene, and discovered a new gene. They also discovered that the chemical, made in the sex glands of the males, is identical to a chemical produced by flowers to attract butterflies. The study, published today in the journal PLOS Biology, shows that butterflies and flowers independently evolved to make the same chemical for different purposes.

Dr Kathy Darragh, lead author of the paper and previously a member of Jiggins’ research group, said: “We identified the gene responsible for producing this powerful anti-aphrodisiac pheromone called ocimene in the genitals of male butterflies. This shows that the evolution of ocimene production in male butterflies is independent of the evolution of ocimene production in plants.

“For a long time it was thought insects took the chemical compounds from plants and then used them, but we have shown butterflies can make the chemicals themselves – but with very different intentions. Male butterflies use it to repulse competitors and flowers use the same smell to entice butterflies for pollination.”

There are around 20,000 species of butterflies worldwide. Some only live for a month, but the Heliconius melponene butterflies found in Panama that were studied live for around six months. The females typically have few sexual partners and they store the sperm and use it to fertilise their eggs over a number of months after a single mating.

Male butterflies have as many mates as they can and each time they transfer the anti-aphrodisiac chemical because they want to be the one to fertilise the offspring. This chemical, however, is not produced by all Heliconius butterflies. Whilst Heliconius melpomene does produce ocimene, another closely related species that was analysed – Heliconius cydno – does not produce the strong smelling pheromone.

If the smell has such a powerful effect, how do the butterflies know when to be attracted or when to steer clear?

Darragh, now based at the University of California, Davis, explained: “The visual cues the butterflies get will be important – when the scent is detected in the presence of flowers it will be attractive but when it is found on another butterfly it is repulsive to the males – context is key.”

This new analysis of the power of smell – also called chemical signalling – sheds new light on the importance of scent as a form of communication.

Jiggins said: “The butterflies presumably adapted to detect this chemical to find flowers, and then evolved to use it in this very different way. The males want to pass their genes onto the next generation and they don’t want the females to have babies with other fathers, so they use this scent to make them unsexy.

“Male butterflies pester the females a lot so it might benefit the females too if the smell left behind means they stop being bothered for sex after they have already mated.”

Reference

Darragh, K. et al. ‘A novel terpene synthase controls differences in anti-aphrodisiac pheromone production between closely related Heliconius butterflies.’ Jan 2021, PLOS Biology. DOI: 10.1371/journal.pbio.3001022

Adapted from a press release by St John’s College, Cambridge


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Successive Governments’ Approaches to Obesity Policies Have Destined Them To Fail, Say Researchers

Silhouettes of three women running
source: www.cam.ac.uk

 

Government obesity policies in England over the past three decades have largely failed because of problems with implementation, lack of learning from past successes or failures, and a reliance on trying to persuade individuals to change their behaviour rather than tackling unhealthy environments.

 

In almost 30 years, successive UK governments have proposed hundreds of wide-ranging policies to tackle obesity in England, but these are yet to have an impact on levels of obesity or reduce inequality

Dolly Theis

This is the conclusion of new research by a team at the University of Cambridge funded by the NIHR School for Public Health Research.

The researchers say their findings may help to explain why, after nearly thirty years of government obesity policies, obesity prevalence in England has not fallen and substantial inequalities persist. According to a report by NHS Digital in May 2020, 67% of men and 60% of women live with overweight or obesity, including 26% of men and 29% of women who suffer clinical obesity. More than a quarter of children aged two to 15 years live with obesity or overweight and the gap between the least and most deprived children is growing.

Successive governments have tried to tackle the obesity problem: in research published today in The Milbank Quarterly, Dolly Theis and Martin White in the Centre for Diet and Activity Research (CEDAR) at the University of Cambridge identified 14 government-led obesity strategies in England from 1992 to 2020. They analysed these strategies – which contained 689 wide-ranging policies – to determine whether they have been fit for purpose in terms of their strategic focus, content, basis in theory and evidence, and implementation viability.

Seven of the strategies were broad public health strategies containing obesity as well as non-obesity policies such as on tobacco smoking and food safety. The other seven contained only obesity-related policies, such as on diet and/or physical activity. Twelve of the fourteen strategies contained obesity reduction targets. However, only five of these were specific, numerical targets rather than statements such as ‘aim to reduce obesity’.

Theis said: “In almost 30 years, successive UK governments have proposed hundreds of wide-ranging policies to tackle obesity in England, but these are yet to have an impact on levels of obesity or reduce inequality. Many of these policies have largely been flawed from the outset and proposed in ways that make them difficult to implement. What’s more, there’s been a fairly consistent failure to learn from past mistakes. Governments appear more likely to publish another strategy containing the same, recycled policies than to implement policies already proposed.

“If we were to produce a report card, overall we might only give them 4 out of 10: could do much better.”

Theis and White identified seven criteria necessary for effective implementation, but found that only 8% of policies fulfilled all seven criteria, while the largest proportion of policies (29%) did not fulfil a single one of the criteria. Fewer than a quarter (24%) included a monitoring or evaluation plan, just 19% cited any supporting scientific evidence, and less than one in ten (9%) included details of likely costs or an allocated budget.

The lack of such basic information as the cost of implementing policies was highlighted in a recent National Audit Office report on the UK Government’s approach to tackling childhood obesity in England, which found that the Department of Health and Social Care did not know how much central government spent tackling childhood obesity.

“No matter how well-intended and evidence-informed a policy, if it is nebulously proposed without a clear plan or targets it makes implementation difficult and it is unlikely the policy will be deemed successful,” added Theis. “One might legitimately ask, what is the purpose of proposing policies at all if they are unlikely to be implemented?”

Thirteen of the 14 strategies explicitly recognised the need to reduce health inequality, including one strategy that was fully focused on reducing inequality in health. Yet the researchers say that only 19% of policies proposed were likely to be effective in reducing inequalities because of the measures proposed.

UK governments have to date largely favoured a less interventionist approach to reducing obesity, regardless of political party, prioritising provision of information to the public in their obesity strategies, rather than more directly shaping the choices available to individuals in their living environments through regulation or taxes. The researchers say that governments may have avoided a more deterrence-based, interventionist approach for fear of being perceived as ‘nannying’ – or because they lacked knowledge about what more interventionist measures are likely to be effective.

There is, however, evidence to suggest that policymaking is changing. Even though the current UK government still favours a less interventionist approach, more recent strategies have contained some fiscal and regulatory policies, such as banning price promotions of unhealthy products, banning unhealthy food advertisements and the Soft Drinks Industry Levy. This may be because the government has come under increasing pressure and recognises that previous approaches have not been effective, that more interventionist approaches are increasingly acceptable to the public, and because evidence to support regulatory approaches is mounting.

The researchers found little attempt to evaluate the strategies and build on their successes and failures. As a result, many policies proposed were similar or identical over multiple years, often with no reference to their presence in a previous strategy. Only one strategy (Saving Lives, published in 1999) commissioned a formal independent evaluation of the previous government’s strategy.

“Until recently, there seems to have been an aversion to conducting high quality, independent evaluations, perhaps because they risk demonstrating failure as well as success,” added White. “But this limits a government’s ability to learn lessons from past policies. This may be potentially compounded by the often relatively short timescales for putting together a strategy or implementing policies.

“Governments need to accompany policy proposals with information that ensures they can be successfully implemented, and with built-in evaluation plans and time frames. Important progress has been made with commissioning evaluations in the last three years. But, we also need to see policies framed in ways that make them readily implementable. We also need to see a continued move away from interventions that rely on individual’s changing their diet and activity, and towards policies that change the environments that encourage people to overeat and to be sedentary in the first place.”

Living with obesity or excess weight is associated with long-term physical, psychological and social problems. Related health problems, such as type-2 diabetes, cardiovascular disease and cancers, are estimated to cost NHS England at least £6.1 billion per year and the overall cost of obesity to wider society in England is estimated to be £27 billion per year. The COVID-19 pandemic has brought to light additional risks for people living with obesity, such as an increased risk of hospitalisation and more serious disease.

The research was funded by the NIHR School for Public Health Research, with additional support by the British Heart Foundation, Cancer Research UK, Economic & Social Research Council, Medical Research Council, and Wellcome Trust.

Reference
Dolly R Z Theis, Martin White. Is obesity policy in England fit for purpose? Analysis of government strategies and policies, 1992-2020. Milbank Quarterly; 19 Jan 2021; DOI: https://doi.org/10.1111/1468-0009.12498


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Low-Carbon Policies Can Be ‘Balanced’ To Benefit Small Firms and Average Households – Study

source: www.cam.ac.uk

 

A review of ten types of policy used to reduce carbon suggests that some costs fall on those less able to bear them – but it also shows these policies can form the bedrock of a ‘green recovery’ if specifically designed and used in tandem.

 

Unless low-carbon policies are fair, affordable and economically competitive, they will struggle to secure public support

Cristina Peñasco

Some of the low-carbon policy options currently used by governments may be detrimental to households and small businesses less able to manage added short-term costs from energy price hikes, according to a new study.

However, it also suggests that this menu of decarbonising policies, from quotas to feed-in tariffs, can be designed and balanced to benefit local firms and lower-income families – vital for achieving ‘Net Zero’ carbon and a green recovery.

University of Cambridge researchers combed through thousands of studies to create the most comprehensive analysis to date of widely used types of low-carbon policy, and compared how they perform in areas such as cost and competitiveness.

The findings are published today in the journal Nature Climate Change. The researchers also poured all their data into an interactive online tool that allows users to explore evidence around carbon-reduction policies from across the globe.

“Preventing climate change cannot be the only goal of decarbonisation policies,” said study lead author Dr Cristina Peñasco, a public policy expert from the University of Cambridge.

“Unless low-carbon policies are fair, affordable and economically competitive, they will struggle to secure public support – and further delays in decarbonisation could be disastrous for the planet.”

Around 7,000 published studies were whittled down to over 700 individual findings. These results were coded to allow comparison – with over half the studies analysed “blind” by different researchers to avoid bias.

The ten policy “instruments” covered in the study include forms of investment – targeted R&D funding, for example – as well as financial incentives including different kinds of subsidies, taxes, and the auctioning of energy contracts.

The policies also include market interventions – e.g. emissions permits; tradable certificates for clean or saved energy – and efficiency standards, such as those for buildings.

Researchers looked at whether each policy type had a positive or negative effect in various environmental, industrial and socio-economic areas.

When it came to “distributional consequences” – the fairness with which the costs and benefits are spread – the mass of evidence suggests that the impact of five of the ten policy types are far more negative than positive.

“Small firms and average households have less capacity to absorb increases in energy costs,” said co-author Laura Diaz Anadon, Professor of Climate Change Policy.

“Some of the investment and regulatory policies made it harder for small and medium-size firms to participate in new opportunities or adjust to changes.

“If policies are not well designed and vulnerable households and businesses experience them negatively, it could increase public resistance to change – a major obstacle in reaching net zero carbon,” said Anadon.
For example, feed-in tariffs pay renewable electricity producers above market rates. But these costs may bump energy prices for all if they get passed on to households – leaving the less well-off spending a larger portion of their income on energy.

Renewable electricity traded as ‘green certificates’ can redistribute wealth from consumers to energy companies – with 83% of the available evidence suggesting they have a “negative impact”, along with 63% of the evidence for energy taxes, which can disproportionately affect rural areas.

However, the vast tranche of data assembled by the researchers reveals how many of these policies can be designed and aligned to complement each other, boost innovation, and pave the way for a fairer transition to zero carbon.

For example, tailoring feed-in tariffs (FiTs) to be “predictable yet adjustable” can benefit smaller and more dispersed clean energy projects – improving market competitiveness and helping to mitigate local NIMBY-ism.

Moreover, revenues from environmental taxes could go towards social benefits or tax credits e.g. reducing corporate tax for small firms and lowering income taxes, providing what researchers call a “double dividend”: stimulating economies while reducing emissions.

The researchers argue that creating a “balance” of well-designed and complementary policies can benefit different renewable energy producers and “clean” technologies at various stages.

Government funding for research and development (R&D) that targets small firms can help attract other funding streams – boosting both eco-innovation and competitiveness. When combined with R&D tax credits, it predominantly supports innovation in startups rather than corporations.

Government procurement, using tiered contracts and bidding, can also improve innovation and market access for smaller businesses in “economically stressed” areas. This could aid the “levelling up” between richer and poorer regions as part of any green recovery.

“There is no one-size-fits-all solution,” said Peñasco. “Policymakers should deploy incentives for innovation, such as targeted R&D funding, while also adapting tariffs and quotas to benefit those across income distributions.

“We need to spur the development of green technology at the same time as achieving public buy-in for the energy transition that must start now to prevent catastrophic global heating,” she said.

Peñasco and Anadon contributed to the recent report from Cambridge Zero – the University’s climate change initiative. In it, they argue for piloting a UK government research programme akin to ARPA in the US, but focused on new net-zero technologies.

Prof Laura Diaz Anadon is Director of Cambridge’s Centre for Environment, Energy and Natural Resource Governance (C-EENRG). The review was also co-authored by Prof Elena Verdolini from the RFF-CMCC European institute on Economics and the Environments (EIEE) and the Euro-Mediterranean Centre on Climate Change and University of Brescia. Anadon and Verdolini lead part of the EU project INNOPATHS that funded the research.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Likelihood of Severe and ‘Long’ COVID May Be Established Very Early on Following Infection

source: www.cam.ac.uk

New research provides important insights into the role played by the immune system in preventing – and in some cases increasing the severity of – COVID-19 symptoms in patients. It also finds clues to why some people experience ‘long COVID’.

 

Our evidence suggests that the journey to severe COVID-19 may be established immediately after infection, or at the latest around the time that they begin to show symptoms

Paul Lyons

Among the key findings, which have not yet been peer-reviewed, are:

  • Individuals who have asymptomatic or mild disease show a robust immune response early on during infection.
  • Patients requiring admission to hospital have impaired immune responses and systemic inflammation (that is, chronic inflammation that may affect several organs) from the time of symptom onset.
  • Persistent abnormalities in immune cells and a change in the body’s inflammatory response may contribute to ‘long COVID’.

The immune response associated with COVID-19 is complex. Most people who get infected by SARS-CoV-2 mount a successful antiviral response, resulting in few if any symptoms. In a minority of patients, however, there is evidence that the immune system over-reacts, leading to a flood of immune cells (a ‘cytokine storm’) and to chronic inflammation and damage to multiple organs, often resulting in death.

To better understand the relationship between the immune response and COVID-19 symptoms, scientists at the University of Cambridge and Addenbrooke’s Hospital, Cambridge University Hospitals NHS Foundation Trust, have been recruiting individuals who test positive for SARS-CoV-2 to the COVID-19 cohort of the NIHR BioResource. These individuals range from asymptomatic healthcare workers in whom the virus was detected on routine screening, through to patients requiring assisted ventilation. The team take blood samples from patients over several months, as well as continuing to measure their symptoms.

In research published today, the team analysed samples from 207 COVID-19 patients with a range of disease severities taken at regular interviews over three months following the onset of symptoms. They compared the samples against those taken from 45 healthy controls.

Because of the urgent need to share information relating to the pandemic, the researchers have published their report on MedRXiv. It has not yet been peer-reviewed.

Professor Ken Smith, senior co-author and Director of the Cambridge Institute of Therapeutic Immunology & Infectious Disease (CITIID), said: “The NIHR BioResource has allowed us to address two important questions regarding SARS-CoV-2. Firstly, how does the very early immune response in patients who recovered from disease with few or no symptoms, compare with those who experienced severe disease? And secondly, for those patients who experience severe disease, how rapidly does their immune system recover and how might this relate to ‘long COVID’?”

Listen to Professor Ken Smith discuss the findings with the Naked Scientists

The team found evidence of an early, robust adaptive immune response in those infected individuals whose disease was asymptomatic or mildly symptomatic. An adaptive immune response is where the immune system identifies an infection and then produces T cells, B cells and antibodies specific to the virus to fight back. These individuals produced the immune components in larger numbers than patients with more severe COVID-19 managed, and within the first week of infection – after which these numbers rapidly returned to normal. There was no evidence in these individuals of systemic inflammation that can lead to damage in multiple organs.

In patients requiring admission to hospital, the early adaptive immune response was delayed, and profound abnormalities in a number of white cell subsets were present. Also present in the first blood sample taken from these patients was evidence of increased inflammation, something not seen in those with asymptomatic or mild disease. This suggests that an abnormal inflammatory component to the immune response is present even around the time of diagnosis in individuals who progress to severe disease.

The team found that key molecular signatures produced in response to inflammation were present in patients admitted to hospital. They say that these signatures could potentially be used to predict the severity of a patient’s disease, as well as correlating with their risk of COVID-19 associated death.

Dr Paul Lyons, senior co-author, also from CITIID, said: “Our evidence suggests that the journey to severe COVID-19 may be established immediately after infection, or at the latest around the time that they begin to show symptoms. This finding could have major implications as to how the disease needs to be managed, as it suggests we need to begin treatment to stop the immune system causing damage very early on, and perhaps even pre-emptively in high risk groups screened and diagnosed before symptoms develop.”

The researchers found no evidence of a relationship between viral load and progression to inflammatory disease. However, once inflammatory disease was established, viral load was associated with subsequent outcome.

The study also provides clues to the biology underlying cases of ‘long COVID’ – where patients report experiencing symptoms of the disease, including fatigue, for several months after infection, even when they no longer test positive for SARS-CoV-2.

The team found that profound alterations in many immune cell types often persisted for weeks or even months after SARS-CoV-2 infection, and these problems resolved themselves very differently depending on the type of immune cell. Some recover as systemic inflammation itself resolves, while others recover even in the face of persistent systemic inflammation. However, some cell populations remain markedly abnormal, or show only limited recovery, even after systemic inflammation has resolved and patients have been discharged from hospital.

Dr Laura Bergamaschi, the study’s first author, said: “It’s these populations of immune cells that still show abnormalities even when everything else seems to have resolved itself that might be of importance in long COVID. For some cell types, it may be that they are just slow to regenerate, but for others, including some types of T and B cells, it appears something is continuing to drive their activity. The more we understand about this, the more likely we will be able to better treat patients whose lives continue to be blighted by the after-effects of COVID-19.”

Professor John Bradley, Chief Investigator of the NIHR BioResource, said: “The NIHR BioResource is a unique resource made possible by the strong links that exist in the UK between doctors and scientists in the NHS and at our universities. It’s because of collaborations such as this that we have learnt so much in such a short time about SARS-CoV-2.”

The research was supported by CVC Capital Partners, the Evelyn Trust, UK Research & Innovation COVID Immunology Consortium, Addenbrooke’s Charitable Trust, the NIHR Cambridge Biomedical Research Centre and Wellcome.

Reference
Bergamaschi, L et al. Early immune pathology and persistent dysregulation characterise severe COVID-19. MedRXiV; 15 Jan 2021; DOI: 10.1101/2021.01.11.20248765


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

DNA Test Can Quickly Identify Pneumonia in Patients With Severe COVID-19, Aiding Faster Treatment

Doctor checks on patient connected to a ventilator
source: www.cam.ac.uk

 

Researchers have developed a DNA test to quickly identify secondary infections in COVID-19 patients, who have double the risk of developing pneumonia while on ventilation than non-COVID-19 patients.

 

Using this test, we found that patients with COVID-19 were twice as likely to develop secondary pneumonia as other patients in the same intensive care unit

Andrew Conway Morris

For patients with the most severe forms of COVID-19, mechanical ventilation is often the only way to keep them alive, as doctors use anti-inflammatory therapies to treat their inflamed lungs. However, these patients are susceptible to further infections from bacteria and fungi that they may acquire while in hospital – so called ‘ventilator-associated pneumonia’.

Now, a team of scientists and doctors at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust, led by Professor Gordon Dougan, Dr Vilas Navapurkar and Dr Andrew Conway Morris, have developed a simple DNA test to quickly identify these infections and target antibiotic treatment as needed.

The test, developed at Addenbrooke’s hospital in collaboration with Public Health England, gives doctors the information they need to start treatment within hours rather than days, fine-tuning treatment as required and reducing the inappropriate use of antibiotics. This approach, based on higher throughput DNA testing, is being rolled out at Cambridge University Hospitals and offers a route towards better treatments for infection more generally. The results are reported in the journal Critical Care.

Patients who need mechanical ventilation are at significant risk of developing secondary pneumonia while they are in intensive care. These infections are often caused by antibiotic-resistant bacteria, and are hard to diagnose and need targeted treatment.

“Early on in the pandemic we noticed that COVID-19 patients appeared to be particularly at risk of developing secondary pneumonia, and started using a rapid diagnostic test that we had developed for just such a situation,” said co-author Dr Andrew Conway Morris from Cambridge’s Department of Medicine and an intensive care consultant. “Using this test, we found that patients with COVID-19 were twice as likely to develop secondary pneumonia as other patients in the same intensive care unit.”

COVID-19 patients are thought to be at increased risk of infection for several reasons. Due to the amount of lung damage, these severe COVID-19 cases tend to spend more time on a ventilator than patients without COVID-19. In addition, many of these patients also have a poorly-regulated immune system, where the immune cells damage the organs, but also have impaired anti-microbial functions, increasing the risk of infection.

Normally, confirming a pneumonia diagnosis is challenging, as bacterial samples from patients need to be cultured and grown in a lab, which is time-consuming. The Cambridge test takes an alternative approach by detecting the DNA of different pathogens, which allows for faster and more accurate testing.

The test uses multiple polymerase chain reaction (PCR) which detects the DNA of the bacteria and can be done in around four hours, meaning there is no need to wait for the bacteria to grow. “Often, patients have already started to receive antobiotics before the bacteria have had time to grow in the lab,” said Morris. “This means that results from cultures are often negative, whereas PCR doesn’t need viable bacteria to detect – making this a more accurate test.”

The test – which was developed with Dr Martin Curran, a specialist in PCR diagnostics from Public Health England’s Cambridge laboratory – runs multiple PCR reactions in parallel, and can simultaneously pick up 52 different pathogens, which often infect the lungs of patients in intensive care. At the same time, it can also test for antibiotic resistance.

“We found that although patients with COVID-19 were more likely to develop secondary pneumonia, the bacteria that caused these infections were similar to those in ICU patients without COVID-19,” said lead author Mailis Maes, also from the Department of Medicine. “This means that standard antibiotic protocols can be applied to COVID-19 patients.”

This is one of the first times that this technology has been used in routine clinical practice and has now been approved by the hospital. The researchers anticipate that similar approaches would benefit patients if used more broadly.

This study was funded by the National Institute for Health Research Cambridge Biomedical Research Centre.

 

Reference:
Mailis Maes et al. ‘Ventilator-associated pneumonia in critically ill patients with COVID-19.’ Critical Care (2021). DOI: 10.1186/s13054-021-03460-5


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Following the Hops of Disordered Proteins Could Lead To Future Treatments of Alzheimer’s Disease

Beta-Amyloid Plaques and Tau in the Brain
source: www.cam.ac.uk

Study shows how to determine the elusive motions of proteins that remain disordered.

 

The constant motion of amyloid-beta is one of the reasons it’s been so difficult to target – it’s almost like trying to catch smoke in your hands

Michele Vendruscolo

Researchers from the University of Cambridge, Google Research and the University of Milan have used machine learning techniques to predict how proteins, particularly those implicated in neurological diseases, completely change their shapes in a matter of microseconds.

They found that when amyloid-beta, a key protein implicated in Alzheimer’s disease, adopts a collection of disordered shapes, it actually becomes less likely to stick together and form the toxic clusters which lead to the death of brain cells.

The results, reported in the journal Nature Computational Science, could aid in the future development of treatments for diseases involving disordered proteins, such as Alzheimer’s disease and Parkinson’s disease.

“We are used to thinking of proteins as molecules that fold into well-defined structures: finding out how this process happens has been a major research focus over the last 50 years,” said Professor Michele Vendruscolo from Cambridge’s Centre for Misfolding Diseases, who led the research. “However, about a third of the proteins in our body do not fold, and instead remain in disordered shapes, sort of like noodles in a soup.”

We do not know much about the behaviour of these disordered proteins, since traditional methods tend to address the problem of determining static structures, not structures in motion. The approach developed by the researchers harnesses the power of Google’s cloud computing infrastructure to generate large numbers of short trajectories. “Extensive computer simulations allow us to capture the molecular-level motions of thousands of copies of a protein in parallel, and play them back like a movie,” said co-author Dr Kai Kohlhoff from Google Research.

The most common types of motions show up multiple times in these movies, making it possible to define the frequencies by which disordered proteins jump between different states.

“By counting these motions, we can predict which states the protein occupies and how quickly it transitions between them,” said first author Thomas Löhr from Cambridge’s Yusuf Hamied Department of Chemistry.

The researchers focused their attention on the amyloid-beta peptide, a protein fragment associated with Alzheimer’s disease, which aggregates to form amyloid plaques in the brains of affected individuals. They found that amyloid-beta hops between widely different states millions of times per second without ever stopping in any particular state. This is the hallmark of disorder, and the main reason for which amyloid-beta has been deemed ‘undruggable’ so far.

“The constant motion of amyloid-beta is one of the reasons it’s been so difficult to target – it’s almost like trying to catch smoke in your hands,” said Vendruscolo.

However, by studying a variant of amyloid-beta, in which one of the amino acids is modified by oxidation, the researchers obtained a glimpse on how to make it resistant to aggregation. They found that oxidated amyloid-beta changes shape even faster than its unmodified counterpart, providing a rationale to explain the decreased tendency for aggregation of the oxidated version.

“From a chemical perspective, this modification is a minor change. But the effect on the states and transitions between them is drastic,” said Löhr.

“By making disordered proteins even more disordered, we can prevent them from self-associating in aberrant manners,” said Vendruscolo.

The approach provides a powerful tool to investigate a class of proteins with fast and disordered motions, which have remained elusive so far despite their importance in biology and medicine.

 

Reference:
Thomas Löhr et al. ‘A kinetic ensemble of the Alzheimer’s Aβ peptide’ Nature Computational Science (2021). DOI: 10.1038/s43588-020-00003-w


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Mathematics Explains How Giant ‘Whirlpools’ Form In Developing Egg Cells

source: www.cam.ac.uk

 

The swirling currents occur when the rodlike structures that extend inward from the cells’ membranes bend in tandem, like stalks of wheat caught in a strong breeze, according to a study from the University of Cambridge and the Flatiron Institute.

 

The mechanism of the swirling instability is disarmingly simple, and the agreement between our calculations and experimental observations supports the idea that this is indeed the process at work in fruit fly egg cells

Raymond Goldstein

Egg cells are among the largest cells in the animal kingdom. Unpropelled, a protein could take hours or even days to drift from one side of a forming egg cell to the other. Luckily, nature has developed a faster way: scientists have spotted cell-spanning whirlpools in the immature egg cells of animals such as mice, zebrafish and fruit flies. These vortices make cross-cell commutes take just a fraction of the time. But scientists didn’t know how these crucial flows formed.

Using mathematical modeling, researchers say they now have an answer. The gyres result from the collective behavior of rodlike molecular tubes called microtubules that extend inward from the cells’ membranes. Their results are reported in the journal Physical Review Letters.

“While much is not understood about the biological function of these flows, they distribute nutrients and other factors that organise the body plan and guide development,” said study co-lead author David Stein, a research scientist at the Flatiron Institute’s Center for Computational Biology (CCB) in New York City. And given how widely they have been observed, “they are probably even in humans.”

Scientists have studied cellular flows since the late 18th century, when Italian physicist Bonaventura Corti peered inside cells using his microscope. What he found were fluids in constant motion, however scientists didn’t understand the mechanisms driving these flows until the 20th century.

The culprits, they found, are molecular motors that walk along the microtubules. Those motors haul large biological payloads such as lipids. Carrying the cargo through a cell’s relatively thick fluids is like dragging a beach ball through honey. As the payloads move through the fluid, the fluid moves too, creating a small current.

Sometimes those currents aren’t so small. In certain developmental stages of a common fruit fly’s egg cell, scientists spotted whirlpool-like currents that spanned the entire cell. In these cells, microtubules extend inward from the cell’s membrane like stalks of wheat. Molecular motors climbing these microtubules push downward on the microtubule as they ascend. That downward force bends the microtubule, redirecting the resulting flows.

Previous studies looked at this bending mechanism, but only for isolated microtubules. Those studies predicted that the microtubules would wave around in circles, but their behavior didn’t match the observations.

“The mechanism of the swirling instability is disarmingly simple, and the agreement between our calculations and the experimental observations by various groups lends support to the idea that this is indeed the process at work in fruit fly egg cells,” said Professor Raymond Goldstein from Cambridge’s Department of Applied Mathematics and Theoretical Physics. “Further experimental tests should be able to probe details of the transition between disordered and ordered flows, where there is still much to be understood.”

In the new study, the researchers added a key factor to their model: the influence of neighboring microtubules. That addition showed that the fluid flows generated by the payload-ferrying motors bend nearby microtubules in the same direction. With enough motors and a dense enough packing of microtubules, the authors found that all the microtubules eventually lean together like wheat stalks caught in a strong breeze. This collective alignment orients all the flows in the same direction, creating the cell-wide vortex seen in real fruit fly cells.

While grounded in reality, the new model is stripped down to the bare essentials to make clearer the conditions responsible for the swirling flows. The researchers are now working on versions that more realistically capture the physics behind the flows to understand better the role the currents play in biological processes.

Stein serves as the co-lead author of the new study along with Gabriele De Canio, a researcher at the University of Cambridge. They co-authored the study with CCB director and New York University professor Michael Shelley and University of Cambridge professors Eric Lauga and Raymond Goldstein.

This work was supported by the US National Science Foundation, the Wellcome Trust, the European Research Council, the Engineering and Physical Sciences Research Council, and the Schlumberger Chair Fund.

 

Reference:
D.B. Stein, G. De Canio, E. Lauga, M.J. Shelley, and R.E. Goldstein, “Swirling Instability of the Microtubule Cytoskeleton”, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.126.028103


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

High Insulin Levels During Childhood a Risk For Mental Health Problems Later In Life, Study Suggests

Children sitting on park bench
source: www.cam.ac.uk

 

Researchers have shown that the link between physical and mental illness is closer than previously thought. Certain changes in physical health, which are detectable in childhood, are linked with the development of mental illness in adulthood.

 

For some individuals, physical health problems detectable from childhood might be risk factors for adult psychosis and depression

Benjamin Perry

The researchers, led by the University of Cambridge, used a sample of over 10,000 people to study how insulin levels and body mass index (BMI) in childhood may be linked with depression and psychosis in young adulthood.

They found that persistently high insulin levels from mid-childhood were linked with a higher chance of developing psychosis in adulthood. In addition, they found that an increase in BMI around the onset of puberty was linked with a higher chance of developing depression in adulthood, particularly in girls. The results were consistent after adjusting for a range of other possible factors.

The findings, reported in the journal JAMA Psychiatry, suggest that early signs of developing physical health problems could be present long before the development of symptoms of psychosis or depression and show that the link between physical and mental illness is more complex than previously thought.

However, the researchers caution that these risk factors are among many, both genetic and environmental, and that their results do not suggest that one could predict the likelihood of developing adult mental disorders from these physical health measures alone.

The researchers recommend that healthcare professionals should carry out robust physical assessments of young people presenting with symptoms of psychosis or depression, so that early signs of physical illnesses may be diagnosed and treated early. It has been well-established that people with depression and psychosis can have a life expectancy of up to 20 years shorter than the general population, mostly because physical health problems like diabetes and obesity are more common in adults with those mental disorders.

While psychosis and depression in adulthood are already known to be associated with significantly higher rates of diabetes and obesity than the general population, these links are often attributed to the symptoms of the mental disorder itself.

“The general assumption in the past has been that some people with psychosis and depression might be more likely to have a poor diet and lower levels of physical exercise, so any adverse physical health problems are a result of the mental disorder, or the treatment for it,” said first author Dr Benjamin Perry from Cambridge’s Department of Psychiatry. “In essence, the received wisdom is that the mental disorder comes first. But we’ve found that this isn’t necessarily the case, and for some individuals, it may be the other way around, suggesting that physical health problems detectable from childhood might be risk factors for adult psychosis and depression.”

Using data from the Avon Longitudinal Study of Parents and Children (ALSPAC), a long-term population-representative birth cohort study set in the west of England, Perry and his colleagues found that disruption to insulin levels can be detected in childhood, long before the onset of psychosis, suggesting that some people with psychosis may have an inherent susceptibility to developing diabetes.

They used a statistical method to group individuals based on similar trajectories of change in insulin levels and BMI from age one to 24, and examined how the different groups related to risks of depression and psychosis in adulthood. About 75% of study participants had normal insulin levels, between 15% and 18% had insulin levels which increased gradually over adolescence, and around 3% had relatively high insulin levels. This third group had a higher chance of developing psychosis in adulthood compared with the average group.

The researchers did not find that the group who had persistently high BMI through childhood and adolescence had a significantly increased risk of depression in adulthood, and instead suggest that their findings mean that certain factors around the age of puberty which might cause BMI to increase might be important risk factors for depression in adulthood. The researchers were not able to determine in their study what those factors might be, and future research will be required to find them. These factors might be important targets to reduce the risk of depression in adulthood.

“These findings are an important reminder that all young people presenting with mental health problems should be offered a full and comprehensive assessment of their physical health in tandem with their mental health,” said Perry. “Intervening early is the best way to reduce the mortality gap sadly faced by people with mental disorders like depression and psychosis.

“The next step will be to work out exactly why persistently high insulin levels from childhood increase the risk of psychosis in adulthood, and why increases in BMI around the age of puberty increase the risk of depression in adulthood. Doing so could pave the way for better preventative measures and the potential for new treatment targets.”

 

Reference:
Benjamin I. Perry et al: ‘Longitudinal Trends in Insulin Levels and BMI From Childhood and Their Associations with Risks of Psychosis and Depression in Young Adults.’ JAMA Psychiatry (2021). DOI: 10.1001/jamapsychiatry.2020.4180


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Quantum Projects Launched To Solve Universe’s Mysteries

New Simulation Sheds Light on Spiraling Supermassive Black Holes
source: www.cam.ac.uk

 

Researchers will use cutting-edge quantum technologies to transform our understanding of the universe and answer key questions such as the nature of dark matter and black holes.

 

UK Research and Innovation (UKRI) is supporting seven projects with a £31 million investment to demonstrate how quantum technologies could solve some of the greatest mysteries in fundamental physics. Researchers from the University of Cambridge have been awarded funding on four of the seven projects.

Just as quantum computing promises to revolutionise traditional computing, technologies such as quantum sensors have the potential to radically change our approach to understanding our universe.

The projects are supported through the Quantum Technologies for Fundamental Physics programme, delivered by the Science and Technology Facilities Council (STFC) and the Engineering and Physical Sciences Research Council (EPSRC) as part of UKRI’s Strategic Priorities Fund. The programme is part of the National Quantum Technologies Programme.

AION: A UK Atom Interferometer Observatory and Network has been awarded £7.2 million in funding and will be led by Imperial College London. The project will develop and use technology based on quantum interference between atoms to detect ultra-light dark matter and sources of gravitational waves, such as collisions between massive black holes far away in the universe and violent processes in the very early universe. The team will design a 10m atom interferometer, preparing the construction of the instrument in Oxford and paving the way for larger-scale future experiments to be located in the UK. Members of the AION consortium will also contribute to MAGIS, a partner experiment in the US.

The Cambridge team on AION is led by Professor Valerie Gibson and Dr Ulrich Schneider from the Cavendish Laboratory, alongside researchers from the Kavli Institute for Cosmology, the Institute of Astronomy and the Department of Applied Mathematics and Theoretical Physics. Dr Tiffany Harte will co-lead the development of the cold atom transport and final cooling sequences for AION, and Dr Jeremy Mitchell will co-lead the data readout and network capabilities for AION and MAGIS, and undertake data analysis and theoretical interpretation.

“This announcement from STFC to fund the AION project, which alongside some seed funding from the Kavli Foundation, will allow us to target key open questions in fundamental physics and bring new interdisciplinary research to the University for the foreseeable future,” said Gibson.

“Every physical effect, known or unknown, leaves its fingerprint on the phase evolution of a coherent quantum system such as cold atoms; it only requires sufficiently sensitive detectors,” said Schneider. “We are excited to contribute our cold-atom technology to this interdisciplinary endeavour and to develop atom interferometry into a powerful detector for fundamental physics.”

The Quantum Sensors for the Hidden Sector (QSHS) project, led by the University of Sheffield, has been awarded £4.8 million in funding. The project aims to contribute to the search for axions, low-mass ‘hidden’ particles that are candidates to solve the mystery of dark matter. They will develop new quantum measurement technology for inclusion in the US ADMX experiment, which can then be used to search for axions in parts of our galaxy’s dark matter halo that have never been explored before.

“The team will develop new electronic technology to a high level of sophistication and deploy it to search for the lowest-mass particles detected to date,” said Professor Stafford Withington from the Cavendish Laboratory, Co-Investigator and Senior Project Scientist on QSHS. “These particles are predicted to exist theoretically, but have not yet been discovered experimentally. Our ability to probe the particulate nature of the physical world with sensitivities that push at the limits imposed by quantum uncertainty will open up a new frontier in physics.

“This new window will allow physicists to explore the nature of physical reality at the most fundamental level, and it is extremely exciting that the UK will be playing a major international role in this new generation of science.”

Professor Withington is also involved in the Determination of Absolute Neutrino Mass using Quantum Technologies, which will be led by UCL. The project aims to harness recent breakthroughs in quantum technologies to solve one of the most important outstanding challenges in particle physics – determining the absolute mass of neutrinos. One of the universe’s most abundant particles neutrinos are a by-product of nuclear fusion within stars, therefore being key to our understanding of the processes within stars and the makeup of the universe. Moreover, knowing the value of the neutrino mass is critical to our understanding of the origin of matter and evolution of the universe. They are poorly understood however, and the researchers aim to develop pioneering new spectroscopy technology capable to precisely measure the mass of this elusive but important particle.

Professor Zoran Hadzibabic has received funding as part of the Quantum Simulators for Fundamental Physics project, led by the University of Nottingham. The project aims to develop quantum simulators capable of providing insights into the physics of the very early universe and black holes. The goals include simulating aspects of quantum black holes and testing theories of the quantum vacuum that underpin ideas on the origin of the universe.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Family Court Decisions Distorted By Misuse of Key Research, Say Experts

Mother and child at sunset

 

Family courts are misunderstanding and misusing research around how children form close relationships with their caregivers, say an international group of experts.

 

The decisions reached by family courts can have a major impact on a child’s life, but as we’ve seen, these decisions may be based on incorrect understanding and assumptions

Robbie Duschinsky

Seventy experts from across the globe argue that widespread misunderstandings around attachment research have hampered its accurate implementation, with potentially negative consequences for decisions in family courts.

In response, they have published an international consensus statement in Attachment & Human Development that aims “to counter misinformation and help steer family court applications of attachment theory in a supportive, evidence-based direction on matters related to child protection and custody decisions”.

In the statement, the group sets out three principles from attachment research which they say should guide decision-making: the child’s need for familiar, non-abusive caregivers; the value of continuity of good-enough care; and the benefits of networks of familiar relationships.

Attachment research investigates the strong affectional bonds – ‘attachments’ – that individuals form to others in order to achieve comfort and protection. Children are born with a predisposition to develop these bonds with ‘attachment figures’ in their lives. This often includes the child’s parents, but many children develop attachment relationships with additional caregivers, such as grandparents. Children wish to turn to their attachment figures when upset.

The quality of an attachment relationship – how readily a child will turn to their caregiver and accept comfort – is indicated by behaviour suggestive of whether or not they expect their attachment figures to respond sensitively to their signals in times of need. Indeed, the most important predictor of children’s attachment quality is caregiver ‘sensitivity’: the ability to perceive, interpret and respond in a timely manner and appropriately to children’s signals.

Attachment research is applied in many settings, including in family court decision-making regarding child custody and child protection. Court practice needs to follow the best interests of the child, but this can be difficult to determine. There is an increasing focus on the interactions and relationships between children and their caregivers, which in turn has led to interest in using attachment theory and measures to help guide decision-making.

Dr Robbie Duschinsky from the University of Cambridge, said: “The decisions reached by family courts can have a major impact on a child’s life, but as we’ve seen, these decisions may be based on incorrect understanding and assumptions. By outlining potential issues and presenting principles to guide the decision-making process, we hope to better inform and hence empower courts to act in a child’s best interests.”

One example is the mistaken assumption that attachment quality equals relationship quality, and that it is possible to judge attachment quality by looking at isolated behaviours. In fact, there are many other important aspects of child-caregiver relationships, such as play, supervision and teaching, and specific behaviours such as crying can depend on largely constitutional factors such as temperament.

There are also misunderstandings regarding the importance of developing attachment to one particular caregiver rather than to more than one, with the theory misinterpreted as placing an emphasis on one ‘psychological parent’, typically the mother. In this line of reasoning, it is often assumed that an attachment relationship with one person is at the expense of other attachment relationships, and that best-interest decisions should maximise the likelihood of secure attachment with one primary caregiver. However, children can develop and maintain secure attachment relationships to multiple caregivers simultaneously, and a network of attachment relationships may well constitute a protective factor in child development.

In other cases, attachment theory has been held to categorically prescribe joint physical custody, with equal time allocation regardless of child age, including overnights and transitions between family homes every day or every other day. Yet, there is a notable scarcity of empirical research on attachment in relation to child custody, time allocation, and overnight arrangements.

Dr Tommie Forslund from Stockholm University said: “Misunderstandings can have important consequences for children and their caregivers. In some cases, they can lead to an ill-informed dismissal of the relevance of attachment by court professionals or, conversely, to the overuse of attachment ideas and measures, with practice unmoored from evidence.

“We need to make sure that courts are aware of the limits of current understanding as well as the nuances of attachment theory and research before seeking to apply it to their decision-making.”

The researchers have also advised caution in using assessments of attachment quality in the family courts.

Professor Pehr Granqvist from Stockholm University added: “Courts need to bear in mind that while assessments of attachment quality may be suitable for helping target supportive interventions, there are different opinions even among those of us who specialise in attachment research regarding the potential usefulness of these assessments when it comes to decision-making regarding child protection.

“Validated in group-level research, attachment measures have insufficient precision for individual level prediction. If used at all, assessments of attachment quality should never be used in isolation but only as part of a larger assessment battery that assigns more weight to direct assessments of caregiving behaviour. Importantly, attachment assessments must only be used by formally trained observers who follow standardised protocols.”

The experts propose three fundamental principles, based on more than half a century of research, which they argue can be used as a basis for court practitioners:

  • The need for familiar, non-abusive caregivers – For child protection practice, for example, this implies that all non-abusive and non-neglecting family-based care is likely to be better than institutional care.
  • The value of continuity of good-enough care – ‘Good-enough’ care signifies an adequate level of meeting the child’s needs over time. The group urges family courts to examine and support caregivers’ abilities to provide ‘good-enough’ caregiving, rather than placing children in out-of-home custody with the hope of ‘optimal’ care. Major separations from caregivers constitute risk factors in child development that should be prevented whenever possible.
  • The benefits of networks of attachment relationships – Decision-making concerning child custody should assign weight to supporting children’s ability to develop and maintain attachment relationships with both their caregivers, except when there is threat to the child’s welfare and safety or one of the parents wants to ‘opt out’.

Reference
Attachment Goes to Court: Child Protection and Custody Issues. Attachment & Human Development; 11 Jan 2021; DOI:  10.1080/14616734.2020.1840762


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

One in Three Adults Drank More Alcohol During First Lockdown

Wine glass and bottle
source: www.cam.ac.uk

 

COVID-19 and lockdown measures drove some individuals more than others to use alcohol to cope with stress, a new study has revealed. While overall alcohol consumption appeared to fall, a study published in BMJ Open found that more than one in three adults (36%) increased their consumption during the first lockdown.

 

As COVID-19 remains part of daily life, many of us are turning to alcohol to cope with stress. For many people, drinking in moderation can be help with stress relief, but for others it can be more problematic

Valerie Voon

In early March, the World Health Organization declared COVID-19 a pandemic and many countries put in place drastic safety measures to control the spread of the virus, including an extended lockdown period.

In the UK, the first nationwide lockdown started on 23 March 2020 and lasted until 1 June, when restrictions began to be eased. Since then more localised lockdowns have been implemented where necessary.

A team of researchers at the University of Cambridge has explored whether the stress of the pandemic and lockdown measures affected people’s alcohol consumption. Between 14 and 28 May 2020, 1,346 people around the world completed an online survey about their drinking habits before and during lockdown. The researchers used their responses to compare the amount of alcohol consumed during lockdown against that in November 2019, as well as their drinking severity (occurrences of problem drinking such as drinking to the point of memory loss or neglecting personal responsibilities due to drinking). They also assessed mental health factors such as depression and anxiety.

The survey revealed that while the units of alcohol consumed per week decreased during lockdown – down from a mean average of 8.32 units in November to 8.03 during lockdown – a substantial percentage of individuals (36%) increased their drinking during lockdown. In the UK, the units of alcohol consumed per week increased from 10.94 to 11.25 units.

Samantha N Sallie, the study’s first author and a PhD student at the Department of Psychiatry, said: “While in countries such as Canada and the USA people drank less during lockdown, in the UK there was a small increase in alcohol consumption.”

Older individuals tended to increase their alcohol consumption more than younger people during lockdown, from 10 to 11 units weekly. Age may play a particularly unique role in the context of COVID-19 due to the greater need for older people to have more stringent isolation, with potentially fewer support mechanisms, and hence a risk of greater isolation and loneliness, as well as concern about the impact of COVID-19 on their personal health.

Respondents with children reported a greater increase in alcohol consumption during lockdown, of between 0.54 and 2.02 units, though their depression and anxiety scores were lower than for those without children. The researchers say this suggests the additional burden of childcare and home schooling contributed to the tendency towards drinking, possibly in the context of stress relief, but the presence of children may also be protective against depression and anxiety.

“For parents having to take on extra childcare responsibilities during lockdown, possibly at the same time as having to manage changes to their work routine, it’s possible that the extra stress increased their tendency to drink,” said Sallie. “On the other hand, having children may mitigate against loneliness that has been highlighted as a major issue during the isolation of lockdown.”

The team found that essential workers – specifically healthcare workers responsible for taking care of individuals with COVID-19 – showed an increase in drinking amount of between 0.45 and 1.26 units, while those whose loved ones became severely ill or died from COVID-19 showed an increase in problem drinking during the lockdown.

“This demonstrates how the virus itself has affected alcohol consumption in those who have had close contact with the very real and devastating effects of COVID-19,” added Sallie.

Although men consumed more alcohol than women, they showed a decrease in both drinking amount and severity during lockdown, while women demonstrated the opposite trend, with women consuming an extra unit of alcohol a week during lockdown. This finding corroborates evidence that indicates women are more likely than men to consume alcohol in order to cope with stress.

Individuals who reported a change in their employment status or were isolating alone were more likely to have higher depression scores, but showed no change in their drinking behaviour. Those individuals isolating with others but reporting a poor relationship were more likely to have higher depression and anxiety scores.

Dr Valerie Voon, senior author of the study from the University of Cambridge, said: “As COVID-19 remains part of daily life, many of us are turning to alcohol to cope with stress. For many people, drinking in moderation can be help with stress relief, but for others it can be more problematic.

“Alcohol misuse is a major public health issue in the United Kingdom, costing £21-52 billion with NHS costs of £3.5 billion per year.  Our findings highlight a need to identify those individuals who are at risk for problem drinking so we can offer them greater support during the ongoing pandemic.”

The researchers say there may be a number of reasons for the overall decrease in alcohol use and problematic use, including stringent lockdown measures leading to a decrease in the availability of alcoholic drinks within the immediate household and because people tend to consume alcohol in social situations, such as at the pub or when eating out.

Reference
Sallie, S N et al Assessing International Alcohol Consumption Patterns During Isolation from the COVID-19 Pandemic Using an Online Survey: Highlighting Negative Emotionality Mechanisms. BMJ Open; 26 Nov 2020; DOI: 10.1136/bmjopen-2020-044276


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Common Drug for Build-up of Blood Following Head Injury Worse Than Placebo, Study Finds

Closeup of brain MRI scan resultsource: www.cam.ac.uk

 

A commonly-used treatment for chronic subdural haematoma – the build-up of ‘old’ blood in the space between the brain and the skull, usually as a result of minor head injury – could lead to a worse outcome than receiving no medication, suggests new research from the University of Cambridge.

 

Our trial sought to determine if dexamethasone should be offered routinely to all patients with chronic subdural haematoma or if its use should be abandoned. Based on our findings, we believe that dexamethasone should not be used in patients with chronic subdural haematoma anymore

Peter Hutchinson

Chronic subdural haematoma is one of the most common neurological disorders and mainly affects older people. People affected often have headaches, deteriorating memory, confusion, balance problems or limb weakness. Surgery to drain the liquid collection is effective with the majority of patients improving.

A commonly used steroid, dexamethasone, has been used alongside surgery or instead of it since the 1970s. However, consensus has been lacking regarding the use of dexamethasone, especially since no high-quality studies confirming its effectiveness had been conducted until now.

With funding from the UK National Institute Health Research, a group of doctors and researchers from 23 neurosurgical units in the United Kingdom enrolled 748 patients with chronic subdural haematoma in the “Dexamethasone in Chronic Subdural Haematoma (Dex-CSDH)” randomised trial. A total of 375 patients were randomised to receive a two-week tapering course of dexamethasone and were compared with 373 patients randomised to an identical matching placebo.

The results of the study, published today in the New England Journal of Medicine, show that patients who received dexamethasone had a lower chance of favourable recovery at six months compared to patients who received placebo. More specifically, the vast majority of patients in both groups had an operation to drain the haematoma and had experienced significant functional improvement at six months compared to their initial admission to hospital.

Fewer patients in the dexamethasone group required repeat surgery for a recurrent haematoma compared to patients in the placebo group. However, 84% of patients who received dexamethasone had recovered well at 6 months compared to 90% of patients who received placebo.

Peter Hutchinson, Professor of Neurosurgery at the University of Cambridge and the trial’s Chief Investigator, said: “Chronic subdural haematoma has been steadily increasing in frequency over the past decades. Patients affected are often frail and have other co-existing medical conditions. Since the 1970s, dexamethasone has been used as a drug alongside or instead of surgery with a few studies reporting good results.

“Our trial sought to determine if dexamethasone should be offered routinely to all patients with chronic subdural haematoma or if its use should be abandoned. Based on our findings, we believe that dexamethasone should not be used in patients with chronic subdural haematoma anymore.”

Angelos Kolias, Lecturer of Neurosurgery at the University of Cambridge and the trial’s Co-chief Investigator, added: “The results of the study were surprising given that dexamethasone seemed to help reduce the number of repeat surgeries. However, this simply reinforces the importance of conducting high-quality trials with patient-reported outcomes as the main outcomes of interest”.

Ellie Edlmann, the trial’s research fellow, currently a Clinical Lecturer at the University of Plymouth, concluded: “Credit is due to all doctors and researchers from across the NHS who worked tirelessly in order to enrol all eligible patients in the trial; in particular, the role of trainee neurosurgeons, members of the British Neurosurgical Trainee Research Collaborative, needs to be highlighted. We sincerely thank all patients and their carers, as without their altruistic participation, this trial would not have been possible.”

The trial was funded by the National Institute for Health Research (NIHR), with further support from the NIHR Cambridge Biomedical Research Centre, the NIHR Brain Injury MedTech Co-operative, the Royal College of Surgeons of England, and the Rosetrees Trust.

Reference
Hutchinson, PJ et al. Trial of Dexamethasone for Chronic Subdural Hematoma. NEJM; 16 Dec 2020; DOI: 10.1056/NEJMoa2020473


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Launches Regulatory Genome Project

Red bars on black background
source: www.cam.ac.uk

 

The project will use machine learning to sequence the world’s regulatory text and create an open-source repository of machine-readable regulatory information.

 

An open access repository of regulatory information will serve to level the regulatory playing field for those who develop and comply with regulation, particularly in emerging markets

Robert Wardrop

The University of Cambridge has launched the Regulatory Genome Project, a transformational initiative to sequence the world’s vast amount of regulatory text to create a comprehensive open repository of machine-readable regulatory information for use by regulatory agencies and businesses around the world.

This multi-year project – which includes an expanding collaboration network of regulatory agencies, companies, and academic researchers – has been inspired by the scientific and commercial innovation that followed the collective effort to code the human genome.

The project aims to provide an open-source infrastructure for all countries, particularly in developing regions, to have the same digital capabilities as advanced economies to identify regulatory obligations around the world. This is particularly important as consumers have increasingly adopted digital financial services, which has opened up new areas of risk that require regulation.

Drawing on research from the Cambridge Centre for Alternative Finance (CCAF) at Cambridge Judge Business School and the Department of Computer Science and Technology, the project uses machine learning and natural language processing to ‘sequence’ huge amounts of regulatory text, starting with financial regulation. This offers regulators and firms the opportunity to acquire unprecedented capabilities in the development, processing and analysis of regulation.

“The Regulatory Genome Project combines the University’s research with its convening influence to deliver societal and economic impact with the potential to be global, transformational and enduring,” said Professor Eilís Ferran, Pro-Vice-Chancellor for Institutional and International Relations at the University of Cambridge.

As part of the project’s structure, a new University-affiliated company called Regulatory Genome Development Ltd has been formed to provide support to third parties building applications using the code and data in the Regulatory Genome.

The Regulatory Genome originated out of a research project led by the CCAF with initial funding provided by the Omidyar Network to create a solution – named RegSimple – that simplifies the comparison of regulations across different jurisdictions. Additional funding has been provided by the UK’s Foreign, Commonwealth & Development Office to expand the scope and functionality of RegSimple to serve the needs of regulators in developing and emerging economies. Regulators from more than 20 jurisdictions are already contributing to the project.

“We are excited about the vast potential of this project to benefit both public and private sector interests,” said Dr Robert Wardrop, Director of the Cambridge Centre for Alternative Finance. “An open access repository of regulatory information will serve to level the regulatory playing field for those who develop and comply with regulation, particularly in emerging markets, and serve as a key resource for researchers in deepening their understanding how the regulatory landscape for digital financial services is evolving.”

Originally published on the Cambridge Judge Business School website.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Driving Force Behind cellular ‘Protein Factories’ Could Have Implications For Neurodegenerative Disease

Inducing lysosome motion with light leads to a rapid and significant extension of ER network.
source: www.cam.ac.uk

 

Researchers have identified the driving force behind a cellular process linked to neurodegenerative disorders such as Parkinson’s and motor neurone disease.

 

There is still so much to learn about this system, which is incredibly important to fundamental biomedical science

Clemens Kaminski

In a study published today in Science Advances, researchers from the University of Cambridge show that tiny components within the cell are the biological engines behind effective protein production.

The endoplasmic reticulum (ER) is the cell’s protein factory, producing and modifying the proteins needed to ensure healthy cell function. It is the cell’s biggest organelle and exists in a web-like structure of tubes and sheets. The ER moves rapidly and constantly changes shape, extending across the cell to wherever it is needed at any given moment.

Using super-resolution microscopy techniques, researchers from Cambridge’s Department of Chemical Engineering and Biotechnology (CEB) have discovered the driving force behind these movements – a breakthrough that could have significant impact on the study of neurodegenerative diseases.

“It has been known that the endoplasmic reticulum has a very dynamic structure – constantly stretching and extending its shape inside the cell,” said Dr Meng Lu, research associate in the Laser Analytics Group, led by Professor Clemens Kaminski.

“The ER needs to be able to reach all places efficiently and quickly to perform essential housekeeping functions within the cell, whenever and wherever the need arises. Impairment of this capability is linked to diseases including Parkinson’s, Alzheimer’s, Huntington’s and ALS. So far there has been limited understanding of how the ER achieves these rapid and fascinating changes in shape and how it responds to cellular stimuli.”

Lu and colleagues discovered that another cell component holds the key – small structures, that look like tiny droplets contained in membranes, called lysosomes.

Lysosomes can be thought of as the cell’s recycling centres: they capture damaged proteins, breaking them down into their original building blocks so that they can be reused in the production of new proteins. Lysosomes also act as sensing centres – picking up on environmental cues and communicating these to other parts of the cell, which adapt accordingly.

There can be up to 1,000 or so lysosomes zipping around the cell at any one time and with them, the ER appears to change its shape and location, in an apparently orchestrated fashion.

What surprised the Cambridge scientists was their discovery of a causal link between the movement of the tiny lysosomes within the cell and the reshaping process of the large ER network.

“We could show that it is the movement of the lysosomes themselves that forces the ER to reshape in response to cellular stimuli,” said Lu. “When the cell senses that there is a need for lysosomes and ER to travel to distal corners of the cell, the lysosomes pull the ER web along with them, like tiny locomotives.”

From a biological point of view, this makes sense: The lysosomes act as a sensor inside the cell, and the ER as a response unit; co-ordinating their synchronous function is critical to cellular health.

To discover this surprising bond between two very different organelles, Kaminski’s research team made use of new imaging technologies and machine learning algorithms, which gave them unprecedented insights into the inner workings of the cell.

“It is fascinating that we are now able to look inside living cells and see the marvellous speed and dynamics of the cellular machinery at such detail and in real time,” said Kaminski. “Only a few years ago, watching organelles going about their business inside the cell would have been unthinkable.”

The researchers used illumination patterns projected onto living cells at high speed, and advanced computer algorithms to recover information on a scale more than one hundred times smaller than the width of a human hair. To capture such information at video rates has only recently become possible.

The researchers also used machine learning algorithms to extract the structure and movement of the ER networks and lysosomes in an automated fashion from thousands of datasets.

The team extended their research to look at neurons or nerve cells – specialised cells with long protrusions called axons along which signals are transmitted. Axons are extremely thin tubular structures and it was not known how the movement of the very large ER network is orchestrated inside these structures.

The study shows how lysosomes travel easily along the axons and drag the ER along behind them. The researchers also show how impairing this process is detrimental to the development of growing neurons.

Frequently, the researchers saw events where the lysosomes acted as repair engines for disconnected or broken pieces of ER structure, merging and fusing them into an intact network again. The work is therefore relevant for an understanding of disorders of the nervous system and its repair.

The team also studied the biological significance of this coupled movement, providing a stimulus – in this case nutrients – for the lysosomes to sense. The lysosomes were seen to move towards this signal, dragging the ER network behind so that the cell can elicit a suitable response.

“So far, little was known on the regulation of ER structure in response to metabolic signals,” said Lu. “Our research provides a link between lysosomes as sensors units that actively steer the local ER response.”

The team hopes that their insights will prove invaluable to those studying links between disease and cellular response, and their own next steps are focused on studying ER function and dysfunction in diseases such as Parkinson’s and Alzheimer’s.

Neurodegenerative disorders are associated with aggregation of damaged and misfolded proteins, so understanding the underlying mechanisms of ER function is critical to research into their treatment and prevention.

“The discoveries of the ER and lysosomes were awarded the Nobel Prize many years ago – they are key organelles essential for healthy cellular function,” said Kaminski. “It is fascinating to think that there is still so much to learn about this system, which is incredibly important to fundamental biomedical science looking to find the cause and cures of these devastating diseases.”

Reference:
Meng Lu et al. ‘The structure and global distribution ofthe endoplasmic reticulum network is actively regulated by lysosomes.’ Science Advances (2020). DOI: 10.1126/sciadv.abc7209


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

AI Could Help Cut Waiting Times For Cancer By Automating Mark-Up of Patient Scans Prior To Radiotherapy

 

Doctors at Addenbrooke’s Hospital in Cambridge aim to drastically cut cancer waiting times by using artificial intelligence (AI) to automate lengthy radiotherapy preparations.

 

As clinicians we want to start radiotherapy promptly to improve survival rates and reduce anxiety. Using machine learning tools can save time for busy clinicians and help get our patients onto treatment as quickly as possible

Raj Jena

The AI technology, known as InnerEye, is the result of an eight-year collaboration between researchers at Cambridge-based Microsoft Research, Addenbrooke’s Hospital and the University of Cambridge.

InnerEye aims to save clinicians many hours of time laboriously marking up patient scans prior to radiotherapy. The team has demonstrated how machine learning (ML) models built using the InnerEye open-source technology can cut this preparation time by up to 90% – meaning that waiting times for starting potentially life-saving radiotherapy treatment can be dramatically reduced.

Health and Social Care Secretary Matt Hancock said: “New innovations like this can make all the difference to patients and I am proud to see we are once again leading the way in new cancer treatments.

“Helping people receive treatment faster is incredibly important and will not only improve recovery rates but will save clinicians precious time so they can focus on caring for patients.

“Embracing new technologies will help save lives and is vital for the sustainability of the NHS, and our NHS Long Term Plan will continue to deliver the best possible care for patients so that we can offer faster, more personalised and effective cancer treatment for all.”

Dr Raj Jena from the Department of Oncology at the University of Cambridge and an oncologist at Addenbrooke’s, who co-leads InnerEye, said: “These results are a game-changer. To be diagnosed with a tumour of any kind is an incredibly traumatic experience for patients. So as clinicians we want to start radiotherapy promptly to improve survival rates and reduce anxiety. Using machine learning tools can save time for busy clinicians and help get our patients onto treatment as quickly as possible.”

Dr Yvonne Rimmer, also from at Addenbrooke’s, said: “There is no doubt that InnerEye is saving me time. It’s very good at understanding where tumours and healthy organs are. It’s speeding up the process so I can concentrate on looking at a patient’s diagnostic images and tailoring treatment to them.

“But it’s important for patients to know that the AI is helping me do my job; it’s not replacing me in the process. I double check everything the AI does and can change it if I need to. The key thing is that most of the time, I don’t need to change anything.”

Up to half of the population in the UK will be diagnosed with cancer at some point in their lives. Of those, half will be treated with radiotherapy, often in combination with other treatments such as surgery, chemotherapy, and increasingly immunotherapy.

Radiotherapy involves focusing high-intensity radiation beams to damage the DNA of hard cancerous tumours while avoiding surrounding healthy organs. This is a critical tool, with around 40% of successfully treated patients undergoing some form of radiotherapy.

Planning radiotherapy treatment can be a lengthy process. It starts with a 3D CT (Computed Tomography) imaging scan of the part of the body to be targeted. These CT images come in the form of stacks of 2D images, dozens of images deep, each of which must be examined and marked up by a radiation oncologist or specialist technician. This process is called contouring. In each image, an expert must manually draw a contour line around the tumours and key healthy organs at risk in the target area using dedicated computer software. For complex cases, this can take several hours in the planning of a single patient’s treatment.

This image segmentation task is a rate-limiting factor in the cancer treatment pathway for radiotherapy, which increases the burden of time on clinicians and the financial cost to hospitals. As this task is subjective, there can be significant variability across experts and institutions where acquisition protocols and patient demographics vary. This is a limitation to the use of imaging in clinical trials and can introduce variability in patient care.

Research published by the team in JAMA Network Open confirms that the InnerEye ML models can accurately and rapidly carry out the otherwise lengthy ‘image segmentation’ requiring hours of expert clinicians’ time.

Head of Health Intelligence at Microsoft Research, Aditya Nori, said: “This is the first time, we believe, that an NHS Trust has implemented its own deep learning solution trained on their own data, so it can be used on their patients. It paves the way for more NHS Trusts to take advantage of open-source AI tools to help reduce cancer treatment times.”

The InnerEye Deep Learning Toolkit has been made freely available as open-source software by Microsoft.

While ML models developed using the tool need to be tested and validated in each individual healthcare setting, doctors at Cambridge University Hospitals (CUH) have demonstrated how the technology can be applied in clinical settings.

Reference
Ozan Oktay, et al. Evaluation of Deep Learning to Augment Image-Guided Radiotherapy for Head and Neck and Prostate Cancers. JAMA Network Open; 30 Nov 2020; DOI: 10.1001/jamanetworkopen.2020.27426

Adapted from press releases from Microsoft and Cambridge University Hospitals NHS Foundation Trust. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Aroma Diffuser and Plastic Bag Offer Inexpensive Method To Test Fit of Face Masks At Home

Medical personnel putting on PPE
source: www.cam.ac.uk

 

Researchers have developed a way to use a simple home aroma diffuser to test whether N95 and other types of sealing masks, such as KN95 and FFP2 masks, are properly fitted, a result which could be used to help protect healthcare workers and the public from contracting or transmitting COVID-19.

 

Given the importance of masks in slowing the spread of COVID-19 and other airborne viruses, it’s essential that they fit properly, especially in healthcare settings

Eugenia O’Kelly

The researchers, from the University of Cambridge, tested a variety of materials to construct a new inexpensive and reliable method for assessing the fit of masks. Commercial testing equipment has been in extremely short supply since the outbreak of the COVID-19 pandemic, forcing many healthcare institutions to abandon regular fit-testing of their staff.

Their results, published in the journal Disaster Medicine and Public Health Preparedness, found that widely-available alternatives, such as aroma diffusers and extra-large freezer bags, can be used to make a qualitative fit-testing setup which performs at a similar level to commercial solutions.

While commercial kits typically cost several hundred pounds, the Cambridge setup can be made for under £35. In addition to its potential benefits to the healthcare industry, this inexpensive setup can be used by anyone who wants to test the fit of their mask at home.

The researchers caution, however, that their setup will only test the fit of sealing masks with high filtration ability, such as N95, FFP3, KN95 or FFP2 masks. The method cannot be used to test the fit of surgical or fabric masks, as these do not typically offer the fit or filtration necessary to pass a qualitative fit-test.

Sealing masks offer the wearer a high level of protection, but only if they fit properly, with no gaps between the mask and the wearer’s face. Previous studies have found that even if the mask material is highly efficient at filtering fine particles, the effectiveness of the mask is hampered by an imperfect seal.

“So far, there has not been an inexpensive, accessible and reliable way of testing the fit of sealing masks,” said Eugenia O’Kelly from Cambridge’s Department of Engineering, the study’s first author. “Shortages of the fit-testing equipment that healthcare facilities normally use have left some of them unable to test their workers. And those who do not work in healthcare have had no reliable way to ensure their masks fit.”

Most healthcare facilities use qualitative fit-testing methods on their staff, as these are faster and cheaper than quantitative methods. Qualitative fit-testing requires three key pieces of equipment: a testing solution, a diffuser to atomise the solution, and a testing hood.

To carry out a typical fit-test, a user places the hood over their head while wearing a mask, and the solution is aerosolised into the enclosure as a fine mist. The solution is usually sweet or bitter. The fit of the mask is assessed by how well the user can taste the solution while nodding their head or speaking. If the mask fits the wearer, they will not be able to taste the solution.

When COVID-19 struck, the increase in demand for fit-testing supplies, combined with breakdowns in manufacturing and supply chains, meant it became very difficult to get qualitative test equipment, with wait times extending weeks or even months.

“Solving the fit-testing supply crisis is critical to enable hospitals and businesses to properly protect their workers,” said O’Kelly.

Meanwhile, those outside of healthcare facilities who use non-sealing face masks are left with no reliable way to determine the fit of their masks. “Many people are using KN95 or FFP2 masks,” said O’Kelly. “While these masks can offer high levels of protection, they do not fit everyone. We also wanted to offer a way for the public, particularly those who are at high risk, to evaluate the fit of these masks for themselves.”

Previous research has assessed the safety and efficacy of homemade testing solutions; however, no effective alternatives to the atomising equipment or enclosures had yet been identified.

Now, the researchers have identified alternatives to these pieces of the testing apparatus which are around a quarter of the cost of commercial equipment and are readily available from many retailers, including Amazon.

To diffuse the solution, the researchers tested an aroma diffuser, humidifier, mist maker and spray bottle. For the enclosure, they tested a plastic bag, testing hood, a clear storage cube and no enclosure. Testers first underwent quantitative fit-testing to assess the fit on their faces before the qualitative methods. Quantitative testing measures the number of particles inside and outside the mask and is highly accurate. However, it is also time-consuming and expensive, which is why qualitative testing is more frequently used in healthcare settings.

Using an N95 mask from 3M and a KN95 mask from a Chinese manufacturer, the testers then assessed the alternative devices and enclosures. A solution of sodium saccharin – an artificial sweetener – was aerosolised for 60 seconds at a time, and testers were asked whether they could taste the sweetener or not. The test was then repeated with the tester causing an intentional gap in the fit by placing the tip of a finger between the mask and their face.

They found that the combination of an aroma diffuser and a small container, such as a large plastic bag, provided the most accurate and most sensitive setup, with results comparable to commercial qualitative fit-testing solutions.

“Our homemade replacement requires further testing for safety and efficacy: in particular, the use of a plastic bag to concentrate the vapour remains a safety concern,” said O’Kelly. “However, we were happy to find an inexpensive setup to assess the fit of masks when used in combination with homemade fit-testing solution. Given the importance of masks in slowing the spread of COVID-19 and other airborne viruses, it’s essential that they fit properly, especially in healthcare settings.”

More information is available at www.facemaskresearch.com.

 

Reference:
Eugenia O’Kelly et al. ‘Performing Qualitative Mask Fit Testing Without a Commercial Kit: Fit Testing Which Can Be Performed at Home and at Work.’ Disaster Medicine and Public Health Preparedness (2020). DOI: 10.1017/dmp.2020.352


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Remdesivir Likely To Be Highly Effective Antiviral Against SARS-CoV-2 For Some Patients

Creative rendition of SARS-COV-2 virus particles
source: www.cam.ac.uk

 

The drug remdesivir is likely to be a highly effective antiviral against SARS-CoV-2, according to a new study by a team of UK scientists. Writing in Nature Communications, the researchers describe giving the drug to a patient with COVID-19 and a rare immune disorder, and observing a dramatic improvement in his symptoms and the disappearance of the virus.

 

Our patient’s unusual condition gave us a rare insight into the effectiveness of remdesivir as a treatment for coronavirus infection

Nicholas Matheson

The response to the COVID-19 pandemic has been hampered by the lack of effective antiviral drugs against SARS-CoV-2, the coronavirus that causes the disease. Scientists had pinned hope on the drug remdesivir, originally developed to treat hepatitis C and subsequently tested against Ebola. However, results from large clinical trials have been inconclusive, and in early October the World Health Organization (WHO) announced that the drug did not significantly reduce mortality rates. The question is more complicated, however, and a clinical team have now used a different approach to determine the effects of the drug on COVID-19 in a closely monitored patient.

Dr James Thaventhiran from the MRC Toxicology Unit at the University of Cambridge said: “There have been different studies supporting or questioning remdesivir’s effectiveness, but some of those conducted during the first wave of infection may not be optimal for assessing its antiviral properties.

“Mortality is due to a combination of factors, likely including unchecked viral replication and, importantly, the response of the immune system. A clinical trial that looks only at remdesivir’s impact on mortality will have difficulty distinguishing between these two factors. This limits our ability to ask the simple question: how good is remdesivir as an antiviral?”

To answer this question, a team led by scientists at the University of Cambridge and Barts Health NHS Trust examined the case of a 31 year old man with XLA, a rare genetic condition that affects the body’s ability to produce antibodies and hence fight infection.

The patient’s illness began with fever, cough, nausea and vomiting, and on day 19 he tested positive for SARS-CoV-2. His symptoms persisted and on day 30 he was admitted to hospital, where he was given supplemental oxygen due to breathing difficulties.

Unusually, his fever and inflammation of the lungs persisted for longer than 30 days, but without causing severe breathing problems or spreading to other organs. The researchers say this may have been due to his inability to produce antibodies – although antibodies fight infection, they can also cause damage to the body and even lead to severe disease.

At first, the patient was treated with hydroxychloroquine and azithromycin, which had little effect, and the treatments were stopped on day 34. The patient then commenced a ten-day course of remdesivir. Within 36 hours, his fever and shortness of breath had improved and his nausea and vomiting ceased. Rising oxygen saturation allowed him to be taken off supplemental oxygen.

This dramatic clinical response was accompanied by a progressive decrease in levels of C-reactive protein (CRP), a substance produced by the liver in response to inflammation. At the same time, doctors saw an increase in the number of his immune cells known as lymphocytes, and chest scans showed that his lung inflammation was clearing. The patient was discharged on day 43.

A week after discharge, the patient’s fever, shortness of breath and nausea returned. He was readmitted to hospital on day 54 and given supplemental oxygen. He again tested positive for SARS-CoV-2, was found to have lung inflammation, and his CRP levels had increased and his lymphocyte count fallen.

On day 61, the patient began treatment with a further ten-day course of remdesivir. Once again, his symptoms improved rapidly, his fever dropped and he was taken off supplemental oxygen. His CRP and lymphocyte count normalised. Following additional treatment with convalescent plasma on days 69 and 70, he was discharged three days later and is no longer symptomatic.

The team found that the patient’s virus levels fell progressively during his first course of remdesivir, corresponding with the improvement in his symptoms. His virus levels increased again, as did his symptoms, when the first course of the treatment ceased, but the effect of the second course of remdesivir was even more rapid and complete. By day 64, he was no longer testing positive for the coronavirus.

The patient’s inability to clear his infection without antiviral medication is very likely to be due to his lack of antibodies, say the researchers. However, there are other immune cells that contribute to fighting infection, including those known as CD8+ T cells. The team observed that the patient was able to produce CD8+ T cells that responded to the ‘spike protein’ on the surface of the virus – spike proteins give the virus its characteristic crown profile (hence the name coronavirus). While insufficient to clear the infection spontaneously, this likely contributed to the clearance of virus during the second course of remdesivir.

Dr Nicholas Matheson from the Cambridge Institute of Therapeutic Immunology and Infectious Disease (CITIID) at the University of Cambridge added: “Our patient’s unusual condition gave us a rare insight into the effectiveness of remdesivir as a treatment for coronavirus infection. The dramatic response to the drug – on repeated challenge – suggests that it can be a highly effective treatment, at least for some patients.”

The team further suspect that remdesivir is likely to be most beneficial when administered early in infection, before the virus is able to trigger a potentially catastrophic immune response. They say that the course of their patient’s disease also underscores the important – but often conflicting – roles that antibodies play in protecting us from infection.

“The fact that our patient was unable to fight off the disease without treatment suggests that antibodies contribute to the control of SARS-CoV-2,” explained Dr Matthew Buckland from the Department of Clinical Immunology, Barts Health, London. “But this lack of antibodies may also have prevented his COVID-19 from becoming life-threatening, because he had no antibodies to trigger a damaging immune response.

“All of this suggests that treatments will need to be tailored for individual patients, depending on their underlying condition – for example, whether it is the virus that is causing the symptoms, or the immune response. The extended viral monitoring in our study was clinically necessary because in April 2020 we didn’t know if this drug would be effective. Adopting this approach more widely could further clarify how best to use remdesivir for clinical benefit.”

The research was supported by the Medical Research Council, the NIHR Bioresource, NHS Blood and Transplant, Wellcome and the European Union’s Horizon 2020 programme.

Reference
Buckland, MS et al. Successful treatment of COVID-19 with remdesivir in the absence of humoral immunity, a case report. Nat Comms; 14 Dec 2020; DOI: 10.1038/s41467-020-19761-2


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.


Read this next

No Deal Brexit Could Have Detrimental Impact For Four Million People In UK Living With a Rare Disease

Union Flag and EU Flag
source: www:cam.ac.uk

 

Experts have warned that a ‘no deal’ Brexit will result in the exclusion of the UK from the 24 European Reference Networks (ERNs) that were established to improve the care of patients bearing the lifelong burden of a rare disease, which require highly specialised diagnosis and treatment.

 

Rare diseases are rare, and experts are rarer still. European Reference Networks were set up because no single country has the expertise or resources to cover all of the known rare diseases, which number in the thousands

Mark Tischkowitz

One in 17 UK citizens lives with a rare disease, which are defined as conditions that affect fewer than one in 2,000 people in the general population. A group of experts has written to The Lancet highlighting their concerns about the detrimental impact a no deal Brexit will have on these individuals.

“Rare diseases are rare, and experts are rarer still,” said Dr Marc Tischkowitz from the University of Cambridge, who helped coordinate the letter. “European Reference Networks were set up because no single country has the expertise or resources to cover all of the known rare diseases, which number in the thousands. They’ve played a pivotal role in harnessing the collective knowledge across the continent and in developing sustainable healthcare to treat those affected.”

The UK has been at the forefront of the creation and development of these virtual networks, which involve healthcare providers across Europe. As a result, write the experts, it has been able “to reap the benefits of closer collaboration with experts and patient advocates throughout Europe”.

The ERNs have made it much easier to develop guidelines, create disease registries, build research collaborations, and create new education and training programmes. Crucially, they have directly improved patient care by establishing a pan-European platform that brings international experts together to advise on patient-specific complex problem and therapeutic options where insufficient expertise exists in one country alone.

Dr Tischkowitz added: “Leaving the EU without an agreement on UK participation in the Networks means we potentially write off years of progress made by UK clinicians, researchers and patient advocates, while also reducing access to clinical trials and funding. Most importantly, it will diminish our ability to provide the best care for the millions of children and adults with rare diseases and complex conditions in the future.”

The letter has a total of 73 signatories, including 20 signatories each representing a patient support group and 53 signatories from senior clinicians and researchers who are currently members of a European Reference Network and who will be removed from the networks as of 1 January if no agreement is reached.

Allison Watson co-founded Ring20, a charity that supports people living with ring chromosome 20 Syndrome, an ultra-rare disease that affects her young adult son. She is also a co-lead for the EpiCARE ERN for rare and complex epilepsies.

“I have been hugely encouraged by the change that being part of an ERN can bring, for people like my son and many others living with ultra-rare diseases,” said Watson. “I believe we would not have managed this working with just UK rare disease organisations.”

Initiatives already delivered through the EpiCARE ERN include heightened awareness of rare epilepsies (including ring chromosome 20 Syndrome) across the 28 EpiCARE centres, long overdue Orphanet updates, increased information and education to healthcare practitioners and patient families in the form of leaflets and patient journeys, plus updated Clinical Practice Guidelines which aim to simplify and speed up diagnosis and improve care through understanding the unmet needs.

Watson added: “With thousands of rare diseases, many of them ultra-rare where only a handful of people living in the UK are affected, is it cost-effective or even possible that the UK can deliver effective services and research alone for these people alone? I believe only through collaboration with our European partners and others around the world can we truly meet the needs of the affected and ultimately improve their outcomes and quality of life.”

Beverley Power, chair of CDH UK, the congenital diaphragmatic hernia support charity, says that one of the main barriers to research within the field of rare diseases is access to patients and patient data.

“Since joining the ERNICA European Reference Network, the access to patients and data has become broader for the UK and the rest of Europe,” she explained. “It has enabled charities like CDH UK to better understand other healthcare settings and to be able to signpost newly diagnosed parents and patients with ongoing medical needs in a much better direction. It has also introduced new and innovative ways to collaborate in order to effect better outcomes and quality of life for patients and their families, which ultimately can potentially impact the economic implications of treating rare diseases in the UK and overseas.”

Reference
Tischkowitz, M et al. A no-deal Brexit will be detrimental to people with rare diseases. Lancet; 12 Dec 2020; DOI: 10.1016/S0140-6736(20)32631-3

Correspondence pieces represent the views of the authors and not necessarily the views of The Lancet or any Lancet specialty journal. Unlike Articles containing original research, this Correspondence was not externally peer reviewed.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Gene Therapy Injection In One Eye Surprises Scientists By Improving Vision In Both

A young man's eye
source: www.cam.ac.uk

 

Injecting a gene therapy vector into one eye of someone suffering from LHON, the most common cause of mitochondrial blindness, significantly improves vision in both eyes, scientists have found.

 

Saving sight with gene therapy is now a reality

Patrick Yu-Wai-Man

In a landmark phase 3 clinical trial, the international team, coordinated by Dr Patrick Yu-Wai-Man from the University of Cambridge and Dr José-Alain Sahel from the University of Pittsburgh and Institut de la Vision, Paris, successfully treated 37 patients suffering from Leber hereditary optic neuropathy (LHON). Subject to further trials, the treatment could help thousands of people across the world to regain and retain some of their sight.

The study, published today in the journal Science Translational Medicine, indicates that 78% of treated patients experienced significant visual improvement in both eyes. It suggests that the improvement in vision in untreated eyes could be due to the transfer of viral vector DNA from the injected eye.

LHON affects a specific type of retinal cells, known as retinal ganglion cells, causing optic nerve degeneration and rapidly worsening vision in both eyes. Within a few weeks of disease onset, the vision of most people affected deteriorates to levels at which they are considered legally blind. Visual recovery occurs in less than 20% of cases and few achieve vision better than 20/200 (largest letter on a standard eye chart). LHON affects approximately 1 in 30,000 people, mostly men, with symptoms usually emerging in their 20s and 30s. The majority of patients carry the m.11778G>A mutation in the MT-ND4 gene. Existing treatment for this blinding optic neuropathy remains limited.

“As someone who treats these young patients, I get very frustrated about the lack of effective therapies,” said senior investigator Dr Sahel, a professor of ophthalmology at the University of Pittsburgh. “These patients rapidly lose vision in the course of a few weeks to a couple of months. Our study provides a big hope for treating this blinding disease in young adults.”

The researchers injected rAAV2/2-ND4 – a viral vector containing modified cDNA – into the vitreous cavity at the back of one eye of 37 patients who had suffered vision loss for between 6 to 12 months. Their other eye received a sham injection. The technology, called mitochondrial targeting, was developed by the Institut de la Vision in Paris, France, and licensed to GenSight Biologics.

International coordinating investigator and neuro-ophthalmologist Dr Yu-Wai-Man, from Cambridge’s Department of Clinical Neurosciences and Moorfields Eye Hospital, London, said: “We expected vision to improve in the eyes treated with the gene therapy vector only. Rather unexpectedly, both eyes improved for 78% of patients in the trial following the same trajectory over 2 years of follow-up.”

Treated eyes showed a mean improvement in best-corrected visual acuity (BCVA) of 15 letters on an ETDRS chart, representing three lines of vision, while a mean improvement of 13 letters was observed in the sham treated eyes. As some patients were still in the dynamic phase of the disease process upon enrolment, the visual gain from the nadir (worst BCVA for each eye) was even larger, reaching 28.5 letters for the treated eyes and 24.5 letters for sham-treated eyes.

Dr Yu-Wai-Man said: “By replacing the defective MT-ND4 gene, this treatment rescues the retinal ganglion cells from the destructive effects of the m.11778G>A mutation, preserving function and improving the patient’s visual prognosis. The outcomes can be life-changing.”

The researchers found that treated eyes were around three times more likely to achieve vision better than or equal to 20/200. Patient-reported outcome measures evaluated using the National Eye Institute Visual Function Questionnaire-25 (NEI VFQ-25) also confirmed the positive impact of treatment on quality of life and psychosocial well-being.

The researchers then conducted a study in cynomolgus macaques to investigate how the treatment of one eye could bring about improvement in the other. Macaques have a visual system similar to that of humans, which allows the distribution and effects of the gene therapy vector to be studied in much greater detail. A unilateral injection of rAAV2/2-ND4 was administered and after three months, tissues from various parts of the eye and the brain were analysed to detect and quantify the presence of viral vector DNA using a transgene-specific quantitative PCR assay.

Viral vector DNA was detected in the anterior segment, retina and optic nerve of the untreated eye. The unexpected visual improvement observed in the untreated eyes could therefore reflect the interocular diffusion of rAAV2/2-ND4. Further investigations are needed to confirm these findings and whether other mechanisms are contributing to this bilateral improvement.

Dr Yu-Wai-Man said: “Saving sight with gene therapy is now a reality. The treatment has been shown to be safe and we are currently exploring the optimal therapeutic window.”

“Our approach isn’t just limited to vision restoration,” added Dr Sahel. “Other mitochondrial diseases could be treated using the same technology.”

 

Notes

rAAV2/2-ND4 (GS010) is a recombinant replication-defective adeno-associated virus, serotype 2, which contains a modified cDNA encoding the human wild-type mitochondrial ND4 protein and a specific mitochondrial targeting sequence (MTS) for directing the protein to the mitochondrial compartment.

The ETDRS chart consists of rows of 5 letters each and it is used to measure visual acuity.

 

Funding

The study was fully funded and sponsored by GenSight Biologics.

 

Reference

Patrick Yu-Wai-Man et al., ‘Bilateral Visual Improvement with Unilateral Gene Therapy Injection for Leber Hereditary Optic Neuropathy’; Science Translational Medicine (9 December 2020). DOI: 10.1126/scitranslmed.aaz7423


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms

Cambridge Researchers Awarded European Research Council Funding

European flags outside EU in Belgium
source: www.cam.ac.uk

 

Five researchers at the University of Cambridge have won consolidator grants from the European Research Council (ERC), Europe’s premiere funding organisation for frontier research.

 

Three hundred and twenty-seven mid-career researchers were today awarded Consolidator Grants by the ERC, totalling €655 million. The UK has 50 grantees in this year’s funding round. The funding is part of the EU’s current research and innovation programme, Horizon 2020.

The ERC Consolidator Grants are awarded to outstanding researchers of any nationality and age, with at least seven and up to 12 years of experience after PhD, and a scientific track record showing great promise.

The research projects proposed by the new grantees cover a wide range of topics in physical sciences and engineering, life sciences, as well as social sciences and humanities.

From the University of Cambridge, the following researchers were named as grantees:

 

Vasco Carvalho, Professor of Macroeconomics and Director of Cambridge-INET

Project title: Micro Structure and Macro Outcomes.

 

Professor Tuomas Knowles of the Yusuf Hamied Department of Chemistry

Project title: Digital Protein Biophysics of Aggregation.

 

Dr Neel Krishnaswami of the Computer Laboratory

Project title: Foundations of Type Inference for Modern Programming Languages.

 

Dr Kaisey Mandel of the Institute of Astronomy and the Statistical Laboratory of the Department of Pure Mathematics and Mathematical Statistics.

Project title: Next-Generation Data-Driven Probabilistic Modelling of Type Ia Supernova SEDs in the Optical to Near-Infrared for Robust Cosmological Inference.

 

Professor Silvia Vignolini of the Yusuf Hamied Department of Chemistry

Project title: Sym-Bionic Matter: developing symbiotic relationships for light-matter interaction.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.