All posts by Admin

Farmed Carnivores May Become ‘Disease Reservoirs’ Posing Human Health Risk

Mink farm
source: www.cam.ac.uk

 

Carnivorous animals lack key genes needed to detect and respond to infection by pathogens, a study has found.

 

We’ve found that a whole cohort of inflammatory genes is missing in carnivores

Clare Bryant

Farming large numbers of carnivores, like mink, could allow the formation of undetected ‘disease reservoirs’, in which a pathogen could spread to many animals and mutate to become a risk to human health.

Research led by the University of Cambridge has discovered that carnivores have a defective immune system, which makes them likely to be asymptomatic carriers of disease-causing pathogens.

Three key genes in carnivores that are critical for gut health were found to have lost their function. If these genes were working, they would produce protein complexes called inflammasomes to activate inflammatory responses and fight off pathogens. The study is published today in the journal Cell Reports.

The researchers say that the carnivorous diet, which is high in protein, is thought to have antimicrobial properties that could compensate for the loss of these immune pathways in carnivores – any gut infection is expelled by the production of diarrhoea. But the immune deficiency means that other pathogens can reside undetected elsewhere in these animals.

“We’ve found that a whole cohort of inflammatory genes is missing in carnivores – we didn’t expect this at all,” said Professor Clare Bryant in the University of Cambridge’s Department of Veterinary Medicine, senior author of the paper.

She added: “We think that the lack of these functioning genes contributes to the ability of pathogens to hide undetected in carnivores, to potentially mutate and be transmitted becoming a human health risk.”

Zoonotic pathogens are those that live in animal hosts before jumping to infect humans. The COVID-19 pandemic, thought to originate in a wild animal, has shown the enormous damage that can be wrought by a novel human disease. Carnivores include mink, dogs, and cats, and are the biggest carriers of zoonotic pathogens.

Three genes appear to be in the process of being lost entirely in carnivores: the DNA is still present but it is not expressed, meaning they have become ‘pseudogenes’ and are not functioning. A third gene important for gut health has developed a unique mutation, causing two proteins called caspases to be fused together to change their function so they can no longer respond to some pathogens in the animal’s body.

“When you have a large population of farmed carnivorous animals, like mink, they can harbour a pathogen – like SARS-CoV-2 and others – and it can mutate because the immune system of the mink isn’t being activated. This could potentially spread into humans,” said Bryant.

The researchers say that the results are not a reason to be concerned about COVID-19 being spread by dogs and cats. There is no evidence that these domestic pets carry or transmit COVID-19. It is when large numbers of carnivores are kept together in close proximity that a large reservoir of the pathogen can build up amongst them, and potentially mutate.

This research was funded by Wellcome.

Reference
Digby, Z. et al: ‘Evolutionary loss of inflammasomes in the Carnivora and implications for the carriage of zoonotic infections.’ Cell Reports, August 2021. DOI: 10.1016/j.celrep.2021.109614


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Female Scientists Lead Cambridge Success In Royal Society Awards

Professor Dame Jocelyn Bell Burnell
source: www.cam.ac.uk

 

Professor Dame Jocelyn Bell Burnell has become only the second woman to be awarded the Royal Society’s prestigious Copley Medal, the world’s oldest scientific prize.

 

I hope there will be many more female Copley winners in the near future

Jocelyn Bell Burnell

Bell Burnell is one of twelve former and current Cambridge researchers, including six women, to be recognised in 2021 for their exceptional research and outstanding contributions to science.

Dame Jocelyn has been honoured for her work on the discovery of pulsars in the 1960s while she was a postgraduate student at New Hall (now Murray Edwards College) carrying out research at Cambridge’s Cavendish Laboratory.

Past winners of the Copley Medal have included Charles Darwin, Albert Einstein and Dorothy Crowfoot Hodgkin. Dame Jocelyn said: “I am delighted to be the recipient of this year’s Copley Medal, a prize which has been awarded to so many incredible scientists.

“With many more women having successful careers in science, and gaining recognition for their transformational work, I hope there will be many more female Copley winners in the near future.

“My career has not fitted a conventional – male – pattern. Being the first person to identify pulsars would be the highlight of any career; but I have also swung sledgehammers and built radio telescopes; set up a successful group of my own studying binary stars; and was the first female president of the Institute of Physics and of the Royal Society of Edinburgh.

“I hope that my work and presence as a senior woman in science continues to encourage more women to pursue scientific careers”.

The Copley Medal award includes a £25,000 gift which Dame Jocelyn will add to the Institute of Physics’ Bell Burnell Graduate Scholarship Fund, which provides grants to graduate students from under-represented groups in physics.

Three female scientists currently working at Cambridge have been recognised in 2021. Professor Sadaf Farooqi from the MRC Metabolic Diseases Unit receives the Croonian Medal and Lecture, together with Sir Stephen O’Rahilly, for their seminal discoveries regarding the control of human body weight, resulting in novel diagnostics and therapies, which improve human health.

Dr Serena Nik-Zainal from the MRC Cancer Unit has been awarded the Francis Crick Medal and Lecture, for her contributions to understanding the aetiology of cancers by her analyses of mutation signatures in cancer genomes, which is now being applied to cancer therapy.

Professor Anne Ferguson-Smith from the Department of Genetics and currently the University’s Pro-Vice-Chancellor for Research receives the Buchanan Medal, for her pioneering work in epigenetics, her interdisciplinary work on genomic imprinting, the interplay between the genome and epigenome, and how genetic and environmental influences affect development and human diseases.

Former Cavendish Laboratory Research Fellow, Professor Michelle Simmons, has won the Bakerian Medal and Lecture, for her seminal contributions to our understanding of nature at the atomic-scale by creating a sequence of world-first quantum electronic devices in which individual atoms control device behaviour.

Professor Frances Kirwan, alumna and Honorary Fellow of Clare College, received the Sylvester Medal, for her research on quotients in algebraic geometry, including links with symplectic geometry and topology, which has had many applications.

Other current Cambridge researchers honoured include Dr Sjors Scheres from the MRC Laboratory of Molecular Biology. Scheres has been awarded the Leeuwenhoek Medal and Lecture for his ground-breaking contributions and innovations in image analysis and reconstruction methods in electron cryo-microscopy, enabling the structure determination of complex macromolecules of fundamental biological and medical importance to atomic resolution.

Emeritus Professor Michael Green from the Department of Applied Mathematics and Theoretical Physics has been awarded Royal Medal A for crucial and influential contributions to the development of string theory over a long period, including the discovery of anomaly cancellation.

The Royal Society’s President, Sir Adrian Smith, said: “Through its medals and awards the Royal Society recognises those researchers and science communicators who have played a critical part in expanding our understanding of the world around us.”

“From advancing vaccine development to catching the first glimpses of distant pulsars, these discoveries shape our societies, answer fundamental questions and open new avenues for exploration.”

Find the full list of 2021 Royal Society medal, award and prize winners here.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Hospital-Acquired COVID-19 Tends To Be Picked Up From Other Patients, Not From Healthcare Workers

 

The majority of patients who contracted COVID-19 while in hospital did so from other patients rather than from healthcare workers, concludes a new study from researchers at the University of Cambridge and Addenbrooke’s Hospital.

 

The fact that the vast majority of infections were between patients suggests that measures taken by hospital staff to prevent staff transmitting the virus to patients, such as the wearing of masks, were likely to have been effective

Chris Illingworth

The study provides previously unprecedented detail on how infections might spread in a hospital context, showing that a minority of individuals can cause most of the transmission.

The researchers analysed data from the first wave of the pandemic, between March and June 2020.  While a great deal of effort is made to prevent the spread of viruses within hospital by keeping infected and non-infected individuals apart, this task is made more difficult during times when the number of infections is high. The high level of transmissibility of the virus and the potential for infected individuals to be asymptomatic both make this task particularly challenging.

Looking back at data from the first wave, researchers identified five wards at Addenbrooke’s Hospital, part of Cambridge University Hospitals (CUH) NHS Foundation Trust, where multiple individuals, including patients and healthcare workers, tested positive for COVID-19 within a short space of time, suggesting that a local outbreak might have occurred.

Using new statistical methods that combine viral genome sequence data with clinical information about the locations of individuals, the researchers identified cases where the data were consistent with transmission occurring between individuals in the hospital. Looking in detail at these transmission events highlighted patterns in the data.

The results of the study, published today in eLife, showed that patients who were infected in the hospital were mostly infected by other patients, rather than by hospital staff. Out of 22 cases where patients were infected in hospital, 20 of these were the result of the virus spreading from patients to other patients

Dr Chris Illingworth, a lead author on the study, who carried out his research while at Cambridge’s MRC Biostatistics Unit, said: “The fact that the vast majority of infections were between patients suggests that measures taken by hospital staff to prevent staff transmitting the virus to patients, such as the wearing of masks, were likely to have been effective.

“But it also highlights why it is important that patients themselves are screened for COVID-19 regularly, even if asymptomatic, and wear face masks where possible.”

The study found contrasting results among healthcare workers, who were almost as likely to be infected by patients as they were by other healthcare workers. This was one piece of evidence that motivated the decision to upgrade the respiratory protection worn by healthcare workers in COVID-19 wards at CUH. A recent Cambridge study indicated that this resulted in staff being better protected against catching COVID-19.

The researchers also found a trend towards individuals either infecting no one else, or infecting multiple other people – just over a fifth of patients (21%) caused 80% of the infections. This phenomenon is sometimes called ‘superspreading’ and can make infection control very challenging. Whether or not an individual can be identified in advance as being more or less likely to pass on the virus is an ongoing topic of research.

Dr William Hamilton, an infectious diseases clinician at CUH and co-lead author on the study said: “Preventing new cases of hospital-based infection is a critical part of our work.  Here we have shown that analysing clinical and viral genome sequence data can produce insights that inform infection control measures, which are so important for protecting patients and healthcare workers alike.”

The research was funded by COG-UK, Wellcome, the Academy of Medical Sciences, the Health Foundation and the NIHR Cambridge Biomedical Research Centre.

Reference
Illingworth, CJR & Hamilton, WL et al. Superspreaders drive the largest outbreaks of hospital onset COVID-19 infections. eLife; 24 Aug 2021; DOI: 10.7554/eLife.67308


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

10,000 Autistic People To Take Part in the UK’s Largest Study of Autism

Graphic of family sitting on sofa

source: www.cam.ac.uk

An ambitious new research project, Spectrum 10K, launches today and will recruit 10,000 autistic individuals, as well as their relatives, living in the UK.

 

There is an urgent need to better understand the wellbeing of autistic individuals. Spectrum 10K hopes to answer questions such as why some autistic people have epilepsy or poor mental health outcomes and others do not

Simon Baron-Cohen

Spectrum 10K is led by researchers at the world-leading Autism Research Centre (ARC), the University of Cambridge, together with the Wellcome Sanger Institute and University of California Los Angeles (UCLA) and will study how biological and environmental factors impact on the wellbeing of autistic individuals.

In the UK, there are approximately 700,000 autistic individuals. The level of support needed by autistic individuals varies considerably. Many autistic people have additional physical health conditions such as epilepsy, or mental health conditions such as anxiety or depression.

It is unclear what gives rise to the diversity within the autism spectrum or why some autistic people have better outcomes than others. The project aims to answer this question and to identify what support works best for each individual.

Professor Simon Baron-Cohen, leading Spectrum 10K and Director of the ARC, explained: “There is an urgent need to better understand the wellbeing of autistic individuals. Spectrum 10K hopes to answer questions such as why some autistic people have epilepsy or poor mental health outcomes and others do not.”

Individuals of all ages, genders, ethnicities and intellectual capacities will take part in Spectrum 10K. Eligible participants join by completing an online questionnaire and providing a DNA saliva sample by post. Autistic participants involved in Spectrum 10K can also invite their biological relatives (autistic or otherwise) to participate. Information collected from the questionnaire and DNA saliva sample, and information from health records will be used to increase knowledge and understanding of wellbeing in autism.

Dr James Cusack, CEO of the autism research charity Autistica and an autistic person, said: “We are delighted to support Spectrum 10K. This project enables autistic people to participate in and shape autism research to build a future where support is tailored to every individual’s needs.”

The Spectrum 10K team views autism as an example of neurodiversity and is opposed to eugenics or looking for a cure for preventing or eradicating autism itself.  Instead, their research aims to identify types of support and treatment which alleviate unwanted symptoms and co-occurring conditions that cause autistic people distress.

The Spectrum 10K team collaborates with an Advisory Panel consisting of autistic individuals, parents of autistic children, clinicians, and autism charity representatives to ensure Spectrum 10K is designed in a way that best serves the autistic community. 27 specialist NHS sites around the UK are also helping with recruitment for Spectrum 10K.

Dr Venkat Reddy, Consultant Neurodevelopmental Paediatrician in the Community Child Health Services at Cambridgeshire and Peterborough NHS Foundation Trust, said: “There is a need to conduct further research into autism and co-occurring conditions to enable researchers and clinicians to build a better understanding of autism. I would encourage autistic individuals and their families to consider taking part in Spectrum 10K.”

Chris Packham, naturalist and TV presenter who is also autistic, said: “I’m honoured to be an ambassador of Spectrum 10K because I believe in the value of science to inform the support services that autistic kids and adults will need.”

Paddy McGuinness, actor, comedian, television presenter, and father of three autistic children, said: “As a parent of three autistic children, I am really excited to support Spectrum 10K. This research is important to help us understand what makes every autistic person different, and how best to support them.”

Dr Anna and Alastair Gadney, parents of a teenager with autism and learning difficulties: “We have been exploring, over many years, how to implement the best support for our son. We wholeheartedly endorse Spectrum 10K and hope our involvement can help increase understanding of autism and in-turn support many families out there.”

Recruitment for Spectrum 10K is now open. Autistic children under the age of 16 must be registered by their parent or legal guardian. Autistic adults who lack the capacity to consent by themselves must be registered by a carer/or family member. To register, participants should visit www.spectrum10k.org


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Europe-Wide Political Divide Emerging Between Cities and Countryside – Study

source: www.cam.ac.uk

 

“Geography of disillusion” poses a major challenge for democratic countries across the continent, according to researchers.

 

As disenchantment rises in European hinterlands, democratic politics risks being eroded from within

Davide Luca

new study reveals the extent of the political divide opening up between city and countryside right across Europe, with research suggesting that political polarisation in the 21st century may have a lot to do with place and location.

University of Cambridge researchers analysed survey data collected between 2002 and 2018 to gauge the social and civic attitudes of people across the cities, towns and rural areas of 30 European countries.

The findings show that political division throughout the continent runs on a ‘gradient’ of disenchantment and distrust in democracy that increases as it moves from urban centres through suburbs, towns, villages and out into open country.

People in the more rural parts of Europe have the lowest levels of trust in their nation’s current political system – and yet are significantly more likely than their urban counterparts to actually vote in elections.

Those in suburbs, followed by towns and then the countryside, are increasingly more likely to see themselves as politically conservative, and hold anti-immigration and anti-EU views, while city dwellers lean towards the left.

However, it’s not the poorest rural areas where disillusion is strongest, and small town and countryside dwellers report much higher levels of life satisfaction while voicing dissatisfaction with democratic institutions.

Researchers from Cambridge’s Bennett Institute for Public Policy and Department of Land Economy say the study suggests a ‘deepening geographical fracture’ in European societies that could see a return to the stark urban-rural political divides of the early 20th century.

“Those living outside of Europe’s major urban centres have much less faith in politics,” said study co-author Professor Michael Kenny from the Bennett Institute.

“The growth of disenchantment in more rural areas has provided fertile soil for nationalist and populist parties and causes – a trend that looks set to continue.”

“Mainstream politicians seeking to re-engage residents of small towns and villages must provide economic opportunities, but they also need to address feelings of disconnection from mainstream politics and the changes associated with a more globalised economy,” he said.

Across Western Europe, residents of rural areas are on average 33.5% more likely to vote than those in inner cities, but 16% less likely to report a one-unit increase in their trust of political parties on a scale of 0-10. They are also far less likely to engage in political actions such as protests and boycotts.

Conservatism incrementally increases as locations shift from suburb to town to the countryside. Europeans in rural places are an average of 57% more likely to feel one point closer to the right on the political spectrum (on a ten-point scale where five is the centre ground) than a city dweller.

When asked if migration and the EU ‘enrich the national culture’, rural Europeans are 55% more likely than those in cities to disagree by one unit on a ten-unit scale.

However, on issues of the welfare state and trust in police – both iconic in post-war rhetorical battles between left and right – no urban-rural divisions were detected. “Worries about law and welfare may no longer be key to Europe’s political geography in our new populist age,” said Kenny.

Last year, research from the Bennett Institute revealed a global decline in satisfaction with democracy, and the latest study suggests that – in Europe, at least – this is most acute in rural locations.

After discounting characteristics typically thought to influence political attitudes, from education to age, the researchers still found that people in rural housing were 10% more likely than urbanites to report a one unit drop in democratic satisfaction (on a scale of 0-10).

“We find that there is a geography to current patterns of political disillusion,” said Dr Davide Luca of the Land Economy Department, co-author of the study now published in the Cambridge Journal of Regions, Economy and Society.

“As disenchantment rises in European hinterlands, democratic politics risks being eroded from within by people who engage with elections yet distrust the system and are drawn to populist, anti-system parties.”

Of the 30 nations they looked at – the EU27 plus Norway, Switzerland and the UK – France had the sharpest urban-rural divide in political attitudes. “Large cities such as Paris and Lyon are seen to be highly globalised and full of bohemians nicknamed the ‘bobos’, while small towns and rural areas are primarily inhabited by long-term immigrants and the indigenous working classes,” Luca said.

While less pronounced across the Channel, the trend is still very much in evidence in the UK. “Cambridge is a prime example,” explains Luca. “The centre hosts the world’s leading labs and companies, yet greater Cambridge is one of the UK’s least equal cities – and the fenland market towns are even more disconnected from the city’s hyper-globalised core.”

Added Luca: “Ageing populations in small towns and villages combined with years of austerity have put pressure on public services in rural areas – services that are often central to the social connections needed for a community to thrive.

“Reviving these services may be key to reducing the political divides emerging between urban and rural populations across Europe.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Ageing Cuttlefish Can Remember the Details of Last Week’s Dinner

 

Cuttlefish can remember what, where, and when specific things happened – right up to their last few days of life, researchers have found.

 

The old cuttlefish were just as good as the younger ones in the memory task

Alexandra Schnell

The results, published today in the journal Proceedings of the Royal Society B, are the first evidence of an animal whose memory of specific events does not deteriorate with age.

Researchers from the University of Cambridge, the Marine Biological Laboratory in Woods Hole, Massachusetts, and the University of Caen, conducted memory tests on 24 common cuttlefish, Sepia officinalis. Half of these were 10-12 months old – not-quite adult, and the other half were in old age at 22-24 months – equivalent to humans in their 90s.

“Cuttlefish can remember what they ate, where and when, and use this to guide their feeding decisions in the future. What’s surprising is that they don’t lose this ability with age, despite showing other signs of ageing like loss of muscle function and appetite,” said Dr Alexandra Schnell in the University of Cambridge’s Department of Psychology, first author of the paper.

As humans age, we gradually lose the ability to remember experiences that happened at particular times and places – for example, what we had for dinner last Tuesday. This is termed ‘episodic memory’, and its decline is thought to be due to deterioration of a part of the brain called the hippocampus.

Cuttlefish do not have a hippocampus, and their brain structure is dramatically different to ours. The ‘vertical lobe’ of the cuttlefish brain is associated with learning and memory. This does not deteriorate until the last two to three days of the animal’s life, which the researchers say could explain why episodic-like memory is not affected by age in cuttlefish.

To conduct the experiment, the cuttlefish were first trained to approach a specific location in their tank marked with a black and white flag. Then they were trained to learn that two foods they commonly eat were available at specific flag-marked locations and after specific delays. At one spot, the flag was waved and a piece of king prawn, their less preferred food, was provided. Live grass shrimp, which they like more, was provided at a different spot where another flag was also waved – but only every three hours. This was repeated for four weeks.

Then the cuttlefishes’ recall of which food would be available, where, and when was tested. To make sure they hadn’t just learned a pattern, the two feeding locations were unique each day. All the cuttlefish – regardless of age – watched which food first appeared at each flag and used that to work out which feeding spot was best at each subsequent flag-waving. This suggests that episodic-like memory does not decline with age in cuttlefish, unlike in humans.

“The old cuttlefish were just as good as the younger ones in the memory task – in fact, many of the older ones did better in the test phase. We think this ability might help cuttlefish in the wild to remember who they mated with, so they don’t go back to the same partner,” said Schnell.

Cuttlefish only breed at the end of their life. By remembering who they mated with, where, and how long ago, the researchers think this helps the cuttlefish to spread their genes widely by mating with as many partners as possible.

Cuttlefish have short lifespans – most live until around two years old – making them a good subject to test whether memory declines with age. Since it is impossible to test whether animals are consciously remembering things, the authors used the term ‘episodic-like memory’ to refer to the ability of cuttlefish to remember what, where and when specific things happened.

This research was funded by the Royal Society and the Grass Foundation.

Reference
Schnell, AK et al: ‘Episodic-like memory is preserved with age in cuttlefish.’ Proceedings of the Royal Society B, August 2021. DOI: 10.1098/rspb.2021.1052


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Worsening GP Shortages In Disadvantaged Areas Likely To Widen Health Inequalities

Worsening GP shortages in disadvantaged areas likely to widen health inequalities

GP
source: www.cam.ac.uk

 

Areas of high socioeconomic disadvantage are being worst hit by shortages of GPs, a trend that is only worsening with time and is likely to widen pre-existing health inequalities, say researchers at the University of Cambridge.

 

The government has made reducing health inequalities a core commitment, but this will be challenging with the increasing shortage of GPs in areas of high socioeconomic disadvantage, where health needs are greatest

Claire Nussbaum

In a study published today in the BJGP Open, a team from the University of Cambridge looked at the relationship between shortages in the healthcare workforce and levels of deprivation. The team found significantly fewer full time equivalent (FTE) GPs per 10,000 patients in practices within areas of higher levels of deprivation. This inequality has widened slightly over time. By December 2020, there were on average 1.4 fewer FTE GPs per 10,000 patients in the most deprived areas compared to the least deprived areas.

The same was the case for total direct patient care staff (all patient-facing general practice staff excluding GPs and nurses), with 1.5 fewer FTE staff per 10,000 patients in the most deprived areas compared to the least deprived areas.

The lower GP numbers in deprived areas, was compensated, in part, by more nurses.

The analysis used data captured between September 2015 and December 2020 from the NHS Digital General Practice Workforce collection. They compared this workforce data against practice population sizes and levels of deprivation across England.

In addition to their report, the team have today launched an interactive dashboard that maps local-level primary care workforce inequalities to accompany the national-level analysis done in the paper. Clear local-level inequalities in GP distribution can be seen within West, North and East Cumbria, Humber, Coast and Vale, and Coventry and Warwickshire STP (Sustainability and Transformation Plan) areas, among others.

Workforce shortages, especially in primary care, have been a problem for health care systems for some time now, and the gap between the growing demand for services and sufficient staff has been widening. Although the number of consultations in general practice has been increasing, staff numbers have not kept up with demand. The number of GPs relative to the size of population has been decreasing since 2009, and the GP workforce is ageing. Doctors are increasingly working part-time, which suggests that shortages will grow steadily worse.

In 2015, then-Secretary of State for Health Jeremy Hunt promised an additional 5,000 GPs for the NHS by 2020, but this was not achieved. Instead, it is predicted that there will be a shortage of 7,000 GPs by 2024.

Dr John Ford from the Department of Public Health and Primary Care at the University of Cambridge, the study’s senior author, said: “People who live in disadvantaged regions of England are not only more likely to have long-term health problems, but are likely to find it even more difficult to see a GP and experience worse care when they see a GP. This is just one aspect of how disadvantage accumulates for some people leading to poor health and early death.

“There may be some compensation due to increasing number of other health professionals, which may partially alleviate the undersupply of GPs in more socioeconomically disadvantaged areas. But this is not a like-for-like replacement and it is unlikely to be enough.”

The researchers say there are a number of reasons that may account for why GP workforce shortages disproportionately affect practices in areas of higher deprivation. Previous studies have suggested that the primary driver of GP inequality was the opening and closing of practices in more disadvantaged areas, with practice closures increasing in recent years.

Claire Nussbaum, the study’s first author, added: “The government has made reducing health inequalities a core commitment, but this will be challenging with the increasing shortage of GPs in areas of high socioeconomic disadvantage, where health needs are greatest. The primary care staffing inequalities we observed are especially concerning, as they suggest that access to care is becoming increasingly limited where health needs are greatest.

“Addressing barriers to health care access is even more urgent in the context of COVID-19, which has widened pre-existing health and social inequities.”

The researchers say that the imbalance in recruitment of staff within primary care must be addressed by policymakers, who will need to consider why practices and networks in disadvantaged areas are relatively under-staffed, and how this can be reversed. Potential options include increased recruitment to medical school from disadvantaged areas, incentivisation of direct patient care posts in under-staffed areas, enhanced training offers for these roles, and offering practices and networks in under-staffed areas additional recruitment support.

Expanded use of additional roles under the Additional Roles Reimbursement Scheme, designed to provide financial reimbursement for Primary Care Networks to build workforce capacity, may partially alleviate GP workload in overstretched practices, but the report’s authors argue that there is a risk that additional workforce will gravitate to more affluent areas, further perpetuating inequity in primary care staffing.

Dr James Matheson, a GP at Hill Top Surgery in Oldham, said: “People living in socioeconomically disadvantaged areas shoulder a much higher burden of physical and mental health problems but have less access to the GPs who could support them towards better health. For the primary care teams looking after them this means a greater workload with fewer resources – a burnout risk which can further exacerbate the problem.

“General Practice in disadvantaged areas is challenging but also enjoyable and professionally rewarding but now, more than ever, we need to see a more equitable distribution of workforce and resources to ensure it is sustainable.”

Reference
Nussbaum, C et al. Inequalities in the distribution of the general practice workforce in England. BJGP Open; 18 Aug 2021; DOI: 10.3399/BJGPO.2021.0066


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Mathematical Model Predicts Best Way To Build Muscle

Mathematical model predicts best way to build muscle

Woman lifting weights
source: www.cam.ac.uk

 

Researchers have developed a mathematical model that can predict the optimum exercise regime for building muscle.

 

Surprisingly, not very much is known about why or how exercise builds muscles: there’s a lot of anecdotal knowledge and acquired wisdom, but very little in the way of hard or proven data

Eugene Terentjev

The researchers, from the University of Cambridge, used methods of theoretical biophysics to construct the model, which can tell how much a specific amount of exertion will cause a muscle to grow and how long it will take. The model could form the basis of a software product, where users could optimise their exercise regimes by entering a few details of their individual physiology.

The model is based on earlier work by the same team, which found that a component of muscle called titin is responsible for generating the chemical signals which affect muscle growth.

The results, reported in the Biophysical Journal, suggest that there is an optimal weight at which to do resistance training for each person and each muscle growth target. Muscles can only be near their maximal load for a very short time, and it is the load integrated over time which activates the cell signalling pathway that leads to synthesis of new muscle proteins. But below a certain value, the load is insufficient to cause much signalling, and exercise time would have to increase exponentially to compensate. The value of this critical load is likely to depend on the particular physiology of the individual.

We all know that exercise builds muscle. Or do we? “Surprisingly, not very much is known about why or how exercise builds muscles: there’s a lot of anecdotal knowledge and acquired wisdom, but very little in the way of hard or proven data,” said Professor Eugene Terentjev from Cambridge’s Cavendish Laboratory, one of the paper’s authors.

When exercising, the higher the load, the more repetitions or the greater the frequency, then the greater the increase in muscle size. However, even when looking at the whole muscle, why or how much this happens isn’t known. The answers to both questions get even trickier as the focus goes down to a single muscle or its individual fibres.

Muscles are made up of individual filaments, which are only 2 micrometres long and less than a micrometre across, smaller than the size of the muscle cell. “Because of this, part of the explanation for muscle growth must be at the molecular scale,” said co-author Neil Ibata. “The interactions between the main structural molecules in muscle were only pieced together around 50 years ago. How the smaller, accessory proteins fit into the picture is still not fully clear.”

This is because the data is very difficult to obtain: people differ greatly in their physiology and behaviour, making it almost impossible to conduct a controlled experiment on muscle size changes in a real person. “You can extract muscle cells and look at those individually, but that then ignores other problems like oxygen and glucose levels during exercise,” said Terentjev. “It’s very hard to look at it all together.”

Terentjev and his colleagues started looking at the mechanisms of mechanosensing – the ability of cells to sense mechanical cues in their environment – several years ago. The research was noticed by the English Institute of Sport, who were interested in whether it might relate to their observations in muscle rehabilitation. Together, they found that muscle hyper/atrophy was directly linked to the Cambridge work.

In 2018, the Cambridge researchers started a project on how the proteins in muscle filaments change under force. They found that main muscle constituents, actin and myosin, lack binding sites for signalling molecules, so it had to be the third-most abundant muscle component – titin – that was responsible for signalling the changes in applied force.

Whenever part of a molecule is under tension for a sufficiently long time, it toggles into a different state, exposing a previously hidden region. If this region can then bind to a small molecule involved in cell signalling, it activates that molecule, generating a chemical signal chain. Titin is a giant protein, a large part of which is extended when a muscle is stretched, but a small part of the molecule is also under tension during muscle contraction. This part of titin contains the so-called titin kinase domain, which is the one that generates the chemical signal that affects muscle growth.

The molecule will be more likely to open if it is under more force, or when kept under the same force for longer. Both conditions will increase the number of activated signalling molecules. These molecules then induce the synthesis of more messenger RNA, leading to production of new muscle proteins, and the cross-section of the muscle cell increases.

This realisation led to the current work, started by Ibata, himself a keen athlete. “I was excited to gain a better understanding of both the why and how of muscle growth,” he said. “So much time and resources could be saved in avoiding low-productivity exercise regimes, and maximising athletes’ potential with regular higher value sessions, given a specific volume that the athlete is capable of achieving.”

Terentjev and Ibata set out to constrict a mathematical model that could give quantitative predictions on muscle growth. They started with a simple model that kept track of titin molecules opening under force and starting the signalling cascade. They used microscopy data to determine the force-dependent probability that a titin kinase unit would open or close under force and activate a signalling molecule.

They then made the model more complex by including additional information, such as metabolic energy exchange, as well as repetition length and recovery. The model was validated using past long-term studies on muscle hypertrophy.

“While there is experimental data showing similar muscle growth with loads as little as 30% of maximum load, our model suggests that loads of 70% are a more efficient method of stimulating growth,” said Terentjev, who is a Fellow of Queens’ College. “Below that, the opening rate of titin kinase drops precipitously and precludes mechanosensitive signalling from taking place. Above that, rapid exhaustion prevents a good outcome, which our model has quantitatively predicted.”

“One of the challenges in preparing elite athletes is the common requirement for maximising adaptations while balancing associated trade-offs like energy costs,” said Fionn MacPartlin, Senior Strength & Conditioning Coach at the English Institute of Sport. “This work gives us more insight into the potential mechanisms of how muscles sense and respond to load, which can help us more specifically design interventions to meet these goals.”

The model also addresses the problem of muscle atrophy, which occurs during long periods of bed rest or for astronauts in microgravity, showing both how long can a muscle afford to remain inactive before starting to deteriorate, and what the optimal recovery regime could be.

Eventually, the researchers hope to produce a user-friendly software-based application that could give individualised exercise regimes for specific goals. The researchers also hope to improve their model by extending their analysis with detailed data for both men and women, as many exercise studies are heavily biased towards male athletes.

Reference:
Neil Ibata and Eugene M. Terentjev. ‘Why exercise builds muscles: Titin mechanosensing controls skeletal muscle growth under load.’ Biophysical Journal (2021). DOI: 10.1016/j.bpj.2021.07.023


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Minor Volcanic Eruptions Could ‘Cascade’ Into Global Catastrophe, Experts Warn

Minor Volcanic Eruptions Could ‘Cascade’ Into Global Catastrophe, Experts Warn

 

source: www.cam.ac.uk

 

Researchers call for a shift in focus away from risks of “super-volcanic” eruptions and towards likelier scenarios of smaller eruptions in key global “pinch points” creating devastating domino effects.

 

We need to move away from thinking in terms of colossal eruptions destroying the world, as portrayed in Hollywood films

Lara Mani

Currently, much of the thinking around risks posed by volcanoes follows a simple equation: the bigger the eruption, the worse it will be for society and human welfare.

However, a team of experts now argues that too much focus is on the risk of massive yet rare volcanic explosions, while far too little attention is paid to the potential domino effects of moderate eruptions in key parts of the planet.

Researchers led by the University of Cambridge’s Centre for the Study of Existential Risk (CSER) have identified seven “pinch points” where clusters of relatively small but active volcanoes sit alongside vital infrastructure that, if paralysed, could have catastrophic global consequences.

These regions include volcano groups in Taiwan, North Africa, the North Atlantic, and the northwestern United States. The report is published today in the journal Nature Communications.

“Even a minor eruption in one of the areas we identify could erupt enough ash or generate large enough tremors to disrupt networks that are central to global supply chains and financial systems,” said Dr Lara Mani from CSER, lead author of the latest report.

“At the moment, calculations are too skewed towards giant explosions or nightmare scenarios, when the more likely risks come from moderate events that disable major international communications, trade networks or transport hubs. This is true of earthquakes and extreme weather as well as volcanic eruption.”

Mani and colleagues say that smaller eruptions ranking up to 6 on the “volcanic explosivity index” – rather than the 7s and 8s that tend to occupy catastrophist thinking – could easily produce ash clouds, mudflows and landslides that scupper undersea cables, leading to financial market shutdowns, or devastate crop yields, causing food shortages that result in political turmoil.

As an example from recent history, the team point to events of 2010 in Iceland, where a magnitude 4 eruption from the Eyjafjallajökull volcano, close to the major “pinch point” of mainland Europe, saw plumes of ash carried on northwesterly winds close European airspace at a cost of US$5 billion to the global economy.

Yet when Mount Pinatubo in the Philippines erupted in 1991, a magnitude 6 eruption some 100 times greater in scale than the Icelandic event, its distance from vital infrastructure meant that overall economic damage was less than a fifth of Eyjafjallajökull. (Pinatubo would have a global economic impact of around US$740 million if it occurred in 2021.)

The seven “pinch point” areas identified by the experts – within which relatively small eruptions could inflict maximum global mayhem – include the volcanic group on the northern tip of Taiwan. Home to one of the largest producers of electronic chips, if this area – along with the Port of Taipei – was indefinitely incapacitated, the global tech industry could grind to a halt.

Another pinch point is the Mediterranean, where legends of the classical world such as Vesuvius and Santorini could induce tsunamis that smash submerged cable networks and seal off the Suez Canal. “We saw what a six-day closure to the Suez Canal did earlier this year, when a single stuck container ship cost up to ten billion dollars a week in global trade,” said Mani.

Eruptions in the US state of Washington in the Pacific Northwest could trigger mudflows and ash clouds that blanket Seattle, shutting down airports and seaports. Scenario modelling for a magnitude 6 eruption from Mount Rainier predicts potential economic losses of more than US$7 trillion over the ensuing five years.

The highly active volcanic centres along the Indonesian archipelago – from Sumatra to Central Java – also line the Strait of Malacca: one of the busiest shipping passages in the world, with 40% of global trade traversing the narrow route each year.

The Luzon Strait in the South China Sea, another key shipping route, is the crux of all the major submerged cabling that connects China, Hong Kong, Taiwan, Japan and South Korea. It is also encircled by the Luzon Volcanic Arc.

The researchers also identify the volcanic region straddling the Chinese-North-Korean border, from which plumes of ash would disrupt the busiest air routes in the east, and point out that a reawakening of Icelandic volcanoes would do the same in the west.

“It’s time to change how we view extreme volcanic risk,” added Mani. “We need to move away from thinking in terms of colossal eruptions destroying the world, as portrayed in Hollywood films. The more probable scenarios involve lower-magnitude eruptions interacting with our societal vulnerabilities and cascading us towards catastrophe.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Heads Reveal How ‘Overwhelming’ Government Guidance Held Schools Back As COVID Hit

source: www.cam.ac.uk

 

Headteachers and school leaders have described how an ‘avalanche’ of confused and shifting Government guidance severely impeded schools during the critical first months of COVID lockdown in a new study.

 

It was uncanny how often the term ‘avalanche’ was used to describe the ridiculous amount of information they were getting

Peter Fotheringham

The research compiles data gathered from almost 300 heads and other school leaders in June 2020, as schools were beginning to reopen after the first wave of closures. It documents leadership teams’ struggles with overwhelming and disorganised information dumps by Government and the Department for Education (DfE), which were often issued with barely any notice and then continually updated.

The researchers, from the University of Cambridge and University College London, calculate that between 18 March and 18 June 2020, DfE released 201 policy updates for schools. This included 12 cases in which five or more documents were published in a single day for immediate interpretation and implementation.

Asked about the main challenges they faced, heads repeatedly cited ‘changing updates’, ‘clarity’ and ‘time’. 77% of executive heads and 71% of headteachers complained about “too many inputs and too much information”. In follow-up interviews, participants referred to being “inundated” with Government updates, which often contradicted earlier guidance.

Peter Fotheringham, a doctoral researcher at the University’s Faculty of Education and the study’s lead author, said: “We expected the biggest challenge for school leaders during lockdown would be student welfare. In fact, time and again, the message we got was: ‘I don’t know what’s going to happen tomorrow, nothing is being shared in advance, and it’s overwhelming.’”

“It was uncanny how often the term ‘avalanche’ was used to describe the ridiculous amount of information they were getting. Policy measures were also typically announced to the public before official guidance even arrived, so parents were on the phone before heads even had a chance to read it. We think that with some simple fixes, a lot of this could be avoided in the future.”

The study invited a random sample of heads and other school leaders in England to complete a simple, anonymous questionnaire about what information had informed their schools’ responses to the pandemic, and any associated challenges and opportunities. 298 leaders responded, 29 of whom were later randomly selected for follow-up interviews.

Asked to rate the importance of different information sources on a scale of one to five, school leaders perceived guidance from the DfE (average score 4.1) and Government (4.0) as most important – ahead of sources such as Multi-Academy Trusts (MATs), unions, or the media.

Many, however, expressed deep frustration with the lack of notice that preceded new Government guidance, which they often heard about first through televised coronavirus briefings or other public announcements. “Society at large is being given information at the same time as schools,” one head told the researchers. “There is no time to put our thoughts in place before parents start calling.”

Follow-up guidance, either from DfE, Local Education Authorities, or MATs, tended to lag behind. The study finds this meant heads had to interpret key policies – such as those concerning safety measures, social distancing, in-person tuition for the children of key workers, or schools reopening – before further information arrived which sometimes contradicted their judgements.

One survey response read: “It is quite clear that cabinet does not communicate with the DfE before making announcements, leaving everyone scrabbling to develop policies in the dark, while parents and students look to the College for immediate guidance.”

The sheer volume of information being released also represented a major challenge. During the three-month period concerned, DfE published 74 unique guidance documents; each of which was updated three times on average. The net result was that school leaders received an average of three policy updates per day, for 90 days, including at weekends.

“A critical problem was that there was no way of telling what had changed from one update to the next,” Fotheringham said. “Leadership teams literally had to print off different versions and go through them with a highlighter, usually in hastily-organised powwows at 7am.”

“These things are very, very time-consuming to read, but have highly technical consequences. Even a small change to distancing rules, for example, affects how you manage classrooms, corridors and play areas. The release process made the translation of such policies into action incredibly difficult.”

The study concludes that introducing simple measures, such as signalling in-line changes to policy updates, ‘would have a high impact’ on school leaders’ ability to implement policy during any future disruption. Fotheringham added that “numerous mechanisms” were available to DfE to sharpen its communications with heads – not least a direct-line email system to school leaders, which could have been used to give them advance warning about new guidance.

The findings also underline the value of schools’ wider networks within their communities and of the professional connections of school leaders themselves. Heads repeatedly described, in particular, the benefits they experienced from having opportunities to collaborate and share ideas with other school leaders as they tried to steer their schools through the crisis. Investing in further opportunities to do this beyond the ‘traditional’ structures offered by local authorities or MATs would, the authors suggest, prove beneficial.

The study warns that the challenges faced by school leaders in the spring of 2020 appear to echo those encountered both internationally and in the UK during previous school closures – for example, amid the 2009 H1N1 swine flu pandemic, when 74 UK schools had to close.

“We frequently describe COVD as unprecedented, but school closures are a common public health measure,” Fotheringham said. “Previous cases have provided plentiful evidence that Government communications with schools can be a problem. The findings of this study would suggest we haven’t yet learned those lessons.”

The study is published in the British Educational Research Journal.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Lab-Grown Beating Heart Cells Identify Potential Drug To Prevent COVID-19-Related Heart Damage

 

Cambridge scientists have grown beating heart cells in the lab and shown how they are vulnerable to SARS-CoV-2 infection. In a study published in Communications Biology, they used this system to show that an experimental peptide drug called DX600 can prevent the virus entering the heart cells.

 

Using stem cells, we’ve managed to create a model which, in many ways, behaves just like a heart does, beating in rhythm. This has allowed us to look at how the coronavirus infects cells and, importantly, helps us screen possible drugs that might prevent damage to the heart

Sanjay Sinha

The heart is one the major organs damaged by infection with SARS-CoV-2, particularly the heart cells, or ‘cardiomyocytes’, which contract and circulate blood. It is also thought that damage to heart cells may contribute to the symptoms of long COVID.

Patients with underlying heart problems are more than four times as likely to die from COVID-19, the disease caused by SARS-CoV-2 infection. The case fatality rate in patients with COVID-19 rises from 2.3% to 10.5% in these individuals.

To gain entry into our cells, SARS-CoV-2 hijacks a protein on the surface of the cells, a receptor known as ACE2.  Spike proteins on the surface of SARS-CoV-2 – which give it its characteristic ‘corona’-like appearance – bind to ACE2. Both the spike protein and ACE2 are then cleaved, allowing genetic material from the virus to enter the host cell. The virus manipulates the host cell’s machinery to allow itself to replicate and spread.

A team of scientists at the University of Cambridge has used human embryonic stem cells to grow clusters of heart cells in the lab and shown that these cells mimic the behaviour of the cells in the body, beating as if to pump blood. Crucially, these model heart cells also contained the key components necessary for SARS-CoV-2 infection – in particular, the ACE2 receptor.

Working in special biosafety laboratories and using a safer, modified synthetic (‘pseudotyped’) virus decorated with the SARS-CoV-2 spike protein, the team mimicked how the virus infects the heart cells. They then used this model to screen for potential drugs to block infection.

Dr Sanjay Sinha from the Wellcome-MRC Cambridge Stem Cell Institute said: “Using stem cells, we’ve managed to create a model which, in many ways, behaves just like a heart does, beating in rhythm. This has allowed us to look at how the coronavirus infects cells and, importantly, helps us screen possible drugs that might prevent damage to the heart.”

The team showed that some drugs that targeted the proteins involved in SARS-CoV-2 viral entry significantly reduced levels of infection. These included an ACE2 antibody that has been shown previously to neutralise pseudotyped SARS-CoV-2 virus, and DX600, an experimental drug.

DX600 is an ACE2 peptide antagonist – that is, a molecule that specifically targets ACE2 and inhibits the activity of peptides that play a role in allowing the virus to break into the cell.

DX600 was around seven times more effective at preventing infection compared to the antibody, though the researchers say this may be because it was used in higher concentrations. The drug did not affect the number of heart cells, implying that it would be unlikely to be toxic.

Professor Anthony Davenport from the Department of Medicine and a fellow at St Catharine’s College, Cambridge said: “The spike protein is like a key that fits into the ‘lock’ on the surface of the cells – the ACE2 receptor – allowing it entry. DX600 acts like gum, jamming the lock’s mechanism, making it much more difficult for the key to turn and unlock the cell door.

“We need to do further research on this drug, but it could provide us with a new treatment to help reduce harm to the heart in patients recently infected with the virus, particularly those who already have underlying heart conditions or who have not been vaccinated. We believe it may also help reduce the symptoms of long COVID.”

The research was largely supported by Wellcome, Addenbrooke’s Charitable Trust, Rosetrees Trust Charity and British Heart Foundation.

Reference
Williams, TL et al. Human embryonic stem cell-derived cardiomyocyte platform screens inhibitors of SARS-CoV-2 infection. Communications Biology; 29 Jul 2021; DOI: 10.1038/s42003-021-02453-y


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Artificial Pancreas Trialled For Outpatients With Type 2 Diabetes For First Time

Patient using the artificial pancreas
source: www.cam.ac.uk

 

An artificial pancreas could soon help those people living with type 2 diabetes who also require kidney dialysis. Tests led by the University of Cambridge and Inselspital, University Hospital of Bern, Switzerland, show that the device can help patients safely and effectively manage their blood sugar levels and reduce the risk of low blood sugar levels.

 

Patients living with type 2 diabetes and kidney failure are a particularly vulnerable group and managing their condition can be a challenge. There’s a real unmet need for new approaches to help them manage their condition safely and effectively

Charlotte Boughton

Diabetes is the most common cause of kidney failure, accounting for just under a third (30%) of cases. As the number of people living with type 2 diabetes increases, so too does the number of people requiring dialysis or a kidney transplant. Kidney failure increases the risk of hypoglycaemia and hyperglycaemia – abnormally low or high levels of blood sugar respectively – which in turn can cause complications from dizziness to falls and even to coma.

Managing diabetes in patients with kidney failure is challenging for both patients and healthcare professionals. Many aspects of their care are poorly understood, including targets for blood sugar levels and treatments. Most oral diabetes medications are not recommended for these patients, so insulin injections are the most commonly used diabetes therapy – though optimal insulin dosing regimens are difficult to establish.

A team at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust has previously developed an artificial pancreas with the aim of replacing insulin injections for patients living with type 1 diabetes. In research published today in Nature Medicine, the team – working with researchers at Bern University Hospital and University of Bern, Switzerland – has shown that the device can be used to support patients living with both type 2 diabetes and kidney failure.

Unlike the artificial pancreas being used for type 1 diabetes, this version is a fully closed loop system – whereas patients with type 1 diabetes need to tell their artificial pancreas that they are about to eat to allow adjustment of insulin, for example, with this new version they can leave the device to function entirely automatically.

Dr Charlotte Boughton from the Wellcome-MRC Institute of Metabolic Science at the University of Cambridge, who led the study, said: “Patients living with type 2 diabetes and kidney failure are a particularly vulnerable group and managing their condition – trying to prevent potentially dangerous highs or lows of blood sugar levels – can be a challenge. There’s a real unmet need for new approaches to help them manage their condition safely and effectively.”

The artificial pancreas is a small, portable medical device designed to carry out the function of a healthy pancreas in controlling blood glucose levels, using digital technology to automate insulin delivery. The system is worn externally on the body, and is made up of three functional components: a glucose sensor, a computer algorithm to calculate the insulin dose, and an insulin pump. Software in the user’s smartphone sends a signal to an insulin pump to adjust the level of insulin the patient receives. The glucose sensor measures the patient’s blood sugar levels and sends these back to the smartphone to enable it to make further adjustments.

The team recruited 26 patients requiring dialysis between October 2019 and November 2020. Thirteen participants were randomised to receive the artificial pancreas first and 13 to receive standard insulin therapy first. The researchers compared how long patients spent in the target blood sugar range (5.6 to 10.0mmol/L) over a 20-day period as outpatients.

Patients using the artificial pancreas spent on average 53% of their time in the target range, compared to 38% when they used the control treatment. This equated to around 3.5 additional hours every day spent in the target range compared with the control therapy.

Mean blood sugar levels were lower with the artificial pancreas (10.1 vs. 11.6 mmol/L). The artificial pancreas reduced the amount of time patients spent with potentially dangerously low blood sugar levels, or ‘hypos’.

The efficacy of the artificial pancreas improved considerably over the study period as the algorithm adapted, and the time spent in the target blood sugar range increased from 36% on day one to over 60% by the twentieth day. This finding highlights the importance of using an adaptive algorithm, which can adjust in response to an individual’s changing insulin requirements over time.

When asked about their experiences of using the artificial pancreas, everyone who responded said they would recommend it to others. Nine out of ten (92%) reported that they spent less time managing their diabetes with the artificial pancreas than during the control period, and similar numbers (87%) were less worried about their blood sugar levels when using it.

Other benefits of the artificial pancreas reported by study participants included less need for finger-prick blood sugar checks, less time required to manage their diabetes resulting in more personal time and freedom, and improved peace of mind and reassurance. Downsides included discomfort wearing the insulin pump and carrying the smartphone.

Senior author Professor Roman Hovorka, also from the Wellcome-MRC Institute of Metabolic Science, said: “Not only did the artificial pancreas increase the amount of time patients spent within the target range for the blood sugar levels, it also gave the users peace of mind. They were able to spend less time having to focus on managing their condition and worrying about their blood sugar levels, and more time getting on with their lives.”

Dr Boughton added: “Now that we’ve shown the artificial pancreas works in one of the more difficult-to-treat groups of patients, we believe it could prove useful in the wider population of people living with type 2 diabetes.”

The team is currently trialling the artificial pancreas for outpatient use in people living with type 2 diabetes who do not need dialysis and exploring the system in complex medical situations such as perioperative care.

Dr Lia Bally, who co-led the study in Bern, said: “The artificial pancreas has the potential to become a key feature of integrated personalised care for people with complex medical needs.”

The research was supported by the NIHR Cambridge Biomedical Research Centre, The Novo Nordisk UK Research Foundation, Swiss Society for Endocrinology and Diabetes, and Swiss Diabetes Foundation and Swiss Kidney Foundation.

Reference
Boughton, CK et al.  Fully automated closed-loop glucose control compared with standard insulin therapy in adults with type 2 diabetes requiring dialysis: an open-label, randomised crossover trial. Nat Med; 4 Aug 2021; DOI: 10.1038/s41591-021-01453-z


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

UK and Ireland Among Five Nations Most Likely to Survive a Collapse of Global Civilisation, Study Suggests

Researchers say a worldwide breakdown could happen “within a few decades” and have identified five countries most likely to withstand future threats.

Aerial view of rural Ireland
Image:source: https://news.sky.com/

The UK and Ireland were among five countries identified by researchers as best placed to maintain civilisation within their own borders

Why you can trust Sky News 

The UK and Ireland are among five nations most likely to survive a collapse of global civilisation, researchers have said.

A study has suggested a combination of ecological destruction, limited resources and population growth could trigger a worldwide breakdown “within a few decades”, with climate change making things worse.

A “very likely” collapse would be characterised by the disintegration of supply chains, international agreements and global financial structures, according to researchers at the Global Sustainability Institute at Anglia Ruskin University.

Aerial view of New Zealand
Image:Islands like New Zealand are considered to be most resilient to future threats

They said problems could spread quickly because of how connected and economically dependant countries are on one another.

Five countries were identified as best placed to maintain civilisation within their own borders, with New Zealand topping the list and followed by Iceland, the United Kingdom, Ireland and Australia.

All of them are islands or island continents which have fewer extremes in temperatures and varied amounts of rainfall due to their proximity to oceans.

Wind turbines at Whitelee Windfarm in East Renfrewshire
Image:Researchers said the UK could increase its use of wind turbines to secure its future

Researchers said this makes them most likely to have relatively stable conditions in the future, despite the effects of climate change – which is expected to hit subtropics and tropics the hardest.

More on Climate Change

  • COP26 climate change conference: Future of our planet depends on decisions made in run up to summit

  • Climate change: United Nations report ratchets up political risks for government

  • Climate change: Six everyday things you can do to help stop global warming

  • Climate change: UK ‘on track’ to meet ‘challenging’ target of net zero emissions by 2050, minister declares

  • Climate change: Milestone scientific report expected to deliver ‘starkest warning yet’ on global warming crisis

  • ‘A code red for humanity’: Landmark climate report says global warming limit to be hit within 20 years

New Zealand’s ability to produce geothermal and hydroelectric energy, its abundant agricultural land and its low population would allow it to survive relatively unscathed.

Although the UK has generally fertile soils and varied agricultural output, it does not have as much agricultural land available because of its population density, raising questions about future self-sufficiency.

Britain’s reliance on fossil fuels and nuclear energy was considered to be a risk as power sources could be “rendered at least partly inoperable” if global supply chains collapse.

:: Subscribe to ClimateCast on SpotifyApple Podcasts, or Spreaker

Researchers said this could be mitigated by the nation’s manufacturing capabilities.

Meeting the large population’s energy demands through renewables alone would require very extensive infrastructure, they said, but the UK could increase its resilience by harnessing more energy from wind and water bodies like lagoons or barrages in the Severn Estuary.

Professor Aled Jones, director of the Global Sustainability Institute at Anglia Ruskin University, said “significant changes are possible in the coming years and decades”.

He said: “The impact of climate change, including increased frequency and intensity of drought and flooding, extreme temperatures, and greater population movement, could dictate the severity of these changes.”

Researchers identified pandemics as another risk to societal stability, citing the United Nations’ warning that future pandemics could be even more severe than COVID-19.

Twenty countries were analysed in the report.

Cambridge Researcher Named as Turing AI World-Leading Researcher Fellow

Professor Zoubin Ghahramani
source: www.cam.ac.uk

 

Five internationally-recognised researchers, including Cambridge’s Professor Zoubin Ghahramani, have been appointed as the first Turing AI World-Leading Researcher Fellows to conduct work on artificial intelligence’s (AI) biggest challenges.

 

The other new Fellows are Professor Samuel Kaski from the University of Manchester, Professor Mirella Lapata from the University of Edinburgh, Professor Philip Torr from the University of Oxford, and Professor Michael Wooldridge from the University of Oxford.

The fellowships, named after AI pioneer Alan Turing, are part of the UK’s commitment to further strengthen its position as a global leader in the field.

Retaining and attracting some of the best international research talent in a highly competitive international environment will increase the UK’s competitive advantage and capability in AI.

The fellows’ research will have a transformative effect on the international AI research and innovation landscape by tackling some of the fundamental challenges in the field.

It could also deliver major societal impact in areas including decision-making in personalised medicine, synthetic biology and drug design, financial modelling, and autonomous vehicles.

Professor Ghahramani, from Cambridge’s Department of Engineering, is Senior Director and Distinguished Researcher at Google, former Chief Scientist at Uber and a Fellow of the Royal Society.

In his fellowship, which he will hold jointly while continuing to work at Google, he aims to develop the new algorithms and applications needed to address limitations faced by the AI systems that underpin technologies such as speech recognition and autonomous vehicles. This includes ensuring they can better adapt to new data and apply data-driven machine learning approaches to simulators to understand complex systems.

“The Turing AI Fellowships provide a fantastic opportunity to grow the UK’s research talent in AI, and to build stronger relationships between industry and academia,” said Ghahramani. “Most modern AI systems are based on machine learning technology that learns from patterns in data. This research programme aims to improve such systems by making them more robust and reliable, so that they can better respond to changing circumstances, and better incorporate prior knowledge, symbolic reasoning and data.”

The fellows are supported with an £18 million investment by UK Research and Innovation (UKRI).

In addition to this, 39 different collaborators including IBM, AstraZeneca and Facebook are making contributions worth £15.7 million to the fellows’ research programmes.

The fellowships are being delivered by UKRI’s Engineering and Physical Sciences Research Council.

“The Turing AI World-Leading Researcher Fellowships recognise internationally-leading researchers in AI, and provide the support needed to tackle some of the biggest challenges and opportunities in AI research,” said EPSRC Executive Chair Professor Dame Lynn Gladden. “These fellowships enable the UK to attract top international talent to the UK as well as retaining our own world-leaders. Attracting and retaining top talent is essential to keep the UK at the leading edge of AI research and innovation.”

The Turing AI Fellowships investment is delivered in partnership by UKRI, the Office for AI, and The Alan Turing Institute, the national institute for data science and AI.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Astronomers Show How Planets Form In Binary Systems Without Getting Crushed

Astronomers show how planets form in binary systems without getting crushed

Artist’s impression of the planet around Alpha Centauri B
source: www.cam.ac.uk

 

Astronomers have developed the most realistic model to date of planet formation in binary star systems.

 

Planet formation in binary systems is more complicated, because the companion star acts like a giant eggbeater, dynamically exciting the protoplanetary disc

Roman Rafikov

The researchers, from the University of Cambridge and the Max Planck Institute for Extra-terrestrial Physics, have shown how exoplanets in binary star systems – such as the ‘Tatooine’ planets spotted by NASA’s Kepler Space Telescope – came into being without being destroyed in their chaotic birth environment.

They studied a type of binary system where the smaller companion star orbits the larger parent star approximately once every 100 years – our nearest neighbour, Alpha Centauri, is an example of such a system.

“A system like this would be the equivalent of a second Sun where Uranus is, which would have made our own solar system look very different,” said co-author Dr Roman Rafikov from Cambridges Department of Applied Mathematics and Theoretical Physics.

Rafikov and his co-author Dr Kedron Silsbee from the Max Planck Institute for Extra-terrestrial Physics found that for planets to form in these systems, the planetesimals – planetary building blocks which orbit around a young star – need to start off at least 10 kilometres in diameter, and the disc of dust and ice and gas surrounding the star within which the planets form needs to be relatively circular.

The research, which is published in Astronomy and Astrophysics, brings the study of planet formation in binaries to a new level of realism and explains how such planets, a number of which have been detected, could have formed.

Planet formation is believed to begin in a protoplanetary disc – made primarily of hydrogen, helium, and tiny particles of ices and dust – orbiting a young star. According to the current leading theory on how planets form, known as core accretion, the dust particles stick to each other, eventually forming larger and larger solid bodies. If the process stops early, the result can be a rocky Earth-like planet. If the planet grows bigger than Earth, then its gravity is sufficient to trap a large quantity of gas from the disc, leading to the formation of a gas giant like Jupiter.

“This theory makes sense for planetary systems formed around a single star, but planet formation in binary systems is more complicated, because the companion star acts like a giant eggbeater, dynamically exciting the protoplanetary disc,” said Rafikov.

“In a system with a single star the particles in the disc are moving at low velocities, so they easily stick together when they collide, allowing them to grow,” said Silsbee. “But because of the gravitational eggbeater’ effect of the companion star in a binary system, the solid particles there collide with each other at much higher velocity. So, when they collide, they destroy each other.”

Many exoplanets have been spotted in binary systems, so the question is how they got there. Some astronomers have even suggested that perhaps these planets were floating in interstellar space and got sucked in by the gravity of a binary, for instance.

Rafikov and Silsbee carried out a series of simulations to help solve this mystery. They developed a detailed mathematical model of planetary growth in a binary that uses realistic physical inputs and accounts for processes that are often overlooked, such as the gravitational effect of the gas disc on the motion of planetesimals within it.

The disc is known to directly affect planetesimals through gas drag, acting like a kind of wind,” said Silsbee. A few years ago, we realised that in addition to the gas drag, the gravity of the disc itself dramatically alters dynamics of the planetesimals, in some cases allowing planets to form even despite the gravitational perturbations due to the stellar companion.”

The model weve built pulls together this work, as well as other previous work, to test the planet formation theories,” said Rafikov.

Their model found that planets can form in binary systems such as Alpha Centauri, provided that the planetesimals start out at least 10 kilometres across in size, and that the protoplanetary disc itself is close to circular, without major irregularities. When these conditions are met, the planetesimals in certain parts of the disc end up moving slowly enough relative to each other that they stick together instead of destroying each other.

These findings lend support to a particular mechanism of planetesimal formation, called the streaming instability, being an integral part of the planet formation process. This instability is a collective effect, involving many solid particles in the presence of gas, that is capable of concentrating pebble-to-boulder sized dust grains to produce a few large planetesimals, which would survive most collisions.

The results of this work provide important insights for theories of planet formation around both binary and single stars, as well as for the hydrodynamic simulations of protoplanetary discs in binaries. In future, the model could also be used to explain the origin of the Tatooine planets – exoplanets orbiting both components of a binary – about a dozen of which have been identified by NASAs Kepler Space Telescope.

 

Reference:
Kedron Silsbee and Roman R. Rafikov. ‘Planet Formation in Stellar Binaries: Global Simulations of Planetesimal Growth.’ Astronomy and Astrophysics (2021). DOI:10.1051/0004-6361/20214113


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Earth’s Interior is Swallowing Up More Carbon Than Thought

Alaska’s Pavlof Volcano: NASA’s View from Space
source: www.cam.ac.uk

 

Scientists from Cambridge University and NTU Singapore have found that slow-motion collisions of tectonic plates drag more carbon into Earth’s interior than previously thought.

 

We currently have a relatively good understanding of the surface reservoirs of carbon and the fluxes between them, but know much less about Earth’s interior carbon stores, which cycle carbon over millions of years

Stefan Farsang

They found that the carbon drawn into Earth’s interior at subduction zones – where tectonic plates collide and dive into Earth’s interior – tends to stay locked away at depth, rather than resurfacing in the form of volcanic emissions.

Their findings, published in Nature Communications, suggest that only about a third of the carbon recycled beneath volcanic chains returns to the surface via recycling, in contrast to previous theories that what goes down mostly comes back up.

One of the solutions to tackle climate change is to find ways to reduce the amount of CO2 in Earth’s atmosphere. By studying how carbon behaves in the deep Earth, which houses the majority of our planet’s carbon, scientists can better understand the entire lifecycle of carbon on Earth, and how it flows between the atmosphere, oceans and life at the surface.

The best-understood parts of the carbon cycle are at or near Earth’s surface, but deep carbon stores play a key role in maintaining the habitability of our planet by regulating atmospheric CO2 levels. “We currently have a relatively good understanding of the surface reservoirs of carbon and the fluxes between them, but know much less about Earth’s interior carbon stores, which cycle carbon over millions of years,” said lead author Stefan Farsang, who conducted the research while a PhD student at Cambridge’s Department of Earth Sciences.

There are a number of ways for carbon to be released back to the atmosphere (as CO2) but there is only one path in which it can return to the Earth’s interior: via plate subduction. Here, surface carbon, for instance in the form of seashells and micro-organisms which have locked atmospheric CO2 into their shells, is channelled into Earth’s interior. Scientists had thought that much of this carbon was then returned to the atmosphere as CO2 via emissions from volcanoes. But the new study reveals that chemical reactions taking place in rocks swallowed up at subduction zones trap carbon and send it deeper into Earth’s interior – stopping some of it coming back to Earth’s surface.

The team conducted a series of experiments at the European Synchrotron Radiation Facility, “The ESRF have world-leading facilities and the expertise that we needed to get our results,” said co-author Simon Redfern, Dean of the College of Science at NTU Singapore, “The facility can measure very low concentrations of these metals at the high pressure and temperature conditions of interest to us.” To replicate the high pressures and temperatures of subductions zones, they used a heated ‘diamond anvil’, in which extreme pressures are achieved by pressing two tiny diamond anvils against the sample.

The work supports growing evidence that carbonate rocks, which have the same chemical makeup as chalk, become less calcium-rich and more magnesium-rich when channelled deeper into the mantle. This chemical transformation makes carbonate less soluble – meaning it doesn’t get drawn into the fluids that supply volcanoes. Instead, the majority of the carbonate sinks deeper into the mantle where it may eventually become diamond.

“There is still a lot of research to be done in this field,” said Farsang. “In the future, we aim to refine our estimates by studying carbonate solubility in a wider temperature, pressure range and in several fluid compositions.”

The findings are also important for understanding the role of carbonate formation in our climate system more generally. “Our results show that these minerals are very stable and can certainly lock up CO2 from the atmosphere into solid mineral forms that could result in negative emissions,” said Redfern. The team have been looking into the use of similar methods for carbon capture, which moves atmospheric CO2 into storage in rocks and the oceans.

“These results will also help us understand better ways to lock carbon into the solid Earth, out of the atmosphere. If we can accelerate this process faster than nature handles it, it could prove a route to help solve the climate crisis,” said Redfern.

 

Reference:
Farsang, S., Louvel, M., Zhao, C. et al. Deep carbon cycle constrained by carbonate solubility. Nature Communications (2021). DOI: 10.1038/s41467-021-24533-7

Adapted from a news release by the ESRF


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists Can Detect Brain Tumours Using a Simple Urine or Blood Plasma Test

source: www.cam.ac.uk

 

Researchers from the Cancer Research UK Cambridge Institute have developed two tests that can detect the presence of glioma, a type of brain tumour, in patient urine or blood plasma.

 

The team say that a test for detecting glioma using urine is the first of its kind in the world.

Although the research, published in EMBO Molecular Medicine, is in its early stages and only a small number of patients were analysed, the team say their results are promising.

The researchers suggest that in the future, these tests could be used by GPs to monitor patients at high risk of brain tumours, which may be more convenient than having an MRI every three months, which is the standard method.

When people have a brain tumour removed, the likelihood of it returning can be high, so they are monitored with an MRI scan every three months, which is followed by biopsy.

Blood tests for detecting different cancer types are a major focus of research for teams across the world, and there are some in use in the clinic. These tests are mainly based on finding mutated DNA, shed by tumour cells when they die, known as cell-free DNA (cfDNA).

However, detecting brain tumour cfDNA in the blood has historically been difficult because of the blood-brain-barrier, which separates blood from the cerebrospinal fluid (CSF) that surrounds the brain and spinal cord, preventing the passage of cells and other particles, such as cfDNA.

Researchers have previously looked at detecting cfDNA in CSF, but the spinal taps needed to obtain it can be dangerous for people with brain tumours so are not appropriate for patient monitoring.

Scientists have known that cfDNA with similar mutations to the original tumour can be found in blood and other bodily fluids such as urine in very low levels, but the challenge has been developing a test sensitive enough to detect these specific mutations.

The researchers, led by Dr Florent Mouliere who is based at the Rosenfeld Lab of the Cancer Research UK Cambridge Institute and at the Amsterdam UMC, and Dr Richard Mair, who is based at Cancer Research UK Cambridge Institute and the University of Cambridge developed two approaches in parallel to overcome the challenge of detecting brain tumour cfDNA.

The first approach works for patients who have previously had glioma removed and biopsied. The team designed a tumour-guided sequencing test that was able to look for the mutations found in the tumour tissue within the cfDNA in the patient’s urine, CSF, and blood plasma.

A total of eight patients who had suspected brain tumours based on MRIs were included in this part of the study. Samples were taken at their initial brain tumour biopsies, alongside CSF, blood and urine samples.

By knowing where in the DNA strand to look, the researchers found that it was possible to find mutations even in the tiny amounts of cfDNA found in the blood plasma and urine.

The test was able to detect cfDNA in 7 out of 8 CSF samples, 10 out of the 12 plasma blood samples and 10 out of the 16 urine samples.

For the second approach the researchers looked for other patterns in the cfDNA that could also indicate the presence of a tumour, without having to identify the mutations.

They analysed 35 samples from glioma patients, 27 people with non-malignant brain disorders, and 26 healthy people. They used whole genome sequencing, where all the cfDNA of the tumour is analysed, not just the mutations.

They found in the blood plasma and urine samples that fragments of cfDNA, which came from patients with brain tumours were different sizes than those from patients with no tumours in CSF. They then fed this data into a machine learning algorithm which was able to successfully differentiate between the urine samples of people with and without glioma.

The researchers say that while the machine learning test is cheaper and easier, and a tissue biopsy from the tumour is not needed, it is not as sensitive and is less specific than the first tumour-guided sequencing approach.

MRIs are not invasive or expensive, but they do require a trip to the hospital, and the three-month gap between checks can be a regular source of anxiety for patients.

The researchers suggest that their tests could be used between MRI scans, and could ultimately be able to detect a returning brain tumour earlier.

The next stage of this research will see the team comparing both tests against MRI scans in a trial with patients with brain tumours who are in remission to see if it can detect if their tumours are coming back at the same time or earlier than the MRI. If the tests prove that they can detect brain tumours earlier than an MRI, then the researchers will look at how they can adapt the tests so they could be offered in the clinic, which could be within the next ten years.

“We believe the tests we’ve developed could in the future be able to detect a returning glioma earlier and improve patient outcomes,” said Mair. “Talking to my patients, I know the three-month scan becomes a focal point for worry. If we could offer a regular blood or urine test, not only will you be picking up recurrence earlier, you can also be doing something positive for the patient’s mental health.”

Michelle Mitchell, Chief Executive of Cancer Research UK said, “While this is early research, it’s opened up the possibility that within the next decade we could be able to detect the presence of a brain tumour with a simple urine or blood test. Liquid biopsies are a huge area of research interest right now because of the opportunities they create for improved patient care and early diagnosis. It’s great to see Cancer Research UK researchers making strides in this important field.”

Sue Humphreys, from Wallsall, a brain tumour patient, said: “If these tests are found to be as accurate as the standard MRI for monitoring brain tumours, it could be life changing.

If patients can be given a regular and simple test by their GP, it may help not only detect a returning brain tumour in its earliest stages, it can also provide the quick reassurance that nothing is going on which is the main problem we all suffer from, the dreaded Scanxiety.

The problem with three-monthly scans is that these procedures can get disrupted by other things going on, such as what we have seen with the Covid pandemic. As a patient, this causes worry as there is a risk that things may be missed, or delayed, and early intervention is the key to any successful treatment.”

 

Reference:
Florent Mouliere et al. ‘Fragmentation patterns and personalized sequencing of cell-free DNA in urine and plasma of glioma patients.’ EMBO Molecular Medicine (2021). DOI: 10.15252/emmm.202012881

Adapted from a Cancer Research UK press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Blushing Plants Reveal When Fungi Are Growing In Their Roots

Cells of roots colonised by fungi turn red
source: www.cam.ac.uk

 

Scientists have created plants whose cells and tissues ‘blush’ with beetroot pigments when they are colonised by fungi that help them take up nutrients from the soil.

 

We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact.

Sebastian Schornack

This is the first time this vital, 400 million year old process has been visualised in real time in full root systems of living plants. Understanding the dynamics of plant colonisation by fungi could help to make food production more sustainable in the future.

Almost all crop plants form associations with a particular type of fungi – called arbuscular mycorrhiza fungi – in the soil, which greatly expand their root surface area. This mutually beneficial interaction boosts the plant’s ability to take up nutrients that are vital for growth.

The more nutrients plants obtain naturally, the less artificial fertilisers are needed. Understanding this natural process, as the first step towards potentially enhancing it, is an ongoing research challenge. Progress is likely to pay huge dividends for agricultural productivity.

In a study published in the journal PLOS Biology, researchers used the bright red pigments of beetroot – called betalains – to visually track soil fungi as they colonised plant roots in a living plant.

“We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact. We previously had no idea about what happened because there was no way to visualise it in a living plant without the use of elaborate microscopy,” said Dr Sebastian Schornack, a researcher at the University of Cambridge’s Sainsbury Laboratory and joint senior author of the paper.

To achieve their results, the researchers engineered two model plant species – a legume and a tobacco plant – so that they would produce the highly visible betalain pigments when arbuscular mycorrhiza fungi were present in their roots. This involved combining the control regions of two genes activated by mycorrhizal fungi with genes that synthesise red-coloured betalain pigments.

The plants were then grown in a transparent structure so that the root system was visible, and images of the roots could be taken with a flatbed scanner without disturbing the plants.

Using their technique, the researchers could select red pigmented parts of the root system to observe the fungus more closely as it entered individual plant cells and formed elaborate tree-like structures – called arbuscules – which grow inside the plant’s roots. Arbuscules take up nutrients from the soil that would otherwise be beyond the reach of the plant.

Other methods exist to visualise this process, but these involve digging up and killing the plant and the use of chemicals or expensive microscopy. This work makes it possible for the first time to watch by eye and with simple imaging how symbiotic fungi start colonising living plant roots, and inhabit parts of the plant root system over time.

“This is an exciting new tool to visualise this, and other, important plant processes. Beetroot pigments are a distinctive colour, so they’re very easy to see. They also have the advantage of being natural plant pigments, so they are well tolerated by plants,” said Dr Sam Brockington, a researcher in the University of Cambridge’s Department of Plant Sciences, and joint senior author of the paper.

Mycorrhiza fungi are attracting growing interest in agriculture. This new technique provides the ability to ‘track and trace’ the presence of symbiotic fungi in soils from different sources and locations. The researchers say this will enable the selection of fungi that colonise plants fastest and provide the biggest benefits in agricultural scenarios.

Understanding and exploiting the dynamics of plant root system colonisation by fungi has potential to enhance future crop production in an environmentally sustainable way. If plants can take up more nutrients naturally, this will reduce the need for artificial fertilisers – saving money and reducing associated water pollution.

This research was funded by the Biotechnology and Biological Sciences Research Council, Gatsby Charitable Foundation, Royal Society, and Natural Environment Research Council.

Reference
Timoneda, A. & Yunusov, T. et al: ‘MycoRed: Betalain pigments enable in vivo real-time visualisation of arbuscular mycorrhizal colonisation.’ PLOS Biology, July 2021. DOI: 10.1371/journal.pbio.3001326


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Smartphone Screens Effective Sensors For Soil or Water Contamination

 

The touchscreen technology used in billions of smartphones and tablets could also be used as a powerful sensor, without the need for any modifications.

 

Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes, since these ions also interact with the electric fields?

Ronan Daly

Researchers from the University of Cambridge have demonstrated how a typical touchscreen could be used to identify common ionic contaminants in soil or drinking water by dropping liquid samples on the screen, the first time this has been achieved. The sensitivity of the touchscreen sensor is comparable to typical lab-based equipment, which would make it useful in low-resource settings.

The researchers say their proof of concept could one day be expanded for a wide range of sensing applications, including for biosensing or medical diagnostics, right from the phone in your pocket. The results are reported in the journal Sensors and Actuators B.

Touchscreen technology is ubiquitous in our everyday lives: the screen on a typical smartphone is covered in a grid of electrodes, and when a finger disrupts the local electric field of these electrodes, the phone interprets the signal.

Other teams have used the computational power of a smartphone for sensing applications, but these have relied on the camera or peripheral devices, or have required significant changes to be made to the screen.

“We wanted to know if we could interact with the technology in a different way, without having to fundamentally change the screen,” said Dr Ronan Daly from Cambridge’s Institute of Manufacturing, who co-led the research. “Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes, since these ions also interact with the electric fields?”

The researchers started with computer simulations, and then validated their simulations using a stripped down, standalone touchscreen, provided by two UK manufacturers, similar to those used in phones and tablets.

The researchers pipetted different liquids onto the screen to measure a change in capacitance and recorded the measurements from each droplet using the standard touchscreen testing software. Ions in the fluids all interact with the screen’s electric fields differently depending on the concentration of ions and their charge.

“Our simulations showed where the electric field interacts with the fluid droplet. In our experiments, we then found a linear trend for a range of electrolytes measured on the touchscreen,” said first author Sebastian Horstmann, a PhD candidate at IfM. “The sensor saturates at an anion concentration of around 500 micromolar, which can be correlated to the conductivity measured alongside. This detection window is ideal to sense ionic contamination in drinking water.”

One early application for the technology could be to detect arsenic contamination in drinking water. Arsenic is another common contaminant found in groundwater in many parts of the world, but most municipal water systems screen for it and filter it out before it reaches a household tap. However, in parts of the world without water treatment plants, arsenic contamination is a serious problem.

“In theory, you could add a drop of water to your phone before you drink it, in order to check that it’s safe,” said Daly.

At the moment, the sensitivity of phone and tablet screens is tuned for fingers, but the researchers say the sensitivity could be changed in a certain part of the screen by modifying the electrode design in order to be optimised for sensing.

“The phone’s software would need to communicate with that part of the screen to deliver the optimum electric field and be more sensitive for the target ion, but this is achievable,” said Professor Lisa Hall from Cambridge’s Department of Chemical Engineering and Biotechnology, who co-led the research. “We’re keen to do much more on this – it’s just the first step.”

While it’s now possible to detect ions using a touchscreen, the researchers hope to further develop the technology so that it can detect a wide range of molecules. This could open up a huge range of potential health applications.

“For example, if we could get the sensitivity to a point where the touchscreen could detect heavy metals, it could be used to test for things like lead in drinking water. We also hope in the future to deliver sensors for home health monitoring,” said Daly.

“This is a starting point for broader exploration of the use of touchscreen sensing in mobile technologies and the creation of tools that are accessible to everyone, allowing rapid measurements and communication of data,” said Hall.

 

Reference:
Sebastian Horstmann, Cassi J Henderson, Elizabeth A H Hall, Ronan Daly ‘Capacitive touchscreen sensing – a measure of electrolyte conductivity.’ Sensors and Actuators B (2021). DOI: https://doi.org/10.1016/j.snb.2021.130318


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Biological ‘Fingerprints’ of Long COVID in Blood Could Lead To Diagnostic Test, Say Cambridge Scientists

Tired looking womansource: www.cam.ac.uk

Markers in our blood – ‘fingerprints’ of infection – could help identify individuals who have been infected by SARS-CoV-2, the coronavirus that causes COVID-19, several months after infection even if the individual had only mild symptoms or showed no symptoms at all, say Cambridge researchers.

 

Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties

Nyarie Sithole

The team has received funding from the National Institute for Health Research to develop a test that could complement existing antibody tests. They also aim to use similar biological signatures to develop a test and monitor for long COVID.

While most people recover from COVID-19 in a matter of days or weeks, around one in ten people go on to develop symptoms that can last for several months. This can be the case irrespective of the severity of their COVID-19 – even individuals who were asymptomatic can experience so-called ‘long COVID’.

Diagnosing long COVID can be a challenge, however. A patient with asymptomatic or mild disease may not have taken a PCR test at the time of infection – the gold standard for diagnosing COVID-19 –  and so has never had a confirmed diagnosis.  Even antibody tests – which look for immune cells produced in response to infection – are estimated to miss around 30% of cases, particularly among those who have had only mild disease and or beyond six months post-initial illness.

A team at the University of Cambridge and Cambridge University Hospital NHS Foundation Trust has received £370,000 from the National Institute for Health Research to develop a COVID-19 diagnostic test that would complement existing antibody tests and a test that could objectively diagnose and monitor long COVID.

The research builds on a pilot project supported by the Addenbrooke’s Charitable Trust. The team has been recruiting patients from the Long COVID Clinic established in May 2020 at Addenbrooke’s Hospital, part of Cambridge University Hospital NHS Foundation Trust.

During the pilot, the team recruited 85 patients to the Cambridge NIHR COVID BioResource, which collects blood samples from patients when they are first diagnosed and then at follow-up intervals over several months. They now hope to expand their cohort to 500 patients, recruited from Cambridgeshire and Peterborough.

In their initial findings, the team identified a biomarker – a biological fingerprint – in the blood of patients who had previously had COVID-19. This biomarker is a molecule known as a cytokine produced by T cells in response to infection. As with antibodies, this biomarker persists in the blood for a long time after infection. The team plans to publish their results shortly.

Dr Mark Wills from the Department of Medicine at the University of Cambridge, who co-leads the team, said: “We need a reliable and objective way of saying whether someone has had COVID-19. Antibodies are one sign we look for, but not everyone makes a very strong response and this can wane over time and become undetectable.

“We’ve identified a cytokine that is also produced in response to infection by T cells and is likely to be detectable for several months – and potentially years – following infection. We believe this will help us develop a much more reliable diagnostic for those individuals who did not get a diagnosis at the time of infection.”

By following patients for up to 18 months post-infection, the team hopes to address several questions, including whether immunity wanes over time. This will be an important part of helping understand whether people who have been vaccinated will need to receive boosters to keep them protected.

As part of their pilot study, the team also identified a particular biomarker found in patients with long COVID. Their work suggests these patients produce a second type of cytokine, which persists in patients with long COVID compared to those that recover quickly and might be one of the drivers behind the many symptoms that patients experience. This might therefore prove to be useful for diagnosing long COVID.

Dr Nyarie Sithole, also from the Department of Medicine at the University of Cambridge, who co-leads the team and helps to manage long COVID patients, said:  “Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties.

“There is anecdotal evidence that patients see an improvement in symptoms of long COVID once they have been vaccinated – something that we have seen in a small number of patients in our clinic. Our study will allow us to see how this biomarker changes over a longer period of time in response to vaccination.”

At the moment, the team is using the tests for research purposes, but by increasing the size of their study cohort and carrying out further work, they hope to adapt and optimise the tests that can be scaled up and speeded up, able to be used by clinical diagnostic labs.

As well as developing a reliable test, the researchers hope their work will help provide an in-depth understanding of how the immune system responds to coronavirus infection – and why it triggers long COVID in some people.

Dr Sithole added: “One of the theories of what’s driving long COVID is that it’s a hyperactive immune response – in other words, the immune system switches on at the initial infection and for some reason never switches off or never goes back to the baseline. As we’ll be following our patients for many months post-infection, we hope to better understand whether this is indeed the case.”

In addition, having a reliable biomarker could help in the development of new treatments against COVID. Clinical trials require an objective measure of whether a drug is effective. Changes in – or the disappearance of – long-COVID-related cytokine biomarkers with corresponding symptom improvement in response to drug treatment would suggest that a treatment intervention is working.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

University of Cambridge Launches Roadmap To Support Future Growth of Life Sciences Cluster

Cambridge Biomedical Campus
source: www.cam.ac.uk

 

Connect: Health Tech, the University of Cambridge Enterprise Zone, has today launched a roadmap, ‘Creating a University Enterprise Zone for Cambridge across the life and physical sciences’, that examines the challenges faced in futureproofing and sustaining the growth of the life sciences cluster to maintain Cambridge as a global centre of excellence for health tech.

 

Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences

Andy Neely

The roadmap sets out a clear plan to create a bridge between two of Cambridge’s historical strengths — biomedical research and cutting-edge technology — and bring these specialisms together to develop new treatments and health tech with real world applications. The solutions in the roadmap are scalable beyond Cambridge and also applicable to other disciplines and sectors.

Professor Andy Neely, Pro-Vice-Chancellor for Enterprise and Business Relations at the University of Cambridge, said: “Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences. Our roadmap sets out a plan to do just that and will ensure that Cambridge remains a global leader in health technology into the next generation.

“This will require us to pioneer new ways of working and creating connections between different institutions across disciplines, be they academic or private enterprise. Such a model has been proven to work at a small scale – our proposal in the roadmap is to scale this up and apply it across the cluster and beyond.”

The University sits at the heart of the so-called ‘Cambridge cluster’, in which more than 5,300 knowledge-intensive firms employ more than 67,000 people and generate £18 billion in turnover. Cambridge has the highest number of patent applications per 100,000 residents in the UK.

The mission of the University is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. This includes cultivating and delivering excellent research and world-leading innovation and training of the next generation of highly skilled researchers and entrepreneurs, thereby underpinning the UK’s economic growth and competitiveness.

Professor Tony Kouzarides, Director of the Milner Therapeutics Institute at the University of Cambridge, said: “The pandemic has clearly shown the importance of rapid innovation in healthcare. We are determined to harness the power of innovation, creativity and collaboration in Cambridge, and apply this towards solving some of the biggest medical challenges facing the country, and the world.”

The Connect: Health Tech roadmap is a result of consultation with major stakeholders and a series of road-mapping workshops with the Cambridge community. It aims to shape the future success of the Cambridge cluster in health tech through a supportive and dynamic ecosystem that aligns with the needs of the community.

The roadmap includes ambitious steps to build strong foundations for the Cambridge cluster for the next 20 years and will support the region’s economic recovery post-pandemic and bring cutting-edge research, businesses and innovators together to be better prepared and connected for the future. Connect: Health Tech will also increase access to the Cambridge ecosystem extending reach and helping to level up growth and investment across the East of England and the Oxford-Cambridge Arc.

One of the major recommendations in the report is to create and foster connectivity at the interface between medicine and technology and across sectors. This recommendation has been piloted by expanding the Cambridge cluster from a physical community to a digital one.

The COVID19 pandemic has required the creation of an innovative model of access and navigation to Cambridge. The digital platform simplifies navigation of the Cambridge research community and enables new companies based all over the world to access expertise and knowledge across the University with the aim of increasing inward investment in the life sciences. It also pilots an approach to navigation and connectivity that can be scaled up across the Arc and the UK. This new way of working will speed up the development of new healthcare innovations and technologies that the NHS will use in years to come.

Connect: Health Tech is a Cambridge University initiative funded by Research England. Connect: Health Tech UEZ has been created to build a highly effective interdisciplinary bridge between two Cambridge research hubs and beyond: the West science and technology hub anchored at the Maxwell Centre and South biomedical hub anchored at the Milner Therapeutics Institute. The bridge will bring together and integrate a community from across the University, research institutes, NHS, industry, investors, local and national Government, with a focus on medtech, digital health and therapeutics, to create opportunities that will transform ideas at the interface between medicine and technology into reality.

Read Creating a University Enterprise Zone for Cambridge across the life and physical sciences


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Top UK Organisations Release Annual Statistics For Use Of Animals In Research

Top UK organisations release annual statistics for use of animals in research

source: www.cam.ac.uk

 

The ten organisations in Great Britain that carry out the highest number of animal procedures – those used in medical, veterinary and scientific research – have today released their annual statistics.

 

We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use.

Martin Vinnell

This is to coincide with the Home Office’s publication of Great Britain’s statistics for animals used in research in 2020.

These ten organisations carried out 1,343,893 procedures, 47% or nearly half of the 2,883,310 procedures carried out in Great Britain in 2020. More than 99% of these 1,343,893 procedures were carried out in rodents or fish.

The statistics are freely available on the organisations’ websites as part of their ongoing commitment to transparency and openness around the use of animals in research.

The ten organisations are listed below alongside the total number of procedures that they carried out in 2020. This is the sixth consecutive year organisations have come together to publicise their collective statistics and examples of their research.

Organisation Number of Procedures
The Francis Crick Institute 183,811
University of Cambridge 177,219
Medical Research Council 173,637
University of Oxford 169,511
University of Edinburgh 151,669
UCL 142,988
University of Glasgow 102,526
University of Manchester 93,448
King’s College London 85,414
Imperial College London 63,670
TOTAL 1,343,893

A further breakdown of Cambridge’s numbers, including the number of procedures by species and detail of the levels of severity, can be found on our animal research pages.

Animal research has been essential for developing lifesaving vaccines and treatments for Covid-19. Ferrets and macaque monkeys were used to test the safety and efficacy of Covid-19 vaccines, including the successful Oxford / AstraZeneca vaccine. Hamsters are being used to develop Covid-19 treatment strategies as they display a more severe form of the disease than ferrets and monkeys. Guinea pigs have also been used in regulatory research to batch test vaccine potency.

Despite all this research to develop vaccines and treatments for Covid-19, the majority of UK research facilities carried out significantly less research than usual due to the various national lockdowns. Therefore, the 2020 figures cannot be reasonably compared with previous statistics.

All organisations are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible; minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as institutions expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

All organisations listed are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. More than 120 organisations have signed the Concordat including UK universities, medical research charities, research funders, learned societies and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said:

Animal research has been essential to the development and safety testing of lifesaving COVID-19 vaccines and treatments. Macaque monkeys and ferrets have been used to develop vaccines, including the Oxford / AstraZeneca vaccine, hamsters are being used to develop treatments, and guinea pigs are used to quality-check each batch of vaccines.

“Animal testing provided scientists with initial data that the vaccines were effective and safe enough to move into human clinical trials. During these trials, thousands more humans than animals were used to test how effective and safe the vaccines were in people. The pandemic has led to increased public interest in the way vaccines and medicines are developed and UAR has worked with research institutions and funding bodies throughout the UK to develop resources that explain to the public how animals have been used in this critical research.”

University of Cambridge Establishment Licence Holder Dr Martin Vinnell said:

“Animal research currently plays an essential role in our understanding of health and disease and in the development of modern medicines and surgical techniques. Without the use of animals, we would not have many of the modern medicines, antibiotics, vaccines and surgical techniques we take for granted in both human and veterinary medicine.

“We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use, for example in the development of ‘mini-organs’ grown from human cells, which can be used to model disease.”

Adapted from a press release by Understanding Animal Research.

Find out more

A team in the University of Cambridge’s Department of Engineering is developing implantable devices to bypass nerve damage and restore movement to paralysed limbs.

“Our aim is to make muscles wireless by intercepting electrical signals from the brain before they enter the damaged nerve and sending them directly to the target muscles via radio waves,” says Sam Hilton, a Research Assistant in the team.

The procedure has been tested and refined in computer simulations, and on cells grown in the lab. But before it can be tested in humans there is another important step: testing its safety in living rats. To avoid testing in animals entirely would place untenable risk on the first human recipients of this new device. All the experiments are carefully designed to ensure that just enough animals are used to produce convincing data, without resulting in unnecessary excess.

By working out how complex microelectronics can interface with living tissue in a very precise and controlled way, this work has potential to improve or restore movement in patients suffering severe nerve damage – improving their quality of life and easing the burden on our healthcare services.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Women Economists Underrepresented ‘At Every Level’ In UK Academia – Report

 

New research shows the gender gap in the teaching and study of economics is still dramatic and actually getting worse. Economists argue that this is not just a problem for the discipline, but for society as a whole.

 

Unless economists are diverse, we cannot hope to build a complete understanding of the economy, and, with it, formulate the right kinds of policies

Victoria Bateman

Women are underrepresented “at almost every level” within the discipline of economics at UK universities, according to a new report co-authored by a Cambridge economist.

Dr Victoria Bateman says that her research for the Royal Economics Society (RES) found signs of “stagnation and retreat” in the closing of gender gaps across the study of economics – with female intake (relative to male) actually falling at both undergraduate and master’s levels over the last two decades.

Published today, the report ‘Gender Imbalance in UK Economics’ marks 25 years since the establishment of the RES Women’s Committee, which was set up to monitor and advance the representation of women in UK economics.

“The economy affects everyone, and economists need to represent us all,” said Bateman, an Economics Fellow at Gonville and Caius College. “If they don’t, that’s a major barrier to building a solid understanding of the economy.”

“Across all students, from undergraduate to PhD, there are twice as many men studying economics as there are women in UK universities. While in many respects the discipline of economics has come a long way in the 21st century, the gender gap is clearly still real, persistent and in some ways getting worse.”

Bateman and colleagues argue that attracting, retaining and promoting female economists is a “particular problem” within UK academia when compared to areas of government and third sector organisations such as think tanks.

Only a quarter (26%) of economists working in UK academia are female, and only 15% of economics professors are women, compared to 38% of the economists at the UK Treasury and 44% of researchers at economic think tanks.

Among UK students entering the discipline, the gender gap has actually widened since 2002, when 31% of economics undergraduates and 37% of master’s students were women. By 2018, this had fallen to 27% and 31% respectively. Bateman says these statistics show that the closure of the gender gap in economics “isn’t simply a matter of time”.

“Only a third of economics lecturers in the UK are women, and just 15% of economics professors,” said report co-author Dr Erin Hengel, who received her PhD in economics from Cambridge before going on to lecture at the University of Liverpool.

“While these figures are better than they were 25 years ago, the improving trend has levelled off. It appears that progress is starting to slow far before we reach any kind of gender parity.”

When the report’s authors factored in ethnicity, the percentage of female students was higher. In 2018, a third (33%) of Black economics undergraduates and 31% of Asian ethnicity undergraduates were women, compared to a quarter (25%) of White students.

However, women from ethnic minority backgrounds are not staying in academic economics. The report also found that at PhD level, the proportion of women is ten percentage points lower among minority candidates than white candidates.

Perhaps startlingly, the report found that between 2012 and 2018 there was not a single Black woman employed as a professor of economics anywhere in the UK.

Bateman says she hopes the new report will serve as a “call to arms” for the discipline of economics. “We are calling on universities to ask themselves why so few UK women are attracted to studying and researching the economy and why, even when they are, they do not stay,” she said.

Bateman’s 2019 book The Sex Factor showed how the status and freedom of women are central to prosperity, and that ‘gender blindness’ in economics has left the discipline wide of the mark on everything from poverty and inequality to understanding cycles of boom and bust.

“Unless economists are diverse, we cannot hope to build a complete understanding of the economy, and, with it, formulate the right kinds of policies,” Bateman added.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Trinity Challenge Announces Inaugural Winners

Trinity Challenge announces inaugural winners

Collage of Trinity Challenge finalists

 

The Trinity Challenge has announced the winners of its inaugural competition, and is investing a £5.7 million (US$8 million) charitable pledged prize fund into one grand prize winner, two 2nd prize winners, and five 3rd prize winners.

 

While others talked, we took action. The solutions we have discovered in the course of the Challenge will be a link between systems and countries

Dame Sally Davies

The eight winners have been selected by an international panel of expert judges, out of a total of 340 applications from 61 countries. The competition has seen unprecedented collaborations between the private, public, charitable and academic sectors, and will drive a step-change in using data and analytics for pandemic preparedness.

The University of Cambridge joined a coalition of some of the world’s leading businesses and academic and tech institutions to launch The Trinity Challenge in September 2020. The global challenge, convened by Dame Sally Davies, Master of Trinity College, provides a £10m prize fund for breakthrough solutions to make sure one billion more people are better protected against health emergencies.

Participatory One Health Disease Detection (PODD), which empowers farmers to identify and report zoonotic diseases that could potentially pass from animals to humans, has been named the grand prize winner at the inaugural awards ceremony. The organisation is being awarded £1.3 million (US$1.8 million) in pledged funding.

Led by Susumpat Patipow, General Director at OpenDream, PODD has developed a platform for livestock owners to report suspected animal illness, and in return receive veterinary care to improve animal health. If it appears a disease outbreak is likely, local health officials will quarantine the sick animals, saving the remaining livestock and possibly preventing the next COVID-19-type outbreak.

Having already achieved significant success in Thailand, with a network of 20,000 farmers helping to detect and control disease outbreaks, PODD is looking to expand its operations to Cambodia, India, Indonesia, Laos, Uganda and Vietnam over the next three years.

BloodCounts! – an international consortium of scientists, led by Professor Carola-Bibiane Schönlieb from Cambridge’s Department of Applied Mathematics and Theoretical Physics (DAMTP) that has developed an innovative infectious disease outbreak detection system, was one of two second-prize winners, each awarded £1 million in pledged funding.

Developed by Dr Michael Roberts and Dr Nicholas Gleadall, the BloodCounts! Solution uses data from routine blood tests and powerful AI-based techniques to provide a ‘tsunami-like’ early warning system for new disease outbreaks.

“Since the beginning of the pandemic I have been developing AI-based methods to aid in medical decision making for COVID-19 patients, starting with analysis of Chest X-ray data,” said Roberts, who is affiliated with DAMTP and the Cambridge Mathematics of Information in Healthcare (CMIH) Hub. “Echoing the observations made by the clinical teams, we saw profound and unique differences in the medical measurements of infected individuals, particularly in their full blood count data. It is these changes that we can train models to detect at scale.”

Unlike many current test methods, their approach doesn’t require any prior knowledge of a specific pathogen to work, instead, they use full blood count data to exploit the pathogen detecting abilities of the human immune system by observing changes in the blood measurements associated with infection.

As the full blood count is the world’s most common medical laboratory test, with over 3.6 billion being performed worldwide each year, the BloodCounts! team can rapidly apply their methods to scan for abnormal changes in the blood cells of large populations – alerting public health agencies to potential outbreaks of pathogen infection.

This solution is a demonstration of how the application of AI-based methods can lead to healthcare benefits. It also highlights the importance of strong collaboration between leading organisations, as the development of these algorithms was only possible due the EpiCov data sharing initiative pioneered by Cambridge University Hospitals.

“Hundreds of millions of full blood count tests are being performed every day worldwide, and this meant that we could apply our AI methods at population scale,” said Gleadall, from the University of Cambridge and NHS Blood and Transplant. “Usually the rich measurement data are discarded after summary results have been reported, but by working with Cambridge University, Barts Health London, and University College London NHS Hospitals we have rescued throughout the pandemic the rich data from 2.8 million full blood count tests.”

The Sentinel Forecasting System is the other second-prize winner, and will explore the emergence of new infectious diseases in West Africa, beginning with Lassa fever. The system will combine data from ecology, social science, genomics and epidemiology to provide real-time disease risk for haemorrhagic fevers, such as Lassa and Ebola.

Lassa is a virus usually passed to humans through exposure to food or household items contaminated by infected rats. It is endemic in West African countries including Benin, Ghana, Guinea, Liberia, Mali, Sierra Leone, Togo and Nigeria.

Around 80% of people who become infected with Lassa virus have no symptoms, and the overall case-fatality rate is 1%. 1 in 5 infections can result in severe disease affecting the liver, spleen and kidneys.

The UCL team will partner with the African Centre of Excellence for Genomics of Infectious Diseases in Nigeria, Nigeria Centre for Disease Control, Zoological Society of London, London School of Hygiene and Tropical Medicine, Microsoft, and Cambridge’s Laboratory of Viral Zoonotics (LVZ) to produce the system.

“This Trinity Challenge project brings new multidisciplinary technologies together to anticipate climatic, human, animal population, agricultural impacts on the likelihood of spill overs of infections from animals to humans,” said Professor Jonathan Heeney, who leads LVZ at Cambridge’s Department of Veterinary Medicine.

Additionally, five 3rd prize winners are each being awarded £480,000 (US$ 660,000) in pledged funding.

Dame Sally Davies said: “It was crystal clear at the beginning of this pandemic that the world had a lack of data, a lack of access to data, and a lack of interoperability of data, presenting a challenge. While others talked, we took action. The solutions we have discovered in the course of the Challenge will be a link between systems and countries.”

In addition to financial support, The Trinity Challenge will provide connections to the right organisations to maximise the impact of these solutions. Since its inception nine months ago, TTC has united early applicants with partners from the private, academic and social sectors to receive access to digital platforms, data, and technical advice, to scale-up the use of data and analytics to protect the world from future health emergencies. The Trinity Challenge has helped form over 200 connections between applicants and its members.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Marmoset Study Identifies Brain Region Linking Actions To Their Outcomes

Marmoset

 

Researchers have discovered a specific brain region underlying ‘goal-directed behaviour’ – that is, when we consciously do something with a particular goal in mind, for example going to the shops to buy food.

 

This is a first step towards identifying suitable molecular targets for future drug treatments, or other forms of therapy, for devastating mental health disorders such as OCD and addiction.

Trevor Robbins

The study, published today in the journal Neuron, found that marmoset monkeys could no longer make an association between their behaviour and a particular outcome when a region of their brain called the anterior cingulate cortex was temporarily switched off.

This finding is important because the compulsive behaviours in OCD and addiction are thought to result from impairments in the ‘goal-directed system’ in the brain. In these conditions worrying, obsessions or compulsive behaviour such as drug seeking may reflect an alternative, habit-based system at work in the brain in which behaviours are not correctly linked with their outcomes.

It also sheds more light on how healthy people behave in a goal-directed way, which is needed to respond to changing environments and goals.

“We have identified the very specific region of the brain involved in goal-directed behaviour. When we temporarily turned this off, behaviour became more habitual – like when we go onto autopilot,” said Lisa Duan in the University of Cambridge’s Department of Psychology, first author of the report.

Marmosets were used because their brains share important similarities with human brains, and it is possible to manipulate specific regions of their brains to understand causal effects.

In the experiment, marmosets were first taught a goal-directed behaviour: by tapping a coloured cross when it appeared on a touchscreen, they were rewarded with their favourite juice to drink. But this connection between action and reward was randomly uncoupled so that they sometimes received the juice without having to respond to the image. They quickly detected this change and stopped responding to the image, because they saw they could get juice without doing anything.

Using drugs, the researchers temporarily switched off the anterior cingulate cortex including its connections with another brain region called the caudate nucleus. Repeating the experiment, they found when the connection between tapping the cross and receiving juice was randomly uncoupled, the marmosets did not change their behaviour but kept tapping the cross when it appeared.

Such habitual responding to the coloured cross was not observed when several other neighbouring regions of the brain’s prefrontal cortex – known to be important for other aspects of decision-making – were switched off. This shows the specificity of the anterior cingulate region for goal-directed behaviour.

A similar effect has been observed in computer-based tests on patients with Obsessive Compulsive Disorder (OCD) or addiction – when the relationship between an action and an outcome is uncoupled the patients continue to respond as though the connection is still there.

Previous evidence from patients suffering brain damage, and from brain imaging in healthy volunteers, shows that part of the brain called the prefrontal cortex is involved in goal-directed behaviour. However, the prefrontal cortex is a complex structure with many regions, and it has not previously been possible to identify the specific part responsible for goal-directed behaviour from human studies alone.

“We think this is the first study to have established the specific brain circuitry that controls goal-directed behaviour in primates, whose brains are very similar to human brains,” said Professor Angela Roberts in the University of Cambridge’s Department of Physiology, Development and Neuroscience, joint senior author of the report.

“This is a first step towards identifying suitable molecular targets for future drug treatments, or other forms of therapy, for devastating mental health disorders such as OCD and addiction,” added Professor Trevor Robbins in the University of Cambridge’s Department of Psychology, joint senior author of the report.

This research was conducted in the University of Cambridge’s Behavioural and Clinical Neuroscience Institute, and was funded by Wellcome.

Reference

Duan, L.Y. et al. ‘Controlling one’s world: identification of sub-regions of primate PFC underlying goal-directed behaviour.’ Neuron, June 2021. DOI: 10.1016/j.neuron.2021.06.003


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.