All posts by Adam Brinded

Cambridge is the UK’s most innovation intensive city, says report

Hand holding test tubes in a lab

A new report by Dealroom shows that Cambridge is, for its size, the most innovative city in the UK. Globally, it ranks fourth behind US innovation powerhouses San Francisco, Boston and New York. 

Dealroom’s Global Tech Ecosystem Index analyses and compares start-up ecosystems in 288 cities across 69 countries. To measure innovation intensity, it looks for ecosystems that are performing well relative to their population size. These hubs typically have high start-up activity, research intensity and strong links with local universities.

Diarmuid O’Brien, Pro-Vice-Chancellor for Innovation at the University of Cambridge, said: “It’s great to see that, as a relatively small city, Cambridge continues to lead the UK in innovation intensity but it’s no accident that we punch above our weight. In recent years, the University and the wider ecosystem have put in place a range of initiatives to ensure that we realise our potential and are able to bring transformative science and technologies out of the lab and into the real world.”

Gerard Grech, Head of Founders at the University of Cambridge, which supports new ventures emerging from the University, added: “Cambridge is proof of what happens when world-class research meets relentless ambition. While global venture capital funding in 2024 pulled back, Cambridge doubled investment – a powerful signal that deep tech innovation is increasingly leading the way in shaping our future economies.

“What makes Cambridge unique is its cutting-edge science, an increasing flywheel of people who have successfully scaled ventures, and a culture built to turn ground-breaking ideas into transformative companies.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Cambridge researchers named as 2025 Academy of Medical Sciences Fellows

Academy of Medical Sciences plaque
Academy of Medical Sciences plaque
Credit: Big T Images for Academy of Medical Sciences

Four Cambridge biomedical and health researchers are among those announced today as newly-elected Fellows of the Academy of Medical Sciences.

The new Fellows have been recognised for their remarkable contributions to advancing medical science, groundbreaking research discoveries and translating developments into benefits for patients and the wider public. Their work exemplifies the Academy’s mission to create an open and progressive research sector that improves health for everyone.

They join an esteemed Fellowship of 1,450 researchers who are at the heart of the Academy’s work, which includes nurturing the next generation of scientists and shaping research and health policy in the UK and worldwide.

One of Cambridge’s new Fellows, Professor Sam Behjati, is a former recipient of the Academy’s prestigious Foulkes Foundation medal, which recognises rising stars within biomedical research. Sam is Clinical Professor of Paediatric Oncology at the University and an Honorary Consultant Paediatric Oncologist at Addenbrooke’s Hospital, as well as Group Leader at the Wellcome Sanger Institute. His research is rooted in cancer genomics, phylogenetics, and single cell transcriptomics and spans a wide range of diseases and biological problems. More recently, his work has focused on the origin of cancers, in particular of childhood cancer. In addition, he explores how to use genomic data to improve the treatment of children. Sam is a Fellow at Corpus Christi College, Cambridge.

Also elected to the Academy of Medical Sciences Fellowship are:

Professor Clare Bryant, Departments of Medicine and Veterinary Medicine

Clare Bryant is Professor of Innate Immunity. She studies innate immune cell signalling during bacterial infection to answer fundamental questions about host-pathogen interactions and to search for new drugs to modify them. She also applies these approaches to study inflammatory signalling in chronic diseases of humans and animals.  Clare has extensive collaborations with many pharmaceutical companies, is on the scientific advisory board of several biotech companies, and helped found the natural product company Polypharmakos. Clare is a Fellow of Queens’ College, Cambridge.

Professor Frank Reimann, Institute of Metabolic Science-Metabolic Research Laboratories

Frank Reimann is Professor of Endocrine Signaling. The main focus of his group, run in close partnership with Fiona Gribble, is the enteroendocrine system within the gut, which helps regulate digestion, metabolism, and how full we feel. Their work has included the use of animal models and human cellular models to understand how cells function. One of these cells, glucagon-like peptide-1 (GLP-1) is the target of therapies now widely used in the treatment of diabetes mellitus and obesity. How cells shape feeding behaviour has become a major focus of the lab in recent years.

Professor Mina Ryten, UK Dementia Research Institute

Mina Ryten is a clinical geneticist and neuroscientist, and Director of the UK Dementia Research Institute at Cambridge since January 2024. She also holds the Van Geest Professorship and leads a lab focused on understanding molecular mechanisms driving neurodegeneration. Mina’s research looks at how genetic variation influences neurological diseases, particularly Lewy body disorders. Her work has advanced the use of single cell and long-read RNA sequencing to map disease pathways and identify potential targets for new treatments. Her expertise in clinical care and functional genomics has enabled her to bridge the gap between patient experience and scientific discovery.

Professor Andrew Morris CBE FRSE PMedSci, President of the Academy of Medical Sciences, said: “The breadth of disciplines represented in this year’s cohort – from mental health and infectious disease to cancer biology and respiratory medicine – reflects the rich diversity of medical science today. Their election comes at a crucial time when scientific excellence and collaboration across disciplines are essential for addressing global health challenges both now and in the future. We look forward to working with them to advance biomedical research and create an environment where the best science can flourish for the benefit of people everywhere.”

The new Fellows will be formally admitted to the Academy at a ceremony on Wednesday 9 July 2025.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Enhanced breast cancer screening in the UK could detect an extra 3,500 cancers per year, trial shows

Woman Undergoing Mammogram Procedure
Woman undergoing mammogram procedure – stock photo
Credit: Tom Werner (Getty Images)

Researchers in Cambridge are calling for additional scans to be added to breast screening for women with very dense breasts. This follows a large-scale trial, which shows that extra scans could treble cancer detection for these women, potentially saving up to 700 lives a year in the UK.

We need to change our national screening programme so we can make sure more cancers are diagnosed early, giving many more women a much better chance of survivalFiona Gilbert

Around 10% of women have very dense breasts. Between the ages of 50 and 70, these women are up to four-times more likely to develop breast cancer compared to women with low breast density.

Over 2.2 million women receive breast screening in the UK each year. For women with very dense breasts, mammograms (breast X-rays), which are used for breast screening, can be less effective at detecting cancer. This is because denser breasts look whiter on mammograms, which makes it harder to spot small early-stage cancers which also appear white.

Published today in The Lancet, a trial of over 9000 women across the UK who have dense breasts and had a negative (no cancer) mammogram result, found 85 cancers.

The trial, called BRAID, tested different scanning methods that could be used in addition to mammograms to detect cancers in dense breasts. Per 1000 women screened, two of the methods detected 17-19 cancers that were not seen in mammograms.

The two methods are known as CEM (contrast enhanced mammography) and AB-MRI (abbreviated magnetic resonance imaging).

The researchers that ran the trial recommend that adding either of these methods to existing breast screening could detect 3,500 more cancers per year in the UK. Estimates suggest that screening reduces mortality for about 20% of cancers detected, so this could mean an extra 700 lives saved each year.

BRAID also included a third scanning method, ABUS (automated whole breast ultrasound), which also detected cancers not seen in mammograms but was three times less effective than CEM and AB-MRI.

Each of the three methods was used to scan around 2000 women. Per 1000 women scanned, CEM detected 19 cancers, AB-MRI found 17 cancers, and ABUS found 4.

Mammograms already detect approximately 8 cancers per 1000 women with dense breasts. This means additional scans could more than treble breast cancer detection in this group of women.

BRAID is the first trial to directly compare supplemental imaging methods and to demonstrate their value for early cancer detection as part of widespread screening. The team hope their results will be used to enhance screening programmes in the UK and globally to diagnose more cancers early.

More work is needed to confirm whether additional scans will reduce the number of deaths as cancers detected through screening are not always life-threatening.

The trial was led from Cambridge. It recruited across 10 UK sites, including over 2000 women at Addenbrooke’s Hospital, Cambridge.

The research was led by Professor Fiona Gilbert, Department of Radiology, University of Cambridge and honorary consultant radiologist at Addenbrooke’s Hospital, part of Cambridge University Hospitals NHS Foundation Trust (CUH). The trial was funded by Cancer Research UK with support from the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre (BRC).

Professor Gilbert said: “Getting a cancer diagnosis early makes a huge difference for patients in terms of their treatment and outlook. We need to change our national screening programme so we can make sure more cancers are diagnosed early, giving many more women a much better chance of survival.”

Professor Stephen Duffy, Emeritus Professor, Queen Mary University, London, trial statistician and screening programme expert said: “The NHS Breast Screening Programme has made a huge difference to many lives. Thanks to these results we can see that the technology exists to make screening even better, particularly for the 10% of women with dense breast tissue.”

Dr David Crosby, head of prevention and early detection at Cancer Research UK, said: “Breast cancer screening is for people without symptoms and helps to spot the disease at an early stage, when treatment is more likely to be successful. But having dense breasts can make it harder to detect cancer.

“This study shows that making blood vessels more visible during mammograms could make it much easier for doctors to spot signs of cancer in women with dense breasts. More research is needed to fully understand the effectiveness of these techniques, but these results are encouraging.

“Remember, having dense breasts is not something you can check for yourself or change, but if you’re concerned at all, you can speak to your GP.”

Reference
Gilbert, FJ et al. Comparison of supplemental imaging techniques – interim results of BRAID (Breast Screening: risk adapted imaging for density) randomized controlled trial. Lancet; 22 May 2025; DOI: 10.1016/S0140-6736(25)00582-3 

Press release from Cambridge University Hospitals NHS Foundation Trust


Louise’s story

Louise Duffield, age 60, a grandmother of four from Ely was diagnosed with early-stage breast cancer as a result of the BRAID trial.

Louise works in local government. She spends her free time knitting, and visiting 1940s events around the UK with her husband, Fred, and their two restored wartime Jeep. She is enthusiastic about clinical research and has previously participated as a healthy participant in several studies.

In 2023, Louise was invited to participate in the BRAID trial following her regular mammogram screening, which showed that she had very dense breasts. As part of the trial, Louise had an AB-MRI scan which identified a small lump deep inside one of her breasts.

“When they rang to say they’d found something, it was a big shock. You start thinking all sorts of things but, in the end, I just thought, at least if they’ve found something, they’ve found it early. The staff were brilliant, and so supportive.”

Soon after the MRI, Louise had a biopsy that confirmed she had stage 0 (very early) breast cancer within the ducts of one of her breasts. Six weeks later Louise underwent surgery to remove the tumour, during that time the tumour had already grown larger than it appeared on the scans.

“It’s been a stressful time and it’s a huge relief to have it gone. The team have been fantastic throughout. The tumour was deep in the breast so, if I hadn’t been on the trial, it could have gone unnoticed for years.

“I feel very lucky, it almost doesn’t feel like I’ve really had cancer. Without this research I could have had a very different experience.”

The location of Louise’s tumour meant it would have been difficult for her to find it through self-examination, and since it was not detected during her regular mammogram it would have been at least three years before she was invited for another.

Following a short course of radiotherapy, Louise is now cancer free. She will continue to be monitored for several years and will continue to be attending her regular mammograms every three years as part of the national breast cancer screening programme.

“This experience has highlighted to me how important screening is. If I hadn’t had the mammogram, I wouldn’t have been invited to the trial. Getting treated was so quick because they found the cancer early.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Life, death and mowing

Britain’s poetic obsession with the humble lawnmower revealed and explained

By Tom Almeroth-Williams

Over the last half-century, British poets including Philip Larkin and Andrew Motion have driven a ‘lawnmower poetry microgenre’, using the machine to explore childhood, masculinity, violence, addiction, mortality and much more, new research shows.

The study, published in Critical Quarterly, argues that the tradition goes back to the 17th-century poet Andrew Marvell who used mowing – with a scythe – to comment on the violence of the English Civil War.

“Lawnmower poetry had its highpoint in the late 20th century but now would be a good moment for a revival,” says the study’s author, Francesca Gardner, from Cambridge’s English Faculty and St Catharine’s College.

“It might seem random to write poetry about mowing but it’s a great vehicle for exploring our relationship with nature and with each other. Andrew Marvell wrote about mowing with scythes after the English Civil War and modern poets continue to use lawnmowers to think about their own ups and downs.

“In a time of eco-crisis, conflict and societal problems, perhaps another poet will be inspired to write one soon. They might reflect the growing anti-lawn movement or something else entirely.”

In 1651, Andrew Marvell wrote a poem in which a mower accidentally kills a bird crouched in the grass. In ‘Upon Appleton House’, he wrote that the ‘Edge’ of the scythe was left ‘all bloody from its Breast’.

Gardner argues that the poem makes us think about ‘the Flesh untimely mow’d’ as a result of powerful undeviating cycles including the seasons and warfare which dominate our lives and determine our actions.

In 1979, another poet from Hull, Philip Larkin, described killing a hedgehog with his own motorised machine. In The MowerLarkin wrote that his mower had ‘stalled, twice’ and that he found ‘A hedgehog jammed up against the blades, / Killed.’

Inspired by ‘Upon Appleton House’, Larkin also admired Marvell’s four mower poems, ‘The Mower’s Song’, ‘Damon the Mower’, ‘The Mower Against Gardens’, and ‘The Mower to the Glow-Worms’, describing them as ‘charming and exquisite in the pastoral tradition’, and Gardner points out numerous similarities between the two poets.

“Larkin had a deep awareness of pastoral and georgic poetry and this makes his poem more unsettling. While he felt terrible about killing the hedgehog, which really happened, his poem is disturbing because it presents an uneasy affinity between the natural and the mechanical.”

“Every time Larkin cuts the grass, it grows back so he’s forced to use a machine that completes the job efficiently and repeatedly. By mirroring nature’s cruel, relentless forces, mowers like Larkin commit their own acts of cruelty and violence.”

And yet, Gardner argues, it is often through their violence that the human mowers in these poems discover a capacity to be careful, sensitive and empathetic.

Larkin’s is one the best-known lawnmower poems from the UK and USA discussed by Gardner but not the only one to tackle traumatic events.

In 2007, Andrew Motion based a moving elegy for his father on happy memories of him mowing the lawn. By contrast, Michael Laskey’s 1999 ‘The Lawnmower’ uses the machine to describe fatherly ‘despotism and neglect’, Gardner argues.

“Mowing a lawn is often viewed as a victory over nature but these poems reflect an increasing sense that this is a pyrrhic or ignoble victory,” Gardner says. “The father in Michael Laskey’s poem is so intent on mowing straight lines that he misses out on the joyful messiness of life with his children.”

Laskey’s poem ends: ‘We keep back, / do as we’re told, don’t touch. / It must be overgrown now, the grave’.

Gardner says: “British poets are very interested in the lawn as a nostalgic space so lawnmowers are often associated with childhood memories, especially of fathers working. The lawn is a safe domestic, often suburban, space in which unexpected violence can occur, as when Larkin kills a hedgehog.”

Gardner’s favourite lawnmower poem is Mark Waldron’s 2017 ‘I wish I loved lawnmowers’ which explores alienation, obsession and drug addiction. The speaker tells us that, if he loved lawnmowers, he would take a trip to the British Lawnmower Museum in Southport. But he doesn’t and the poem ends: ‘Now crack cocaine — that I loved’.

Most of the poems Gardner studied were written by recognised poets but she also found examples written by lawnmower enthusiasts. In 2013, Grassbox, the Old Lawnmower Club’s magazine, published Tony Hopwood’s parody of the hymn ‘Morning Has Broken’ which laments: ‘Mower has broken, / Gardener’s in mourning. / Missus has spoken, / Had the last word.’

“Lawnmowers draw people to poetry as much as poetry draws people to lawnmowers,” Gardner says.

Gardner points out that to-date most British lawnmower poems have been written by men but has found examples of women poking fun at mower-obsessed men. In 2002, Grassbox published ‘A Lawnmower Widow’s Lament’, a poem by Peggy Miller, which opens: ‘I once was loved and cherished by a man who was quite handsome / But now I’m second fiddle to a Dennis or Ransomes’.

Francesca Gardner, a Harding Distinguished Scholar, is an expert on early modern pastoral, georgic, and ‘nature’ writing. She explains that British and American lawnmower poetry is rooted in two forms of ancient nature poetry.

The pastoral form presents an idyllic form of nature in which shepherds stroll through fields and the land yields things up to them. By contrast, georgic poetry involves people having to work hard and use tools because nature isn’t so generous.

Gardner points out that Andrew Marvell’s ‘Upon Appleton House’ is an unusual mixture of both pastoral and georgic.

“Poets inspired by Marvell appreciate that clash between idyllic nature and what it takes to maintain the lawn as an ideal space, the georgic conception of work,” Gardner says.

The final lines of Larkin’s poem were widely quoted during the Covid-19 pandemic: ‘we should be kind / While there is still time.’

“That remains a useful lesson whether we’re mowing or not,” Gardner says.

Reference

Francesca Gardner, ‘Lawnmower Poetry and the Poetry of Lawnmowers’,
Critical Quarterly (2025). DOI: 10.1111/criq.12818

Find out more from Francesca in this short film:

Francesca Gardner on the lawn at St Catharine's College, Cambridge, in front of a lawnmower
Francesca Gardner at St Catharine’s College, Cambridge
Portrait of Andrew Marvell attributed to Godfrey Kneller in the collection of Marvell's alma mater, Trinity College, Cambridge
Portrait of Andrew Marvell attributed to Godfrey Kneller in the collection of his Marvell’s alma mater, Trinity College, Cambridge
Francis Place (1647-1728), A Study for the Pilkington Crest, a mower with a scythe (undated). Yale Center for British Art, Paul Mellon Collection
Francis Place (1647-1728), A mower with a scythe (undated)
A black and white photo of a man wearing a suit mowing a lawn
Photograph of a man mowing a lawn in the mid-20th Century
Horse-drawn mowing of the meadow at King's College, Cambridge
Mowing the meadow at King’s College, Cambridge in 2022

Published 17th May 2025

The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License


Image credits

University of Cambridge: Title image; Francesca Gardner
Sarah Laval via Flikr: mower in motion (banner)

Trinity College, Cambridge: Portrait of Andrew Marvell
Yale Center for British Art, Paul Mellon Collection: Mower with a scythe
Mike Finn via Flikr: Hedgehog
Dave’s Archive via Flikr: two archive photographs of men mowing
Lloyd Mann: Meadow mowing at King’s College, Cambridge

source: cam.ac.uk

Removing ovaries and fallopian tubes linked to lower risk of early death among certain breast cancer patients

Doctor and patient making a mammography
Doctor and patient making a mammography
Credit: pixelfit (Getty Images)

Women diagnosed with breast cancer who carry particular BRCA1 and BRCA2 genetic variants are offered surgery to remove the ovaries and fallopian tubes as this dramatically reduces their risk of ovarian cancer. Now, Cambridge researchers have shown that this procedure – known as bilateral salpingo-oophorectomy (BSO) – is associated with a substantial reduction in the risk of early death among these women, without any serious side-effects.

Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operationAntonis Antoniou

Women with certain variants of the genes BRCA1 and BRCA2 have a high risk of developing ovarian and breast cancer. These women are recommended to have their ovaries and fallopian tubes removed at a relatively early age – between the ages 35 and 40 years for BRCA1 carriers, and between the ages 40 and 45 for BRCA2 carriers.

Previously, BSO has been shown to lead to an 80% reduction in the risk of developing ovarian cancer among these women, but there is concern that there may be unintended consequences as a result of the body’s main source of oestrogen being removed, which brings on early menopause. This can be especially challenging for BRCA1 and BRCA2 carriers with a history of breast cancer, as they may not typically receive hormone replacement therapy to manage symptoms. The overall impact of BSO in BRCA1 and BRCA2 carriers with a prior history of breast cancer remains uncertain. 

Ordinarily, researchers would assess the benefits and risks associated with BSO through randomised controlled trials, the ‘gold standard’ for testing how well treatments work. However, to do so in women who carry the BRCA1 and BRCA2 variants would be unethical as it would put them at substantially greater risk of developing ovarian cancer.

To work around this problem, a team at the University of Cambridge, in collaboration with the National Disease Registration Service (NDRS) in NHS England, turned to electronic health records and data from NHS genetic testing laboratories collected and curated by NDRS to examine the long-term outcomes of BSO among BRCA1 and BRCA2 PV carriers diagnosed with breast cancer. The results of their study, the first large-scale study of its kind, are published today in The Lancet Oncology.

The team identified a total of 3,400 women carrying one of the BRCA1 and BRCA2 cancer-causing variants (around 1,700 women for each variant). Around 850 of the BRCA1 carriers and 1,000 of the BRCA2 carriers had undergone BSO surgery.

Women who underwent BSO were around half as likely to die from cancer or any other cause over the follow-up period (a median follow-up time of 5.5 years). This reduction was more pronounced in BRCA2 carriers compared to BRCA1 carriers (a 56% reduction compared to 38% respectively). These women were also at around a 40% lower risk of developing a second cancer.

Although the team say it is impossible to say with 100% certainty that BSO causes this reduction in risk, they argue that the evidence points strongly towards this conclusion.

Importantly, the researchers found no link between BSO and increased risk of other long-term outcomes such as heart disease and stroke, or with depression. This is in contrast to previous studies that found evidence in the general population of an association between BSO and increased risk of these conditions.

First author Hend Hassan, a PhD student at the Centre for Cancer Genetic Epidemiology, Department of Public Health and Primary Care, and Wolfson College, Cambridge, said: “We know that removing the ovaries and fallopian tubes dramatically reduces the risk of ovarian cancer, but there’s been a question mark over the potential unintended consequences that might arise from the sudden onset of menopause that this causes.

“Reassuringly, our research has shown that for women with a personal history of breast cancer, this procedure brings clear benefits in terms of survival and a lower risk of other cancers without the adverse side effects such as heart conditions or depression.”

Most women undergoing BSO were white. Black and Asian women were around half as likely to have BSO compared to white women. Women who lived in less deprived areas were more likely to have BSO compared to those in the most-deprived category.

Hassan added: “Given the clear benefits that this procedure provides for at-risk women, it’s concerning that some groups of women are less likely to undergo it. We need to understand why this is and encourage uptake among these women.”

Professor Antonis Antoniou, from the Department of Public Health and Primary Care, the study’s senior author, said: “Our findings will be crucial for counselling women with cancer linked to one of the BRCA1 and BRCA2 variants, allowing them to make informed decisions about whether or not to opt for this operation.”

Professor Antoniou, who is also Director of the Cancer Data-Driven Detection programme, added: “The study also highlights the power of exceptional NHS datasets in driving impactful, clinically relevant research.”

The research was funded by Cancer Research UK, with additional support from the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre.

The University of Cambridge is fundraising for a new hospital that will transform how we diagnose and treat cancer. Cambridge Cancer Research Hospital, a partnership with Cambridge University Hospitals NHS Foundation Trust, will treat patients across the East of England, but the research that takes place there promises to change the lives of cancer patients across the UK and beyond. Find out more here.

Reference

Hassan, H et al. Long-term health outcomes of bilateral salpingo-oophorectomy in BRCA1 and BRCA2 pathogenic variant carriers with personal history of breast cancer: a retrospective cohort study using linked electronic health records. Lancet Oncology; 7 May 2025; DOI: 10.1016/S1470-2045(25)00156-1



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Significant gaps in NHS care for patients who are deaf or have hearing loss, study finds

A male doctor sits next to a male patient in a waiting room while holding a digital tablet. In the background a nurse chats to a family while holding digital tablet.
A male doctor sits next to a male patient in a waiting room while holding a digital tablet.
Credit: sturti via Getty Images

A majority of individuals who are deaf or have hearing loss face significant communication barriers when accessing care through the National Health Service (NHS), with nearly two-thirds of patients missing half or more of vital information shared during appointments.

Better communication for deaf patients benefits everyone. We’re not just pointing out problems – we’re providing practical solutions.Bhavisha Parmar

A team of patients, clinicians, researchers and charity representatives, led by the University of Cambridge and the British Society of Audiology, surveyed over 550 people who are deaf or have hearing loss about their experiences with the NHS – making it the largest study of its kind. Their findings, reported in the journal PLOS One, highlight systemic failures and suggest changes and recommendations for improving deaf-aware communication in the NHS.

“The real power of this study lies in the stories people shared,” said lead author Dr Bhavisha Parmar from Cambridge’s Department of Clinical Neurosciences (Sound Lab) and UCL Ear Institute. “Patients weren’t just rating their experiences – they were telling us how these barriers affect every part of their healthcare journey, and in many cases, why they avoid healthcare altogether.”

The study found that despite being a legal requirement under the Accessible Information Standards, NHS patients have inadequate and inconsistent access to British Sign Language (BSL) interpreters and other accessibility accommodations such as hearing loop systems.

Nearly two-thirds (64.4%) of respondents reported missing at least half of the important information during appointments, and only a third (32%) expressed satisfaction with NHS staff communication skills. Respondents said they had to rely on family members or advocates to communicate with healthcare workers, raising privacy and consent concerns.

The research found that communication barriers extend across the entire patient journey – from booking appointments to receiving results. Simple actions, like calling a patient’s name in a waiting room or giving instructions during a scan, become anxiety-inducing when basic accommodations are lacking. Respondents noted that hearing aids often must be removed for X-rays or MRI scans, leaving them struggling or unable to follow verbal instructions.

“We heard over and over that patients fear missing their name being called, or avoid making appointments altogether,” said Parmar. “These aren’t isolated experiences – this is a systemic issue.”

The idea for the study was sparked by real-life experiences shared online by NHS patients, particularly audiology patients– a field Parmar believes should lead by example. “We’re audiologists: we see more patients with hearing loss than anyone else in the NHS,” she said. “If we’re not deaf-aware, then how can we expect other parts of the NHS to be?”

The research team included NHS patients with deafness or hearing loss, who contributed to study design, data analysis, and report writing. As part of the study, they received training in research methods, ensuring the work was grounded in and reflective of lived experiences.

Co-author Zara Musker, current England Deaf Women’s futsal captain and winner of deaf sports personality of the year 2023 said her disappointing experiences with the NHS in part motivated her to qualify as an audiologist.

“The research is extremely important as I have faced my own experiences of inadequate access, and lack of deaf awareness in NHS healthcare not just in the appointment room but the whole process of booking appointments, being in the waiting room, interacting with clinicians and receiving important healthcare information,” said Musker. “I really hope that the results will really highlight that NHS services are still not meeting the needs of patients. Despite this, the study also highlights ways that the NHS can improve, and recommendations are suggested by those who face these barriers within healthcare.”

The researchers have also released a set of recommendations that could improve accessibility in the NHS, such as:

  • Mandatory deaf awareness and communication training for NHS staff
  • Consistent provision of interpreters and alert systems across all NHS sites
  • Infrastructure improvements, such as text-based appointment systems and visual waiting room alerts
  • The creation of walk-through assessments at hospitals to ensure accessibility across the full patient journey

“This is a legal obligation, not a luxury,” said Parmar. “No one should have to write down their symptoms in a GP appointment or worry they’ll miss their name being called in a waiting room. These are simple, solvable issues.”

A practice guidance resource – developed in consultation with patients and driven by this research – is open for feedback until 15 June and will be made publicly available as a free tool to help clinicians and NHS services improve deaf awareness. People can submit feedback at the British Society of Audiology website.

“Ultimately, better communication for deaf patients benefits everyone,” Parmar said. “We’re not just pointing out problems – we’re providing practical solutions.”

Reference:
Bhavisha Parmar et al. ‘“I always feel like I’m the first deaf person they have ever met.” Deaf Awareness, Accessibility and Communication in the United Kingdom’s National Health Service (NHS): How can we do better?’ PLOS One (2025). DOI: 10.1371/journal.pone.0322850

https://youtube.com/watch?v=gEsEZDBsEQY%3Fsi%3Df76tEj_tJP2XKhzY


The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Adolescents with mental health conditions use social media differently than their peers, study suggests

Teenage boy with smart phone
Teenage boy with smart phone
Credit: D-Keine via Getty

One of the first studies in this area to use clinical-level diagnoses reveals a range of differences between young people with and without mental health conditions when it comes to social media – from changes in mood to time spent on sites.

Young people with a diagnosable mental health condition report differences in their experiences of social media compared to those without a condition, including greater dissatisfaction with online friend counts and more time spent on social media sites.

This is according to a new study led by the University of Cambridge, which suggests that adolescents with ‘internalising’ conditions such as anxiety and depression report feeling particularly affected by social media.

Young people with these conditions are more likely to report comparing themselves to others on social media, feeling a lack of self-control over time spent on the platforms, as well as changes in mood due to the likes and comments received.

Researchers found that adolescents with any mental health condition report spending more time on social media than those without a mental health condition, amounting to an average of roughly 50 minutes extra on a typical day.*

The study, led by Cambridge’s Medical Research Council Cognition and Brain Sciences Unit (MRC CBU), analysed data from a survey of 3,340 adolescents in the UK aged between 11 and 19 years old, conducted by NHS Digital in 2017.**

It is one of the first studies on social media use among adolescents to utilise multi-informant clinical assessments of mental health. These were produced by professional clinical raters interviewing young people, along with their parents and teachers in some cases.***

“The link between social media use and youth mental health is hotly debated, but hardly any studies look at young people already struggling with clinical-level mental health symptoms,” said Luisa Fassi, a researcher at Cambridge’s MRC CBU and lead author of the study, published in the journal Nature Human Behaviour.

“Our study doesn’t establish a causal link, but it does show that young people with mental health conditions use social media differently than young people without a condition.

“This could be because mental health conditions shape the way adolescents interact with online platforms, or perhaps social media use contributes to their symptoms. At this stage, we can’t say which comes first – only that these differences exist,” Fassi said.

The researchers developed high benchmarks for the study based on existing research into sleep, physical activity and mental health. Only findings with comparable levels of association to how sleep and exercise differ between people with and without mental health conditions were deemed significant.

While mental health was measured with clinical-level assessments, social media use came from questionnaires completed by study participants, who were not asked about specific platforms.****

As well as time spent on social media, all mental health conditions were linked to greater dissatisfaction with the number of online friends. “Friendships are crucial during adolescence as they shape identity development,” said Fassi.

“Social media platforms assign a concrete number to friendships, making social comparisons more conspicuous. For young people struggling with mental health conditions, this may increase existing feelings of rejection or inadequacy.”

Researchers looked at differences in social media use between young people with internalising conditions, such as anxiety, depression and PTSD, and externalising conditions, such as ADHD or conduct disorders.

The majority of differences in social media use were reported by young people with internalising conditions. For example, ‘social comparison’ – comparing themselves to others online – was twice as high in adolescents with internalising conditions (48%, around one in two) than for those without a mental health condition (24%, around one in four).

Adolescents with internalising conditions were also more likely to report mood changes in response to social media feedback (28%, around 1 in 4) compared to those without a mental health condition (13%, around 1 in 8). They also reported lower levels of self-control over time spent on social media and a reduced willingness to be honest about their emotional state when online.*****

“Some of the differences in how young people with anxiety and depression use social media reflect what we already know about their offline experiences. Social comparison is a well-documented part of everyday life for these young people, and our study shows that this pattern extends to their online world as well,” Fassi said.

By contrast, other than time spent on social media, researchers found few differences between young people with externalising conditions and those without a condition.

“Our findings provide important insights for clinical practice, and could help to inform future guidelines for early intervention,” said Cambridge’s Dr Amy Orben, senior author of the study.

“However, this study has only scratched the surface of the complex interplay between social media use and mental health. The fact that this is one of the first large-scale and high-quality studies of its kind shows the lack of systemic investment in this space.”

Added Fassi: “So many factors can be behind why someone develops a mental health condition, and it’s very hard to get at whether social media use is one of them.”

“A huge question like this needs lots of research that combines experimental designs with objective social media data on what young people are actually seeing and doing online.”

“We need to understand how different types of social media content and activities affect young people with a range of mental health conditions such as those living with eating disorders, ADHD, or depression. Without including these understudied groups, we risk missing the full picture.”

Notes:

*Study participants were asked to rate their social media use on a typical school day and a typical weekend or holiday day. This was given as a nine-point scale, ranging from less than 30 minutes to over seven hours. Responses from adolescents with any mental health condition approached on average ‘three to four hours’, compared to adolescents without a condition, who averaged between ‘one to two hours’ and ‘two to three hours’.

The category of all mental health conditions in the study includes several conditions that are classed as neither internalising or externalising, such as sleep disorders and psychosis. However, the numbers of adolescents suffering from these are comparatively small.

**The survey was conducted as part of NHS Digital’s Mental Health of Children and Young People Survey (MHCYP) and is nationally representative of this age group in the UK. The researchers only used data from those who provided answers on social media use (50% male, 50% female).

*** Previous studies have mainly used self-reported questionnaires (e.g. a depression severity questionnaire) to capture mental health symptoms and conditions in participants.

**** The researchers point out that, as responses on social media use were self-reported, those with mental health conditions may be perceiving they spend more time on social media rather than actually doing so. They say that further research with objective data is required to provide definitive answers.

***** For data on social media use, study participants were asked to rate the extent to which they agree with a series of statements on a five-point Likert scale. The statements ranged from ‘I compare myself to others on social media’ to ‘I am happy with the number of friends I have on social media’.

Researchers divided responses into ‘disagree’ (responses 1 to 3) and ‘agree’ (responses 4 and 5) and then calculated the proportion of adolescents agreeing separately for each diagnostic group to aid with public communication of the findings.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Adolescents who sleep longer perform better at cognitive tasks

Teenager asleep and wrapped in a blanket
Teenager asleep and wrapped in a blanket
Credit: harpazo_hope (Getty Images)

Adolescents who sleep for longer – and from an earlier bedtime – than their peers tend to have improved brain function and perform better at cognitive tests, researchers from the UK and China have shown.

Even though the differences in the amount of sleep that each group got was relatively small, we could still see differences in brain structure and activity and in how well they did at tasks
Barbara Sahakian

But the study of adolescents in the US also showed that even those with better sleeping habits were not reaching the amount of sleep recommended for their age group.

Sleep plays an important role in helping our bodies function. It is thought that while we are asleep, toxins that have built up in our brains are cleared out, and brain connections are consolidated and pruned, enhancing memory, learning, and problem-solving skills. Sleep has also been shown to boost our immune systems and improve our mental health.

During adolescence, our sleep patterns change. We tend to start going to bed later and sleeping less, which affects our body clocks. All of this coincides with a period of important development in our brain function and cognitive development. The American Academy of Sleep Medicine says that the ideal amount of sleep during this period is between eight- and 10-hours’ sleep.

Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge said: “Regularly getting a good night’s sleep is important in helping us function properly, but while we know a lot about sleep in adulthood and later life, we know surprisingly little about sleep in adolescence, even though this is a crucial time in our development. How long do young people sleep for, for example, and what impact does this have on their brain function and cognitive performance?”

Studies looking at how much sleep adolescents get usually rely on self-reporting, which can be inaccurate. To get around this, a team led by researchers at Fudan University, Shanghai, and the University of Cambridge turned to data from the Adolescent Brain Cognitive Development (ABCD) Study, the largest long-term study of brain development and child health in the United States.

As part of the ABCD Study, more than 3,200 adolescents aged 11-12 years old had been given FitBits, allowing the researchers to look at objective data on their sleep patterns and to compare it against brain scans and results from cognitive tests. The team double-checked their results against two additional groups of 13-14 years old, totalling around 1,190 participants. The results are published today in Cell Reports.

The team found that the adolescents could be divided broadly into one of three groups:

Group One, accounting for around 39% of participants, slept an average (mean) of 7 hours 10 mins. They tended to go to bed and fall asleep the latest and wake up the earliest.

Group Two, accounting for 24% of participants, slept an average of 7 hours 21 mins. They had average levels across all sleep characteristics.

Group Three, accounting for 37% of participants, slept an average of 7 hours 25 mins. They tended to go to bed and fall asleep the earliest and had lower heart rates during sleep.

Although the researchers found no significant differences in school achievement between the groups, when it came to cognitive tests looking at aspects such as vocabulary, reading, problem solving and focus, Group Three performed better than Group Two, which in turn performed better than Group One.

Group Three also had the largest brain volume and best brain functions, with Group One the smallest volume and poorest brain functions.

Professor Sahakian said: “Even though the differences in the amount of sleep that each group got was relatively small, at just over a quarter-of-an-hour between the best and worst sleepers, we could still see differences in brain structure and activity and in how well they did at tasks. This drives home to us just how important it is to have a good night’s sleep at this important time in life.”

First author Dr Qing Ma from Fudan University said: “Although our study can’t answer conclusively whether young people have better brain function and perform better at tests because they sleep better, there are a number of studies that would support this idea. For example, research has shown the benefits of sleep on memory, especially on memory consolidation, which is important for learning.”

The researchers also assessed the participants’ heart rates, finding that Group Three had the lowest heart rates across the sleep states and Group One the highest. Lower heart rates are usually a sign of better health, whereas higher rates often accompany poor sleep quality like restless sleep, frequent awakenings and excessive daytime sleepiness.

Because the ABCD Study is a longitudinal study – that is, one that follows its participants over time – the team was able to show that the differences in sleep patterns, brain structure and function, and cognitive performance, tended be present two years before and two years after the snapshot that they looked at.

Senior author Dr Wei Cheng from Fudan University added: “Given the importance of sleep, we now need to look at why some children go to bed later and sleep less than others. Is it because of playing videogames or smartphones, for example, or is it just that their body clocks do not tell them it’s time to sleep until later?”

The research was supported by the National Key R&D Program of China, National Natural Science Foundation of China, National Postdoctoral Foundation of China and Shanghai Postdoctoral Excellence Program. The ABCD Study is supported by the National Institutes of Health.

Reference

Ma, Q et al. Neural correlates of device-based sleep characteristics in adolescents. Cell Reports; 22 Apr 2025; DOI: 10.1016/j.celrep.2025.115565



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Charles Darwin Archive recognised by UNESCO

Two of Charles Darwin’s pocket notebooks. Cambridge University Library
Two of Charles Darwin’s pocket notebooks in Cambridge University Library
Credit: Cambridge University Library

Documentary heritage relating to the life and work of Charles Darwin has been recognised on the prestigious UNESCO International Memory of the World Register, highlighting its critical importance to global science and the necessity of its long-term preservation and accessibility.

We could not be prouder of UNESCO’s recognition of this remarkable documentary heritage
Jessica Gardner

The UNESCO Memory of the World Programme serves as the documentary heritage equivalent of UNESCO World Heritage Sites, protecting invaluable records that tell the story of human civilisation.

A collaboration between Cambridge University Library, the Natural History Museum, the Linnean Society of London, English Heritage’s Down House, the Royal Botanic Gardens, Kew and the National Library of Scotland, the Charles Darwin documentary heritage archive provides a unique window into the life and work of one of the world’s most influential natural scientists.

The complete archive, comprising over 20,000 items across the six major institutions, includes Darwin’s records illustrating the development of his ground-breaking theory of evolution and extensive global travels.

At Cambridge University Library, the Darwin Archive is a significant collection of Darwin’s books, experimental notes, correspondence, and photographs, representing his scientific and personal activities throughout his life.

The collection in Cambridge includes Darwin’s pocket notebooks recording early statements of key ideas contributing to his theory of evolution, notably that species are not stable. These provide important insights into the development of his thought and feature the iconic ‘Tree of Life’ diagram which he drew on his return from the voyage of the HMS Beagle.

The Linnean Society of London holds several of Darwin’s letters, manuscripts and books. Here is also home to John Collier’s original iconic portrait of Charles Darwin, commissioned by the Society and painted in 1883 to commemorate the first reading of the theory of evolution by natural selection at a Linnean Society meeting in 1858.

At the Natural History Museum, a letter written to his wife Emma in 1844, provides insight into Darwin’s perceived significance of his species theory research and holds instructions on what she should do in the case of his sudden death. This is alongside other letters to Museum staff and other family members which demonstrate the broad scope of his scientific thinking, research and communication ranging from caterpillars to volcanoes, dahlias to ants and the taking of photographs for his third publication Expression of the Emotions in Man and Animals.

Correspondence with Darwin’s publisher John Murray, held at the National Library of Scotland document the transformation of his research into print, including the ground-breaking On the Origin of Species publication.

At the Royal Botanic Gardens, Kew, documents include a highly significant collection of 44 letters sent around the HMS Beagle expedition from Darwin to Professor John Stevens Henslow, detailing his travels and the genesis of his theory of evolution as he comes in contact with new plants, wildlife and fossils; as well as a rare sketch of the orchid Gavilea patagonica made by Darwin. Other items include a letter from Darwin to his dear friend Joseph Hooker, Director of Kew in which he requests cotton seeds from Kew’s collections for his research.

Down House (English Heritage) in Kent was both a family home and a place of work where Darwin pursued his scientific interests, carried out experiments, and researched and wrote his many ground-breaking publications until his death in 1882.

The extensive collection amassed by Darwin during his 40 years at Down paint a picture of Darwin’s professional and personal life and the intersection of the two. The archive here includes over 200 books from Darwin’s personal collection, account books, diaries, the Journal of the Voyage of the Beagle MSS, and Beagle notebooks and letters. More personal items include scrapbooks, Emma Darwin’s photograph album and Charles Darwin’s will. The collection at Down House has been mainly assembled through the generous donations of Darwin’s descendants.

This inscription marks a significant milestone in recognising Darwin’s legacy, as it brings together materials held by multiple institutions across the UK for the first time, ensuring that his work’s scientific, cultural, and historical value is preserved for future generations.

In line with the ideals of the UNESCO Memory of the World Programme, much of the Darwin archive can be viewed by the public at the partner institutions and locations.

The UNESCO International Memory of the World Register includes some of the UK’s most treasured documentary heritage, such as the Domesday Book, the Shakespeare Documents, alongside more contemporary materials, including the personal archive of Sir Winston Churchill. The Charles Darwin archive now joins this esteemed list, underscoring its historical, scientific, and cultural significance.

The inscription of the Charles Darwin archive comes as part of UNESCO’s latest recognition of 75 archives worldwide onto the International Memory of the World Register.

These newly inscribed collections include a diverse range of documents, such as the Draft of the International Bill of Human Rights, the papers of Friedrich Nietzche, and the Steles of Shaolin Temple (566-1990) in China.

Baroness Chapman of Darlington, Minister of State for International Development, Latin America and Caribbean, Foreign, Commonwealth & Development Office (FCDO) said: “The recognition of the Charles Darwin archive on UNESCO’s International Memory of the World Register is a proud moment for British science and heritage.

“Darwin’s work fundamentally changed our understanding of the natural world and continues to inspire scientific exploration to this day. By bringing together extraordinary material from our world class British institutions, this archive ensures that Darwin’s groundbreaking work remains accessible to researchers, students, and curious minds across the globe.”

Ruth Padel, FRSL, FZS, poet, conservationist, great-great-grand-daughter of Charles Darwin and King’s College London Professor of Poetry Emerita, said: “How wonderful to see Darwin’s connections to so many outstanding scientific and cultural institutions in the UK reflected in the recognition of his archive on the UNESCO Memory of the World International Register. All these institutions are open to the public so everyone will have access to his documentary heritage.”

Dr Jessica Gardner, University Librarian and Director of Library Services at Cambridge University Libraries (CUL) said: “For all Charles Darwin gave the world, we are delighted by the UNESCO recognition in the Memory of the World of the exceptional scientific and heritage significance of his remarkable archive held within eminent UK institutions.

“Cambridge University Library is home to over 9,000 letters to and from Darwin, as well as his handwritten experimental notebooks, publications, and photographs which have together fostered decades of scholarship and public enjoyment through exhibition, education for schools, and online access.

“We could not be prouder of UNESCO’s recognition of this remarkable documentary heritage at the University of Cambridge, where Darwin was a student at Christ’s College and where his family connections run deep across the city, and are reflected in his namesake, Darwin College.”

Read the full, illustrated version of this story on the University Library’s site.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Throwing a ‘spanner in the works’ of our cells’ machinery could help fight cancer, fatty liver disease… and hair loss

Bald young man, front view
Bald young man, front view
Credit: bob_bosewell (Getty Images)

Fifty years since its discovery, scientists have finally worked out how a molecular machine found in mitochondria, the ‘powerhouses’ of our cells, allows us to make the fuel we need from sugars, a process vital to all life on Earth.

Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditionsEdmund Kunji

Scientists at the Medical Research Council (MRC) Mitochondrial Biology Unit, University of Cambridge, have worked out the structure of this machine and shown how it operates like the lock on a canal to transport pyruvate – a molecule generated in the body from the breakdown of sugars – into our mitochondria.

Known as the mitochondrial pyruvate carrier, this molecular machine was first proposed to exist in 1971, but it has taken until now for scientists to visualise its structure at the atomic scale using cryo-electron microscopy, a technique used to magnify an image of an object to around 165,000 times its real size. Details are published today in Science Advances.

Dr Sotiria Tavoulari, a Senior Research Associate from the University of Cambridge, who first determined the composition of this molecular machine, said: “Sugars in our diet provide energy for our bodies to function. When they are broken down inside our cells they produce pyruvate, but to get the most out of this molecule it needs to be transferred inside the cell’s powerhouses, the mitochondria. There, it helps increase 15-fold the energy produced in the form of the cellular fuel ATP.”

Maximilian Sichrovsky, a PhD student at Hughes Hall and joint first author of the study, said: “Getting pyruvate into our mitochondria sounds straightforward, but until now we haven’t been able to understand the mechanism of how this process occurs. Using state-of-the-art cryo-electron microscopy, we’ve been able to show not only what this transporter looks like, but exactly how it works. It’s an extremely important process, and understanding it could lead to new treatments for a range of different conditions.”

Mitochondria are surrounded by two membranes. The outer one is porous, and pyruvate can easily pass through, but the inner membrane is impermeable to pyruvate. To transport pyruvate into the mitochondrion, first an outer ‘gate’ of the carrier opens, allowing pyruvate to enter the carrier. This gate then closes, and the inner gate opens, allowing the molecule to pass through into the mitochondrion.

“It works like the locks on a canal but on the molecular scale,” said Professor Edmund Kunji from the MRC Mitochondrial Biology Unit, and a Fellow at Trinity Hall, Cambridge. “There, a gate opens at one end, allowing the boat to enter. It then closes and the gate at the opposite end opens to allow the boat smooth transit through.”

Because of its central role in controlling the way mitochondria operate to produce energy, this carrier is now recognised as a promising drug target for a range of conditions, including diabetes, fatty liver disease, Parkinson’s disease, specific cancers, and even hair loss.

Pyruvate is not the only energy source available to us. Our cells can also take their energy from fats stored in the body or from amino acids in proteins. Blocking the pyruvate carrier would force the body to look elsewhere for its fuel – creating opportunities to treat a number of diseases. In fatty liver disease, for example, blocking access to pyruvate entry into mitochondria could encourage the body to use potentially dangerous fat that has been stored in liver cells.

Likewise, there are certain tumour cells that rely on pyruvate metabolism, such as in some types of prostate cancer. These cancers tend to be very ‘hungry’, producing excess pyruvate transport carriers to ensure they can feed more. Blocking the carrier could then starve these cancer cells of the energy they need to survive, killing them.

Previous studies have also suggested that inhibiting the mitochondrial pyruvate carrier may reverse hair loss. Activation of human follicle cells, which are responsible for hair growth, relies on metabolism and, in particular, the generation of lactate. When the mitochondrial pyruvate carrier is blocked from entering the mitochondria in these cells, it is instead converted to lactate.

Professor Kunji said: “Drugs inhibiting the function of the carrier can remodel how mitochondria work, which can be beneficial in certain conditions. Electron microscopy allows us to visualise exactly how these drugs bind inside the carrier to jam it – a spanner in the works, you could say. This creates new opportunities for structure-based drug design in order to develop better, more targeted drugs. This will be a real game changer.”

The research was supported by the Medical Research Council and was a collaboration with the groups of Professors Vanessa Leone at the Medical College of Wisconsin, Lucy Forrest at the National Institutes of Health, and Jan Steyaert at the Free University of Brussels.

Reference

Sichrovsky, M, Lacabanne, D, Ruprecht, JJ & Rana, JJ et al. Molecular basis of pyruvate transport and inhibition of the human mitochondrial pyruvate carrier. Sci Adv; 18 Apr 2025; DOI: 10.1126/sciadv.adw1489



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Extreme drought contributed to barbarian invasion of late Roman Britain, tree-ring study reveals

Milecastle 39 on Hadrian's Wall
Milecastle 39 on Hadrian’s Wall
Credit: Adam Cuerden

Three consecutive years of drought contributed to the ‘Barbarian Conspiracy’, a pivotal moment in the history of Roman Britain, a new Cambridge-led study reveals. Researchers argue that Picts, Scotti and Saxons took advantage of famine and societal breakdown caused by an extreme period of drought to inflict crushing blows on weakened Roman defences in 367 CE. While Rome eventually restored order, some historians argue that the province never fully recovered.

Our findings provide an explanation for the catalyst of this major event.
Charles Norman

The ‘Barbarian Conspiracy’ of 367 CE was one of the most severe threats to Rome’s hold on Britain since the Boudiccan revolt three centuries earlier. Contemporary sources indicate that components of the garrison on Hadrian’s wall rebelled and allowed the Picts to attack the Roman province by land and sea. Simultaneously, the Scotti from modern-day Ireland invaded broadly in the west, and Saxons from the continent landed in the south.

Senior Roman commanders were captured or killed, and some soldiers reportedly deserted and joined the invaders. Throughout the spring and summer, small groups roamed and plundered the countryside. Britain’s descent into anarchy was disastrous for Rome and it took two years for generals dispatched by Valentian I, Emperor of the Western Roman Empire, to restore order. The final remnants of official Roman administration left Britain some 40 years later around 410 CE.

The University of Cambridge-led study, published today in Climatic Change, used oak tree-ring records to reconstruct temperature and precipitation levels in southern Britain during and after the ‘Barbarian Conspiracy’ in 367 CE. Combining this data with surviving Roman accounts, the researchers argue that severe summer droughts in 364, 365 and 366 CE were a driving force in these pivotal events.

First author Charles Norman, from Cambridge’s Department of Geography, said: “We don’t have much archaeological evidence for the ‘Barbarian Conspiracy’. Written accounts from the period give some background, but our findings provide an explanation for the catalyst of this major event.”

The researchers found that southern Britain experienced an exceptional sequence of remarkably dry summers from 364 to 366 CE. In the period 350 to 500 CE, average monthly reconstructed rainfall in the main growing season (April–July) was 51 mm. But in 364 CE, it fell to just 29mm. 365 CE was even worse with 28mm, and 37mm the following year kept the area in crisis.

Professor Ulf Büntgen, from Cambridge’s Department of Geography, said: “Three consecutive droughts would have had a devastating impact on the productivity of Roman Britain’s most important agricultural region. As Roman writers tell us, this resulted in food shortages with all of the destabilising societal effects this brings.”

Between 1836 and 2024 CE, southern Britain only experienced droughts of a similar magnitude seven times – mostly in recent decades, and none of these were consecutive, emphasising how exceptional these droughts were in Roman times. The researchers identified no other major droughts in southern Britain in the period 350–500 CE and found that other parts of northwestern Europe escaped these conditions.

Roman Britain’s main produce were crops like spelt wheat and six-row barley. Because the province had a wet climate, sowing these crops in spring was more viable than in winter, but this made them vulnerable to late spring and early summer moisture deficits, and early summer droughts could lead to total crop failure.

The researchers point to surviving accounts written by Roman chroniclers to corroborate these drought-driven grain deficits. By 367 CE, Ammianus Marcellinus described the population of Britain as in the ‘utmost conditions of famine’.

“Drought from 364 to 366 CE would have impacted spring-sown crop growth substantially, triggering poor harvests,” Charles Norman said. “This would have reduced the grain supply to Hadrian’s Wall, providing a plausible motive for the rebellion there which allowed the Picts into northern Britain.”

The study suggests that given the crucial role of grain in the contract between soldiers and the army, grain deficits may have contributed to other desertions in this period, and therefore a general weakening of the Roman army in Britain. In addition, the geographic isolation of Roman Britain likely combined with the severity of the prolonged drought to reduce the ability of Rome to alleviate the deficits.

Ultimately the researchers argue that military and societal breakdown in Roman Britain provided an ideal opportunity for peripheral tribes, including the Picts, Scotti and Saxons, to invade the province en masse with the intention of raiding rather than conquest. Their finding that the most severe conditions were restricted to southern Britain undermines the idea that famines in other provinces might have forced these tribes to invade.

Andreas Rzepecki, from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz, said: “Our findings align with the accounts of Roman chroniclers and the seemingly coordinated nature of the ‘Conspiracy’ suggests an organised movement of strong onto weak, rather than a more chaotic assault had the invaders been in a state of desperation.”

“The prolonged and extreme drought seems to have occurred during a particularly poor period for Roman Britain, in which food and military resources were being stripped for the Rhine frontier, while immigratory pressures increased.”

“These factors limited resilience, and meant a drought induced, partial-military rebellion and subsequent external invasion were able to overwhelm the weakened defences.”

The researchers expanded their climate-conflict analysis to the entire Roman Empire for the period 350–476 CE. They reconstructed the climate conditions immediately before and after 106 battles and found that a statistically significant number of battles were fought following dry years.

Tatiana Bebchuk, from Cambridge’s Department of Geography, said: “The relationship between climate and conflict is becoming increasingly clear in our own time so these findings aren’t just important for historians. Extreme climate conditions lead to hunger, which can lead to societal challenges, which eventually lead to outright conflict.”

Charles Norman, Ulf Büntgen, Paul Krusic and Tatiana Bebchuk are based at the Department of Geography, University of Cambridge; Lothar Schwinden and Andreas Rzepecki are from the Generaldirektion Kulturelles Erbe Rheinland-Pfalz in Trier. Ulf Büntgen is also affiliated with the Global Change Research Institute, Czech Academy of Sciences and the Department of Geography, Masaryk University in Brno.

Reference

C Norman, L Schwinden, P Krusic, A Rzepecki, T Bebchuk, U Büntgen, ‘Droughts and conflicts during the late Roman period’, Climatic Change (2025). DOI: 10.1007/s10584-025-03925-4

Funding

Charles Norman was supported by Wolfson College, University of Cambridge (John Hughes PhD Studentship). Ulf Büntgen received funding from the Czech Science Foundation (# 23-08049S; Hydro8), the ERC Advanced Grant (# 882727; Monostar), and the ERC Synergy Grant (# 101118880; Synergy-Plague).



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Mouse study suggests a common diabetes drug may prevent leukaemia

Brown lab mouse on blue gloved hand
Brown lab mouse on blue gloved hand
Credit: University of Cambridge

Metformin, a widely used and affordable diabetes drug, could prevent a form of acute myeloid leukaemia in people at high risk of the disease, a study in mice has suggested. Further research in clinical trials will be needed to confirm this works for patients.

We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trials
Brian Huntly

Around 3,100 people are diagnosed with acute myeloid leukaemia (AML) each year in the UK. It is an aggressive form of blood cancer that is very difficult to treat. Thanks to recent advances, individuals at high risk of AML can be identified years in advance using blood tests and blood DNA analysis, but there’s no suitable treatment that can prevent them from developing the disease.

In this study, Professor George Vassiliou and colleagues at the University of Cambridge investigated how to prevent abnormal blood stem cells with genetic changes from progressing to become AML. The work focused on the most common genetic change, which affects a gene called DNMT3A and is responsible for starting 10-15% of AML cases.

Professor Vassiliou, from the Cambridge Stem Cell Institute at the University of Cambridge and Honorary Consultant Haematologist at Cambridge University Hospitals NHS Foundation Trust (CUH) co-led the study. He said: “Blood cancer poses unique challenges compared to solid cancers like breast or prostate, which can be surgically removed if identified early. With blood cancers, we need to identify people at risk and then use medical treatments to stop cancer progression throughout the body.”

The research team examined blood stem cells from mice with the same changes in DNMT3A as seen in the pre-cancerous cells in humans. Using a genome-wide screening technique, they showed that these cells depend more on mitochondrial metabolism than healthy cells, making this a potential weak spot. The researchers went on to confirm that metformin, and other mitochondria-targeting drugs, substantially slowed the growth of mutation-bearing blood cells in mice. Further experiments also showed that metformin could have the same effect on human blood cells with the DNMT3A mutation.

Dr Malgorzata Gozdecka, Senior Research Associate at the Cambridge Stem Cell Institute and first author of the research said: “Metformin is a drug that impacts mitochondrial metabolism, and these pre-cancerous cells need this energy to keep growing. By blocking this process, we stop the cells from expanding and progressing towards AML, whilst also reversing other effects of the mutated DNMT3A gene.”

In addition, the study looked at data from over 412,000 UK Biobank volunteers and found that people taking metformin were less likely to have changes in the DNMT3A gene. This link remained even after accounting for factors that could have confounded the results such as diabetes status and BMI.

Professor Brian Huntly, Head of the Department of Haematology at the University of Cambridge, Honorary Consultant Haematologist at CUH, and joint lead author of the research, added: “Metformin appears highly specific to this mutation rather than being a generic treatment. That specificity makes it especially compelling as a targeted prevention strategy.

“We’ve done the extensive research all the way from cell-based studies to human data, so we’re now at the point where we have a made a strong case for moving ahead with clinical trials. Importantly, metformin’s lack of toxicity will be a major advantage as it is already used by millions of people worldwide with a well-established safety profile.”

The results of the study, funded by Blood Cancer UK with additional support from Cancer Research UK, the Leukemia & Lymphoma Society (USA) and the Wellcome Trust, are published in Nature.

Dr Rubina Ahmed, Director of Research at Blood Cancer UK, said: “Blood cancer is the third biggest cancer killer in the UK, with over 280,000 people currently living with the disease. Our Blood Cancer Action plan shed light on the shockingly low survival for acute myeloid leukaemia, with only around 2 in 10 surviving for 5 years, and we urgently need better strategies to save lives. Repurposing safe, widely available drugs like metformin means we could potentially get new treatments to people faster, without the need for lengthy drug development pipelines.”

The next phase of this research will focus on clinical trials to test metformin’s effectiveness in people with changes in DNMT3A at increased risk of developing AML.  With metformin already approved and widely used for diabetes, this repurposing strategy could dramatically reduce the time it takes to bring a new preventive therapy to patients.

Tanya Hollands, Research Information Manager at Cancer Research UK, who contributed funding for the lab-based screening in mice, said: “It’s important that we work to find new ways to slow down or prevent AML in people at high risk. Therefore, it’s positive that the findings of this study suggest a possible link between a commonly-used diabetes drug and prevention of AML progression in some people. While this early-stage research is promising, clinical trials are now needed to find out if this drug could benefit people. We look forward to seeing how this work progresses.”

Reference
Gozdecka, M et al. Mitochondrial metabolism sustains DNMT3A-R882-mutant clonal haematopoiesis. Nature; 16 Apr 2025; DOI: 10.1038/s41586-025-08980-6

Adapted from a press release from Blood Cancer UK



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Growing wildflowers on disused urban land can damage bee health

Chicory growing on unused land in Cleveland, USA.
Chicory growing in a vacant lot
Credit: Sarah Scott

Wildflowers growing on land previously used for buildings and factories can accumulate lead, arsenic and other metal contaminants from the soil, which are consumed by pollinators as they feed, a new study has found.

Our results should not discourage people from planting wildflowers in towns and cities. But.. it’s important to consider the history of the land and what might be in the soil.”Sarah Scott

The metals have previously been shown to damage the health of pollinators, which ingest them in nectar as they feed, leading to reduced population sizes and death. Even low nectar metal levels can have long-term effects, by affecting bees’ learning and memory – which impacts their foraging ability.

Researchers have found that common plants including white clover and bindweed, which are vital forage for pollinators in cities, can accumulate arsenic, cadmium, chromium and lead from contaminated soils.

Metal contamination is an issue in the soils of cities worldwide, with the level of contamination usually increasing with the age of a city. The metals come from a huge range of sources including cement dust and mining.

The researchers say soils in cities should be tested for metals before sowing wildflowers and if necessary, polluted areas should be cleaned up before new wildflower habitats are established.

The study highlights the importance of growing the right species of wildflowers to suit the soil conditions.

Reducing the risk of metal exposure is critical for the success of urban pollinator conservation schemes. The researchers say it is important to manage wildflower species that self-seed on contaminated urban land, for example by frequent mowing to limit flowering – which reduces the transfer of metals from the soil to the bees.

The results are published today in the journal Ecology and Evolution.

Dr Sarah Scott in the University of Cambridge’s Department of Zoology and first author of the report, said: “It’s really important to have wildflowers as a food source for the bees, and our results should not discourage people from planting wildflowers in towns and cities.

“We hope this study will raise awareness that soil health is also important for bee health. Before planting wildflowers in urban areas to attract bees and other pollinators, it’s important to consider the history of the land and what might be in the soil – and if necessary find out whether there’s a local soil testing and cleanup service available first.”

The study was carried out in the post-industrial US city of Cleveland, Ohio, which has over 33,700 vacant lots left as people have moved away from the area. In the past, iron and steel production, oil refining and car manufacturing went on there. But any land that was previously the site of human activity may be contaminated with traces of metals.

To get their results, the researchers extracted nectar from a range of self-seeded flowering plants that commonly attract pollinating insects, found growing on disused land across the city. They tested this for the presence of arsenic, cadmium, chromium and lead. Lead was consistently found at the highest concentrations, reflecting the state of the soils in the city.

The researchers found that different species of plant accumulate different amounts, and types, of the metals. Overall, the bright blue-flowered chicory plant (Cichorium intybus) accumulated the largest total metal concentration, followed by white clover (Trifolium repens), wild carrot (Daucus carota) and bindweed (Convolvulus arvensis). These plants are all vital forage for pollinators in cities – including cities in the UK – providing a consistent supply of nectar across locations and seasons.

There is growing evidence that wild pollinator populations have dropped by over 50% in the last 50 years, caused primarily by changes in land use and management across the globe. Climate change and pesticide use also play a role; overall the primary cause of decline is the loss of flower-rich habitat.

Pollinators play a vital role in food production: many plants, including apple and tomato, require pollination in order to develop fruit. Natural ‘pollination services’ are estimated to add billions of dollars to global crop productivity.

Scott said: “Climate change feels so overwhelming, but simply planting flowers in certain areas can help towards conserving pollinators, which is a realistic way for people to make a positive impact on the environment.”

The research was funded primarily by the USDA National Institute of Food and Agriculture.

Reference
Scott, SB and Gardiner, MM: ‘Trace metals in nectar of important urban pollinator forage plants: A direct exposure risk to pollinators and nectar-feeding animals in cities.’ Ecology and Evolution, April 2025. DOI: 10.1002/ece3.71238



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Complete clean sweep for Cambridge at The Boat Race 2025

Credit: Row360

Cambridge is celebrating a complete clean sweep at The Boat Race 2025, with victories in all 4 openweight races and also both lightweight races.

Thousands of spectators lined the banks of the River Thames on 13 April to witness a dramatic afternoon of action, with millions more following live on the BBC.

Cambridge Women secured their eighth consecutive win in the 79th Women’s Boat Race, extending their overall record to 49 victories to Oxford’s 30. The Men’s crew, too, were victorious in defending their title in the 170th edition of the event, notching up their 88th win, with Oxford sitting on 81.

Goldie, the Cambridge Men’s Reserve Crew, won the Men’s Reserve Race, while Blondie, the Cambridge Women’s Reserve Crew, won the Women’s Reserve Race. And the day before, the 2025 Lightweight Boat Race also saw two wins for Cambridge.

Cambridge’s Claire Collins said it was an incredible feeling to win the race. 

“This is so cool, it’s really an incredible honour to share this with the whole club,” she said.

The Women’s Race was stopped initially after an oar clash, but Umpire Sir Matthew Pinsent allowed the race to resume after a restart. Claire said that the crew had prepared for eventualities such as a restart and so were able to lean on their training when it happened.

“I had total confidence in the crew to regroup. Our focus was to get back on pace and get going as soon as possible and that’s what we did.”

For Cambridge Men’s President Luca Ferraro, it was his final Boat Roat campaign, having raced in the Blue Boat for the last three years, winning the last two.

He said: “It was a great race. The guys really stepped up. That’s something that our Coach Rob Baker said to us before we went out there, that each of us had to step up individually and come together and play our part in what we were about to do. I couldn’t be prouder of the guys, they really delivered today.”

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, congratulated all the crews following the wins.

“I am in awe of these students and what they have achieved, and what Cambridge University Boat Club has been able to create,” she said.

“These students are out in the early hours of the morning training and then trying to make it to 9am lectures. It’s so inspiring. And a complete clean sweep – this was an incredibly impressive showing by Cambridge, I am so proud of them.”

The Cambridge Blue Boats featured student athletes drawn from Christ’s College, Downing College, Emmanuel College, Gonville & Caius, Hughes Hall, Jesus College, Pembroke College, Peterhouse, St Edmund’s, and St John’s.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Harmful effects of digital tech – the science ‘needs fixing’, experts argue

Illustration representing potential online harms
Illustration representing potential online harms
Credit: Nuthawut Somsuk via Getty

From social media to AI, online technologies are changing too fast for the scientific infrastructure used to gauge its public health harms, say two leaders in the field.

The scientific methods and resources we have for evidence creation at the moment simply cannot deal with the pace of digital technology developmentDr Amy Orben

Scientific research on the harms of digital technology is stuck in a “failing cycle” that moves too slowly to allow governments and society to hold tech companies to account, according to two leading researchers in a new report published in the journal Science.

Dr Amy Orben from the University of Cambridge and Dr J. Nathan Matias from Cornell University say the pace at which new technology is deployed to billions of people has put unbearable strain on the scientific systems trying to evaluate its effects.

They argue that big tech companies effectively outsource research on the safety of their products to independent scientists at universities and charities who work with a fraction of the resources – while firms also obstruct access to essential data and information. This is in contrast to other industries where safety testing is largely done “in house”.

Orben and Matias call for an overhaul of “evidence production” assessing the impact of technology on everything from mental health to discrimination.  

Their recommendations include accelerating the research process, so that policy interventions and safer designs are tested in parallel with initial evidence gathering, and creating registries of tech-related harms informed by the public.    

“Big technology companies increasingly act with perceived impunity, while trust in their regard for public safety is fading,” said Orben, of Cambridge’s MRC Cognition and Brain Sciences Unit. “Policymakers and the public are turning to independent scientists as arbiters of technology safety.”

“Scientists like ourselves are committed to the public good, but we are asked to hold to account a billion-dollar industry without appropriate support for our research or the basic tools to produce good quality evidence quickly.”

“We must urgently fix this science and policy ecosystem so we can better understand and manage the potential risks posed by our evolving digital society,” said Orben.

‘Negative feedback cycle’

In the latest Science paper, the researchers point out that technology companies often follow policies of rapidly deploying products first and then looking to “debug” potential harms afterwards. This includes distributing generative AI products to millions before completing basic safety tests, for example.

When tasked with understanding potential harms of new technologies, researchers rely on “routine science” which – having driven societal progress for decades – now lags the rate of technological change to the extent that it is becoming at times “unusable”.  

With many citizens pressuring politicians to act on digital safety, Orben and Matias argue that technology companies use the slow pace of science and lack of hard evidence to resist policy interventions and “minimize their own responsibility”.

Even if research gets appropriately resourced, they note that researchers will be faced with understanding products that evolve at an unprecedented rate.

“Technology products change on a daily or weekly basis, and adapt to individuals. Even company staff may not fully understand the product at any one time, and scientific research can be out of date by the time it is completed, let alone published,” said Matias, who leads Cornell’s Citizens and Technology (CAT) Lab.

“At the same time, claims about the inadequacy of science can become a source of delay in technology safety when science plays the role of gatekeeper to policy interventions,” Matias said.

“Just as oil and chemical industries have leveraged the slow pace of science to deflect the evidence that informs responsibility, executives in technology companies have followed a similar pattern. Some have even allegedly refused to commit substantial resources to safety research without certain kinds of causal evidence, which they also decline to fund.” 

The researchers lay out the current “negative feedback cycle”:

Tech companies do not adequately resource safety research, shifting the burden to independent scientists who lack data and funding. This means high-quality causal evidence is not produced in required timeframes, which weakens government’s ability to regulate – further disincentivising safety research, as companies are let off the hook.

Orben and Matias argue that this cycle must be redesigned, and offer ways to do it.

Reporting digital harms

To speed up the identification of harms caused by online technologies, policymakers or civil society could construct registries for incident reporting, and encourage the public to contribute evidence when they experience harms.

Similar methods are already used in fields such as environmental toxicology where the public reports on polluted waterways, or vehicle crash reporting programs that inform automotive safety, for example.

“We gain nothing when people are told to mistrust their lived experience due to an absence of evidence when that evidence is not being compiled,” said Matias.

Existing registries, from mortality records to domestic violence databases, could also be augmented to include information on the involvement of digital technologies such as AI.

The paper’s authors also outline a “minimum viable evidence” system, in which policymakers and researchers adjust the “evidence threshold” required to show potential technological harms before starting to test interventions.

These evidence thresholds could be set by panels made up of affected communities, the public, or “science courts”: expert groups assembled to make rapid assessments.   

“Causal evidence of technological harms is often required before designers and scientists are allowed to test interventions to build a safer digital society,” said Orben. 

“Yet intervention testing can be used to scope ways to help individuals and society, and pinpoint potential harms in the process. We need to move from a sequential system to an agile, parallelised one.”

Under a minimum viable evidence system, if a company obstructs or fails to support independent research, and is not transparent about their own internal safety testing, the amount of evidence needed to start testing potential interventions would be decreased.

Orben and Matias also suggest learning from the success of “Green Chemistry”, which sees an independent body hold lists of chemical products ranked by potential for harm, to help incentivise markets to develop safer alternatives.

“The scientific methods and resources we have for evidence creation at the moment simply cannot deal with the pace of digital technology development,” Orben said.  

“Scientists and policymakers must acknowledge the failures of this system and help craft a better one before the age of AI further exposes society to the risks of unchecked technological change.”

Added Matias: “When science about the impacts of new technologies is too slow, everyone loses.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Cambridge research: First global bond index to address fossil fuel expansion

University of Cambridge researchers based at the Department for Land Economy have selected index provider Bloomberg Index Services Limited to launch the first global corporate bond index to cover fossil fuel producers, utilities, insurance, and financing, with the aim of driving investment to reduce real-economy emissions.

This is an enormously impactful project which showcases the high-quality research undertaken at CambridgeAnthony Odgers, University of Cambridge Chief Financial Officer

This is a critical – and hugely challenging – moment for climate action. Legal and political pressures have paralysed asset managers and other financial service providers, leading to a recent wave of actors leaving investor climate coalitions. However, asset owners are increasingly seeing the need to take a leadership role in addressing climate change, which threatens the long-term future of their portfolios and the wider economy.

That’s why we are delighted to announce that Cambridge researchers based at the Department for Land Economy have selected index provider Bloomberg Index Services Limited to launch the first global corporate bond index to cover fossil fuel producers, utilities, insurance, and financing, with the aim of driving investment to reduce real-economy emissions.

You can read the University press release here.

“We are delighted that this project has reached such a key milestone,” said Professor Martin Dixon, Head of the Department of Land Economy. “As a multidisciplinary department with a focus on outstanding academic publication and teaching, this project has the potential to serve as a ‘systems demonstrator’ for ongoing research in this important area.”

Why a bond index?

The launch of the bond index by an 816-year-old institution is an unusual process and a tale worth telling. It began with a peer-reviewed paper by Dr Ellen Quigley, Principal Research Associate at Land Economy, exploring the case for evidence-based climate impact by institutional investors. This was followed by an internal feasibility study based at Jesus College, Cambridge (which continues to co-host the project), and supported by several other parts of the University.

With feasibility assessed, the team went out to global index providers to explore their interest. All of the leading players were interested in building this index, yet all grappled with a lack of access to data and the complexity of assessing companies based on their activities (e.g., whether they were building new fossil fuel infrastructure), not their business classification. An extensive Request for Proposals process resulted in naming Bloomberg Index Services Limited as our provider. The project aims to provide a genuine solution for asset owners looking to align their corporate debt instruments with their climate targets and to avoid both ineffective blanket interventions and greenwashing.

The central problem, on which the industry has faltered for decades, is how to manage the risk presented by a fossil fuel industry that continues to grow. Leading climate scenarios such as the International Energy Agency’s Net Zero by 2050 scenario are clear that fossil fuel expansion is inconsistent with the transition to a decarbonised economy.  With approximately 90% of new financing for fossil fuel expansion coming from bonds and bank loans, debt markets must be the focus of investor efforts to transition away from fossil fuel expansionism. Bonds offer a larger pool of capital than equities, and a greater proportion are purchased in the primary market, where companies gain access to new capital.

The past decade has seen a significant rise in passive investment strategies and therefore an increase in financial flows into index funds, which have as a consequence become significant ‘auto-allocators’ of capital. This research project aims to study the extent to which the new bond index influences cost, volume, and access to capital among companies who are seeking to build new fossil fuel infrastructure and delaying the phase-down of their operations. Bond markets are not just a key part of investor action on climate change: they are the very coalface of fossil fuel expansion, i.e. new gas, oil, and coal extraction and infrastructure.

“This is an enormously impactful project which showcases the high-quality research undertaken at Cambridge,” University of Cambridge Chief Financial Officer Anthony Odgers said.  “The index is a game-changer for the growing number of asset owners who invest in corporate debt and understand its impact on fossil fuel expansion, particularly the construction of new fossil fuel infrastructure such as coal- and gas-fired power plants which risk locking in fossil fuel usage for decades.”

“Once the index launches, Cambridge expects to invest some of its own money against financial products referencing it. This will enable us to align our fixed income holdings with our institution-wide objectives,” Odgers said.

There are currently no off-the-shelf products that allow for passive investments in global corporate bond markets without financing fossil fuel expansion, through fossil fuel production, utilities building new coal- and gas-fired power plants, and through the banks and insurers that continue to finance and underwrite these activities. By supporting the development of this ‘systems demonstrator’, we will be able to conduct essential research on the efficacy of such a lever.

“Instead of linear year-on-year reductions or blanket bans by business classification, the index methodology identifies companies that present the greatest systemic risks to investors, while ensuring that those companies that meet the criteria can rejoin the bond index,” said project leader Lily Tomson, a Senior Research Associate at Jesus College, Cambridge. 

Several years of close collaboration with leading global asset owners such as California State Teachers Retirement System (CalSTRS), Universities Superannuation Scheme (USS), Swiss Federal Pension Fund PUBLICA and the United Nations Joint Staff Pension Fund (UNJSPF) provided input and technical market expertise that underpins the index. Alongside the University of Cambridge, the index will be used at launch by investments from the United Nations Joint Staff Pension Fund.

“Finally, large asset owners around the world have an index for this market that aims to discourage the expansion of fossil fuels,” said Pedro Guazo, Representative of the Secretary-General (RSG) for the investment of the UNJSPF assets.

Rules-based engagement: a lever for behaviour change

Debt benchmarks have a key role to play in any real efforts to tackle the expansion of fossil fuels. This project is innovative because it focuses on exclusions and weightings of companies based on their current corporate activity, instead of using an approach that relies on blanket exclusions by business classification (which does not generate incentives to change behaviour). For example, a company might be classed as a fossil fuel company, but if it stops expanding new fossil fuel operations and aligns to an appropriate phase-down pathway, the company has an opportunity to be included in the index and gain access to capital via funds which use the index, as a result.

Across the project, we are using data sources that have never previously been used to build an index – for example, the Global Coal Exit List (GCEL) and Global Oil and Gas Exit List (GOGEL) from Urgewald. We are taking a novel approach that focuses investor attention on those actors that our framework considers ‘edge cases’: companies close to reaching, or moving away from, alignment with the index. Companies have the option of being (re-)included in the index if they change their behaviour to align with the rules of the index. Academic literature suggests this is a lever for behaviour change in equities, but as an approach it is new to debt market indices. This is one of many key hypotheses that this project tests. We are convening a community of leading global academics who will support the creation of this new form of rules-based bondholder engagement.

This bond index project is one of a suite of actions rooted in academic research and collaboration that have been developed by the collegiate University. Alongside 74 other higher education institutions, Cambridge is delivering a parallel project focused on cash deposits and money market funds. We will continue to conduct research as the associated new products begin to operate through 2025.

At a time when climate damage is growing rapidly and is visible in news stories around the world, many actors across investment markets are looking for a clear path to take necessary action. As an academic institution and a long-term investor, the University of Cambridge is committed to supporting evidence-based research and action on climate change.

The bond index will be launched later this year. If you are interested in finding out more about the project or the team’s research, contact us here: bondindex@landecon.cam.ac.uk.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Scientists create ‘metal detector’ to hunt down tumours

Serena Nik-Zainal at the Early Cancer Institute
Serena Nik-Zainal at the Early Cancer Institute
Credit: University of Cambridge

Cambridge researchers have created a ‘metal detector’ algorithm that can hunt down vulnerable tumours, in a development that could one day revolutionise the treatment of cancer.

Genomic sequencing is now far faster and cheaper than ever before. We are getting closer to the point where getting your tumour sequenced will be as routine as a scan or blood testSerena Nik-Zainal

In a paper published today in Nature Genetics, scientists at the University of Cambridge and NIHR Cambridge Biomedical Research Centre analysed the full DNA sequence of 4,775 tumours from seven types of cancer. They used that data from Genomics England’s 100,000 Genomes Project to create an algorithm capable of identifying tumours with faults in their DNA that makes them easier to treat.

The algorithm, called PRRDetect, could one day help doctors work out which patients are more likely to have successful treatment. That could pave the way for more personalised treatment plans that increase people’s chances of survival.

The research was funded by Cancer Research UK and the National Institute for Health and Care Research (NIHR).

Professor Serena Nik-Zainal  from the Early Cancer Institute at the University of Cambridge, lead author of the study, said: “Genomic sequencing is now far faster and cheaper than ever before. We are getting closer to the point where getting your tumour sequenced will be as routine as a scan or blood test.

“To use genomics most effectively in the clinic, we need tools which give us meaningful information about how a person’s tumour might respond to treatment. This is especially important in cancers where survival is poorer, like lung cancer and brain tumours.

“Cancers with faulty DNA repair are more likely to be treated successfully. PRRDetect helps us better identify those cancers and, as we sequence more and more cancers routinely in the clinic, it could ultimately help doctors better tailor treatments to individual patients.”

The research team looked for patterns in DNA created by so-called ‘indel’ mutations, in which letters are inserted or deleted from the normal DNA sequence.  

They found unusual patterns of indel mutations in cancers that had faulty DNA repair mechanisms – known as ‘post-replicative repair dysfunction’ or PRRd. Using this information, the scientists developed PRRDetect to allow them to identify tumours with this fault from a full DNA sequence.

PRRd tumours are more sensitive to immunotherapy, a type of cancer treatment that uses the body’s own immune system to attack cancer cells. The scientists hope that the PRRd algorithm could act like a ‘metal detector’ to allow them to identify patients who are more likely to have successful treatment with immunotherapy.

The study follows from a previous ‘archaeological dig’ of cancer genomes carried out by Professor Nik-Zainal, which examined the genomes of tens of thousands of people and revealed previously unseen patterns of mutations which are linked to cancer.

This time, Professor Nik-Zainal and her team looked at cancers which have a higher proportion of tumours with PRRd. These include bowel, brain, endometrial, skin, lung, bladder and stomach cancers. Whole genome sequences of these cancers were provided by the 100,000 Genomes Project – a pioneering study led by Genomics England and NHS England which sequenced 100,000 genomes from around 85,000 NHS patients affected by rare diseases or cancer.

The study identified 37 different patterns of indel mutations across the seven cancer types included in this study. Ten of these patterns were already linked to known causes of cancer, such as smoking and exposure to UV light. Eight of these patterns were linked to PRRd. The remaining 19 patterns were new and could be linked to causes of cancer that are not fully understood yet or mechanisms within cells that can go wrong when a cell becomes cancerous.

Executive Director of Research and Innovation at Cancer Research UK, Dr Iain Foulkes, said: “Genomic medicine will revolutionise how we approach cancer treatment. We can now get full readouts of tumour DNA much more easily, and with that comes a wealth of information about how an individual’s cancer can start, grow and spread.

“Tools like PRRDetect are going to make personalised treatment for cancer a reality for many more patients in the future. Personalising treatment is much more likely to be successful, ensuring more people can live longer, better lives free from the fear of cancer.”

NIHR Scientific Director, Mike Lewis, said: “Cancer is a leading cause of death in the UK so it’s impressive to see our research lead to the creation of a tool to determine which therapy will lead to a higher likelihood of successful cancer treatment.”

Chief Scientific Officer at Genomics England, Professor Matt Brown, said: “Genomics is playing an increasingly important role in healthcare and these findings show how genomic data can be used to drive more predictive, preventative care leading to better outcomes for patients with cancer.

“The creation of this algorithm showcases the immense value of whole genome sequencing not only in research but also in the clinic across multiple diverse cancer types in advancing cancer care.”

The University of Cambridge is fundraising for a new hospital that will transform how we diagnose and treat cancer. Cambridge Cancer Research Hospital, a partnership with Cambridge University Hospitals NHS Foundation Trust, will treat patients across the East of England, but the research that takes place there promises to change the lives of cancer patients across the UK and beyond. Find out more here.

Reference

Koh, GCC et al. Redefined indel taxonomy reveals insights into mutational signatures. Nat Gen; 10 Apr 2025; DOI:

Adapted from a press release from Cancer Research UK



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Handheld device could transform heart disease screening

Person wearing a grey t-shirt holding a palm-sized device to their chest
Person demonstrating use of a handheld device for heart disease screening
Credit: Acoustics Lab

Researchers have developed a handheld device that could potentially replace stethoscopes as a tool for detecting certain types of heart disease.

This device could become an affordable and scalable solution for heart health screening, especially in areas with limited medical resourcesAnurag Agarwal

The researchers, from the University of Cambridge, developed a device that makes it easy for people with or without medical training to record heart sounds accurately. Unlike a stethoscope, the device works well even if it’s not placed precisely on the chest: its larger, flexible sensing area helps capture clearer heart sounds than traditional stethoscopes.

The device can also be used over clothing, making it more comfortable for patients – especially women – during routine check-ups or community heart health screening programmes.

The heart sound recordings can be saved on the device, which can then be used to detect signs of heart valve disease. The researchers are also developing a machine learning algorithm which can detect signs of valve disease automatically. The results are reported in the IEEE Journal of Biomedical and Health Informatics.

Heart valve disease (valvular heart disease or VHD) has been called the ‘next cardiac epidemic,’ with a prognosis worse than many forms of cancer. Up to 50% of patients with significant VHD remain undiagnosed, and many patients only see their doctor when the disease has advanced and they are experiencing significant complications.

In the UK, the NHS and NICE have identified early detection of heart valve disease as a key goal, both to improve quality of life for patients, and to decrease costs.

An examination with a stethoscope, or auscultation, is the way that most diagnoses of heart valve disease are made. However, just 38% of patients who present to their GP with symptoms of valve disease receive an examination with a stethoscope.

“The symptoms of VHD can be easily confused with certain respiratory conditions, which is why so many patients don’t receive a stethoscope examination,” said Professor Anurag Agarwal from Cambridge’s Department of Engineering, who led the research. “However, the accuracy of stethoscope examination for diagnosing heart valve disease is fairly poor, and it requires a GP to conduct the examination.”

In addition, a stethoscope examination requires patients to partially undress, which is both time consuming in short GP appointments, and can be uncomfortable for patients, particularly for female patients in routine screening programmes.

The ‘gold standard’ for diagnosing heart valve disease is an echocardiogram, but this can only be done in a hospital and NHS waiting lists are extremely long – between six to nine months at many hospitals.

“To help get waiting lists down, and to make sure we’re diagnosing heart valve disease early enough that simple interventions can improve quality of life, we wanted to develop an alternative to a stethoscope that is easy to use as a screening tool,” said Agarwal.

Agarwal and his colleagues have developed a handheld device, about the diameter of a drinks coaster, that could be a solution. Their device can be used by any health professional to accurately record heart sounds, and can be used over clothes.

While a regular or electronic stethoscope has a single sensor, the Cambridge-developed device has six, meaning it is easier for the doctor or nurse – or even someone without any medical training – to get an accurate reading, simply because the surface area is so much bigger.

The device contains materials that can transmit vibration so that it can be used over clothes, which is particularly important when conducting community screening programmes to protect patient privacy. Between each of the six sensors is a gel that absorbs vibration, so the sensors don’t interfere with each other.

The researchers tested the device on healthy participants with different body shapes and sizes and recorded their heart sounds. Their next steps will be to test the device in a clinical setting on a variety of patients, against results from an echocardiogram.

In parallel with the development of the device, the researchers have developed a machine learning algorithm that can use the recorded heart sounds to detect signs of valve disease automatically. Early tests of the algorithm suggest that it outperforms GPs in detecting heart valve disease.  

“If successful, this device could become an affordable and scalable solution for heart health screening, especially in areas with limited medical resources,” said Agarwal.

The researchers say that the device could be a useful tool to triage patients who are waiting for an echocardiogram, so that those with signs of valve disease can be seen in a hospital sooner.

A patent has been filed on the device by Cambridge Enterprise, the University’s commercialisation arm. Anurag Agarwal is a Fellow of Emmanuel College, Cambridge.

Reference:
Andrew McDonald et al. ‘A flexible multi-sensor device enabling handheld sensing of heart sounds by untrained users.’ IEEE Journal of Biomedical and Health Informatics (2025). DOI: 10.1109/JBHI.2025.3551882



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

One in 3,000 people at risk of punctured lung from faulty gene – almost 100 times higher than previous estimate

Person clutching their chest in pain
Chest pain
Credit: wildpixel (Getty Images)

As many as one in 3,000 people could be carrying a faulty gene that significantly increases their risk of a punctured lung, according to new estimates from Cambridge researchers. Previous estimates had put this risk closer to one in 200,000 people.

If an individual has Birt-Hogg-Dubé syndrome, then it’s very important that we’re able to diagnose it, because they and their family members may also be at risk of kidney cancerStefan Marciniak

The gene in question, FLCN, is linked to a condition known as Birt-Hogg-Dubé syndrome, symptoms of which include benign skin tumours, lung cysts, and an increased risk of kidney cancer.

In a study published today in the journal Thorax, a team from the University of Cambridge examined data from UK Biobank, the 100,000 Genomes Project, and East London Genes & Health – three large genomic datasets encompassing more than 550,000 people.

They discovered that between one in 2,710 and one in 4,190 individuals carries the particular variant of FLCN that underlies Birt-Hogg-Dubé syndrome. But curiously, whereas patients with a diagnosis of Birt-Hogg-Dubé syndrome have a lifetime risk of punctured lung of 37%, in the wider cohort of carriers of the genetic mutation this was lower at 28%. Even more striking, while patients with Birt-Hogg-Dubé syndrome have a 32% of developing kidney cancer, in the wider cohort this was only 1%.

Punctured lung – known as pneumothorax – is caused by an air leak in the lung, resulting in painful lung deflation and shortness of breath. Not every case of punctured lung is caused by a fault in the FLCN gene, however. Around one in 200 tall, thin young men in their teens or early twenties will experience a punctured lung, and for many of them the condition will resolve itself, or doctors will remove air or fluid from their lungs while treating the individual as an outpatient; many will not even know they have the condition.

If an individual experiences a punctured lung and doesn’t fit the common characteristics – for example, if they are in their forties – doctors will look for tell-tale cysts in the lower lungs, visible on an MRI scan. If these are present, then the individual is likely to have Birt-Hogg-Dubé syndrome.

Professor Marciniak is a researcher at the University of Cambridge and an honorary consultant at Cambridge University Hospitals NHS Foundation Trust and Royal Papworth Hospital NHS Foundation Trust. He co-leads the UK’s first Familial Pneumothorax Rare Disease Collaborative Network, together with Professor Kevin Blyth at Queen Elizabeth University Hospital and University of Glasgow. The aim of the Network is to optimise the care and treatment of patients with rare, inherited forms of familial pneumothorax, and to support research into this condition. 

Professor Marciniak said: “If an individual has Birt-Hogg-Dubé syndrome, then it’s very important that we’re able to diagnose it, because they and their family members may also be at risk of kidney cancer.

“The good news is that the punctured lung usually happens 10 to 20 years before the individual shows symptoms of kidney cancer, so we can keep an eye on them, screen them every year, and if we see the tumour it should still be early enough to cure it.”

Professor Marciniak says he was surprised to discover that the risk of kidney cancer was so much lower in carriers of the faulty FLCN gene who have not been diagnosed with Birt-Hogg-Dubé syndrome.

“Even though we’ve always thought of Birt-Hogg-Dubé syndrome as being caused by a single faulty gene, there’s clearly something else going on,” Professor Marciniak said. “The Birt-Hogg-Dubé patients that we’ve been caring for and studying for the past couple of decades are not representative of when this gene is broken in the wider population. There must be something else about their genetic background that’s interacting with the gene to cause the additional symptoms.”

The finding raises the question of whether, if an individual is found to have a fault FLCN gene, they should be offered screening for kidney cancer. However, Professor Marciniak does not believe this will be necessary.

“With increasing use of genetic testing, we will undoubtedly find more people with these mutations,” he said, “but unless we see the other tell-tale signs of Birt-Hogg-Dubé syndrome, our study shows there’s no reason to believe they’ll have the same elevated cancer risk.”

The research was funded by the Myrovlytis Trust, with additional support from the National Institute for Health and Care Research Cambridge Biomedical Research Centre.

Katie Honeywood, CEO of the Myrovlytis Trust, said: “The Myrovlytis Trust are delighted to have funded such an important project. We have long believed that the prevalence of Birt-Hogg-Dubé syndrome is far higher than previously reported. It highlights the importance of genetic testing for anyone who has any of the main symptoms associated with BHD including a collapsed lung. And even more so the importance of the medical world being aware of this condition for anyone who presents at an emergency department or clinic with these symptoms. We look forward to seeing the impact this projects outcome has on the Birt-Hogg-Dubé and wider community.”

Reference
Yngvadottir, B et al. Inherited predisposition to pneumothorax: Estimating the frequency of Birt-Hogg-Dubé syndrome from genomics and population cohorts. Thorax; 8 April 2025; DOI: 10.1136/thorax-2024-221738



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Researchers demonstrate the UK’s first long-distance ultra-secure communication over a quantum network

Digital abstract background
Abstract background
Credit: MR.Cole_Photographer via Getty Images

Researchers have successfully demonstrated the UK’s first long-distance ultra-secure transfer of data over a quantum communications network, including the UK’s first long-distance quantum-secured video call.

The team, from the Universities of Bristol and Cambridge, created the network, which uses standard fibreoptic infrastructure, but relies on a variety of quantum phenomena to enable ultra-secure data transfer.

The network uses two types of quantum key distribution (QKD) schemes: ‘unhackable’ encryption keys hidden inside particles of light; and distributed entanglement: a phenomenon that causes quantum particles to be intrinsically linked.

The researchers demonstrated the capabilities of the network via a live, quantum-secure video conference link, the transfer of encrypted medical data, and secure remote access to a distributed data centre. The data was successfully transmitted between Bristol and Cambridge – a fibre distance of over 410 kilometres.

This is the first time that a long-distance network, encompassing different quantum-secure technologies such as entanglement distribution, has been successfully demonstrated. The researchers presented their results at the 2025 Optical Fiber Communications Conference (OFC) in San Francisco.

Quantum communications offer unparalleled security advantages compared to classical telecommunications solutions. These technologies are immune against future cyber-attacks, even with quantum computers, which – once fully developed – will have the potential to break through even the strongest cryptographic methods currently in use.

In the past few years, researchers have been working to build and use quantum communication networks. China recently set up a massive network that covers 4,600 kilometres by connecting five cities using both fibreoptics and satellites. In Madrid, researchers created a smaller network with nine connection points that use different types of QKD to securely share information.

In 2019, researchers at Cambridge and Toshiba demonstrated a metro-scale quantum network operating at record key rates of millions of key bits per second. And in 2020, researchers in Bristol built a network that could share entanglement between multiple users. Similar quantum network trials have been demonstrated in Singapore, Italy and the USA.

Despite this progress, no one has built a large, long-distance network that can handle both types of QKD, entanglement distribution, and regular data transmission all at once, until now.

The experiment demonstrates the potential of quantum networks to accommodate different quantum-secure approaches simultaneously with classical communications infrastructure. It was carried out using the UK’s Quantum Network (UKQN), established over the last decade by the same team, supported by funding from the Engineering and Physical Sciences Research Council (EPSRC), and as part of the Quantum Communications Hub project.

“This is a crucial step toward building a quantum-secured future for our communities and society,” said co-author Dr Rui Wang, Lecturer for Future Optical Networks in the Smart Internet Lab’s High Performance Network Research Group at the University of Bristol. “More importantly, it lays the foundation for a large-scale quantum internet—connecting quantum nodes and devices through entanglement and teleportation on a global scale.”

“This marks the culmination of more than ten years of work to design and build the UK Quantum Network,” said co-author Adrian Wonfor from Cambridge’s Department of Engineering. “Not only does it demonstrate the use of multiple quantum communications technologies, but also the secure key management systems required to allow seamless end-to-end encryption between us.”

“This is a significant step in delivering quantum security for the communications we all rely upon in our daily lives at a national scale,” said co-author Professor Richard Penty, also from Cambridge and who headed the Quantum Networks work package in the Quantum Communications Hub. “It would not have been possible without the close collaboration of the two teams at Cambridge and Bristol, the support of our industrial partners Toshiba, BT, Adtran and Cisco, and our funders at UKRI.”

“This is an extraordinary achievement which highlights the UK’s world-class strengths in quantum networking technology,” said Gerald Buller, Director of the IQN Hub, based at Heriot-Watt University. “This exciting demonstration is precisely the kind of work the Integrated Quantum Networks Hub will support over the coming years, developing the technologies, protocols and standards which will establish a resilient, future-proof, national quantum communications infrastructure.”

The current UKQN covers two metropolitan quantum networks around Bristol and Cambridge, which are connected via a ‘backbone’ of four long-distance optical fibre links spanning 410 kilometres with three intermediate nodes.

The network uses single-mode fibre over the EPSRC National Dark Fibre Facility (which provides dedicated fibre for research purposes), and low-loss optical switches allowing network reconfiguration of both classical and quantum signal traffic.

The team will pursue this work further through a newly funded EPSRC project, the Integrated Quantum Networks Hub, whose vision is to establish quantum networks at all distance scales, from local networking of quantum processors to national-scale entanglement networks for quantum-safe communication, distributed computing and sensing, all the way to intercontinental networking via low-earth orbit satellites.

Reference:
R. Yang et al. ‘A UK Nationwide Heterogeneous Quantum Network.’ Paper presented at the 2025 Optical Fiber Communications Conference and Exhibition (OFC): https://www.ofcconference.org/en-us/home/schedule/



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

source: cam.ac.uk

Turbocharging the race to protect NATURE and CLIMATE with AI

By Jacqueline Garget

“We’re in a time of unprecedented change. We must accelerate progress towards equitably rebalancing how humans and nature coexist across the world. AI is our chance to do it!”

Anil Madhavapeddy, Professor of Planetary Computing, Department of Computer Science and Technology

Across Cambridge, researchers are using AI to transform climate and biodiversity research.

Improving land use decisions to benefit nature

“Around one third of the world’s land surface has been transformed for human use in the last 60 years. It’s mad just how terrible local decision-making can be for preserving global biodiversity. By protecting one place we’re often just outsourcing the impact to somewhere else,” says Anil MadhavapeddyProfessor of Planetary Computing in the Department of Computer Science and Technology.

Madhavapeddy is part of the team building a new AI tool called Terra – a predictive model of all terrestrial life on Earth. It’s a hugely complex task. The aim is to understand, then help decision-makers try to reverse, ecosystem deterioration and biodiversity loss while also producing enough food, energy and water for our needs.

Terra will combine extensive terrestrial data with earth observation data from satellites and drones, predictively filling in the blanks to build accurate global maps of biodiversity and human activity, and to reveal the world’s biodiversity hotspots.

The tool will help governments and businesses predict the global biodiversity impact of land-use decisions about vital human activities, like farming. It will vastly improve analysis of the effects of climate change, and the accuracy of biodiversity monitoring.

“Every bit of land on the planet is used for something – either by humans, or by nature. With Terra we’re trying to map it all.”

“We’ll use this to ask which land is most valuable for nature, and which for humanity, on a global scale, and show the potential impact of any land-use decision – aiming to protect the highly biodiverse areas,” says Madhavapeddy.

“We’re also working with Bill Sutherland’s Conservation Evidence project, which is gathering all human knowledge of biodiversity conservation to see which interventions are most effective for each species. The Holy Grail is to combine the observational data with this knowledge-based data. Then we can build really accurate maps of the world.”

“We can’t just rewild everything or humans will starve, and we can’t expand agricultural land so everyone can eat low-intensity organic food because there won’t be enough wildlife left. Terra could help us find the solutions.”

Anil Madhavapeddy holds a map representing the extinction impact as a result of converting natural land to arable use.

Speeding up effective biodiversity conservation

Professor Bill Sutherland has an ambitious goal: “to change the culture so it becomes unthinkable not to use evidence in conservation projects.” He wants to stop time and money being wasted on nature conservation projects that don’t work – something he’s seen surprisingly often.

Twenty years, and the equivalent of 75 years of researchers’ time, after he began the project, his freely available Conservation Evidence database is being used by a wide range of conservation organisations and policy makers, and resulting in better outcomes for nature.

But with up to one million species still facing extinction, things need to happen faster.

The team is using Large Language Models (LLMs) – a type of AI that can understand and process text, learning as it goes – to help comb through the vast conservation literature. They’re training it using their existing database, aiming for an automatic review system where AI can keep adding evidence from the ever-growing number of published papers: 300 million at last count.

“The problem is that it takes a long time to summarise evidence of what works in conservation. The team has read 1.6 million papers in 17 languages, and new science is being generated all the time,” says Sutherland, Professor of Conservation Biology in the Department of Zoology, adding:

“AI can make us much more efficient. It can find papers it thinks are suitable for our database, summarise them, classify them, and explain its decisions. It’s just incredible!”

Planetary Computing Fellow Dr Sadiq Jaffer, in the Department of Computer Science and Technology, is part of the interdisciplinary team building a ‘Conservation Co-Pilot’ based on the system. This will guide decision-makers towards the best way to manage specific land types to conserve particular species – and should be available within a year.

“The Conservation Co-Pilot will enable people to get answers to specific questions using the Conservation Evidence database almost instantaneously – quite a contrast to a traditional systematic review, which might take a year and cost £100,000,” says Jaffer. “Humans will still make the decisions, but the Co-Pilot will suggest the best course of action for nature, and can massively increase productivity.”

More: A man with a big idea. AI-Driven Conservation Co-Pilot is supported by ai@cam.

Professor Bill Sutherland

Understanding climate complexity for better forecasting

If hurricane warnings were taken more seriously because they were highly accurate, could more lives be saved? If we’d foreseen the current state of our climate fifty years ago, could more have been done to curb global temperature rise?

As the climate warms, the Earth’s natural systems are starting to behave in increasingly unpredictable ways. The models behind both short and long-term climate forecasts are getting more complex, and huge amounts of data being gathered, as scientists scramble to work out what’s going on. Machine learning, and software engineers, are becoming vital.

“Reliable forecasts of future climate trends – like temperature rise and sea-level change – are crucial for policy-makers trying to plan for the impacts of climate change,” says Dr Joe Wallwork, a Research Software Engineer at Cambridge’s Institute of Computing for Climate Science (ICCS), adding:

“We need better early warning systems to accurately forecast when and where extreme events will occur.”

“A lot of climate models are limited in resolution,” adds Dr Jack Atkinson, also a Research Software Engineer at ICCS. “Climate processes can happen at very small scales, and machine learning can help us improve the representation of these processes in models.”

Atkinson is lead developer of FTorch, a software that bridges the gap between traditional climate models and the latest machine learning tools. This breakthrough is improving climate predictions by better representing small-scale processes that are challenging to capture in models. FTorch is now used by major institutions, including the National Center for Atmospheric Research, to enhance climate simulations and inform global policy.

Models in development by the team, which use FTorch, use fewer assumptions and more real climate data. They’re powering the science of many climate studies – from sea-ice change, to cloud behaviour, to greenhouse gases in the atmosphere.

They’re helping to make climate predictions faster, which has important climate implications too. “It’s an unfortunate irony that climate models require quite a bit of energy to run,” says Wallwork. “With machine learning we can make the models more efficient, running faster so we can potentially lower our emissions.”

Towards more energy-efficient homes

Dr Ronita Bardhan, Associate Professor of Sustainable Built Environment, Department of Architecture and her team have developed an algorithm using open-source data, that uses satellite images to reveal the heat loss from almost every home in the UK.

This is helping identify the homes at risk of losing most heat, which could be prioritised to improve their energy performance. Bardhan has identified almost 700 homes in Cambridge alone that are particularly vulnerable to heat loss and hard to decarbonise.

The UK government is currently consulting on whether private rental homes should meet a minimum energy efficiency standard (an EPC rating of C) – with the outcome due in May 2025. A limited number of homes have an EPC rating, but according to Bardhan’s research, over half of the houses in England and Wales would fail this requirement.

“Our aim is to inform policy discussions in support of decarbonisation strategies. We’re using AI to unearth the hidden layers affecting population health, including the energy efficiency of our homes.”

Bardhan adds: “Prioritising the risks faced by individual households must become a central focus in climate change discussions. By capturing thermal images, we can distinctly visualise rooftops, walls, and other structural elements to precisely identify how each building loses heat.”

“This allows us to ask critical questions: How much of this heat could be retained through retrofitting? How can we empower households to become more resilient in the face of a changing climate? It allows us to identify poorly performing homes, and those whose owners most need government support to increase the property’s energy efficiency. To reach net zero by 2050 this must be done as a priority.”

Household heat loss isn’t the only issue that can affect health – indoor overheating can too. In the summer of 2022, thousands of people died in the UK due to extreme heat.

“We want to take maximum advantage of our algorithm – scaling it up to look not only at heat loss from homes for the winter months, but also what happens during the summer months when outside temperatures rise,” she says, adding:

“As the climate warms, buildings need to become more energy-efficient to help combat climate change and avoid excessive energy consumption. We shouldn’t need to use air conditioning in the UK!”

More: https://www.sustainabledesign.arct.cam.ac.uk/projects/decarbonisation. Bardhan’s research is supported by the UK and European Space Agencies. She contributes to the Health-Driven Design for Cities (HD4) project based at CARES in Singapore.

Enhancing forest monitoring and carbon tracking

The immense value of forests for capturing atmospheric carbon and supporting biodiversity is now clear. As market initiatives like carbon and biodiversity credits gain momentum to help offset our environmental impact, the science behind them is evolving too. Two new Cambridge projects are harnessing the power of AI to improve forest monitoring and carbon tracking, by diving into remarkable levels of detail.

“Our AI model will provide stronger evidence to support programmes for carbon accounting and biodiversity credits, by validating large-scale satellite images with extensive, high-quality data collected on the ground,” says Dr Emily Lines in the Department of Geography.

She’s working with Dr Harry Owen, also in the Department of Geography, to develop an AI model trained on data collected across European forests. This includes billions of terrestrial laser scans, drone images, and even manual measurements with a tape. While the sheer volume of data has previously been difficult to manage, AI is now speeding up the process and making the data easier to interpret.

“AI allows us to create high-resolution, high-quality and large-scale estimates of ecosystem properties – including their species composition and ecological function. This opens up new opportunities to rigorously test and validate credit-based valuations of ecosystems,” says Lines.

In a complementary project, researcher Frank Feng in the Department of Computer Science and Technology has developed an app called GreenLens, to measure trees on the ground much more quickly and accurately than a person with a tape measure. Estimating tree trunk diameter is key in understanding a tree’s health and its ability to store carbon. The freely available app is a user-friendly tool that provides reliable data to support reforestation projects and advance sustainability efforts.

“GreenLens uses AI-powered computer vision and depth detection to measure tree trunk diameters on affordable Android devices. This makes it easier and faster for researchers, landowners, and communities to collect data and monitor forests – without the need for expensive equipment,” says Feng.

Together these projects demonstrate the growing role of AI and mobile technology in improving forest monitoring, accelerating effective climate action, and supporting biodiversity conservation worldwide.

More: Harnessing AI for Forest Monitoring is supported by ai@cam.

source: cam.ac.uk

AI can be good for our HEALTH and WELLBEING

By Craig Brierley

“If we get things right, the possibilities for AI to transform health and medicine are endless. It can be of massive public benefit. But more than that, it has to be.”

Professors Andres Floto, Mihaela van der Schaar and Eoin McKinney, Cambridge Centre for AI in Medicine

Cambridge researchers are looking at ways that AI can transform everything from drug discovery to Alzheimer’s diagnoses to GP consultations.

Tackling dementia

In 2024, Professor Zoe Kourtzi in the Department of Psychology showed that an AI tool developed by her team could outperform clinical tests at predicting whether people with early signs of dementia will remain stable or develop Alzheimer’s disease. 

At a time of intense pressure on the NHS, tools such as this could help doctors prioritise care for those patients who need it most, while removing the need for invasive and costly diagnostic tests for those whose condition will remain stable. They can also give patients peace of mind that their condition is unlikely to worsen, or, for those less fortunate, it can help them and their families prepare. 

These tools could also be transformational in the search for new drugs, making clinical trials more effective, faster and cheaper, says Kourtzi. 

Recently, two dementia drugs – lecanemab and donanemab – have shown promise in slowing the disease, but the benefits compared to the costs were judged insufficient to warrant approval for use within the NHS. Beyond these, there’s been limited progress in the field. 

Part of the problem is that clinical trials often focus on the wrong people, which is where AI may help to better decide who to include in trials. 

“If you have people that the AI models say will not develop pathology, you won’t want to put them in your trial. They’ll only mess up the statistics, and then [the trials] will never show an effect, no matter if you have the best drug in the world. And if you include people who will progress really fast, it might be already too late for the drug to show benefit.” 

Kourtzi is leading one of ai@cam’s AI-deas projects to create a ‘BrainHealth hub’ to tackle the global brain and mental health crisis. It will bridge the gap between engineers, mathematicians and computer scientists who have the tools but lack the data, and clinicians and neuroscientists who have the data but lack advanced tools to mine them. 

“Our idea is to create a ‘hothouse’ of ideas where people can come together to ask and answer challenging questions.“

University researchers, industry partners, the charity sector and policymakers will explore questions such as: how can we use AI for drug discovery, to accelerate clinical trials and develop new treatments, and how can we build interpretable AI models that can be translated to clinical tools?” 

The need for such AI to be reliable and responsible is a theme that comes up frequently when Kourtzi speaks to patient groups. 

“When doctors are using a complex diagnostic tool like an MRI machine, patients don’t query whether they understand what’s in this machine, why it works this way. What they want to know is that it’s gone through regulatory standards, it’s safe to use and can be trusted. It’s exactly the same with AI.”

Elderly patient speaking to a healthcare worker

Making GP practices more efficient

Professor Niels Peek from The Healthcare Improvement Studies (THIS) Institute believes that AI could have a major impact on primary care services, such as GP practices, by tackling some of their most mundane tasks.

One such application involves the use of ‘digital scribes’ to record, transcribe, and summarise conversations between GPs and patients.

“If you look at the amount of time that clinicians spend on that type of work, it’s just incredible,” he says.

“Considering that clinician time is probably the most precious commodity within the NHS, this is technology that could be transformational.”

It is likely that the NHS will increasingly adopt digital scribe technology in the future, so it will be important to ensure the summaries are accurate and do not omit key points or add things that were not mentioned (a ‘hallucination’). With support from The Health Foundation, Peek is asking whether the technology actually saves time? “If you have to spend a lot of time correcting its outputs, then it’s no longer a given that it actually does save you time.”

Peek believes that in the future, every clinical consultation will be recorded digitally, stored as part of a patient’s record, and summarised with AI. But the existing technology environment, particularly in primary care, presents a challenge.

“GPs use electronic health records that have evolved over time and often look outdated. Any new technology must fit within these systems. Asking people to log into a different system is not feasible.”

Peek is also involved in evaluating Patchs, a tool that applies AI to the process of booking GP appointments and conducting online consultations. It was designed by GP staff and patients, in collaboration with The University of Manchester (where Peek was formerly based) and commercialised by the company Patchs Health. It is now used by around one in 10 GP practices across England.

Working with end users – patients, GPs, and particularly the administrative staff who use these systems on a day-to-day basis – is crucial.  “You have to make sure they fit both with the systems people are already using, and also with how they do things, with their workflows. Only then will you see differences that translate into real benefits to people.”

GP speaking to a patient

Addressing mental health among young people

Over recent years, there has been a significant increase in the prevalence of mental health disorders among young people. But with stretched NHS resources, it can be difficult to access Child and Adolescent Mental Health Services (CAMHS).

Not every child recommended for a referral will need to see a mental health specialist, says Dr Anna Moore from the Department of Psychiatry, but the bottleneck means they can be on the waiting list for up to two years only to be told they don’t meet the criteria for treatment. The quality of advice they get about alternative options that do meet their needs varies a lot.

Moore is interested in whether AI can help manage this bottleneck by identifying those children in greatest need for support, and helping those who don’t need specialist CAMHS to find suitable support from elsewhere. One way to do so is by using data collected routinely on children.

“The kinds of data that help us do this can be some of the really sensitive data about people,” she says. “It might be health information, how they’re doing at school, but it could also be information such as they got drunk last weekend and ended up in A&E.”

For this reason, she says, it’s essential that they work closely with members of the public when designing such a system to make sure people understand what they are doing, the kinds of data they are considering using and how it might be used, but also how it might improve the care of young people with mental health problems.

One of the questions that often comes up from ethicists is whether, given the difficulties in accessing CAMHS, it is necessarily a good thing to identify children if they cannot then access services.

“Yes, we can identify those kids who need help, but we need to ask, ‘but so what?’,” she says. The tool will need to suggest a referral to CAMHS for the children who need it, but for those who have a problem but could be better supported in other ways than CAMHS that could be more flexible to their needs, can it signpost them to helpful, evidence-based, age-appropriate information?

Moore is designing the tool to help find those children who might otherwise get missed. In the most extreme cases, these might be children such as Victoria Climbié and Baby P, who were tortured and murdered by their guardians. The serious case reviews highlighted multiple missed opportunities for action, often because systems were not joined up, meaning no one was able to see full picture.

“If we’re able to look at all of the data across the system relating to a child, then it might well be possible to bring that together and say, actually there’s enough information here that we can do something about it.”

From womb to world

Across the world, fertility rates are falling, while families are choosing to have children later on in life. To help them conceive, many couples turn to assisted reproductive technologies such as IVF; however, success rates remain low and the process can be expensive. In the UK, treatment at a private clinic can cost more than £5,000 per cycle – in the US, it can be around $20,000 – and with no guarantee of success.

Mo Vali and Dr Staci Weiss hope that AI can change this. They are leading From Womb to World, one of ai@cam’s flagship AI-deas projects, which aims to improve prospective parents’ chances of having a baby by diagnosing fertility conditions early on and personalising fertility treatments.

“We’re trying to democratise access to IVF outcomes and tackle a growing societal problem of declining fertility rates.”
Mo Vali

They are working with Professor Yau Thum at The Lister Fertility Clinic, one of the largest standalone private IVF clinics in the UK, to develop cheaper, less invasive and more accurate AI-assisted tests that can be used throughout the patient’s IVF journey. To do this, they are making use of the myriad different samples and datasets collected during the fertility process, from blood tests and ultrasound images to follicular fluid, as well as data encompassing demographic and cultural factors.

Building the AI tools was the easy bit, says Vali. The bigger challenge has been generating the datasets, clearing ethical and regulatory hurdles, and importantly, ensuring that sensitive data is properly anonymised and de-identified – vital for patient privacy and building public trust.

The team also hopes to use AI to improve, and make more accessible, 4D ultrasound scans that let the parents see their baby moving in the womb, capturing movements like thumb-sucking and yawning. This is important for strengthening the maternal bond during a potentially stressful time, says Weiss.

“Seeing their baby’s face and watching it move creates a very different kind of physical, embodied reality and a bond between the mother and her child,” she says.

Consulting with women who have experienced first-hand the challenges of fertility treatments is providing valuable insights, while The Lister Fertility Clinic – a private clinic – is an ideal platform in which to test their ideas before providing tools for the wider public. It offers a smaller, more controlled environment where they can engage directly with senior clinicians.

“We want to ensure that the research that we are doing and the AI models that we’re building work seamlessly before we go at scale,” says Vali.

Pregnant women looking at a fertility app

Preventing cancer

Antonis Antoniou, Professor of Cancer Risk Prediction at Cambridge, has spent most of his career developing models that predict our risk of developing cancers. Now, AI promises to take his work to an entirely new level.

Antoniou has recently been announced as Director of the Cancer Data-Driven Discovery Programme, a £10million initiative that promises to transform how we detect, diagnose – and even prevent – cancer in the future. It’s a multi-institutional project, with partners across the UK, that will build infrastructure and create a multidisciplinary community of researchers, including training the next generation of researchers, with funding for 30 PhD places and early career research positions in cancer data sciences.

The programme will enable scientists to access and link a vast array of diverse health data sets, from GP clinics and cancer screening programmes to large cohort studies through to data generated through interactions with public services such as on occupation, educational attainment and other geospatial data on air pollution, housing quality and access to services. These will be used in combination with AI and state-of-the-art analytics. 

“The funding will allow us to use these data sets to develop models that help us predict individual cancer risk and greatly improve our understanding of who is most at risk of developing cancer,” he says. “It will hopefully help us transform how we detect and prevent and diagnose cancer in the future.”

One of the key considerations of their work will be to ensure that the AI tools they develop do not inadvertently exacerbate inequalities.

“We have to be careful not to develop models that only work for people who are willing to participate in research studies or those who frequently interact with the healthcare sector, for example, and ensure we’re not ignoring those who can’t easily access healthcare services, perhaps because they live in areas of deprivation.”

Key to their programme has been the involvement of patients and members of the public, who, alongside clinical practitioners, have helped them from the outset to shape their programme.

“They were involved in co-developing our proposals from the planning phase, and going forward, they’ll continue to play a key role, helping guide how we work and to make sure that the data are used responsibly and safely,” he says.

The Cancer Data-Driven Detection programme is jointly supported by Cancer Research UK, the National Institute for Health & Care Research, the Engineering & Physical Sciences Research Council, Health Data Research UK, and Administrative Data Research UK.

Read more about AI and cancer here

Female patient undergoing a mammogram

Innovations in drug discovery

It’s just over 20 years since the first human genome was sequenced, opening up a new scientific field – genomics – and helping us understand how our bodies function. Since then, the number of so-called ‘omics’ – complete readouts of particular types of molecules in our bodies, such as proteins (proteomics) and metabolites (metabolomics) – has blossomed.

Dr Namshik Han from the Milner Therapeutics Institute is interested in how AI can mine this treasure trove to help discover new drugs.

“We’re applying AI approaches to dissect those really big data sets and try to identify meaningful, actionable drug targets,” he says. 

His team works with partners who can take these targets to the next stage, such as by developing chemical compounds to act on these targets, testing them in cells and animals, and then taking them through clinical trials.

The Milner Institute acts as a bridge between academia and industry to accelerate this process, partnering with dozens of academic institutes, industry partners, biotech, pharma and venture capitalists. But at the ‘bleeding edge’ of Han’s work is his collaborations with tech companies. 

Han is interested in how quantum computers, which use principles of quantum mechanics to enable much faster and more powerful calculations, can address problems such as the complex chemistry underpinning drug development.

“We’ve shown that quantum algorithms see things that conventional AI algorithms don’t.” Han says.

His lab has used quantum algorithm to explore massive networks comprising tens of thousands of human proteins. When conventional AI explores these networks, it only looks at certain areas, whereas Han showed that quantum algorithms cover the entire network.

AI has the potential to improve every aspect of drug discovery – from identifying targets, as Han is doing, to optimising clinical trials, potentially reducing the cost of new medications and ensuring patients benefit faster. But that’s not what really excites Han.

“Take cancer, for example,” he says. “There are many different types, and for some of them we don’t have specific drugs to treat them. Instead, we have to use a drug for a related cancer and give that to the patient, which is not ideal. 

“Quantum based AI will open up a completely new door to find truly innovative drugs which we’ve never thought of before. That’s where the real impact has to be.” 

Biomedical researcher pipetting in a lab

source: cam.ac.uk

Opinion: AI can democratise weather forecasting

Richard Turner
Professor Richard Turner

AI will give us the next leap forward in forecasting the weather, says Richard Turner, and make it available to all countries, not just those with access to high-quality data and computing resources.

From farmers planting crops to emergency responders preparing for natural disasters, the ability to predict the weather is fundamental to societies all across the globe.

The modern approach to forecasting was invented a century ago. Lewis Fry Richardson, a former student of King’s College Cambridge, who was working as an ambulance driver during the First World War, realised that being able to predict the weather could help save lives. This led him to develop the first mathematical approach to forecasting the weather.

Richardson’s method was a breakthrough, but to say that it was time-consuming is an understatement: he calculated it would require 64,000 people working with slide rules to produce a timely forecast for the following day. It was the development of supercomputers in the 1950s that made Richardson’s approach practical.

Since then, weather forecasting methods have become more sophisticated and more accurate, driven by advances in computing and by the increased amount of information we have about the weather from satellites and other instruments. But now, we are poised to make another big leap forward, thanks to AI.

The last few years have seen an AI revolution in weather forecasting and my group has recently taken this to the next level. Working with colleagues at The Alan Turing Institute, Microsoft Research and the European Centre for Medium Range Weather Forecasts, we’ve developed Aardvark Weather, a fully AI-driven weather prediction system that can deliver accurate forecasts tens of times faster and using thousands of times less computing power than both physics-based forecasting systems and previous AI-based approaches.

We believe that Aardvark could democratise access to accurate forecasts, since it can be run and trained on a regular desktop computer, not the powerful supercomputers that power most of today’s weather forecasting technology. In developing countries where access to high-quality data and computing resources is limited, platforms like Aardvark could be transformational.


AI is a game changer

The need for improved forecasting systems is more crucial than ever. Extreme weather events – from the recent wildfires in Los Angeles to last year’s flash floods in Spain – are becoming more frequent. Predicting other parts of the Earth system is equally as important. For example, 1.5 million people die each year in India due to poor air quality, and changes in ice on the sea and land at the poles have huge implications.

AI could help mitigate these risks by delivering timely, hyper-local forecasts, even in regions with limited observational data. These AI systems have the potential to dramatically improve public safety, food security, supply chain management, and energy planning in an increasingly volatile climate.

AI-driven forecasting is also well-placed to play a crucial role in our transition to a net-zero future. If we can better predict fluctuations in supply from wind and solar energy sources, we can optimise energy grids, reducing reliance on fossil fuels and making clean energy more viable on a global scale.

Richardson’s weather forecasting approach relied on numerical models – mathematical representations of the Earth’s atmosphere, land, and oceans that require massive computing power. These models, though incredibly advanced, have limitations: they are expensive, slow to run, time consuming to improve, and often struggle to deliver accurate predictions in areas like the tropics or the poles. The arrival of AI is changing the game entirely.


Achieving its potential

Results from Aardvark and other AI-driven systems have demonstrated that they can perform weather forecasting tasks with excellent speed and accuracy. These models, trained on vast amounts of historical data, can learn patterns and generate forecasts in a fraction of the time that traditional methods require. Through the Aurora project with Microsoft Research, I’ve also shown that the same approaches can transform forecasts of air quality, ocean waves, and hurricanes.

Companies like Google DeepMind, Microsoft, and various research institutions – including my team at Cambridge – are achieving results that rival or even surpass conventional numerical models at a fraction of the computational cost.

Of course, this transformation comes with challenges.

Ensuring trust and transparency in AI weather forecasting technologies is paramount. Weather forecasting has long been a domain where public institutions – like the UK Met Office and the European Centre for Medium-Range Weather Forecasts – play a critical role in ensuring reliability and accountability. AI models, though promising, must be rigorously tested and validated to build public confidence. These systems should be implemented alongside existing methods, rather than replacing them outright, and continuous retraining and re-evaluation will likely be needed due to the changing climate.

National weather services and universities like Cambridge must step up to ensure that AI-driven forecasting remains a public good, not a commercial commodity. The rise of AI weather forecasting has opened the door to more commercial involvement in an area that would previously have been dominated by public institutions and international centres. While start-ups and big tech companies are making significant strides in AI weather prediction and are a valuable part of the forecasting ecosystem, business interests are not necessarily aligned with societal need. The risk is that critical forecasting capabilities could become privatised, limiting access for those who need it most.

Universities are uniquely positioned to act as a balancing force, driving research that prioritises long-term societal benefit. However, traditional academic structures are often ill-equipped to handle the scale and speed required for AI research. If we are to compete with industry, we must rethink how AI research is conducted  embracing interdisciplinary collaboration, investing in large-scale computational infrastructure, rethinking funding models so that they are faster and more agile, and fostering partnerships that ensure AI development aligns with the public good.

The future of weather forecasting will not be decided solely in the labs of tech giants or the halls of government. It will be shaped by the choices we make now – how we invest in research, how we regulate AI deployment, and how we ensure that life-saving technology remains accessible to all.

Richard Turner is Professor of Machine Learning in the Machine Learning Group of the Department of Engineering, a Research Lead at the Alan Turing Institute, and previously a Visiting Researcher at Microsoft Research. He is also a Bye Fellow of Christ’s College.

source: cam.ac.uk

Opinion: Humans should be at the heart of AI

Anna Korhonen
Professor Anna Korhonen

With the right development and application, AI could become a transformative force for good. What’s missing in current technologies is human insight, says Anna Korhonen.

AI has immense potential to transform human life for the better. But to deliver on this promise it must be equipped with a better understanding of human intelligence, values and needs.

AI could help tackle some of the world’s most pressing challenges – advancing climate science, improving healthcare, making education more accessible, and reducing inequalities.

In the public sector, AI could enhance decision-making, optimise service delivery, and ensure that resources reach the people and places where they are most needed. With the right development and application, it could become a transformative force for good.

But today’s AI technologies struggle to grasp the nuances of human behaviour, social dynamics and the complex realities of our world.

They lack the flexibility and contextual understanding of human intelligence. Their limitations in communication, reasoning and judgment mean that they fall short of supporting us in many critical tasks. Meanwhile, concerns around bias, misinformation, safety and job displacement continue to grow.


Achieving its potential

To unlock AI’s potential for good, we need a fundamental shift in how it is developed.

That starts with designing technologies to work in harmony with people – to be more human-centric. Rather than replacing us, AI should enhance our capabilities, support our intelligence and creativity, and reflect our values and priorities. To truly benefit everyone, it should be designed to be trustworthy, inclusive, and accessible, serving diverse communities worldwide – not just a privileged few.

To enable this, we need to move beyond viewing AI as a purely technical field. Building technologies that genuinely understand and support people requires insights from the diverse range of disciplines that explore the human condition – social, behavioural, cognitive, clinical and environmental sciences, the arts and more. Universities are uniquely positioned to lead this shift by promoting interdisciplinary research and connecting technical fields with human-centred perspectives.

We must also take AI research beyond the lab and into the real world by collaborating across sectors – bringing together academia, industry, policymakers, NGOs, and civil society to understand the needs, ensure technologies are fit for purpose, and test them in real-world settings. These partnerships are crucial to building systems that are robust, scalable, and socially beneficial.

Finally, AI education must evolve. The next generation of AI practitioners needs more than technical expertise – they must also understand the wider social, ethical, environmental, and industrial contexts of their work. At Cambridge, we are launching new MPhil and PhD programmes in Human-Inspired Artificial Intelligence to help meet this need. These programmes, starting in October 2025, will equip students with the interdisciplinary and cross-sector knowledge needed to innovate AI that is not only powerful, but also aligned with human values and needs.

The opportunity is vast – but progress depends on the choices we make today. By rethinking how AI is developed, embedding human insight at every stage, working collaboratively across sectors, and reimagining how we educate the next generation, we can ensure that AI becomes a force for public good – one that helps shape a more just, inclusive and equitable future.

Anna Korhonen is Professor of Natural Language Processing, Director of the Centre for Human-Inspired Artificial Intelligence (CHIA) and Co-Director of the Institute for Technology and Humanity at the University of Cambridge.

source: cam.ac.uk

Opinion: AI belongs in classrooms

Jill Duffy
Jill Duffy

AI in education has transformative potential for students, teachers and schools but only if we harness it in the right way – by keeping people at the heart of the technology, says Jill Duffy.

When you think about AI and education, the first thing that comes to mind is probably students using ChatGPT to write their essays and coursework. But, important as this issue is, the debate about AI in education should go way beyond it.

As head of an exam board (OCR), I am well aware of how serious this issue is. Deciphering whether a piece of work was AI generated was not part of the job description for educators a decade ago, and I’m sure not many appreciate this new addition to their workload.

ChatGPT writing essays may be the most noticeable phenomenon right now, but it is far from the only way that this technology will transform how we teach and assess young people. Crucially, AI offers opportunities as well as threats. But only if we harness it in the right way – by keeping people at the heart of education.

What does that mean in practice? Let’s look again at the concerns over AI and coursework. As I’ve previously argued, we cannot put generative AI back in its box. Demanding that students never use it in any capacity is obviously not enforceable, and I would also argue is not desirable: the proper use of this technology will be a vital skill in their working lives.

In future, instead of asking students “did you use AI?” teachers will be asking them “how did you use AI?” It’s about accepting where this technology can help students – finessing arguments, helping with research – while protecting the human skills they will still need – fact checking, rewriting, thinking analytically.

The same human-centric approach is needed when it comes to teaching and AI. We can’t afford to ignore the obvious benefits of this technology, but we cannot embrace it blindly at the cost of real, human teaching. At OCR we are looking into various tools that could help teachers who are struggling with ever-increasing workloads. This could be about helping them with lesson planning, or searching through subject specifications or guidance materials.

So, we don’t expect AI to replace the very human skills of intelligently questioning a student to guide their learning, or safeguarding their wellbeing, or passing on a passion for their subject. Instead, AI can take care of some of the time-consuming admin, giving teachers more time to actually teach.

This human centered approach guides everything we are doing at Cambridge and OCR. We have been developing digital exams for the past few years, for Cambridge’s international exams and for OCR’s popular Computer Science GCSE. What we are not doing here is simply transferring the paper exam on to a screen. We have been testing and monitoring how students perform in these on-screen exams, using mocks and trials, to make sure there is no advantage or disadvantage to a particular method.


Achieving its potential

But keeping humans at the heart of education while getting the most out of new technology will take more than the efforts of one exam board.

As OCR recently warned in its report Striking the Balance, there is a risk that the move towards digital exacerbates existing inequalities in the system. If digital learning can be more effective, what happens to schools that can’t afford the required technology?

A national strategy is required – involving the government, regulators, and other stakeholders – to ensure every school can benefit from the transformative potential of this technology.

Jill Duffy leads OCR and is managing director for UK Education at Cambridge University Press and Assessment.

source: cam.ac.uk