Specimens in a Cambridge museum will be brought to life through the power of Artificial Intelligence, by a team aiming to strengthen our connection with the natural world and reverse apathy towards biodiversity loss.
This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting.Jack Ashby
From Tuesday 15 October the University of Cambridge’s Museum of Zoology is offering visitors a unique experience: the chance to chat with the animals on display – whether skeletal, taxidermy, or extinct.
In a collaboration with the company Nature Perspectives, the Museum’s Assistant Director Jack Ashby has chosen a range of animal specimens to bring back to life using generative Artificial Intelligence.
Visitors can pose their questions to thirteen specimens – including dodo and whale skeletons, a taxidermied red panda, and a preserved cockroach – by scanning QR codes that open a chat-box on their mobile phone. In two-way conversations, which can be voice- or text-based, the specimens will answer as if they are still alive.
This is believed to be the first time a museum has used generative Artificial Intelligence to enable visitors to chat with objects on display in this way.
By analysing data from the conversations, the team hopes that the month-long experiment will help them learn more about how AI can help the public to better engage with nature, and about the potential for AI in museums. It will also provide the museum with new insights into what visitors really want to know about the specimens on display.
Nature Perspectives uses AI to enable cultural institutions like the Museum of Zoology to engage the public through these unique conversational experiences. The company aims to reverse a growing apathy towards biodiversity loss by enabling new ways to engage with the natural world.
“This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting, and we also hope to learn something about how our visitors see the animals on display,” said Jack Ashby, Assistant Director of the University of Cambridge’s Museum of Zoology.
He added: “Our whole purpose is to get people engaged with the natural world. So we’re curious to see whether this will work, and whether chatting to the animals will change people’s attitudes towards them – will the cockroach be better liked, for example, as a result of having its voice heard?”
“By using AI to simulate non-human perspectives, our technology offers a novel way for audiences to connect with the natural world,” said Gal Zanir, co-founder of the company Nature Perspectives, which developed the AI technology for the experience.
He added: “One of the most magical aspects of the simulations is that they’re age-adaptive. For the first time, visitors of all ages will be able to ask the specimens anything they like.”
The technology brings together all available information on each animal involved – including details particular to the individual specimens such as where they came from and how they were prepared for display in the museum. This is all repackaged from a first-person perspective, so that visitors can experience realistic, meaningful conversations.
The animals will adjust their tone and language to suit the age of the person they’re talking to. And they’re multi-lingual – speaking over 20 languages including Spanish and Japanese so that visitors can chat in their native languages.
The team has chosen a range of specimens that include skeletons, taxidermy, models, and whole preserved animals. The specimens are: dodo skeleton, narwhal skeleton, brain coral, red admiral butterfly, fin whale skeleton, American cockroach, huia taxidermy (a recently extinct bird from New Zealand), red panda taxidermy, freeze-dried platypus, giant sloth fossil skeleton, giant deer skull and antlers, mallard taxidermy, and Ichthyostega model (an extinct ancestor of all animals with four legs).
Nature Perspectives was created by a team of graduates from the University of Cambridge’s Masters in Conservation Leadership programme, who noticed that people seem to feel more connected to machines when they can talk to them. This inspired the team to apply the same principle to nature – giving nature a voice to promote its agency and foster deeper, more personal connections between people and the natural world.
“Artificial Intelligence is opening up exciting new opportunities to connect people with non-human life, but the impacts need to be carefully studied. I’m delighted to be involved in exploring how the Nature Perspectives pilot affects the way people feel about and understand the species they ‘meet’ in the Museum of Zoology,” said Professor Chris Sandbrook, Director of the University of Cambridge’s Masters in Conservation Leadership programme.
“Enabling museums to engage visitors with the simulated perspectives of exhibits is only the first step for Nature Perspectives. We aim to apply this transformative approach widely, from public engagement and education to scientific research, to representing nature in legal processes, policy-making and beyond,” said Zanir.
The Nature Perspectives AI experiment runs for one month, from 15th October to 15th of November 2024. For visiting times see www.museum.zoo.cam.ac.uk/visit-us
An iron meteorite from the core of a melted planetesimal (left) and a chondrite meteorite, derived from a ‘primitive’, unmelted planetesimal (right). Credit: Rayssa Martins/Ross Findlay
Researchers have used the chemical fingerprints of zinc contained in meteorites to determine the origin of volatile elements on Earth. The results suggest that without ‘unmelted’ asteroids, there may not have been enough of these compounds on Earth for life to emerge.
Volatiles are elements or compounds that change into vapour at relatively low temperatures. They include the six most common elements found in living organisms, as well as water. The zinc found in meteorites has a unique composition, which can be used to identify the sources of Earth’s volatiles.
The researchers, from the University of Cambridge and Imperial College London, have previously found that Earth’s zinc came from different parts of our Solar System: about half came from beyond Jupiter and half originated closer to Earth.
“One of the most fundamental questions on the origin of life is where the materials we need for life to evolve came from,” said Dr Rayssa Martins from Cambridge’s Department of Earth Sciences. “If we can understand how these materials came to be on Earth, it might give us clues to how life originated here, and how it might emerge elsewhere.”
Planetesimals are the main building blocks of rocky planets, such as Earth. These small bodies are formed through a process called accretion, where particles around a young star start to stick together, and form progressively larger bodies.
But not all planetesimals are made equal. The earliest planetesimals that formed in the Solar System were exposed to high levels of radioactivity, which caused them to melt and lose their volatiles. But some planetesimals formed after these sources of radioactivity were mostly extinct, which helped them survive the melting process and preserved more of their volatiles.
In a study published in the journal Science Advances, Martins and her colleagues looked at the different forms of zinc that arrived on Earth from these planetesimals. The researchers measured the zinc from a large sample of meteorites originating from different planetesimals and used this data to model how Earth got its zinc, by tracing the entire period of the Earth’s accretion, which took tens of millions of years.
Their results show that while these ‘melted’ planetesimals contributed about 70% of Earth’s overall mass, they only provided around 10% of its zinc.
According to the model, the rest of Earth’s zinc came from materials that didn’t melt and lose their volatile elements. Their findings suggest that unmelted, or ‘primitive’ materials were an essential source of volatiles for Earth.
“We know that the distance between a planet and its star is a determining factor in establishing the necessary conditions for that planet to sustain liquid water on its surface,” said Martins, the study’s lead author. “But our results show there’s no guarantee that planets incorporate the right materials to have enough water and other volatiles in the first place – regardless of their physical state.”
The ability to trace elements through millions or even billions of years of evolution could be a vital tool in the search for life elsewhere, such as on Mars, or on planets outside our Solar System.
“Similar conditions and processes are also likely in other young planetary systems,” said Martins. “The roles these different materials play in supplying volatiles is something we should keep in mind when looking for habitable planets elsewhere.”
The research was supported in part by Imperial College London, the European Research Council, and UK Research and Innovation (UKRI).
Reference: Rayssa Martins et al. ‘Primitive asteroids as a major source of terrestrial volatiles.’ Science Advances (2024). DOI: 10.1126/sciadv.ado4121
Nyobolt, a University of Cambridge spin-out company, has demonstrated its ultra-fast charging batteries in an electric sportscar prototype, going from 10% to 80% charge in under five minutes, twice the speed of the fastest-charging vehicles currently on the road.
In addition to ultra-fast charging times, the batteries developed by Nyobolt – which was spun out of Professor Dame Clare Grey’s lab in the Yusuf Hamied Department of Chemistry in 2019 – do not suffer from the degradation issues associated with lithium-ion batteries.
Tests of the first running Nyobolt EV prototype will be used to validate the company’s battery performance in a high-performance environment.
Cambridge-based Nyobolt has used its patented carbon and metal oxide anode materials, low-impedance cell design, integrated power electronics and software controls to create power-dense battery and charging systems. These support the electrification of applications such as heavy-duty off-highway trucks, EVs, robotics and consumer devices that demand high power and quick recharge cycles.
Initial in-vehicle testing using 350kW (800V) DC fast chargers confirmed that the Nyobolt EV’s battery can be charged from 10 per cent to 80 per cent in 4 minutes 37 seconds – with a full charge enabling the prototype to achieve a range of 155 miles. That is twice the speed of most of the fastest-charging vehicles today.
Independent testing of the technology confirmed that Nyobolt’s longer-lasting and more sustainable batteries can achieve over 4,000 fast charge cycles, or 600,000 miles, maintaining over 80 per cent battery capacity retention. This is many multiples higher than the warranties of much larger EV batteries on the road today.
“Nyobolt’s low impedance cells ensure we can offer sustainability, stretching out the battery’s usable life for up to 600,000 miles in the case of our technology demonstrator,”- Company co-founder and CEO, Dr Sai Shivareddy.
The battery pack in the Nyobolt EV prototype not only adds miles faster but the compact battery pack size enables energy-efficient electric vehicles that are cheaper to buy and run, and crucially use fewer resources to manufacture.
“Nyobolt is removing the obstacle of slow and inconvenient charging, making electrification appealing and accessible to those who don’t have the time for lengthy charging times or space for a home charger,” said Shane Davies, Nyobolt’s director of vehicle battery systems.
Nyobolt’s battery assembly plans could be in production at low volume within a year, ramping to 1,000 packs in 2025. Nyobolt’s flexible manufacturing model enables volumes of up to 2 million cells per year.
Nyobolt’s technology builds on a decade of battery research led by Grey and Shivareddy, who invented cutting-edge supercapacitors. Key to the company’s ability to offer ultra-fast charging without impacting battery life is its low-impedance cells that generate less heat, making it easier to manage such high-power levels during charging. Its anode materials in lithium-ion battery cells allow for a faster transfer of electrons between the anode and cathode.
Nyobolt EV electronics
Nyobolt EV interior
Nyobolt EV charging
Nyobolt EV dash
Nyobolt EV rear
Nyobolt is in conversation with a further eight vehicle manufacturers about adopting its technology. Alongside automotive applications, Nyobolt’s fast-charging technology is set to be used this year in robotics.
“Our extensive research here in the UK and in the US has unlocked a new battery technology that is ready and scalable right now,” said Shivareddy. “We are enabling the electrification of new products and services currently considered inviable or impossible. Creating real-world demonstrators, such as the Nyobolt EV, underlines both our readiness and commitment to making the industries see change is possible.”
Published: October 2024 Story: Sarah Collins and Nybolt. Design: Jessica Keating Photography credits: Nyobolt
The text in this work is licensed under a Creative Commons Attribution 4.0 International License.
Astronomers have used the NASA/ESA James Webb Space Telescope (JWST) to observe the ‘inside-out’ growth of a galaxy in the early universe, only 700 million years after the Big Bang.
This galaxy is one hundred times smaller than the Milky Way, but is surprisingly mature for so early in the universe. Like a large city, this galaxy has a dense collection of stars at its core but becomes less dense in the galactic ‘suburbs’. And like a large city, this galaxy is starting to sprawl, with star formation accelerating in the outskirts.
This is the earliest-ever detection of inside-out galactic growth. Until Webb, it had not been possible to study galaxy growth so early in the universe’s history. Although the images obtained with Webb represent a snapshot in time, the researchers, led by the University of Cambridge, say that studying similar galaxies could help us understand how they transform from clouds of gas into the complex structures we observe today. The results are reported in the journal Nature Astronomy.
“The question of how galaxies evolve over cosmic time is an important one in astrophysics,” said co-lead author Dr Sandro Tacchella from Cambridge’s Cavendish Laboratory. “We’ve had lots of excellent data for the last ten million years and for galaxies in our corner of the universe, but now with Webb, we can get observational data from billions of years back in time, probing the first billion years of cosmic history, which opens up all kinds of new questions.”
The galaxies we observe today grow via two main mechanisms: either they pull in, or accrete, gas to form new stars, or they grow by merging with smaller galaxies. Whether different mechanisms were at work in the early universe is an open question which astronomers are hoping to address with Webb.
“You expect galaxies to start small as gas clouds collapse under their own gravity, forming very dense cores of stars and possibly black holes,” said Tacchella. “As the galaxy grows and star formation increases, it’s sort of like a spinning figure skater: as the skater pulls in their arms, they gather momentum, and they spin faster and faster. Galaxies are somewhat similar, with gas accreting later from larger and larger distances spinning the galaxy up, which is why they often form spiral or disc shapes.”
This galaxy, observed as part of the JADES (JWST Advanced Extragalactic Survey) collaboration, is actively forming stars in the early universe. It has a highly dense core, which despite its relatively young age, is of a similar density to present-day massive elliptical galaxies, which have 1000 times more stars. Most of the star formation is happening further away from the core, with a star-forming ‘clump’ even further out.
The star formation activity is strongly rising toward the outskirts, as the star formation spreads out and the galaxy grows. This type of growth had been predicted with theoretical models, but with Webb, it is now possible to observe it.
“One of the many reasons that Webb is so transformational to us as astronomers is that we’re now able to observe what had previously been predicted through modelling,” said co-author William Baker, a PhD student at the Cavendish. “It’s like being able to check your homework.”
Using Webb, the researchers extracted information from the light emitted by the galaxy at different wavelengths, which they then used to estimate the number of younger stars versus older stars, which is converted into an estimate of the stellar mass and star formation rate.
Because the galaxy is so compact, the individual images of the galaxy were ‘forward modelled’ to take into account instrumental effects. Using stellar population modelling that includes prescriptions for gas emission and dust absorption, the researchers found older stars in the core, while the surrounding disc component is undergoing very active star formation. This galaxy doubles its stellar mass in the outskirts roughly every 10 million years, which is very rapid: the Milky Way galaxy doubles its mass only every 10 billion years.
The density of the galactic core, as well as the high star formation rate, suggest that this young galaxy is rich with the gas it needs to form new stars, which may reflect different conditions in the early universe.
“Of course, this is only one galaxy, so we need to know what other galaxies at the time were doing,” said Tacchella. “Were all galaxies like this one? We’re now analysing similar data from other galaxies. By looking at different galaxies across cosmic time, we may be able to reconstruct the growth cycle and demonstrate how galaxies grow to their eventual size today.”
Colleagues from across the University were recognised for their contributions to research culture at the inaugural Research Culture Celebration event on 30 September.
Cambridge aspires to create a positive research culture where all staff working in research, whether in academic, technical or support roles, feel welcomed, supported and able to give of their best. The Research Culture Celebration event aims to recognise and celebrate the good practice that is already happening, and to inspire further efforts across the University.
The original idea for the nominations and event (where those honoured are put forward by their colleagues) was part of the Action Research on Research Culture (ARRC) project’s study on researcher development. The ARRC project is one of several initiatives to nurture and promote positive research culture at Cambridge.
The event coincides with the launch of a wider programme of work being led by the University’s Research Culture Team. Four priority areas have been identified. These are:
– Precarity: how do we address the issues created by fixed-term contracts in early research careers?
– Access & Participation: who gets to do research? Can everyone fully participate as is expected of them?
– Challenging interpersonal and group dynamics: how do we support researchers who are struggling with difficult research dynamics? How do we support leaders to change?
– Time & space: how do we ensure people have the time and space to embody and enact good research culture?
This year the Research Steering Committee, which oversees the work, is expecting to allocate between £600,000 and £700,000 to facilitate research culture activities around the University. It will also contact individual departments to better understand the concerns they have around research culture. If you would like to be involved, please contact the research culture team.
For more about the event, including a gallery of images, see the Staff Hub (Cambridge users only; University login required).
The first wiring diagram of every neuron in an adult brain and the 50 million connections between them has been produced for a fruit fly.
Brain wiring diagrams are a first step towards understanding everything we’re interested in – how we control our movement, answer the telephone, or recognise a friend.Gregory Jefferis
This landmark achievement has been conducted by the FlyWire Consortium, a large international collaboration including researchers from the University of Cambridge, the MRC Laboratory of Molecular Biology in Cambridge, Princeton University, and the University of Vermont. It is published today in two papers in the journal Nature.
The diagram of all 139,255 neurons in the adult fly brain is the first of an entire brain for an animal that can walk and see. Previous efforts have completed the whole brain diagrams for much smaller brains, for example a fruit fly larva which has 3,016 neurons, and a nematode worm which has 302 neurons.
The researchers say the whole fly brain map is a key first step to completing larger brains. Since the fruit fly is a common tool in research, its brain map can be used to advance our understanding of how neural circuits work.
Dr Gregory Jefferis, from the University of Cambridge and the MRC Laboratory of Molecular Biology, one of the co-leaders of the research, said: “If we want to understand how the brain works, we need a mechanistic understanding of how all the neurons fit together and let you think. For most brains we have no idea how these networks function.
“Flies can do all kinds of complicated things like walk, fly, navigate, and the males sing to the females. Brain wiring diagrams are a first step towards understanding everything we’re interested in – how we control our movement, answer the telephone, or recognise a friend.”
Dr Mala Murthy from Princeton University, one of the co-leaders of the research, said: “We have made the entire database open and freely available to all researchers. We hope this will be transformative for neuroscientists trying to better understand how a healthy brain works. In the future we hope that it will be possible to compare what happens when things go wrong in our brains, for example in mental health conditions.”
Dr Marta Costa from the University of Cambridge, who was also involved in the research, said “This brain map, the biggest so far, has only been possible thanks to technical advances that didn’t seem possible ten years ago. It is a true testament to the way that innovation can drive research forward. The next steps will be to generate even bigger maps, such as a mouse brain, and ultimately, a human one.”
The scientists found that there were substantial similarities between the wiring in this map and previous smaller-scale efforts to map out parts of the fly brain. This led the researchers to conclude that there are many similarities in wiring between individual brains – that each brain isn’t a unique structure.
When comparing their brain diagram to previous diagrams of small areas of the brain, the researchers also found that about 0.5% of neurons have developmental variations that could cause connections between neurons to be mis-wired. The researchers say it will be important to understand, through future research, if these changes are linked to individuality or brain disorders.
3D rendering of all ~140k neurons in the fruit fly brain. Credit: Data source FlyWire.ai; Rendering by Philipp Schlegel (University of Cambridge/MRC LMB).
A whole fly brain is less than one millimetre wide. The researchers started with one female brain cut into seven thousand slices, each only 40 nanometres thick, that were previously scanned using high resolution electron microscopy in the laboratory of project co-leader Davi Bock at Janelia Research Campus in the US.
Analysing over 100 terabytes of image data (equivalent to the storage in 100 typical laptops) to extract the shapes of about 140,000 neurons and 50 million connections between them is too big a challenge for humans to complete manually. The researchers built on AI developed at Princeton University to identify and map neurons and their connections to each other.
However, the AI still makes many errors in datasets of this size. The Princeton University researchers established the FlyWire Consortium – made up of teams in more than 76 laboratories and 287 researchers around the world, as well as volunteers from the general public – which spent an estimated 33 person-years painstakingly proofreading all the data.
Dr Sebastian Seung, from Princeton University, who was one of the co-leaders of the research, said: “Mapping the whole brain has been made possible by advances in AI computing – it would have not been possible to reconstruct the entire wiring diagram manually. This is a display of how AI can move neuroscience forward. The fly brain is a milestone on our way to reconstructing a wiring diagram of a whole mouse brain.”
The researchers also annotated many details on the wiring diagram, such as classifying more than 8,000 cell types across the brain. This allows researchers to select particular systems within the brain for further study, such as the neurons involved in sight or movement.
Dr Philipp Schlegel, the first author of one of the studies, from the MRC Laboratory of Molecular Biology, said: “This dataset is a bit like Google Maps but for brains: the raw wiring diagram between neurons is like knowing which structures on satellite images of the Earth correspond to streets and buildings. Annotating neurons is like adding the names for streets and towns, business opening times, phone numbers and reviews to the map – you need both for it to be really useful.”
Simulating brain function
This is also the first whole brain wiring map – often called a connectome – to predict the function of all the connections between neurons.
Neurons use electrical signals to send messages. Each neuron can have hundreds of branches that connect it to other neurons. The points where these branches meet and transmit signals between neurons are called synapses. There are two main ways that neurons communicate across synapses: excitatory (which promotes the continuation of the electrical signal in the receiving neuron), or inhibitory (which reduces the likelihood that the next neuron will transmit signals).
Researchers from the team used AI image scanning technology to predict whether each synapse was inhibitory or excitatory.
Dr Gregory Jefferis added: “To begin to simulate the brain digitally, we need to know not only the structure of the brain, but also how the neurons function to turn each other on and off.”
“Using our data, which has been shared online as we worked, other scientists have already started trying to simulate how the fly brain responds to the outside world. This is an important start, but we will need to collect many different kinds of data to produce reliable simulations of how a brain functions.”
Associate Professor Davi Bock, one of the co-leaders of the research from the University of Vermont, said: “The hyper-detail of electron microscopy data creates its own challenges, especially at scale. This team wrote sophisticated software algorithms to identify patterns of cell structure and connectivity within all that detail.
“We now can make precise synaptic level maps and use these to better understand cell types and circuit structure at whole-brain scale. This will inevitably lead to a deeper understanding of how nervous systems process, store and recall information. I think this approach points the way forward for the analysis of future whole-brain connectomes, in the fly as well as in other species.”
This research was conducted using a female fly brain. Since there are differences in neuronal structure between male and female fly brains, the researchers also plan to characterise a male brain in the future.
The principal funders were the National Institutes of Health BRAIN Initiative, Wellcome, Medical Research Council, Princeton University and National Science Foundation.
Every other person will experience a mental health difficulty at some point in their life. The causes are complex, but treatment options are not – and in half of patients they just don’t work.
A new network of researchers at Cambridge aims to revolutionise mental healthcare by probing the processes underlying the symptoms.
Nobody needs reminding that just a few years ago we were all plunged into a state of maximal uncertainty.
We didn’t know what was going on, we couldn’t predict what would happen next, and the lockdowns were completely disruptive to normal life.
“Difficulty in responding to uncertainty lies at the core of many mental health difficulties, and it’s very telling that since the pandemic there’s been a twenty-five percent global increase in people diagnosed with depression and anxiety,” says Rebecca Lawson, Professor of Neuroscience and Computational Psychiatry in the University of Cambridge’s Department of Psychology.
Drug treatments and therapies for depression and anxiety do exist – but they’re only effective in 50% of people so it’s a ‘try and see’ approach, often with side-effects along the way. The problem, says Lawson, is that mental health conditions are diagnosed by the symptoms people experience, because there’s no other way.
“We don’t have blood tests or brain scans that will give us an indication of whether you have a mental health condition or not”
says Lawson, adding: “We just have a reference manual listing behavioural symptoms which if you have enough of, you get your diagnosis and a plan of action.”
This approach to mental health considers the symptoms to be the condition – low mood as part of depression, for example. This is very different from physical health conditions where a symptom like a cough could be caused by many things, from a common cold, to asthma, to cancer – each needing a very different treatment.
“We’re lacking that mechanistic understanding of the different routes to causing the symptoms of mental health conditions,” says Lawson, adding: “That means we can’t predict who treatments will be effective for – and for the most part we don’t actually know how the treatments work.”
Personalising the approach
Fingerprint. Credit Andriy Onufriyenko/ Getty
Mental health problems are complex, and idea of a ‘one size fits all’ treatment is outdated. Lawson compares the current state of understanding to cancer research twenty years ago: “We had a very poor understanding of the different mechanisms that could cause breast cancer, for example,” she says.
With greater understanding, cancer treatment has moved to a precision medicine approach where treatment is often tailored to the individual. Lawson wants to achieve the same for mental health.
She’s creating computational models of the behaviour of people with mental health conditions – breaking it down into its constituent parts.
The aim is to be able to assess someone to produce their unique ‘computational fingerprint’ – resulting in a personalised approach to treatment based on the underlying cause of their symptoms.
“People with depression have a tendency to put negative interpretations on things, and there are lots of reasons why this might be happening,” she says.
“Maybe at the visual end you see things differently, or maybe you have difficulty perceiving positive events in the world, or maybe you’re not updating your beliefs in response to your experiences. The idea is that by trying to get closer to the mechanisms that drive the behaviour, we might be able to actually understand how the treatments work, and who they work for.”
Professor Rebecca Lawson. Credit: Jacqueline Garget
A large, competitively-won Wellcome Mental Health Award is now allowing Lawson and her team to investigate the mechanisms underlying depression and anxiety – in particular, how people process uncertainty.
She wants to see how two different treatment approaches – antidepressant medication and Cognitive Behavioural Therapy – change a person’s computational fingerprint, and change their intolerance of uncertainty.
“We’ll do a head-to-head trial of these two different treatments to understand how they’re different mechanistically – the hope being that we could then use knowledge of the underlying mechanism to guide the most effective treatment approach on a personalised basis.”
Targeting memories
Brainwaves. Credit Sean Gladwell Getty
We tend to talk about anxiety and depression much more openly since the pandemic, but this openness doesn’t yet to extend to all mental health conditions – and that can mean many people still don’t seek the support they need.
“There’s much less stigmatisation now around saying that you have an anxiety disorder or depression than there used to be,” says Amy Milton, Professor of Behavioural Neuroscience in the University of Cambridge’s Department of Psychology, “but there’s still a strong stigma attached to drug addiction, which is my opinion is unfair, because it’s also largely driven by biological mechanisms.”
Professor Amy Milton. Credit: Jacqueline Garget
Milton is studying disorders including drug addiction, post-traumatic stress disorder (PTSD) and obsessive-compulsive disorder, all of which seem to be driven, at least in part, by malfunctions in emotional memories – those where our brain links an emotional response to an experience.
We’re more likely to remember emotionally charged images, like someone shouting at us, than neutral ones. But when they’re formed under very traumatic circumstances this can leave lasting damage – even years later when there’s no longer any danger, particular stimuli can trigger the memories and cause the same strong emotional response. This is what happens in PTSD; Milton is trying to work out what’s going on in a bid to stop it. She says:
“Our memories aren’t fixed – we know they drift and change, and under the right conditions they can be updated. If emotional memories contribute to the persistence of PTSD, can we target them in some way?”
Memories are very stable when they’re in an inactive state not being used, and they switch to an active state as we do something that uses them. As they move between the two, they destabilise.
“Our idea is that if we give a person with PTSD a reminder of their trigger stimulus to activate the associated emotional memory, together with a drug that blocks that memory from restabilising, then the memory will disappear,” says Milton. “We already know it works in rats. If we could get rid of a person’s distressing emotional reaction without them forgetting the event, that’s really exciting.”
Milton has found that problems with emotional memory are also involved, in different ways, in drug addiction and obsessive-compulsive disorder. While they are just one component of all three disorders, she says that targeting the mechanism of the emotional memories could become an important part of wider treatment packages.
“It may be that the same mechanisms are affected in different mental health disorders,” she says. “Our ideal approach would be to try and work out which symptoms are causing problems for any individual patient, and treat the processes that give rise to those symptoms. Similar to Rebecca’s approach this is more flexible and personalised, and should have much better outcomes.”
Strength in numbers
Credit Yuichiro Chino/ Getty
With mental health issues projected to be one of the world’s biggest causes of ill health by 2030, there’s no time to lose. Lawson and Milton are co-leads of a new mental health research network at the University, bringing together experts across disciplines to address the challenge from all angles.
Animal models are vital to this, because using them allows complex processes to be modelled in much simpler ways – then translated into humans.
The network will bring in people with experience of the mental health conditions being studied, so that their perspectives can inform research as ideas are being developed.
This ‘lived experience’ is considered so vital to making progress that Lawson worked with the University’s Bioscience Impact Team to develop new practical guidelines, and set up a funding scheme, to enable researchers across the network to incorporate the approach.
The aim is to turbocharge basic biomedical research like theirs, to drive a vastly improved approach to tackling mental health.
“I genuinely believe that we need a paradigm shift to make progress in mental health, and it feels like a tractable problem,” says Lawson, adding: “By taking a step back to focus on the basic science from this mechanistic angle, I think we can do this.” Milton agrees:
“With our whole network focused on the challenge from a huge diversity of perspectives, I genuinely think we can move towards a future of precision psychiatry and vastly improved treatment options.”
A rare collection of 17th-century petitions gives voice to England’s early foster carers as they fought for their rights
Today, the UK faces a major retention and recruitment crisis in foster care, and carers in different parts of the country continue to campaign for higher funding.
Having studied the experiences of foster carers in the 17th century, Cambridge historian Emily Rhodes argues that these struggles have a long history and that England’s early foster carers had more authority than we might expect.
Rhodes, a researcher at Christ’s College, Cambridge, studied a rare collection of surviving petitions submitted to the Lancashire quarter sessions courts between 1660 and 1720.
In a study published in The History of the Family journal, Rhodes reveals the experiences of thirty-eight women who cared for non-kin children for their parish. Traditionally, this work has been called ‘boarding’ or ‘tabling’ but Rhodes says:
“There are very clear similarities between then and now and we should view these women as early foster carers. People in authority looked at family situations and judged whether it was appropriate for a child. When they decided it wasn’t, they sought to place them in a new home, ideally with somebody from their local community, and they compensated this person to look after the child.”
“These women provided such a vital role that when they weren’t paid enough or at all, they had enough authority to approach their county justices, powerful men, and successfully argue their case.”
“Today’s foster carers and the rest of society should know that even 350 years ago this role was essential and respected in society, and that women had power in the system. Every social safety net relies on determined individuals, we all need to remember that.”
Most of the women Rhodes encountered in the petitions would have been entitled to poor relief in their own right. In the 17th century, the Old Poor Laws supported a system of relief in England which saw parishioners contributing to a local pot of funds which churchwardens and overseers of the poor allocated to the needy in the parish.
Some needy or orphaned children were made apprentices but others were placed with a woman in the community, typically a widow or a mother, but sometimes unmarried women. For this work, women expected to receive payment from the parish. As a result, they were both recipients of poor relief and administrators of the poor laws.
Emily Rhodes said: “These petitions give voice to some of the most inaccessible women in history. They’ve left a very small footprint but they played a crucial role in society.”
17th-century parish orphans
Emily Rhodes
Taking on authority
In their petitions, women often accused their local authorities of mismanagement and dishonesty. Three quarters petitioned because they were not receiving the promised rate and nearly one third requested an increase in pay. None of the petitions were rejected but Rhodes cautions that fewer failed petitions may have survived.
Rhodes said: “The state needed to keep these carers satisfied so the justices, the higher authority, overwhelmingly sided with them and opposed mistreatment by local authorities.”
In 1690s Preston, Alice Brewer of Lea battled her parish for years as they cut and withheld payments to care for Anne Helme, ‘a poor distressed child’ who had lived with her for 14 years. Alice complained that ‘the town was pleased to differ & wrangle with your poor petitioner and to lessen and altogether deny the payment’.
In one petition she argued that the overseers’ refusal ‘to provide clothes or other necessaries’ for Anne had caused her to become lame. By 1700, the parish owed Alice for three years of care, leaving her ‘very poor’.
The justices repeatedly ordered the overseers to pay their debts but they repeatedly failed to do so. How the battle ended we do not know.
More authority than biological mothers
Rhodes, who has just completed a PhD on petitioning by mothers in England and Wales from 1660–1720, found that foster mothers had significant advantages over biological mothers when dealing with the authorities.
“When birth mothers petitioned they had to prove, in a grovelling and pitiful tone, that they were among the deserving poor,” Rhodes said.
“They had to describe the impact of being a widow, of having a disabled husband or having a very sick child. But for foster mothers, it was enough to say ‘I’m supposed to be paid for this and you’re not fulfilling your part of the deal’.”
A poor woman surrounded by her children, engraving after Jean-Baptiste Marie Pierre (1746)
Paul Sandby, An old woman (18th century)
For love or money?
Fostering in the 17th century provided poor women and their own families with vital income. The standard rate of payment for one child was around 40 shillings a year, but sums ranged from 12 to 78 shillings. This far exceeded average poor relief payments at the time.
Caring for these children was work and some women may have viewed the role in mostly or purely financial terms. In many petitions, however, female carers did express a strong sense of benevolence and compassion towards the children they were caring for.
In 1671, Anne Beesley told the justices that she had taken in three destitute children from Barton out of pity, fearing ‘they should be starved to death’. Anne claimed that she had expected the authorities to ‘provide for them’ within three weeks but this turned into eight weeks and Anne was only reimbursed after petitioning.
Petitioners often pointed out that they had continued to care for children despite not being paid for months. In the 1670s, Elizabeth Drinkwater reported that the overseers of Great Bolton had failed to pay her for 9 months to care for Ann Reade, but that she had ‘kept the said child with all things necessary’ and had spent 6 shillings on clothes so she was ‘much impoverished’.
“It’s hard to imagine that some women didn’t feel some regard for these children,” said Rhodes. “Many would have known them before they took them in. But petitions were carefully crafted arguments and might not necessarily record true feelings.”
In some petitions, carers threatened to end care and withdraw their services if they did not obtain their desired outcome.
One of the most distressing cases concerns Ellen Fell. In 1665, Ellen told the justices they needed to ‘confirm the said yearly annuity or otherwise the child is very like to be famished & starved’. She told them she had children of her own and had submitted several other petitions about her own family’s neediness. Ellen presented herself as a selfless maternal provider but by the time the court considered her petition, the child had been ‘already turned out of doors and lays in the streets’.
“It is very easy to see disorder in the past,” Rhodes said. “The records show us when things weren’t functioning properly. When a carer was being paid properly, we’re unlikely to find her.”
“Look at the news in 2024 and you will see stories of foster carers not receiving enough support and leaving the system. We still face issues with bureaucracy and people in authority not doing their jobs properly.”
York Museums Trust: Jan Steen, Woman feeding a child (17th century painting). Yale Center for British Art, Yale Art Gallery Collection: William Baillie after Mathieu Le Nain, Parish Orphans (1770); Paul Sandby, An old market woman (undated watercolour) The Metropolitan Museum of Art, New York: Stefano della Bella, Women and children (1649) Wellcome Collection: N. de Larmessin III, after Jean-Baptiste Marie Pierre, A poor Savoyard woman surrounded by her children (1746) Emily Rhodes: Emily Rhodes
Cambridge has once again been named as the most intensive science and technological cluster in the world, according to a new report ranking innovation around the globe.
It’s great to see this continued recognition of Cambridge as the world’s most intensive science and technological cluster. With its exceptional research and science, people and partners, companies and commitment, Cambridge drives innovation that fuels local, national, and global growth, tackling global challenges and delivering life-changing impact.Diarmuid O’Brien
The Global Innovation Index (GII) 2024 – which captures the innovation ecosystem performance of 133 economies and tracks the global innovation trends – has ranked Cambridge as the world’s leading science and technological (S&T) cluster by intensity, in relation to its size, for the third consecutive year. San Jose, San Franciso (USA) was named second, unchanged from the 2023 Index, with Eindhoven, (Kingdom of the Netherlands) third.
S&T clusters are established by analysing patent-filing activity and scientific article publication relative to population, and documenting the geographical areas around the world with the highest density of inventors and scientific authors.
According to the Index, the Cambridge cluster filed 6,379 Patent Cooperation Treaty (PCT) patent applications and published 35,000 scientific articles, both per 1 million inhabitants, over the past 5 years.
The University of Cambridge sits at the heart of this cluster, powering world-leading research, driving a vibrant innovation ecosystem, and cultivating a thriving environment for collaboration, services and investment. The University contributes nearly £30 billion to the UK economy annually, including over £23 billion from commercialisation and innovation activities.
According to the Global Innovation Index 2024: “S&T clusters – which can be entire regions or cities – serve as the backbone of a robust national innovation ecosystem. Situated in areas such as San Francisco’s Silicon Valley, Cambridge, Munich and Paris in Europe, or Bengaluru, Seoul, Shenzhen and Tokyo in Asia, these S&T clusters are home to renowned universities, brilliant scientists, R&D-intensive companies, and prolific inventors. It is the collaboration among these entities that results in the groundbreaking scientific advancements.”
Earlier this year, a report by Dealroom revealed that the Cambridge tech ecosystem has a combined value of $191 billion, representing 18% of the entire UK’s tech ecosystem and reinforcing Cambridge’s reputation as Europe’s deep tech leader.
Dr Diarmuid O’Brien, Pro-Vice-Chancellor for Innovation, University of Cambridge, commented:
“It’s great to see this continued recognition of Cambridge as the world’s most intensive science and technological cluster. With its exceptional research and science, people and partners, companies and commitment, Cambridge drives innovation that fuels local, national, and global growth, tackling global challenges and delivering life-changing impact.”
Sir David Attenborough spoke of how he feels during visits to the Cambridge Conservation Initiative (CCI) when he stopped by the CCI conservation campus at the University of Cambridge this week.
Sir David said of visiting CCI that he felt “an undercurrent of joy” whenever he came to the conservation campus, which is housed in the building bearing his own name.
The campus was opened in 2016 and is the first of its kind, with over 500 conservation professionals and researchers, from 10 different organisations and the University of Cambridge, all collaborating to stop the biodiversity crisis and build more hopeful futures for people and nature.
Multi-disciplinary archaeological survey at the site of Oued Beht, Morocco, reveals a previously unknown 3400–2900 BC farming society, shedding new light on North Africa’s role in Mediterranean prehistory.
For over thirty years I have been convinced that Mediterranean archaeology has been missing something fundamentalProf Cyprian Broodbank
Archaeological fieldwork in Morocco has discovered the earliest, previously unknown 3400–2900 BC farming society from a poorly understood period of north-west African prehistory. This is the earliest and largest agricultural complex yet found in Africa beyond the Nile.
This study, published in the journal Antiquity, reveals for the first time the importance of the Maghreb (north-west Africa) in the emergence of complex societies in the wider Mediterranean during the fourth and third millennia BC.
With a Mediterranean environment, a border with the Sahara desert and the shortest maritime crossing between Africa and Europe, the Maghreb is perfectly located as a hub for major cultural developments and intercontinental connections in the past.
Whilst the region’s importance during the Palaeolithic, Iron Age and Islamic periods is well known, there is a significant gap in knowledge of the archaeology of the Maghreb between c. 4000 and 1000 BC, a period of dynamic change across much of the Mediterranean.
To tackle this, a team of archaeologists led by Prof Cyprian Broodbank from the University of Cambridge, Prof Youssef Bokbot from INSAP, and Prof Giulio Lucarini from CNR-ISPC and ISMEO, have carried out collaborative, multidisciplinary archaeological fieldwork at Oued Beht, Morocco.
“For over thirty years I have been convinced that Mediterranean archaeology has been missing something fundamental in later prehistoric north Africa,” said Broodbank. “Now, at last, we know that was right, and we can begin to think in new ways that acknowledge the dynamic contribution of Africans to the emergence and interactions of early Mediterranean societies.”
“For more than a century the last great unknown of later Mediterranean prehistory has been the role played by the societies of Mediterranean’s southern, Africa shores west of Egypt,” say the authors of the new study. “Our discoveries prove that this gap has been due not to any lack of major prehistoric activity, but to the relative lack of investigation, and publishing. Oued Beht now affirms the central role of the Maghreb in the emergence of both Mediterranean and wider African societies.”
These results reveal that the site was the largest agricultural complex from this period in Africa outside of the Nile region. All of the evidence points to the presence of a large-scale farming settlement—similar in size to Early Bronze Age Troy.
The team recovered unprecedented domesticated plant and animal remains, pottery and lithics, all dating to the Final Neolithic period. Excavation also revealed extensive evidence for deep storage pits.
Importantly, contemporaneous sites with similar pits have been found on the other side of the Strait of Gibraltar in Iberia, where finds of ivory and ostrich egg have long pointed to African connections. This suggests that the Maghreb was instrumental in wider western Mediterranean developments during the fourth millennium BC.
Oued Beht and the north-west Maghreb were clearly integral parts of the wider Mediterranean region. As such, these discoveries significantly change our understanding of the later prehistory of the Mediterranean and Africa.
As the authors of the Antiquity article state: “It is crucial to consider Oued Beht within a wider co-evolving and connective framework embracing peoples both sides of the Mediterranean-Atlantic gateway during the later fourth and third millennia BC – and, for all the likelihood of movement in both directions, to recognise it as a distinctively African-based community that contributed substantially to the shaping of that social world.”
A species of tropical tree snail is no longer extinct in the wild following a successful reintroduction project.
Very few animal species have been re-established back in the wild so this is a fantastic achievement for the programme – the fruit of a vast amount of work.Justin Gerlach
A global conservation effort to reintroduce a tiny snail to the wild is celebrating a momentous milestone: for the first time in 40 years, conservationists have found born-in-the-wild adult Partula tohiveana – meaning the precious molluscs have successfully established themselves in French Polynesia.
This year Cambridge’s Dr Justin Gerlach helped restore over 6,000 of the snails to Moorea, their French Polynesian island home as part of an annual reintroduction of zoo-bred ‘Extinct in the Wild’ and ‘Critically Endangered’ snail species – carried out through collaboration with zoos around the world.
During their work the team found unmarked Partula tohiveana: proof that previously reintroduced snails have successfully bred in the area.
The momentous discovery means Partula tohiveana can now be considered as established – an incredibly rewarding result for 40 years of dedication and collaboration. Conservationists will now begin the process of downlisting the snails from ‘Extinct-in-the-Wild’ to ‘Critically Endangered’ on the IUCN’s Red List.
Very few species have been reintroduced successfully having been completely extinct in the wild. This is also the very first invertebrate species where this has been achieved.
Ten species and sub-species of the tropical snails, reared at London Zoo, Bristol Zoological Society, Detroit Zoological Society, Marwell Wildlife, the Royal Zoological Society of Scotland, Saint Louis Zoo, Sedgwick County Zoo, Woodland Park Zoo and Zoo Schwerin, travelled more than 15,000km to Tahiti at the beginning of September. Before making the two-day journey to the islands of Tahiti, Moorea and Huahine, the incredibly rare snails, which each measure a tiny 1-2cm in length, were individually counted and marked with a dot of red UV reflective paint. The ‘snail varnish’ glows under UV torchlight, helping conservationists in the field to spot and monitor the nocturnal snails at night, when they’re most active.
London Zoo’s Senior Curator of Invertebrates, Paul Pearce-Kelly, who leads the Partula conservation programme, said: “Though little, these snails have great cultural, scientific and conservation value. Partula snails have always been part of Polynesia’s rich cultural heritage and play an important role in the ecological health of their forest habitats. They’ve also been studied for over a century for the insights they give into how species evolve in isolated environments. Most recently, they’re providing a valuable conservation model for helping hundreds of endangered island species.”
He added: “This collaborative conservation effort is playing a crucial role in saving these species from extinction. It’s a powerful example of how conservation zoos can combat biodiversity loss. At a time when nature faces unprecedented challenges, these small snails are a symbol of hope for global wildlife.”
Partula snails – also known as Polynesian tree snails – eat decaying plant tissue and fungi, so play an important role in maintaining forest health. Returning these rare snails back to the wild helps to restore the ecological balance in these islands.
Dr Justin Gerlach of Peterhouse, University of Cambridge and an Academic Associate at the University’s Museum of Zoology, said: “Discovering wild-born adult snails was a great moment. Very few animal species have been re-established back in the wild so this is a fantastic achievement for the programme – the fruit of a vast amount of work.”
Conservation zoos are working with the French Polynesian Government’s Direction de l’environnement, to save Partula snails from extinction. In the 1980s and early 1990s, these snails faced a critical threat after the invasive rosy wolf snail (Euglandina rosea) was introduced to control the African giant land snail (Lissachatina fulica). Unfortunately, the predatory species targeted the native snails instead, leading to the extinction or near-extinction of many Partula species across the region.
In the early 1990s, the last remaining individuals of several Partula species were rescued by London and Edinburgh Zoos, launching an international conservation breeding programme. This collaboration between 15 zoos cares for 15 species and subspecies, most of which are classified as ‘Extinct-in-the-Wild’. These rescued snails, along with those already being studied at universities in the UK and North America, became the foundation for reintroducing the species back onto their native island homes.
Paul said: “After decades of caring for these species in conservation zoos and working with the Direction de l’environnement to prepare the islands, we started reintroducing Partula snails back into their lowland tropical forests almost 10 years ago. Since then, we’ve reintroduced over 30,000 snails, including 10 Extinct-in-the-Wild species and subspecies, with this year’s release being the largest so far, thanks to our international team and collaborators, including mollusc specialist Dr Justin Gerlach of Peterhouse, University of Cambridge.”
London Zoo’s coordination of the Partula snail reintroduction project is made possible due to funding from supporters including the Players of the People’s Postcode Lottery, who have enabled London Zoo to continue bringing species back from the brink of extinction.
Adapted from a press release by the Zoological Society of London.
Arup is a global engineering and sustainable development firm with designers, consultants and experts working across 140 countries.
Arup and Cambridge have been working together since the 1960s.
Today, the partnership is focused on harnessing new digital technologies to build a more sustainable world.
Together we have:
worked on 30+ research collaborations, in engineering, architecture, geography, land economy, earth sciences and maths
co-authored 60+ academic papers
changed the way professionals are taught
and developed new technologies that make our world safer and more sustainable.
Some of this work has been supported by the Ove Arup Foundation, a charity dedicated to promoting its founder’s philosophy of ‘total design’.
Arup and Cambridge have been working together for more than 60 years and the relationship between Cambridge and the Ove Arup Foundation goes back to the early 1990s.
Talk to anyone involved and it quickly becomes clear why these partnerships have proved so fruitful and enduring.
Both Arup and Cambridge attract what Dame Jo da Silva, Arup’s Global Director of Sustainable Development, describes as “incredibly smart people”, all of whom are intent on bringing about positive change across the sector through new technologies and a better understanding of how they can be applied.
Collaboration and a multidisciplinary approach underpin everything the partners undertake, whether it’s solving research problems or transforming how professionals are taught – in a way that has been hugely influential around the world.
Here are just a few of the ways in which the Cambridge-Arup partnership is helping to build a more sustainable world.
“At the time, sensors were being widely used in sectors such as aerospace, so that engineers knew how every bit of their assets were performing at all times.”
“However, the same could not be said of the world of infrastructure. Kenichi Soga (now a professor at University of California, Berkeley) and I could see that these sensing technologies would allow us to understand how vital assets are performing – and if remedial action needs to be taken.”
In 2011, CSIC was established in Cambridge with funding from Innovate UK, EPSRC and 28 industry partners, including prime mover and supporter, Arup.
From it emerged new methods and techniques to use distributed fibre optic sensing (DFOS) for monitoring civil infrastructure. DFOS transformed the way changes in strain and temperature are measured, making it possible to assess the engineering performance of structures such as foundation piles, tunnels, retaining walls, pipelines and bridges.
The development of DFOS was a key pillar of the new Centre’s activities, and one in which Arup had a key role. Mair said: “As a leader in the field, Arup has always been wedded to the importance of measurement and they have been hugely instrumental in both the development and deployment of the DFOS technology.”
Jennifer Schooling, former Director of CSIC (now at Anglia Ruskin University), explained why working with Arup was so instrumental in the development of DFOS.
“Doing good science is one thing, but we needed help to turn it into something that’s useable by industry. That’s where Arup came in, helping us to codify how DFOS should be used in practice and working with their clients to show them its benefits. The intellectual ‘oomph’ was very much a joint effort between Arup and Cambridge.”
Bank Station Bank is one of London’s busiest underground stations. By 2016, around 98 million passengers a year were using it and not finding it a pleasant experience. To alleviate the problem, Transport for London decided to increase the station’s capacity by 40%. This meant building new tunnels under some of the capital’s busiest streets.
Mair explained: “It became clear that the tunnelling team would need to cut through some of the foundation piles supporting a large office block.”
“DFOS meant we could tell what was going on as the construction team worked to sever the piles and replace them with concrete supports that would ensure the building stayed safe.”
Exposed under-ream pile prior to cutting
Sure enough, it worked. The team was able to monitor the strain induced in the piles, as well as the settlement of the building – which was minimal. The work was carried out safely, with no structural repercussions.
A mathematical bridge
In 2021, the world’s first 3D-printed steel bridge was unveiled in Amsterdam by Queen Máxima of the Netherlands.
Mark Girolami, who became the Sir Kirby Laing Professor in Civil Engineering on the retirement of Lord Mair, and is chief scientist of the Alan Turing Institute explained how he got involved.
“Arup was brought in to do the structural design and it soon became clear that because the engineers were working with what was effectively a new material and a new way of building, the standard design tools weren’t working.”
“We realised we were going to need new mathematics, new statistics and new computing. This led to an accelerated research programme that resulted in the redesign of these tools from a theoretical level all the way through to actual deployment.”
“Following on from that, the Royal Academy of Engineering awarded a fellowship to an Arup scientist, Ramaseshan Kanan, so that he could come to Cambridge and continue work on the development of tools that will be needed as 3D printing is used more widely for building and to monitor the performance of existing structures.”
As well as being an exciting testbed for new materials and construction techniques, the other ambition for the 3D bridge was to show that it is possible to measure the performance of a structure over time.
Girolami and team created a digital twin of the bridge fed by a hundred sensors attached to the actual bridge, enabling its engineers to monitor strain, movement, vibrations and weather conditions as people cross it and assess how the bridge is faring, alerting them if problems occur and when maintenance might be required.
For Kanan, the collaboration with Professor Girolami, and the connection into wider Cambridge and Turing communities has been invaluable for his RAEng Industrial Fellowship.
“We have been able to develop and explore algorithmic techniques to a range of problems within decarbonisation and resilience in the built environment.
“The engagement has shown a deep alignment between Cambridge and Arup to work on these themes and has given us a solid foundation on which we plan to build further collaborations.”
Understanding flood risk
“We’re facing a losing battle against rising sea-levels,” says Tom Spencer, Emeritus Professor of Coastal Dynamics, and Director of Cambridge’s Coastal Research Unit. “And we’re going to see more extremes as we move towards the end of the century.”
Being able to anticipate floods accurately is going to be increasingly important for governments and communities around the world.
“But until relatively recently”, explained Arup alumnus Mike Dobson (now the new energy sector lead for the marine environment at Crown Estate), “although we were trying to make good decisions, the tools at our disposal were pretty static. Lots of modelling, lots of economics presented in a report. We wanted to see what else was possible and bring it to life in a way that policymakers in particular could understand. Cambridge University managed to unblock the complex modelling which allowed us to do that.”
Hull was chosen as a pilot site for the new approach because of its geography and susceptibility, with around 100,000 properties potentially at risk of flooding.
Spencer explained, “Arup had been working there for decades, so they knew what data sets were around, and they had a very strong link to the Environment Agency in Hull.”
“The real power of the Cambridge–Arup approach here,” he continued, “is that it considers a huge number of potential options. So you can really play with all sorts of different things, such as the height of your flood defences, your storm surge conditions, your sea level. It just runs thousands of combinations of these things really pretty quickly. The computing power behind it is huge.”
The Cambridge modelling and the sea-level rise insights tool has given the Arup team new levels of data and new opportunities to improve insight for decision-makers.
Can it be applied in places beyond Hull? “Absolutely,” said Dobson, “wherever local levels of sea and land are known. We’ve worked with Cambridge to design the whole process to be replicable. We want it to be globally applicable.”
New skills for a changing world
Back in 1991, the Ove Arup Foundation organised what its former chair of trustees (and former Arup Group Chairman) Terry Hill, described as “a landmark day-and-night workshop” at Madingley Hall in Cambridge to discuss the future of engineering education.
Out of that exhaustive discussion emerged a proposal from Cambridge for a new part-time master’s degree offered jointly by the Departments of Architecture and Engineering.
Hill explained: “Back then, architects and engineers tended not to work closely with one another. We wanted to change all that so we funded the development of what became the Master’s in Interdisciplinary Design for the Built Environment (IDBE).
When it started, it was the only course of its kind, teaching engineers alongside architects. Over time this expanded to bring in the whole gamut of disciplines involved in the built environment, including people with finance and investment backgrounds, surveyors, project managers, designers and planners.
As the environmental impact of the way we live our lives has become an increasingly stark reality, the course has evolved to focus on sustainability and resilience, under a new title, Sustainability Leadership for the Built Environment.
The Ove Arup Foundation is an educational charity set up in memory of Sir Ove Arup with the express purpose of bringing together – and advancing the knowledge of – the different disciplines involved in the creation of our built environment.
Now run by the Cambridge Institute for Sustainability Leadership (still in partnership with the Departments of Engineering and Architecture) it remains true to its original vision of bringing together different disciplines to develop and share their knowledge to build a better world.
Digital Cities for Change
“In 2015, everyone was talking about digitalisation and smart cities,” said Faith Wainwright, one of the Ove Arup Foundation’s current trustees. “At the time, we could see there were huge opportunities coming down the line, thanks to these new technologies, which could help us understand what really makes a city liveable and sustainable.
“But we wanted to understand how they could transform the way we develop and manage the built environment, not just from an engineering ‘hard-wired’ perspective, but also from the point of view of the people whose lives they would affect.
“The goal was always to establish an education programme for practitioners where this new knowledge could be taught in an interdisciplinary way, just like IDBE. But before we could get to that point we needed to understand what we going to be teaching so we started by funding research at CSIC, led by Jennifer Schooling.”
“One of the big challenges we were all very aware of,” said Schooling, “is that a lot of smart city initiatives seem to founder. A common factor is that they tend to be driven by technology rather than need which means that the companies who were implementing them weren’t properly considering how people would use them or how they would be delivered and managed over the long-term.”
“Of course you need to understand what the digital technologies can and, crucially, can’t do but you also need to know how they will land.
“As a result of our research, we developed a digital innovation process map that guides you through the planning and testing while creating an enabling environment and embed the outcomes.”
“One thing we have learnt is that you need to start with understanding the public value of what it is you are trying to deliver, then work out what the role of digital is in delivering that (if any), and then pilot it.”
All of Schooling and team’s work has gone into designing a suite of new Digital Cities for Change education programmes ‘Leadership of Urban Digital Innovation for Public Value – LeadUp’ – a one-year certificate, a two-year diploma or a three-year master’s – aimed at a mix of people from the public and private sectors.
Developing sustainability leadership at Arup
Another path to driving change in the industry is through equipping leaders at Arup with the capabilities they need to deliver change through their work with clients.
This is where another Arup-Cambridge partnership comes into play. Around five years ago, Arup recognised the need for a bespoke programme that would give its senior leaders access to some of the latest thinking on sustainability in the built environment across a range of disciplines and help them apply that thinking in their work.
Dame Jo da Silva, Arup’s Global Director of Sustainable Development, explained: “When I took on this role I was trying to create transformative change in the firm, so that everything we do contributes to a sustainable future and a key part of that is strengthening Arup’s leadership on this issue.”
“The programme we developed with the Cambridge Institute for Sustainability Leadership’s was predicated on really rigorous research and an impressive diversity of contributors. We have been running the course for three years now and it has been genuinely game-changing.”
For CISL’s Programme Director, Elodie Cameron, the relationship with Arup is very much a two-way street. “Being able to work with a like-minded client to co-create and co-deliver a programme, enables us to really drive change.”
Empowering the next generation of industry leaders
CSIC’s Early-Career Academics and Professionals Panel (ECAPP) Ever mindful of the need to move the sector forward, Arup and Cambridge (and other key industry players) have collaborated on an initiative designed to give younger researchers and practitioners more of a voice.
Dr Lizzy Moyce, Research Development Manager at Arup, explained: “The idea is that between eight and ten of us representing different parts of the smart infrastructure and construction sector are able to bring a fresh perspective – as well as giving us the opportunity to network and develop our careers.”
Now in its second year, ECAPP has made a strong start. The panel is intended to complement CSIC’s steering committee which provides mentoring for the various ECAPP workstreams.
ECAPP has run a series of workshops with senior leaders, at which Moyce says: “We’re challenging the views of those people who are holding the senior positions and saying ‘we don’t think that’s quite right from our perspective coming up through the industry’.”
The first ECAPP cohort
And the panel is already having an impact. Moyce and colleagues have submitted a paper to the Joint Board of Moderators which oversees engineering courses in the UK. “We highlighted that a lot of digital content is not yet making its way into the curriculum and on the back of that we’ve been invited to give a talk to the Institution of Civil Engineers (ICE).
A call to action: addressing the climate emergency
Prior to COP26 (in 2021), Jo da Silva was looking for a way to communicate to the industry as a whole the need to address the climate emergency not just by reducing carbon emissions but by addressing the system as a whole.
One of the ways she set about this was to publish Reduce, Restore, Remove: a call to action, written by Arup experts with a foreword by Professor Shaun Fitzgerald, Director of the Centre for Climate Repair. For Fitzgerald, working with companies such as Arup is key to bringing about change:
“It’s fantastic that Arup is taking a leadership role in addressing this kind of systemic change. If we are to make progress, we need to continue our work together to develop the solutions and take the urgent actions that are needed.”
As we celebrate more than six decades of partnership, the strong connections between the Ove Arup Foundation, Arup and Cambridge, stand as a testament to the transformative power of collaboration and innovation in shaping the future of infrastructure.
As da Silva says: “Partnerships work when they are of mutual benefit. We can do things with Cambridge that we couldn’t do on our own. It works because there is strong cultural alignment between us.”
Cambridge Accelerator plan to cut global aviation emissions
Global aviation could be on a flight path to net zero if industry and governments reach just four goals by 2030, according to a new report from the University of Cambridge.
An ambitious five-year plan created by the Aviation Impact Accelerator (AIA), a project led by the University of Cambridge and hosted by the University’s Whittle Laboratory and the Cambridge Institute for Sustainability Leadership (CISL), sets out four Sustainable Aviation Goals over the next five years that could help the sector navigate to net zero emissions across the world by 2050.
“Aviation stands at a pivotal moment, much like the automotive industry in the late 2000s,” said Professor Rob Miller, Director of the Whittle Laboratory. “Back then, discussions centred around biofuels as the replacement for petrol and diesel – until Tesla revolutionised the future with electric vehicles. Our five-year plan is designed to accelerate this decision point in aviation, setting it on a path to achieve net zero by 2050.”
If the goals are not implemented immediately and achieved by 2030, the opportunity for transformation could slip away, leaving the world to face the escalating climate impacts of a rapidly growing aviation sector, which is projected to at least double its emissions by 2050.
Aviation is a major contributor to climate change, accounting for 2-3% of global CO2 emissions and 4% once the non-CO2 climate impacts are included.
Despite ambitious pledges from governments and industry, the aviation sector remains significantly off course in its efforts to achieve net zero by 2050. The report, titled Five Years to Chart a New Future for Aviation, outlines four actionable steps that must be initiated immediately and completed within five years if the sector is to get itself on track for that goal.
“Aviation stands at a pivotal moment. Our five-year plan is designed to accelerate this decision point in aviation, setting it on a path to achieve net-zero by 2050.“
Professor Rob Miller, Director of the Whittle Laboratory
A roadmap to net-zero aviation
Each of the four goals is specifically targeted to raise ambition in a particular area of aviation.
The first goal is to remove the clouds (contrails) formed by aviation. Speeding up the deployment of a global contrail avoidance system could reduce aviation’s climate impact by up to 40%. This would involve the immediate creation of experiments at the scale of whole airspace regions to learn in real environments. For example, an aircraft changing altitude in regions of the atmosphere where there is potential to form clouds.
Around one in 30 flights produces a persistent contrail, a region of cloud that can trap heat and increase the climate impact of aviation. The climate impact of contrails from planes is estimated by some researchers to be about the same as the aviation industry’s total CO2 emissions, though there is scientific debate surrounding this estimation.
The second goalis to implement a new wave of policies aimed at unlocking system-wide efficiency gains across the existing aviation sector. This has the potential to halve fuel burn by 2050 by tapping into efficiency gains that individual companies can’t address.
The third goal is to reform Sustainable Aviation Fuel (SAF) policies to account for global biomass limits across all sectors while driving renewable electricity production. This would provide the market with the confidence needed to rapidly scale up SAF production and ensure its sustainability. The goal is to put in place the global policies required to minimise the wider impact of SAFs on climate and nature.
The final goal is to launch several moonshot technology demonstration programmes designed to rapidly assess the viability and scalability of transformative technologies, bringing forward the timeline for their deployment.
An example of this is long-haul hydrogen aircraft. The low weight of hydrogen fuel, even once the weight of the tanks is included, makes hydrogen advantageous for long-haul flight, and the introduction of hydrogen would remove CO2 emissions from flight.
Royal support for genuine change
King Charles on a visit to the Whittle Laboratory
The first post-coronation engagement for His Majesty The King was to convene a group of aviation industry CEOs, alongside senior Government representatives, to help work on the 2030 Goals. His Majesty visited the University of Cambridge in May 2023 and broke ground on a £58-million Whittle Laboratory facility while encouraging the acceleration of sustainable aviation.
His Majesty also spoke at the opening reception for COP28 and “urged us to continue to raise our ambitions in driving change in the aviation sector,” said Miller.
“In this age of disruption, we not only need new models, but we need new mindsets if we are to raise our ambitions and ensure, in the words of The King, that this is ‘a turning point towards genuine transformational action’.”
Professor Rob Miller, Director of the Whittle Laboratory
Global experts come together to solve big problems
The AIA sits in the Whittle Laboratory, one of the world’s biggest turbomachinery labs conducting research into the rapid development of ultra-low emission aircraft and low carbon power generation.
The AIA is a global initiative that brings together more than 100 experts from across the aviation industry to accelerate the sector’s transition to net-zero emissions. Its goal is to develop interactive tools and models that assist stakeholders—governments, industry leaders, and the public—in understanding and exploring pathways to sustainable aviation.
By focusing on technological innovation, policy development, and environmental impact, the AIA aims to speed up progress toward zero-emission flight.
Partners include Boeing, Rolls-Royce, the Royal Air Force, Stratos, Emirates, 4Air, Flexjet, the UK Department for Energy Security & Net Zero, the UK Department for Transport, Breakthrough Energy, the Sustainable Markets Initiative, MIT, the University of Melbourne, and University College London.
Sustainable flight possible
“The Aviation Impact Accelerator modelling has drawn on the best available evidence to show that there are major challenges to be navigated if we’re to achieve net zero flying at scale, but that it is possible,” said Eliot Whittington, Executive Director at Cambridge Institute for Sustainability Leadership. “With focus and a step change in ambition from governments and business we can address the hurdles, unlock sustainable flying and in doing so build new industries and support wider economic change.”
Combining screening for lung and kidney cancers – for both of which smoking is a risk factor – could help identify undiagnosed cases of kidney cancer, say researchers as they release the results from a study showing this approach is feasible and acceptable to participants.
Early detection of cancer allows the best chance of cure using effective treatments such as surgery. The UK has recently approved a screening programme for smokers at greatest risk of lung cancer. The programme makes use of lung computed tomography (CT) scans, which build up a detailed picture of the inside of an individual’s body by taking multiple x-rays.
Certain cancers, however, are relatively rare and standalone screening programmes are unlikely to be cost-effective. One such disease is kidney cancer. Kidney cancer is the ninth commonest cancer in men and 14th in women, and is largely curable if treated at an early stage. But almost nine in ten patients (87%) will have no symptoms at the stage when it is still curable.
As lung and kidney cancers share risk factors, Yorkshire Cancer Research, in partnership with experts at the University of Cambridge, established the Yorkshire Kidney Screening Trial to see whether screening for kidney cancer could take place at the same time as screening for lung cancer. The results are published in European Urology.
“Kidney cancer is curable if we catch it early enough, but it’s a largely silent disease at that stage, making it very difficult to spot.”
Professor Grant Stewart, University of Cambridge, Chief Investigator on the trial
“We know that smokers who are at high risk of lung cancer are also at increased risk of kidney cancer, so it makes sense to see if we can look for both conditions at the same time.”
Abdominal CT scans were offered to 4,019 ‘ever-smokers’ – that is, people who had smoked at some period in their life – aged 55-80 years old who were attending a lung cancer screening trial between May 2021-October 2022.
Of those offered the additional abdominal scan, more than nine in 10 (93%) accepted. Of these, almost two-thirds (64%) were found to have normal abdominal scans. One in five (20%) required an imaging review but no further action. 15% required further investigations at a clinical review.
One in twenty (5.3%) participants had a previously-undetected serious finding only seen on the abdominal CT scans, including kidney and other abdominal cancers, abdominal aortic aneurysms (a swelling in the artery that carries blood from the heart to the abdomen, which can be serious because they risk bursting) and kidney stones.
Professor Stewart added: “We were able to make use of an existing targeted screening study to ‘bolt-on’ an additional screening test. Patients were very receptive to be screened for several conditions, and this approach helped us identify serious findings in one in 20 participants that carried a real prospect of seriously threatening life span, or of having a substantial impact on their lives.”
A concern with any screening programme is the identification of incidental, non-serious lesions that do not require treatment but carry the risks associated with diagnosis and treatment, create unnecessary anxiety for these individuals, and potentially divert healthcare resources away from other conditions.
In the Yorkshire Kidney Screening Trial, a quarter of participants (25%) had non-serious findings. However, because the trials was set up to allow a robust clinical review of the radiological findings and clear lines of communication with associated specialities to determine if further tests or clinics were needed, only a third of these (8.5% of participants) had incidental findings that triggered further action in the form of further clinic appointments or investigations.
A sub-study published separately also showed that those with non-serious findings did not have lasting psychological, social or financial harms.
Speaking on behalf of the trial funder, Dr Stuart Griffiths, Director of Research at Yorkshire Cancer Research said:“People with kidney cancer are often diagnosed at a late stage when treatment options are more limited. Screening people before they experience any symptoms means the kidney cancer can be found at a very early stage – enabling many people to receive life-saving treatment.”
“Adding an abdominal CT to the recently approved lung cancer screening programme provides a vital opportunity to improve early diagnosis and save thousands of lives in Yorkshire and across the UK.”
Dr Stuart Griffiths, Director of Research, Yorkshire Cancer Research
Jenifer Perrin, aged 81, has lived in Otley on the outskirts of Leeds for 50 years. She used to be a sewing machinist at a local factory, where she worked making curtains – a skill she says macular degeneration has sadly put paid to.
In 2019, Mrs Perrin was invited for a CT scan as part of the Yorkshire Lung Screening Trial (YLST) in a mobile unit, which fortunately found no sign of disease. In October 2022, after a follow-up scan as part of YLST, she was offered a kidney screening as part of the Yorkshire Kidney Screening Trial. She had never been ill and had no symptoms, but agreed. This time, however, the scan picked up an abnormality, a small tumour, around 2.5cm in diameter.
“When they said, ‘You’ve got the C’, I just took a deep breath,” she says. “I thought, ‘Well, it is what it is. There’s no point worrying.’”
Following a biopsy to confirm that the tumour was indeed cancerous, Mrs Perrin was referred to Professor Tze Min Wah and offered treatment using high intensity focused ultrasound (HIFU), a minimally-invasive treatment that required her to go under a general anaesthetic, but involved “no needles and no cutting,” she says.
The operation took place on the Thursday before the Coronation of King Charles III and apart from sickness that she puts down to the after-effects of the anaesthetic, she says there were no side-effects.
Following the treatment, she would return to St James’s Hospital every month for a follow-up scan and blood test. In October 2023, she was finally given the all clear.
“I’m really glad I had the chance to take part in the trial,” she says. “I’d never been ill, so without the CT scan, they might not have spotted my cancer early. As it was, they were able to blast it away using ultrasound. I think I was the second person in the world to have this treatment – the first woman in the world to get it. I like telling people that!”
The Yorkshire Kidney Screening Trial was funded by Yorkshire Cancer Research. Additional support was provided by the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre and Manchester Biomedical Research Centre, and by Kidney Cancer UK.
Barman handing a customer a pint of beer Credit: ELEVATE (Pexels)
Cambridge researchers have shown that reducing the serving size for beer, lager and cider reduces the volume of those drinks consumed in pubs, bars and restaurants, which could have wider public health benefits.
While we may all enjoy a drink, the less we drink the better our healthTheresa Marteau
Alcohol consumption is the fifth largest contributor to premature death and disease worldwide. In 2016 it was estimated to have caused approximately 3 million deaths worldwide.
Professor Dame Theresa Marteau and colleagues at the Behaviour and Health Research Unit have shown previously that serving wine in smaller glasses is associated with a decrease in sales.
To see if this effect was seen with other alcoholic drinks, they approached venues in England and asked them to remove the pint serving size and instead offer two-thirds as the largest option for four weeks, with four-week non-intervention periods before and after as a comparison.
In a study published in PLOS Medicine, the team found that removing the pint reduced the daily mean volume of beer, lager and cider sold by 9.7%, although there was a slight increase in the amount of wine purchased, with one pub contributing to half of the increase of wine sales. They report that although customers did not complain, fewer than 1% of venues approached agreed to participate and the intervention involved only 12 establishments.
Professor Marteau said: “Alcohol harms our health, increasing the risk of injury and many diseases including heart disease, bowel, breast and liver cancers. While we may all enjoy a drink, the less we drink the better our health.
“As we’ve shown is the case with wine, removing the largest serving size for beer, lager and cider – in this case, the pint – could encourage people to drink less. This could be beneficial both to the nation’s health and the health of individuals.”
Further assessment is needed, particularly into whether people fully compensated for reduced beer consumption by drinking other alcoholic drinks, but the intervention merits consideration for inclusion in alcohol control policies. Smaller serving sizes could contribute towards reducing alcohol consumption across populations and thereby decrease the risk of seven cancers and other diseases.
Researchers have discovered that seabirds, including penguins and albatrosses, have highly-sensitive regions in their beaks that could be used to help them find food. This is the first time this ability has been identified in seabirds.
Chinstrap Penguin (Pygoscelis antarcticus) feeding its chicks. Credit: Grace Kinney-Broderick
An international team of researchers, led by the University of Cambridge, studied over 350 species of modern birds and found that seabirds have a high density of sensory receptors and nerves at the tip of their beaks, which has been previously identified in specialised tactile foragers such as ducks.
The researchers say this touch-sensitive region could have come from a common ancestor, and further work is needed to determine whether it serves a specific function in modern birds. Further study of their beaks and food-gathering behaviour could help conserve some of these birds, many of which are at threat of extinction. The results are reported in the journal Biology Letters.
In the same way as humans and other primates use their hands, birds use their beaks to interact with the world around them. Some birds have specialised touch-sensitive areas at the tips of their beaks to help them find food, but since this ability has not been widely studied, it’s not known how the phenomenon evolved or how widespread it is.
“Many scientists had assumed most birds had touch-sensitive beaks, but we hadn’t investigated it enough to know whether it’s a common ability, or whether it’s limited to particular families of birds,” said lead author Dr Carla du Toit from Cambridge’s Department of Earth Sciences.
One group that hasn’t been well studied is the large group of seabirds called Austrodyptornithes, which includes albatrosses, petrels, and penguins. Since many of the bird species in this group are critically endangered, understanding how they find their food using their beaks could be a valuable tool to aid in their conservation.
Du Toit and her colleagues from the UK and South Africa conducted a study of 361 modern bird species, based on fossil and skeletal records, as well as birds that had been accidentally killed by fishing lines and nets. The team focused on the beaks of these birds, how they are constructed and connected to their nerves and blood vessels.
The researchers found that albatrosses and penguins have organs with high density sensory receptors and high concentrations of nerves in their beaks, which is more common in specialised foragers such as ducks. This is the first time that this functionality has been observed in seabirds.
“Seabirds aren’t known to be tactile foragers, so it’s surprising to find that they have this organ,” said du Toit. “It’s really exciting when you get to be the first to see something.”
Atlantic Yellow-nosed Albatrossfeeding on the surface of the ocean on fishery bycatch. Credit: Carla du Toit
These touch-sensitive beaks might help seabirds find food at night or underwater, as they might enable the birds to detect tiny vibrations from potential prey. Some birds that are already known to have touch-sensitive beaks use them to detect tiny underground vibrations from worms, for example.
However, these sensitive areas could also be a ‘leftover’ trait from a common ancestor that doesn’t have a specific function in modern birds, like the beaks of ostriches and emus. Further studies in live birds will be needed to establish the exact purpose of these touch-sensitive areas, which may also help determine how the ability evolved.
“In humans and other primates, our sensitive hands and fingers allowed us to master a huge range of environments,” said du Toit. “Beaks are analogous to hands in a way, but this is the first time we’ve seen touch-sensitive beaks in seabirds. It’s remarkable that no one has ever really studied this in detail, considering that we all learn about evolution from the beaks of Darwin’s finches in school.”
The researchers say their findings could potentially play a role in conserving some of these birds. Of the 22 known species of albatross, 15 are threatened with extinction and two are listed as critically endangered. One of the big threats to albatrosses is commercial longline fishing, which kills an estimated 100,000 of the birds per year, when they get tangled in the lines and drown. According to du Toit, if scientists can better understand how these birds get their food, it could be used to help protect them.
“Much further work is needed, but if albatrosses and other seabirds are able to detect vibrations from potential prey via their beaks, it could be possible to attach some sort of device to longlines that could repel them, so they are less likely to get caught,” said du Toit. “Of course, the bigger threats to birds like albatrosses are climate change, rising ocean temperatures, plastic pollution and falling fish stocks, but if there’s a way to reduce the risks to seabirds in even a small way, then that’s incredibly valuable. These are such special birds and I’ve been interested in them for as long as I can remember.”
The research was supported in part by the Royal Society, the Newton International Fellowship, and UK Research and Innovation (UKRI).
Very sick 5 year old little boy fighting measles infection, boy is laying in bed under the blanket with a agonizing expression, boy is covered with rash caused by virus. Credit: CHBD / E+ / Getty Images
Researchers say it is vital that children born by caesarean section receive two doses of the measles vaccine for robust protection against the disease.
A study by the University of Cambridge, UK, and Fudan University, China, has found that a single dose of the measles jab is up to 2.6 times more likely to be completely ineffective in children born by C-section, compared to those born naturally.
Failure of the vaccine means that the child’s immune system does not produce antibodies to fight against measles infection, so they remain susceptible to the disease.
A second measles jab was found to induce a robust immunity against measles in C-section children.
Measles is a highly infectious disease, and even low vaccine failure rates can significantly increase the risk of an outbreak.
A potential reason for this effect is linked to the development of the infant’s gut microbiome – the vast collection of microbes that naturally live inside the gut. Other studies have shown that vaginal birth transfers a greater variety of microbes from mother to baby, which can boost the immune system.
“We’ve discovered that the way we’re born – either by C-section or natural birth – has long-term consequences on our immunity to diseases as we grow up,” said Professor Henrik Salje in the University of Cambridge’s Department of Genetics, joint senior author of the report.
He added: “We know that a lot of children don’t end up having their second measles jab, which is dangerous for them as individuals and for the wider population.
“Infants born by C-section are the ones we really want to be following up to make sure they get their second measles jab, because their first jab is much more likely to fail.”
At least 95% of the population needs to be fully vaccinated to keep measles under control but the UK is well below this, despite the Measles, Mumps and Rubella (MMR) vaccine being available through the NHS Routine Childhood Immunisation Programme.
An increasing number of women around the world are choosing to give birth by caesarean section: in the UK a third of all births are by C-section, in Brazil and Turkey over half of all children are born this way.
“With a C-section birth, children aren’t exposed to the mother’s microbiome in the same way as with a vaginal birth. We think this means they take longer to catch up in developing their gut microbiome, and with it, the ability of the immune system to be primed by vaccines against diseases including measles,” said Salje.
To get their results, the researchers used data from previous studies of over 1,500 children in Hunan, China, which included blood samples taken every few weeks from birth to the age of 12. This allowed them to see how levels of measles antibodies in the blood change over the first few years of life, including following vaccination.
They found that 12% of children born via caesarean section had no immune response to their first measles vaccination, as compared to 5% of children born by vaginal delivery. This means that many of the children born by C-section did still mount an immune response following their first vaccination.
Two doses of the measles jab are needed for the body to mount a long-lasting immune response and protect against measles. According to the World Health Organisation, in 2022 only 83% of the world’s children had received one dose of measles vaccine by their first birthday – the lowest since 2008.
Salje said: “Vaccine hesitancy is really problematic, and measles is top of the list of diseases we’re worried about because it’s so infectious.”
Measles is one of the world’s most contagious diseases, spread by coughs and sneezes. It starts with cold-like symptoms and a rash, and can lead to serious complications including blindness, seizures, and death.
Before the measles vaccine was introduced in 1963, there were major measles epidemics every few years causing an estimated 2.6 million deaths each year.
The research was funded by the National Natural Science Foundation of China.
Reference
Wang, W. et al: ‘Dynamics of measles immunity from birth and following vaccination.’ Nature Microbiology, 13 May 2024. DOI: 10.1038/s41564-024-01694-x
A visualisation of one of the design scenarios highlighted in the latest paper Credit: Tomasz Hollanek
Cambridge researchers lay out the need for design safety protocols that prevent the emerging “digital afterlife industry” causing social and psychological harm.
Artificial intelligence that allows users to hold text and voice conversations with lost loved ones runs the risk of causing psychological harm and even digitally ‘haunting’ those left behind without design safety standards, according to University of Cambridge researchers.
‘Deadbots’ or ‘Griefbots’ are AI chatbots that simulate the language patterns and personality traits of the dead using the digital footprints they leave behind. Some companies are already offering these services, providing an entirely new type of “postmortem presence”.
AI ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence outline three design scenarios for platforms that could emerge as part of the developing “digital afterlife industry”, to show the potential consequences of careless design in an area of AI they describe as “high risk”.
The research, published in the journal Philosophy and Technology, highlights the potential for companies to use deadbots to surreptitiously advertise products to users in the manner of a departed loved one, or distress children by insisting a dead parent is still “with you”.
When the living sign up to be virtually re-created after they die, resulting chatbots could be used by companies to spam surviving family and friends with unsolicited notifications, reminders and updates about the services they provide – akin to being digitally “stalked by the dead”.
Even those who take initial comfort from a ‘deadbot’ may get drained by daily interactions that become an “overwhelming emotional weight”, argue researchers, yet may also be powerless to have an AI simulation suspended if their now-deceased loved one signed a lengthy contract with a digital afterlife service.
“Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one,” said Dr Katarzyna Nowaczyk-Basińska, study co-author and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI).
“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.
“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”
Platforms offering to recreate the dead with AI for a small fee already exist, such as ‘Project December’, which started out harnessing GPT models before developing its own systems, and apps including ‘HereAfter’. Similar services have also begun to emerge in China.
One of the potential scenarios in the new paper is “MaNana”: a conversational AI service allowing people to create a deadbot simulating their deceased grandmother without consent of the “data donor” (the dead grandparent).
The hypothetical scenario sees an adult grandchild who is initially impressed and comforted by the technology start to receive advertisements once a “premium trial” finishes. For example, the chatbot suggesting ordering from food delivery services in the voice and style of the deceased.
The relative feels they have disrespected the memory of their grandmother, and wishes to have the deadbot turned off, but in a meaningful way – something the service providers haven’t considered.
“People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation,” said co-author Dr Tomasz Hollanek, also from Cambridge’s LCFI.
“Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.”
“We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media.”
While Hollanek and Nowaczyk-Basińska say that designers of re-creation services should actively seek consent from data donors before they pass, they argue that a ban on deadbots based on non-consenting donors would be unfeasible.
They suggest that design processes should involve a series of prompts for those looking to “resurrect” their loved ones, such as ‘have you ever spoken with X about how they would like to be remembered?’, so the dignity of the departed is foregrounded in deadbot development.
Another scenario featured in the paper, an imagined company called “Paren’t”, highlights the example of a terminally ill woman leaving a deadbot to assist her eight-year-old son with the grieving process.
While the deadbot initially helps as a therapeutic aid, the AI starts to generate confusing responses as it adapts to the needs of the child, such as depicting an impending in-person encounter.
The researchers recommend age restrictions for deadbots, and also call for “meaningful transparency” to ensure users are consistently aware that they are interacting with an AI. These could be similar to current warnings on content that may cause seizures, for example.
The final scenario explored by the study – a fictional company called “Stay” – shows an older person secretly committing to a deadbot of themselves and paying for a twenty-year subscription, in the hopes it will comfort their adult children and allow their grandchildren to know them.
After death, the service kicks in. One adult child does not engage, and receives a barrage of emails in the voice of their dead parent. Another does, but ends up emotionally exhausted and wracked with guilt over the fate of the deadbot. Yet suspending the deadbot would violate the terms of the contract their parent signed with the service company.
“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” said Hollanek.
“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.”
The researchers call for design teams to prioritise opt-out protocols that allow potential users terminate their relationships with deadbots in ways that provide emotional closure.
Added Nowaczyk-Basińska: “We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here.”
Baby Opal and mother Jo Credit: Cambridge University Hospitals NHS Foundation Trust
A baby girl born deaf can hear unaided for the first time, after receiving gene therapy when she was 11 months old at Addenbrooke’s Hospital in Cambridge.
Gene therapy has been the future of otology and audiology for many years and I’m so excited that it is now finally here
Manohar Bance
Opal Sandy from Oxfordshire is the first patient treated in a global gene therapy trial, which shows ‘mind-blowing’ results. She is the first British patient in the world and the youngest child to receive this type of treatment.
Opal was born completely deaf because of a rare genetic condition, auditory neuropathy, caused by the disruption of nerve impulses travelling from the inner ear to the brain.
Within four weeks of having the gene therapy infusion to her right ear, Opal responded to sound, even with the cochlear implant in her left ear switched off.
Clinicians noticed continuous improvement in Opal’s hearing in the weeks afterwards. At 24 weeks, they confirmed Opal had close to normal hearing levels for soft sounds, such as whispering, in her treated ear.
Now 18 months old, Opal can respond to her parents’ voices and can communicate words such as “Dada” and “bye-bye.”
Opal’s mother, Jo Sandy, said: “When Opal could first hear us clapping unaided it was mind-blowing – we were so happy when the clinical team confirmed at 24 weeks that her hearing was also picking up softer sounds and speech. The phrase ‘near normal’ hearing was used and everyone was so excited such amazing results had been achieved.”
Auditory neuropathy can be due to a variation in a single gene, known as the OTOF gene. The gene produces a protein called otoferlin, needed to allow the inner hair cells in the ear to communicate with the hearing nerve. Approximately 20,000 people across the UK, Germany, France, Spain, Italy and UK and are deaf due to a mutation in the OTOF gene.
The CHORD trial, which started in May 2023, aims to show whether gene therapy can provide hearing for children born with auditory neuropathy.
Professor Manohar Bance from the Department of Clinical Neurosciences at the University of Cambridge and an ear surgeon at Cambridge University Hospitals NHS Foundation Trust is chief investigator of the trial. He said:
“These results are spectacular and better than I expected. Gene therapy has been the future of otology and audiology for many years and I’m so excited that it is now finally here. This is hopefully the start of a new era for gene therapies for the inner ear and many types of hearing loss.”
Children with a variation in the OTOF gene often pass the newborn screening, as the hair cells are working, but they are not talking to the nerve. It means this hearing loss is not commonly detected until children are 2 or 3 years of age – when a delay in speech is likely to be noticed.
Professor Bance added: “We have a short time frame to intervene because of the rapid pace of brain development at this age. Delays in the diagnosis can also cause confusion for families as the many reasons for delayed speech and late intervention can impact a children’s development.”
“More than sixty years after the cochlear implant was first invented – the standard of care treatment for patients with OTOF related hearing loss – this trial shows gene therapy could provide a future alternative. It marks a new era in the treatment for deafness. It also supports the development of other gene therapies that may prove to make a difference in other genetic related hearing conditions, many of which are more common than auditory neuropathy.”
Mutations in the OTOF gene can be identified by standard NHS genetic testing. Opal was identified as being at risk as her older sister has the condition; this was confirmed by genetic test result when she was 3 weeks old.
Opal was given an infusion containing a harmless virus (AAV1). It delivers a working copy of the OTOF gene and is delivered via an injection in the cochlea during surgery under general anaesthesia. During surgery, while Opal was given the gene therapy in right ear, a cochlear implant was fitted in her left ear.
James Sandy, Opal’s father said: “It was our ultimate goal for Opal to hear all the speech sounds. It’s already making a difference to our day-to-day lives, like at bath-time or swimming, when Opal can’t wear her cochlear implant. We feel so proud to have contributed to such pivotal findings, which will hopefully help other children like Opal and their families in the future.”
Opal’s 24-week results, alongside other scientific data from the CHORD trial are being presented at the American Society of Gene and Cell Therapy (ASGC) in Baltimore, USA this week.
Dr Richard Brown, Consultant Paediatrician at CUH, who is an Investigator on the CHORD trial, said: “The development of genomic medicine and alternative treatments is vital for patients worldwide, and increasingly offers hope to children with previously incurable disorders. It is likely that in the long run such treatments require less follow up so may prove to be an attractive option, including within the developing world. Follow up appointments have shown effective results so far with no adverse reactions and it is exciting to see the results to date.
“Within the new planned Cambridge Children’s Hospital, we look forward to having a genomic centre of excellence which will support patients from across the region to access the testing they need, and the best treatment, at the right time.”
The CHORD trial has been funded by Regeneron. Patients are being enrolled in the study in the US, UK and Spain.
Patients in the first phase of the study receive a low dose to one ear. The second phase are expected to use a higher dose of gene therapy in one ear only, following proven safety of the starting dose. The third phase will look at gene therapy in both ears with the dose selected after ensuring the safety and effectiveness in parts 1 and 2. Follow up appointments will continue for five years for enrolled patients, which will show how patients adapt to understand speech in the longer term.
In Cambridge, the trial is supported by NIHR Cambridge Clinical Research Facility and NIHR Cambridge Biomedical Research Centre.
Illustration of spinal cord Credit: SEBASTIAN KAULITZKI/SCIENCE PHOTO LIBRARY
A tiny, flexible electronic device that wraps around the spinal cord could represent a new approach to the treatment of spinal injuries, which can cause profound disability and paralysis.
Because of recent advances in both engineering and neurosurgery, the planets have aligned and we’ve made major progress in this important areaGeorge Malliaras
A team of engineers, neuroscientists and surgeons from the University of Cambridge developed the devices and used them to record the nerve signals going back and forth between the brain and the spinal cord. Unlike current approaches, the Cambridge devices can record 360-degree information, giving a complete picture of spinal cord activity.
Tests in live animal and human cadaver models showed the devices could also stimulate limb movement and bypass complete spinal cord injuries where communication between the brain and spinal cord had been completely interrupted.
Most current approaches to treating spinal injuries involve both piercing the spinal cord with electrodes and placing implants in the brain, which are both high-risk surgeries. The Cambridge-developed devices could lead to treatments for spinal injuries without the need for brain surgery, which would be far safer for patients.
While such treatments are still at least several years away, the researchers say the devices could be useful in the near-term for monitoring spinal cord activity during surgery. Better understanding of the spinal cord, which is difficult to study, could lead to improved treatments for a range of conditions, including chronic pain, inflammation and hypertension. The results are reported in the journal Science Advances.
“The spinal cord is like a highway, carrying information in the form of nerve impulses to and from the brain,” said Professor George Malliaras from the Department of Engineering, who co-led the research. “Damage to the spinal cord causes that traffic to be interrupted, resulting in profound disability, including irreversible loss of sensory and motor functions.”
The ability to monitor signals going to and from the spinal cord could dramatically aid in the development of treatments for spinal injuries, and could also be useful in the nearer term for better monitoring of the spinal cord during surgery.
“Most technologies for monitoring or stimulating the spinal cord only interact with motor neurons along the back, or dorsal, part of the spinal cord,” said Dr Damiano Barone from the Department of Clinical Neurosciences, who co-led the research. “These approaches can only reach between 20 and 30 percent of the spine, so you’re getting an incomplete picture.”
By taking their inspiration from microelectronics, the researchers developed a way to gain information from the whole spine, by wrapping very thin, high-resolution implants around the spinal cord’s circumference. This is the first time that safe 360-degree recording of the spinal cord has been possible – earlier approaches for 360-degree monitoring use electrodes that pierce the spine, which can cause spinal injury.
The Cambridge-developed biocompatible devices – just a few millionths of a metre thick – are made using advanced photolithography and thin film deposition techniques, and require minimal power to function.
The devices intercept the signals travelling on the axons, or nerve fibres, of the spinal cord, allowing the signals to be recorded. The thinness of the devices means they can record the signals without causing any damage to the nerves, since they do not penetrate the spinal cord itself.
“It was a difficult process, because we haven’t made spinal implants in this way before, and it wasn’t clear that we could safely and successfully place them around the spine,” said Malliaras. “But because of recent advances in both engineering and neurosurgery, the planets have aligned and we’ve made major progress in this important area.”
The devices were implanted using an adaptation to routine surgical procedure so they could be slid under the spinal cord without damaging it. In tests using rat models, the researchers successfully used the devices to stimulate limb movement. The devices showed very low latency – that is, their reaction time was close to human reflexive movement. Further tests in human cadaver models showed that the devices can be successfully placed in humans.
The researchers say their approach could change how spinal injuries are treated in future. Current attempts to treat spinal injuries involve both brain and spinal implants, but the Cambridge researchers say the brain implants may not be necessary.
“If someone has a spinal injury, their brain is fine, but it’s the connection that’s been interrupted,” said Barone. “As a surgeon, you want to go where the problem is, so adding brain surgery on top of spinal surgery just increases the risk to the patient. We can collect all the information we need from the spinal cord in a far less invasive way, so this would be a much safer approach for treating spinal injuries.”
While a treatment for spinal injuries is still years away, in the nearer term, the devices could be useful for researchers and surgeons to learn more about this vital, but understudied, part of human anatomy in a non-invasive way. The Cambridge researchers are currently planning to use the devices to monitor nerve activity in the spinal cord during surgery.
“It’s been almost impossible to study the whole of the spinal cord directly in a human, because it’s so delicate and complex,” said Barone. “Monitoring during surgery will help us to understand the spinal cord better without damaging it, which in turn will help us develop better therapies for conditions like chronic pain, hypertension or inflammation. This approach shows enormous potential for helping patients.”
The research was supported in part by the Royal College of Surgeons, the Academy of Medical Sciences, Health Education England, the National Institute for Health Research, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).
Syringe and vaccine bottle Credit: Stefan Cristian Cioata on Getty
Researchers have developed a new vaccine technology that has been shown in mice to provide protection against a broad range of coronaviruses with potential for future disease outbreaks – including ones we don’t even know about
Our focus is to create a vaccine that will protect us against the next coronavirus pandemic, and have it ready before the pandemic has even started.Rory Hills
This is a new approach to vaccine development called ‘proactive vaccinology’, where scientists build a vaccine before the disease-causing pathogen even emerges.
The new vaccine works by training the body’s immune system to recognise specific regions of eight different coronaviruses, including SARS-CoV-1, SARS-CoV-2, and several that are currently circulating in bats and have potential to jump to humans and cause a pandemic.
Key to its effectiveness is that the specific virus regions the vaccine targets also appear in many related coronaviruses. By training the immune system to attack these regions, it gives protection against other coronaviruses not represented in the vaccine – including ones that haven’t even been identified yet.
For example, the new vaccine does not include the SARS-CoV-1 coronavirus, which caused the 2003 SARS outbreak, yet it still induces an immune response to that virus.
“Our focus is to create a vaccine that will protect us against the next coronavirus pandemic, and have it ready before the pandemic has even started,” said Rory Hills, a graduate researcher in the University of Cambridge’s Department of Pharmacology and first author of the report.
He added: “We’ve created a vaccine that provides protection against a broad range of different coronaviruses – including ones we don’t even know about yet.”
“We don’t have to wait for new coronaviruses to emerge. We know enough about coronaviruses, and different immune responses to them, that we can get going with building protective vaccines against unknown coronaviruses now,” said Professor Mark Howarth in the University of Cambridge’s Department of Pharmacology, senior author of the report.
He added: “Scientists did a great job in quickly producing an extremely effective COVID vaccine during the last pandemic, but the world still had a massive crisis with a huge number of deaths. We need to work out how we can do even better than that in the future, and a powerful component of that is starting to build the vaccines in advance.”
The new ‘Quartet Nanocage’ vaccine is based on a structure called a nanoparticle – a ball of proteins held together by incredibly strong interactions. Chains of different viral antigens are attached to this nanoparticle using a novel ‘protein superglue’. Multiple antigens are included in these chains, which trains the immune system to target specific regions shared across a broad range of coronaviruses.
This study demonstrated that the new vaccine raises a broad immune response, even in mice that were pre-immunised with SARS-CoV-2.
The new vaccine is much simpler in design than other broadly protective vaccines currently in development, which the researchers say should accelerate its route into clinical trials.
The underlying technology they have developed also has potential for use in vaccine development to protect against many other health challenges.
The work involved a collaboration between scientists at the University of Cambridge, the University of Oxford, and Caltech. It improves on previous work, by the Oxford and Caltech groups, to develop a novel all-in-one vaccine against coronavirus threats. The vaccine developed by Oxford and Caltech should enter Phase 1 clinical trials in early 2025, but its complex nature makes it challenging to manufacture which could limit large-scale production.
Conventional vaccines include a single antigen to train the immune system to target a single specific virus. This may not protect against a diverse range of existing coronaviruses, or against pathogens that are newly emerging.
The research was funded by the Biotechnology and Biological Sciences Research Council.
Ali Banwell and Laura Stevens installing the time-lapse camera used in this study on the George VI Ice Shelf in Antarctica. Credit: Ian Willis
Heavy pooling meltwater can fracture ice, potentially leading to ice shelf collapse
When air temperatures in Antarctica rise and glacier ice melts, water can pool on the surface of floating ice shelves, weighing them down and causing the ice to bend. Now, for the first time in the field, researchers have shown that ice shelves don’t just buckle under the weight of meltwater lakes — they fracture.
As the climate warms and melt rates in Antarctica increase, this fracturing could cause vulnerable ice shelves to collapse, allowing inland glacier ice to spill into the ocean and contribute to sea level rise.
Ice shelves are important for the Antarctic Ice Sheet’s overall health as they act to buttress or hold back the glacier ice on land. Scientists have predicted and modelled that surface meltwater loading could cause ice shelves to fracture, but no one had observed the process in the field, until now.
The new study, published in the Journal of Glaciology, may help explain how the Larsen B Ice Shelf abruptly collapsed in 2002. In the months before its catastrophic breakup, thousands of meltwater lakes littered the ice shelf’s surface, which then drained over just a few weeks.
To investigate the impacts of surface meltwater on ice shelf stability, a research team led by the University of Colorado Boulder, and including researchers from the University of Cambridge, travelled to the George VI Ice Shelf on the Antarctic Peninsula in November 2019.
First, the team identified a depression or ‘doline’ in the ice surface that had formed by a previous lake drainage event where they thought meltwater was likely to pool again on the ice. Then, they ventured out on snowmobiles, pulling all their science equipment and safety gear behind on sleds.
Around the doline, the team installed high-precision GPS stations to measure small changes in elevation at the ice’s surface, water-pressure sensors to measure lake depth, and a timelapse camera system to capture images of the ice surface and meltwater lakes every 30 minutes.
In 2020, the COVID-19 pandemic brought their fieldwork to a screeching halt. When the team finally made it back to their field site in November 2021, only two GPS sensors and one timelapse camera remained; two other GPS and all water pressure sensors had been flooded and buried in solid ice. Fortunately, the surviving instruments captured the vertical and horizontal movement of the ice’s surface and images of the meltwater lake that formed and drained during the record-high 2019/2020 melt season.
GPS data indicated that the ice in the centre of the lake basin flexed downward about a foot in response to the increased weight from meltwater. That finding builds upon previous work that produced the first direct field measurements of ice shelf buckling caused by meltwater ponding and drainage.
The team also found that the horizontal distance between the edge and centre of the meltwater lake basin increased by over a foot. This was most likely due to the formation and/or widening of circular fractures around the meltwater lake, which the timelapse imagery captured. Their results provide the first field-based evidence of ice shelf fracturing in response to a surface meltwater lake weighing down the ice.
“This is an exciting discovery,” said lead author Alison Banwell, from the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder. “We believe these types of circular fractures were key in the chain reaction style lake drainage process that helped to break up the Larsen B Ice Shelf.”
“While these measurements were made over a small area, they demonstrate that bending and breaking of floating ice due to surface water may be more widespread than previously thought,” said co-author Dr Rebecca Dell from Cambridge’s Scott Polar Research Institute. “As melting increases in response to predicted warming, ice shelves may become more prone to break up and collapse than they are currently.”
“This has implications for sea level as the buttressing of inland ice is reduced or removed, allowing the glaciers and ice streams to flow more rapidly into the ocean,” said co-author Professor Ian Willis, also from SPRI.
The work supports modelling results that show the immense weight of thousands of meltwater lakes and subsequent draining caused the Larsen B Ice Shelf to bend and break, contributing to its collapse.
“These observations are important because they can be used to improve models to better predict which Antarctic ice shelves are more vulnerable and most susceptible to collapse in the future,” Banwell said.
The research was funded by the U.S. National Science Foundation (NSF) and the Natural Environment Research Council (NERC), part of UK Research and Innovation (UKRI). The team also included researchers from the University of Oxford and the University of Chicago. Rebecca Dell is a Fellow of Trinity Hall, Cambridge.
Survivors of breast cancer are at significantly higher risk of developing second cancers, including endometrial and ovarian cancer for women and prostate cancer for men, according to new research studying data from almost 600,000 patients in England.
It’s important for us to understand to what extent having one type of cancer puts you at risk of a second cancer at a different site. Knowing this can help inform conversations with their care teams to look out for signs of potential new cancers
Isaac Allen
For the first time, the research has shown that this risk is higher in people living in areas of greater socioeconomic deprivation.
Breast cancer is the most commonly diagnosed cancer in the UK. Around 56,000 people in the UK are diagnosed each year, the vast majority (over 99%) of whom are women. Improvements in earlier diagnosis and in treatments mean that five year survival rates have been increasing over time, reaching 87% by 2017 in England.
People who survive breast cancer are at risk of second primary cancer, but until now the exact risk has been unclear. Previously published research suggested that women and men who survive breast cancer are at a 24% and 27% greater risk of a non-breast second primary cancer than the wider population respectively. There have been also suggestions that second primary cancer risks differ by the age at breast cancer diagnosis.
To provide more accurate estimates, a team led by researchers at the University of Cambridge analysed data from over 580,000 female and over 3,500 male breast cancer survivors diagnosed between 1995 and 2019 using the National Cancer Registration Dataset. The results of their analysis are published today in Lancet Regional Health – Europe.
First author Isaac Allen from the Department of Public Health and Primary Care at the University of Cambridge said: “It’s important for us to understand to what extent having one type of cancer puts you at risk of a second cancer at a different site. The female and male breast cancer survivors whose data we studied were at increased risk of a number of second cancers. Knowing this can help inform conversations with their care teams to look out for signs of potential new cancers.”
The researchers found significantly increased risks of cancer in the contralateral (that is, unaffected) breast and for endometrium and prostate cancer in females and males, respectively. Females who survived breast cancer were at double the risk of contralateral breast cancer compared to the general population and at 87% greater risk of endometrial cancer, 58% greater risk of myeloid leukaemia and 25% greater risk of ovarian cancer.
Age of diagnosis was important, too – females diagnosed with breast cancer under the age of 50 were 86% more likely to develop a second primary cancer compared to the general population of the same age, whereas women diagnosed after age 50 were at a 17% increased risk. One potential explanation is that a larger number of younger breast cancer survivors may have inherited genetic alterations that increase risk for multiple cancers. For example, women with inherited changes to the BRCA1 and BRCA2 genes are at increased risk of contralateral breast cancer, ovarian and pancreatic cancer.
Females from the most socioeconomically deprived backgrounds were at 35% greater risk of a second primary cancer compared to females from the least deprived backgrounds. These differences were primarily driven by non-breast cancer risks, particularly for lung, kidney, head and neck, bladder, oesophageal and stomach cancers. This may be because smoking, obesity, and alcohol consumption – established risk factors for these cancers – are more common among more deprived groups.
Allen, a PhD student at Clare Hall, added: “This is further evidence of the health inequalities that people from more deprived backgrounds experience. We need to fully understand why they are at greater risk of second cancers so that we can intervene and reduce this risk.”
Male breast cancer survivors were 55 times more likely than the general male population to develop contralateral breast cancer – though the researchers stress that an individual’s risk was still very low. For example, for every 100 men diagnosed with breast cancer at age 50 or over, about three developed contralateral breast cancer during a 25 year period. Male breast cancer survivors were also 58% more likely than the general male population to develop prostate cancer.
Professor Antonis Antoniou from the Department of Public Health and Primary Care at the University of Cambridge, the study’s senior author, said: “This is the largest study to date to look at the risk in breast cancer survivors of developing a second cancer. We were able to carry this out and calculate more accurate estimates because of the outstanding data sets available to researchers through the NHS.”
The research was funded by Cancer Research UK with support from the National Institute for Health and Care Research Cambridge Biomedical Research Centre.
Cancer Research UK’s senior cancer intelligence manager, Katrina Brown, said: “This study shows us that the risk of second primary cancers is higher in people who have had breast cancer, and this can differ depending on someone’s socioeconomic background. But more research is needed to understand what is driving this difference and how to tackle these health inequalities.”
People who are concerned about their cancer risk should contact their GP for advice. If you or someone close to you have been affected by cancer and you’ve got questions, you can call Cancer Research UK nurses on freephone 0808 800 4040, Monday to Friday.
Abstract orange swirls Credit: orange via Getty Images
Researchers have found a way to super-charge the ‘engine’ of sustainable fuel generation – by giving the materials a little twist.
The researchers, led by the University of Cambridge, are developing low-cost light-harvesting semiconductors that power devices for converting water into clean hydrogen fuel, using just the power of the sun. These semiconducting materials, known as copper oxides, are cheap, abundant and non-toxic, but their performance does not come close to silicon, which dominates the semiconductor market.
However, the researchers found that by growing the copper oxide crystals in a specific orientation so that electric charges move through the crystals at a diagonal, the charges move much faster and further, greatly improving performance. Tests of a copper oxide light harvester, or photocathode, based on this fabrication technique showed a 70% improvement over existing state-of-the-art oxide photocathodes, while also showing greatly improved stability.
The researchers say their results, reported in the journal Nature, show how low-cost materials could be fine-tuned to power the transition away from fossil fuels and toward clean, sustainable fuels that can be stored and used with existing energy infrastructure.
Copper (I) oxide, or cuprous oxide, has been touted as a cheap potential replacement for silicon for years, since it is reasonably effective at capturing sunlight and converting it into electric charge. However, much of that charge tends to get lost, limiting the material’s performance.
“Like other oxide semiconductors, cuprous oxide has its intrinsic challenges,” said co-first author Dr Linfeng Pan from Cambridge’s Department of Chemical Engineering and Biotechnology. “One of those challenges is the mismatch between how deep light is absorbed and how far the charges travel within the material, so most of the oxide below the top layer of material is essentially dead space.”
“For most solar cell materials, it’s defects on the surface of the material that cause a reduction in performance, but with these oxide materials, it’s the other way round: the surface is largely fine, but something about the bulk leads to losses,” said Professor Sam Stranks, who led the research. “This means the way the crystals are grown is vital to their performance.”
To develop cuprous oxides to the point where they can be a credible contender to established photovoltaic materials, they need to be optimised so they can efficiently generate and move electric charges – made of an electron and a positively-charged electron ‘hole’ – when sunlight hits them.
One potential optimisation approach is single-crystal thin films – very thin slices of material with a highly-ordered crystal structure, which are often used in electronics. However, making these films is normally a complex and time-consuming process.
Using thin film deposition techniques, the researchers were able to grow high-quality cuprous oxide films at ambient pressure and room temperature. By precisely controlling growth and flow rates in the chamber, they were able to ‘shift’ the crystals into a particular orientation. Then, using high temporal resolution spectroscopic techniques, they were able to observe how the orientation of the crystals affected how efficiently electric charges moved through the material.
“These crystals are basically cubes, and we found that when the electrons move through the cube at a body diagonal, rather than along the face or edge of the cube, they move an order of magnitude further,” said Pan. “The further the electrons move, the better the performance.”
“Something about that diagonal direction in these materials is magic,” said Stranks. “We need to carry out further work to fully understand why and optimise it further, but it has so far resulted in a huge jump in performance.” Tests of a cuprous oxide photocathode made using this technique showed an increase in performance of more than 70% over existing state-of-the-art electrodeposited oxide photocathodes.
“In addition to the improved performance, we found that the orientation makes the films much more stable, but factors beyond the bulk properties may be at play,” said Pan.
The researchers say that much more research and development is still needed, but this and related families of materials could have a vital role in the energy transition.
“There’s still a long way to go, but we’re on an exciting trajectory,” said Stranks. “There’s a lot of interesting science to come from these materials, and it’s interesting for me to connect the physics of these materials with their growth, how they form, and ultimately how they perform.”
The research was a collaboration with École Polytechnique Fédérale de Lausanne, Nankai University and Uppsala University. The research was supported in part by the European Research Council, the Swiss National Science Foundation, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Sam Stranks is Professor of Optoelectronics in the Department of Chemical Engineering and Biotechnology, and a Fellow of Clare College, Cambridge.
For more information on energy-related research in Cambridge, please visit the Energy IRC, which brings together Cambridge’s research knowledge and expertise, in collaboration with global partners, to create solutions for a sustainable and resilient energy landscape for generations to come.