All posts by Admin

Suppressing negative thoughts may be good for mental health after all

Women enjoying sun

The commonly-held belief that attempting to suppress negative thoughts is bad for our mental health could be wrong, a new study from scientists at the University of Cambridge suggests.

What we found runs counter to the accepted narrativeMichael Anderson

Researchers at the Medical Research Council (MRC) Cognition and Brain Sciences Unit trained 120 volunteers worldwide to suppress thoughts about negative events that worried them, and found that not only did these become less vivid, but that the participants’ mental health also improved.

“We’re all familiar with the Freudian idea that if we suppress our feelings or thoughts, then these thoughts remain in our unconscious, influencing our behaviour and wellbeing perniciously,” said Professor Michael Anderson.

“The whole point of psychotherapy is to dredge up these thoughts so one can deal with them and rob them of their power. In more recent years, we’ve been told that suppressing thoughts is intrinsically ineffective and that it actually causes people to think the thought more – it’s the classic idea of ‘Don’t think about a pink elephant’.”

These ideas have become dogma in the clinical treatment realm, said Anderson, with national guidelines talking about thought avoidance as a major maladaptive coping behaviour to be eliminated and overcome in depression, anxiety, PTSD, for example.

When COVID-19 appeared in 2020, like many researchers, Professor Anderson wanted to see how his own research could be used to help people through the pandemic. His interest lay in a brain mechanism known as inhibitory control – the ability to override our reflexive responses – and how it might be applied to memory retrieval, and in particular to stopping the retrieval of negative thoughts when confronted with potent reminders to them.

Dr Zulkayda Mamat – at the time a PhD student in Professor Anderson’s lab and at Trinity College, Cambridge – believed that inhibitory control was critical in overcoming trauma in experiences occurring to herself and many others she has encountered in life. She had wanted to investigate whether this was an innate ability or something that was learnt – and hence could be taught.

Dr Mamat said: “Because of the pandemic, we were seeing a need in the community to help people cope with surging anxiety. There was already a mental health crisis, a hidden epidemic of mental health problems, and this was getting worse. So with that backdrop, we decided to see if we could help people cope better.”

Professor Anderson and Dr Mamat recruited 120 people across 16 countries to test whether it might in fact be possible – and beneficial – for people to practice suppressing their fearful thoughts. Their findings are published today in Science Advances.

In the study, each participant was asked to think of a number of scenarios that might plausibly occur in their lives over the next two years – 20 negative ‘fears and worries’ that they were afraid might happen, 20 positive ‘hopes and dreams’, and 36 routine and mundane neutral events. The fears had to be worries of current concern to them, that have repeatedly intruded in their thoughts.

Each event had to be specific to them and something they had vividly imagined occurring. For each scenario, they were to provide a cue word (an obvious reminder that could be used to evoke the event during training) and a key detail (a single word expressing a central event detail). For example:

  • Negative – visiting one’s parents at the hospital as a result of COVID-19, with the cue ‘Hospital’ and the detail ‘Breathing’.
  • Neutral – a visit to the opticians, with the cue ‘Optician’ and the detail ‘Cambridge’.
  • Positive – seeing one’s sister get married, with the cue ‘Wedding’ and the detail ‘Dress’.

Participants were asked to rate each event on a number of points: vividness, likelihood of occurrence, distance in the future, level of anxiety about the event (or level of joy for positive events), frequency of thought, degree of current concern, long-term impact, and emotional intensity.

Participants also completed questionnaires to assess their mental health, though no one was excluded, allowing the researchers to look at a broad range of participants, including many with serious depression, anxiety, and pandemic-related post-traumatic stress.

Then, over Zoom, Dr Mamat took each participant through the 20-minute training, which involved 12 ‘No-imagine’ and 12 ‘Imagine’ repetitions for events, each day for three days.

For No-imagine trials, participants were given one of their cue words, asked to first acknowledge the event in their mind.  Then, while continuing to stare directly at the reminder cue, they were asked to stop thinking about the event – they should not try to imagine the event itself or use diversionary thoughts to distract themselves, but rather should try to block any images or thoughts that the reminder might evoke.  For this part of the trial, one group of participants was given their negative events to suppress and the other given their neutral ones.

For Imagine trials, participants were given a cue word and asked to imagine the event as vividly as possible, thinking what it would be like and imagining how they would feel at the event. For ethical reasons, no participant was given a negative event to imagine, but only positive or neutral ones.

At the end of the third day and again three months later, participants were once again asked to rate each event on vividness, level of anxiety, emotional intensity, etc., and completed questionnaires to assess changes in depression, anxiety, worry, affect, and wellbeing, key facets of mental health.

Dr Mamat said: “It was very clear that those events that participants practiced suppressing were less vivid, less emotionally anxiety-inducing, than the other events and that overall, participants improved in terms of their mental health. But we saw the biggest effect among those participants who were given practice at suppressing fearful, rather than neutral, thoughts.” 

Following training – both immediately and after three months – participants reported that suppressed events were less vivid and less fearful. They also found themselves thinking about these events less.

Suppressing thoughts even improved mental health amongst participants with likely post-traumatic stress disorder. Among participants with post-traumatic stress who suppressed negative thoughts, their negative mental health indices scores fell on average by 16% (compared to a 5% fall for similar participants suppressing neutral events), whereas positive mental health indices scores increased by almost 10% (compared to a 1% fall in the second group).

In general, people with worse mental health symptoms at the outset of the study improved more after suppression training, but only if they suppressed their fears. This finding directly contradicts the notion that suppression is a maladaptive coping process.

Suppressing negative thoughts did not lead to a ‘rebound’, where a participant recalled these events more vividly. Only one person out of 120 showed higher detail recall for suppressed items post-training, and just six of the 61 participants that suppressed fears reported increased vividness for No-Imagine items post-training, but this was in line with the baseline rate of vividness increases that occurred for events that were not suppressed at all.  

“What we found runs counter to the accepted narrative,” said Professor Anderson. “Although more work will be needed to confirm the findings, it seems like it is possible and could even be potentially beneficial to actively suppress our fearful thoughts.”

Although participants were not asked to continue practising the technique, many of them chose to do so spontaneously. When Dr Mamat contacted the participants after three months, she found that the benefits in terms of reduced levels of depression and negative emotions, continued for all participants, but were most pronounced among those participants who continued to use the technique in their daily lives.

“The follow up was my favourite time of my entire PhD, because every day was just joyful,” she said. “I didn’t have a single participant who told me ‘Oh, I feel bad’ or ‘This was useless’. I didn’t prompt them or ask ‘Did you find this helpful?’ They were just automatically telling me how helpful they found it.”

One participant was so impressed by the technique that she taught her daughter and her own mother how to do it. Another reported how she had moved home just prior to COVID-19 and so felt very isolated during the pandemic.

“She said this study had come exactly at the time she needed it because she was having all these negative thoughts, all these worries and anxiety about the future, and this really, really helped her,” said Dr Mamat. “My heart literally just melted, I could feel goosebumps all over me. I said to her ‘If everyone else hated this experiment, I would not care because of how much this benefited you!’.”

The research was funded by the Medical Research Council and the Mind Science Foundation.

Reference
Mamat, Z, and Anderson, MC. Improving Mental Health by Training the Suppression of Unwanted Thoughts. Sci Adv; 20 Sept 2023; DOI: 10.1126/sciadv.adh5292



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge remains most intensive science and technological cluster in the world

Cambridge remains the most intensive science and technological cluster in the world – according to a new report ranking innovation around the globe.

“I am thrilled to see Cambridge again recognised as one of the greatest innovation hubs on the planet.”Professor Deborah Prentice, Vice-Chancellor

The 2023 Global Innovation Index (GII) – which evaluates the top-level innovative capacity of countries and economies, and identifies local concentrations of world-leading activity ­– has named Cambridge as the number one science and technological (S&T) cluster by intensity, in relation to its size, unchanged from the 2022 Index. San Jose, San Francisco, in the USA, was named second and Oxford third.

S&T clusters are established by analysing patent-filing activity and scientific article publication, and documenting the geographical areas around the world with the highest density of inventors and scientific authors.

According to the Index – which will be published in full on 27 September – the Cambridge cluster filed 6,582 Patent Cooperation Treaty (PCT) patent applications and published 37,136 scientific articles, both per 1 million inhabitants, over the past 5 years.

The University of Cambridge sits at the heart of the ‘Cambridge cluster’, powering world-leading research, driving a thriving ecosystem of hundreds of spinout and start-up companies, and nurturing an environment for business services and investment.

Earlier this year, a new report by leading consultants London Economics showed that the University adds nearly £30 billion to the economy every year and supports more than 86,000 jobs across the UK.

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, said: “Cambridge is a truly extraordinary place, where leading scientists work side by side with industry and academic partners, sparking ideas and creating life-changing medicines, technologies and services. I am thrilled to see it again recognised as one of the greatest innovation hubs on the planet.”

Dr Diarmuid O’Brien, Chief Executive, Cambridge Enterprise, said: “It is fantastic to see this continued recognition of Cambridge’s success as the world’s most intensive science and technological cluster. The Cambridge innovation ecosystem is home to a unique and driven community of exceptional science, people, companies and partners, tackling global challenges and changing lives.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Machine learning models can produce reliable results even with limited training data

Digital generated image of multi coloured glowing data over landscape.

source: www.cam.ac.uk

Researchers have determined how to build reliable machine learning models that can understand complex equations in real-world situations while using far less training data than is normally expected.

It’s surprising how little data you need to end up with a reliable modelNicolas Boullé

The researchers, from the University of Cambridge and Cornell University, found that for partial differential equations – a class of physics equations that describe how things in the natural world evolve in space and time – machine learning models can produce reliable results even when they are provided with limited data.

Their results, reported in the Proceedings of the National Academy of Sciences, could be useful for constructing more time- and cost-efficient machine learning models for applications such as engineering and climate modelling.

Most machine learning models require large amounts of training data before they can begin returning accurate results. Traditionally, a human will annotate a large volume of data – such as a set of images, for example – to train the model.

“Using humans to train machine learning models is effective, but it’s also time-consuming and expensive,” said first author Dr Nicolas Boullé, from the Isaac Newton Institute for Mathematical Sciences. “We’re interested to know exactly how little data we actually need to train these models and still get reliable results.”

Other researchers have been able to train machine learning models with a small amount of data and get excellent results, but how this was achieved has not been well-explained. For their study, Boullé and his co-authors, Diana Halikias and Alex Townsend from Cornell University, focused on partial differential equations (PDEs).

“PDEs are like the building blocks of physics: they can help explain the physical laws of nature, such as how the steady state is held in a melting block of ice,” said Boullé, who is an INI-Simons Foundation Postdoctoral Fellow. “Since they are relatively simple models, we might be able to use them to make some generalisations about why these AI techniques have been so successful in physics.”

The researchers found that PDEs that model diffusion have a structure that is useful for designing AI models. “Using a simple model, you might be able to enforce some of the physics that you already know into the training data set to get better accuracy and performance,” said Boullé.

The researchers constructed an efficient algorithm for predicting the solutions of PDEs under different conditions by exploiting the short and long-range interactions happening. This allowed them to build some mathematical guarantees into the model and determine exactly how much training data was required to end up with a robust model.

“It depends on the field, but for physics, we found that you can actually do a lot with a very limited amount of data,” said Boullé. “It’s surprising how little data you need to end up with a reliable model. Thanks to the mathematics of these equations, we can exploit their structure to make the models more efficient.”

The researchers say that their techniques will allow data scientists to open the ‘black box’ of many machine learning models and design new ones that can be interpreted by humans, although future research is still needed.

“We need to make sure that models are learning the right things, but machine learning for physics is an exciting field – there are lots of interesting maths and physics questions that AI can help us answer,” said Boullé.

Reference

Nicolas Boullé, Diana Halikias, and Alex Townsend. ‘Elliptic PDE learning is provably data-efficient.’ PNAS (2023). DOI: 10.1073/pnas.2303904120



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Zero takes centre stage at Climate Week NYC

Photograph of New York skyline

source: www.cam.ac.uk

Cambridge Zero Director Professor Emily Shuckburgh will take centre stage at the world’s biggest climate event of its kind in New York, where she will talk to global leaders of government, business and philanthropy about Cambridge’s efforts to tackle climate change.

Now is the time for Cambridge and the rest of the world to turn ambition into action in the race to accelerate the pace of a just transition to a net zero worldProf Emily Shuckburgh

Professor Shuckburgh (Trinity and Darwin) will be at Climate Week NYC  (17-24 September) to discuss how to address the challenges the world faces in keeping global temperatures to 1.5 degrees Celsius. She will be at the Opening Ceremony on Sunday 17 September alongside world government, business, science and policy leaders and appear with top climate scientists in the opening video. You can view the Opening Ceremony and video live by registering for it here.

Professor Shuckburgh will be appearing on Climate Week NYC’s Hub Live on Tuesday 19 September with Helen Clarkson (Corpus Christi 1993) Chief Executive Officer of the Climate Group which organises Climate Week, with Kate Brandt (Selwyn 2007) Chief Sustainability Officer of Google and Judith Weise Chief People & Sustainability Officer of Siemens AG to discuss the innovation and investment needed to achieve net zero. You can register to view Tuesday’s panel online through this link: New frontiers of Climate Action.

“Now is the time for Cambridge and the rest of the world to turn ambition into action in the race to accelerate the pace of a just transition to a net zero world and New York will be buzzing with the kinds of people who can make that happen,” Professor Shuckburgh said.    

Climate Week NYC takes place in partnership with the United Nations General Assembly and is run in coordination with the United Nations and the City of New York. It is the largest annual climate event of its kind, bringing together some 400 events and activities across the City of New York – in person, hybrid and online.

This year it centres around the UN General Assembly, the UN Secretary-General’s Climate Ambition Summit as well as hundreds of national government, business and climate group initiatives, making it a unique opportunity for Cambridge to communicate with the world.

On Wednesday evening, just hours after the UN Secretary-General’s Climate Ambition Summit is concluded at the nearby headquarters of the United Nations, Professor Shuckburgh will lead a discussion for alumni in New York, hosted by Cambridge in America at the Morgan Library, about the technological and behavioural solutions available to build a sustainable future for the whole planet.

Professor Shuckburgh will be joined at the alumni event by Professor of Planetary Computing Anil Madhavapeddy (Pembroke) and Fiona Macklin (St John’s 2012), Senior Adviser to Groundswell, a joint initiative between Bezos Earth Fund, Global Optimism and the Systems Change Lab. The panel will be chaired by Professor Matthew Connelly, the new Director of the Centre for the Study of Existential Risk at the University of Cambridge. 

Book online here to see Mission Possible: Creating a Better Planetary Future.

“Our alumni network is one of Cambridge’s greatest pillars of support and with their help the University is able to amplify its work, linking one of the world’s top research universities to peer institutions, policymakers and business leaders,” Professor Shuckburgh said.   

Throughout the visit to New York, Cambridge Zero will seek to respond to news and relevant climate announcements with the help of an assembled Cambridge Climate Media team of academics at the University.     



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Roadside hedges can reduce harmful ultrafine particle pollution around schools

Monitoring particle air pollution either side of the tredge installed at St Ambrose primary school, Manchester

source: www.cam.ac.uk

A new study led by Cambridge University confirms that planting hedges between roadsides and school playgrounds can dramatically reduce children’s exposure to traffic-related particle pollution.

Our findings show that hedges can provide a simple, cheap and effective way to help reduce exposure to local sources of pollutionHassan Sheikh

The research, a collaboration with Lancaster University, found that hedges can act as protective barriers against air pollution from major city roads by soaking up significant quantities of harmful particles emitted by traffic.

The researchers applied a new type of pollution analysis, using magnetism to study particles trapped by a hedge separating a major 6-lane road from a primary school in Manchester, UK. They found that the hedge was especially successful in removing ultrafine particle pollution, which can be more damaging to health.

“Our findings show that hedges can provide a simple, cheap and effective way to help reduce exposure to local sources of pollution,” said lead author Hassan Sheikh from Cambridge’s Department of Earth Sciences.

The new study differs from conventional air pollution studies because the researchers specifically measured magnetic particles, which originate from vehicle exhaust and the wearing of brake pads and tyres. That allowed them to distinguish local traffic pollution from other sources of air pollution.

In England alone, epidemiological studies estimate that 26,000 to 38,000 deaths and thousands of NHS hospital admissions are linked to dust-like particles carried in air pollution — much of which is generated by heavy traffic in urban environments.

This particle pollution — or particulate matter — is made up of a variety of chemical compounds, metals and other materials, some of which are toxic. The bigger particles (which are still tiny) measure less than 10 microns in diameter (called PM10) and are easily inhaled. Finer particles of less than 2.5 microns across (PM2.5) can penetrate deeper into the lungs and are small enough to enter the bloodstream.

Children attending schools next to busy roads are especially vulnerable to the effects of air pollution because their airways are still developing and they breath faster than adults.

Sheikh and the team studied magnetic particles captured by a western red cedar ‘tredge’ (trees managed at head-height) which was previously installed outside St Ambrose Primary School as part of a trial led by Lancaster University.

“Western red cedar does a great job in ‘capturing ‘ particulate pollution because it has abundant, fine, evergreen leaves into which airborne particles bump and then settle from the roadside air,” said study co-author Professor Barbara Maher from the University of Lancaster who led the previous research.

Sheikh and the team measured particles of varying sizes on the leaves of the tredge and used air filters to measure particle abundance at intervals downwind toward the school playground.

They also developed a new experiment that used a tracer gas to understand how ultrafine particles (measuring less that 2.5 microns) moved through and were trapped by the tredge.

Their results revealed that there was a substantial reduction in particle pollution downwind of the tredge. “The tredge acts as a permeable barrier, intercepting and capturing particles effectively on its leaves,” said Sheikh.

In the school playground, 30 metres from the road, they measured a 78% decrease in PM10 relative to roadside air.

They noticed that this removal was even more efficient for ultrafine PM2.5 particles. “What was remarkable was just how efficiently the tredge hoovered up the very finest particles,” said senior author Professor Richard Harrison, also from Cambridge’s Department of Earth Sciences. They measured an 80% reduction in the ultrafine particles just behind the tredge.

They think the ultrafine particles are preferentially filtered out by the tredge because they have a higher likelihood of being captured on the ridged surfaces of the red cedar leaves than coarser particles.

However, they did note a slight uptick in levels of magnetic PM2.5 in the playground, although they were still 63% below roadside air. “The ultrafine particles were very effectively removed, but this shows that some air still goes over or around the tredge,” said Sheikh. Less is currently known about how particulate matter moves and disperses at this higher level, where air mixes around buildings and trees.

“That means the design and placement of tredges near playgrounds and schools should be carefully considered so that their ability to soak up particles can be used to maximum effect,” said Harrison.

Cllr Tracey Rawlins, Executive Member for Environment for Manchester City Council, said: “We were keen to be part of this study as Manchester seeks to embrace innovation in our efforts to become a greener city with cleaner air and tackle climate change.

“The findings underline the contribution which nature-based innovations can make to rising to that challenge. We look forward to delivering more green screens as well as many trees at school sites, complementing our education climate change strategy,” said Rawlins.

Previously, Sheikh and Harrison used their new magnetic analysis to identify high levels of ultrafine particles polluting the London Underground. They now plan on working with colleagues at the MRC Toxicology Unit in Cambridge to find out what happens when cells are exposed to this type of ultrafine particle pollution.

Reference:
Sheikh, H. A., Maher, B. A., Woods, A. W., Tung, P. Y., & Harrison, R. J. (2023). Efficacy of green infrastructure in reducing exposure to local, traffic-related sources of airborne particulate matter (PM). Science of the Total Environment, 166598.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge researchers announced as programme directors for new UK funding agency

Abstract image

ARIA, the UK’s new R&D funding agency, has announced its line-up of new programme directors – and three of them are current or former researchers from the University of Cambridge.

It’s a remarkable reflection of the quality of Cambridge research that three of the eight new programme directors stem from our UniversityAnne Ferguson-Smith

ARIA is a government-funded agency that aims to unlock scientific and technological breakthroughs that could benefit everyone. Many of society’s most important advances have stemmed from those with the foresight to pursue new capabilities that most believed to be unattainable. ARIA aims to empower scientists and engineers with the resources and freedom to pursue those breakthroughs.

The new programme directors include:

Gemma Bale

Gemma is the Gianna Angelopoulos Assistant Professor in Medical Therapeutics and Head of the Neuro Optics Lab at the University of Cambridge. Her work focuses on developing non-invasive brain monitoring in real-world environments where traditional brain monitoring isn’t usually possible.

Sarah Bohndiek

Sarah is a Professor of Biomedical Physics at the University of Cambridge, jointly appointed in the Department of Physics and the Cancer Research UK Cambridge Institute. Sarah leads an interdisciplinary team that uses optical imaging technology to monitor in situ tumour evolution and support earlier cancer detection.

Angie Burnett

Angie is a plant biologist, focused on investigating the responses of crop plants to environmental stresses, such as drought and extreme temperature. Angie worked as a Postdoctoral Research Associate at Brookhaven National Laboratory and a Consultant at the Food and Agriculture Organization of the United Nations, before becoming a Research Associate at the University of Cambridge.

Writing on the ARIA website, Professor Bohndiek and Dr Bale said: “We’re both passionate about the future health of our planet and the people on it. Working in the health tech space, we have created new tools to allow us to safely see inside humans in new ways using light. We believe that there are emerging optical technologies at the edge of the possible, which will disrupt the current landscape.

“As co-PDs, we’ll look to accelerate these technologies, initially by exploring ideas around non-invasive optical mapping and sensing across a range of applications – from monitoring human health to climate change.”

Professor Anne Ferguson-Smith, Pro-Vice-Chancellor for Research at the University of Cambridge, said: “The launch of ARIA, a brand new funding organisation, is an important moment for UK research and innovation. It’s a remarkable reflection of the quality of Cambridge research that three of the eight new programme directors stem from our University, including two current academics. We wish them luck in this exciting new endeavour.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Over a third of UK medical students do not receive sexual misconduct training

Nurses walking down a corridor

source: www.cam.ac.uk

More than a third of newly qualified doctors are leaving UK medical schools without any education on sexual misconduct specifically relating to the medical profession according to new research led by researchers at the University of Cambridge.

Our study shows it cannot be assumed that graduates who are working as junior doctors have received training on sexual misconduct before starting their rolesSarah Steele

The study, published today in JRSM Open and based on responses by the UK’s 34 medical schools to Freedom of Information requests, shows there is no standardisation of training on sexual misconduct across medical universities.

Almost half of medical schools offered no training, or only generalised harassment training that was not specific to sexual misconduct or that was wholly outside the context of being a doctor.

The researchers point to research findings earlier this year that showed training for NHS staff on sexual harassment intervention is lacking, while other reports highlighted 20,000 incidents of sexual misconduct in the NHS, leading to healthcare workers leaving their professions. In August 2023, the General Medical Council issued new professional standards for doctors which, for the first time, included explicit rules on sexual misconduct.

Medical students, as future clinicians, have a crucial and strategic need for education that allows them to perform a critical role in exhibiting good behaviour, and intervening, identifying, assessing and reporting sexual misconduct when they see it happening at work or in wider society, say the researchers.

Lead researcher Dr Sarah Steele of the University of Cambridge and Jesus College, Cambridge, commented: “Our study shows it cannot be assumed that graduates who are working as junior doctors have received training on sexual misconduct before starting their roles. Considering the magnitude of this issue, universities and professional bodies should urgently address this problem.”

The researchers point to serious shortcomings within the health sector in preventing and addressing sexual misconduct, with costs in damages to address sexual misconduct in the NHS exceeding £4 million in the last five years.

In the medical schools where compulsory sexual misconduct training was provided, a wide range of delivery methods were adopted, with workshops and lectures being the main delivery approach.

Dr Steele said: “With such significant variations in the context and format of teaching, it is important to research which methods and content are most effective in improving these future clinicians’ responses to this form of abuse and discrimination.

“The latest GMC professional standards make it imperative medical schools offer this training. Tomorrow’s doctors need to be trained properly if we are to put in place the zero-tolerance approach.”

According to the study, the curriculum analysis within the study was significantly hampered by several medical schools refusing to provide information on the basis of it being proprietary knowledge. “The idea that public universities offering medical education in accordance with the General Medical Council requirements are in competition, such that they do not share curricular and do not engage in knowledge exchange, is concerning to say the least,” added Dr Steele.

Reference
Dowling, T and Steele, S. Is sexual misconduct training sufficient in the UK’s medical schools: Results of a cross-sectional survey and opportunities for improvement. JRSM Open; 12 Sept 2023; DOI: 10.1177/20542704231198732

Press release from the Royal Society of Medicine



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Healthy lifestyle can help prevent depression – and new research may explain why

A group of people standing around a table with plates of food

source: www.cam.ac.uk

A healthy lifestyle that involves moderate alcohol consumption, a healthy diet, regular physical activity, healthy sleep and frequent social connection, while avoiding smoking and too much sedentary behaviour, reduces the risk of depression, new research has found.

Although our DNA – the genetic hand we’ve been dealt – can increase our risk of depression, we’ve shown that a healthy lifestyle is potentially more important.Barbara Sahakian

In research published today in Nature Mental Health, an international team of researchers, including from the University of Cambridge and Fudan University, looked at a combination of factors including lifestyle factors, genetics, brain structure and our immune and metabolic systems to identify the underlying mechanisms that might explain this link.

According to the World Health Organization, around one in 20 adults experiences depression, and the condition poses a significant burden on public health worldwide. The factors that influence the onset of depression are complicated and include a mixture of biological and lifestyle factors.

To better understand the relationship between these factors and depression, the researchers turned to UK Biobank, a biomedical database and research resource containing anonymised genetic, lifestyle and health information about its participants.

By examining data from almost 290,000 people – of whom 13,000 had depression – followed over a nine-year period, the team was able to identify seven healthy lifestyle factors linked with a lower risk of depression. These were:

  • moderate alcohol consumption
  • healthy diet
  • regular physical activity
  • healthy sleep
  • never smoking
  • low-to-moderate sedentary behaviour
  • frequent social connection

Of all of these factors, having a good night’s sleep – between seven and nine hours a night – made the biggest difference, reducing the risk of depression, including single depressive episodes and treatment-resistant depression, by 22%.

Frequent social connection, which in general reduced the risk of depression by 18%, was the most protective against recurrent depressive disorder.

Moderate alcohol consumption decreased the risk of depression by 11%, healthy diet by 6%, regular physical activity by 14%, never smoking by 20%, and low-to-moderate sedentary behaviour by 13%.

Based on the number of healthy lifestyle factors an individual adhered to, they were assigned to one of three groups: unfavourable, intermediate, and favourable lifestyle. Individuals in the intermediate group were around 41% less likely to develop depression compared to those in the unfavourable lifestyle, while those in the favourable lifestyle group were 57% less likely.

The team then examined the DNA of the participants, assigning each a genetic risk score. This score was based on the number of genetic variants an individual carried that have a known link to risk of depression. Those with the lowest genetic risk score were 25% less likely to develop depression when compared to those with the highest score – a much smaller impact than lifestyle.

In people at high, medium, and low genetic risk for depression, the team further found that a healthy lifestyle can cut the risk of depression. This research underlines the importance of living a healthy lifestyle for preventing depression, regardless of a person’s genetic risk.

Professor Barbara Sahakian, from the Department of Psychiatry at the University of Cambridge, said: “Although our DNA – the genetic hand we’ve been dealt – can increase our risk of depression, we’ve shown that a healthy lifestyle is potentially more important.

“Some of these lifestyle factors are things we have a degree control over, so trying to find ways to improve them – making sure we have a good night’s sleep and getting out to see friends, for example – could make a real difference to people’s lives.”

To understand why a healthy lifestyle might reduce the risk of depression, the team studied a number of other factors.

First off, they examined MRI brain scans from just under 33,000 participants and found a number of regions of the brain where a larger volume – more neurons and connections – was linked to a healthy lifestyle. These included the pallidum, thalamus, amygdala and hippocampus.

Next, the team looked for markers in the blood that indicated problems with the immune system or metabolism (how we process food and produce energy). Among those markers found to be linked to lifestyle were the C-reactive protein, a molecule produced in the body in response to stress, and triglycerides, one of the primary forms of fat that the body uses to store energy for later.

These links are supported by a number of previous studies. For example, exposure to stress in life can affect how well we are able to regulate blood sugar, which may lead to a deterioration of immune function and accelerate age-related damage to cells and molecules in the body. Poor physical activity and lack of sleep can damage the body’s ability to respond to stress. Loneliness and lack of social support have been found to increase the risk of infection and increase markers of immune deficiency.

The team found that the pathway from lifestyle to immune and metabolic functions was the most significant. In other words, a poorer lifestyle impacts on our immune system and metabolism, which in turn increases our risk of depression.

Dr Christelle Langley, also from the Department of Psychiatry at the University of Cambridge, said: “We’re used to thinking of a healthy lifestyle as being important to our physical health, but it’s just as important for our mental health. It’s good for our brain health and cognition, but also indirectly by promoting a healthier immune system and better metabolism.”

Professor Jianfeng Feng, from Fudan University and Warwick University, added: “We know that depression can start as early as in adolescence or young adulthood, so educating young people on the importance of a healthy lifestyle and its impact on mental health should begin in schools.”

This study was supported by grants from organisations including the National Natural Science Foundation of China and the Ministry of Science, China*.

Reference
Zhao, Y & Yang, L et al. The brain structure, immunometabolic and genetic mechanisms underlying the association between lifestyle and depression. Nature Mental Health; 11 Sept 2023; DOI: 10.1038/s44220-023-00120-1

*A full list of funders can be found in the paper.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Lack of evidence hampers progress on corporate-led ecosystem restoration

Degraded coral reef 'rubblefield' in Indonesia.

source: www.cam.ac.uk

A near total lack of transparency is making it impossible to assess the quality of corporate-led ecosystem restoration projects, a new study finds.

The world’s largest corporations have the potential to lift ecosystem restoration efforts to an unprecedented scale. But their involvement has to be managed with proper evidence and accountability, to make sure the outcomes are beneficial and fair for everyone.Rachael Garrett

An international team of scientists has analysed publicly available sustainability reports released by 100 of the world’s largest companies and found that despite many businesses claiming to actively rebuild damaged ecosystems, very little is known about what is actually being achieved.

Efforts to rebuild degraded environments are vital for achieving global biodiversity targets, and corporate-led projects offer huge potential to restore damaged and lost ecosystems around the globe.  

But the study reveals that over 90 percent of corporate-led restoration projects fail to report a single ecological outcome. Around 80 percent of projects do not reveal how much money is invested in restoration, and a third fail to even state the area of habitat that they aim to restore.

The United Nations has launched a Decade on Ecosystem Restoration, and in recent years businesses around the world have collectively pledged to plant billions of trees, hundreds of thousands of corals and tens of thousands of mangroves. Around two thirds of the top 100 largest global corporations undertake ecosystem restoration.

The study is published in the journal Science.

“Ultimately, if big businesses are going to contribute effectively to the UN Decade on Ecosystem Restoration, there needs to be transparency and consistency in reporting,” said Professor Rachael Garrett, Moran Professor of Conservation and Development at the University of Cambridge Conservation Research Institute, and a co-author of the report.

She added: “This is in the interest of the businesses themselves, who stand to gain from demonstrating to their customers, shareholders, employees and the wider public that they are making meaningful impacts with their declared restoration efforts.

“The world’s largest corporations have the potential to lift ecosystem restoration efforts to an unprecedented scale. But their involvement has to be managed with proper evidence and accountability, to make sure the outcomes are beneficial and fair for everyone.”

Many countries require businesses to conduct Environmental Impact Assessments (EIAs) to quantify and reduce their environmental damage, and other private-sector initiatives also encourage companies to measure and disclose their biodiversity impacts. However, the study finds that current guidelines and legal frameworks around ecosystem restoration are inadequate, and are not yet resulting in appropriate reporting by businesses.

The researchers are calling for more transparency around the reporting of corporate-led ecosystem restoration projects, and for reporting to be more consistently centred around scientific principles that determine ecosystem restoration success.

“Restoring degraded ecosystems is an urgent challenge for this decade, and big businesses have the potential to play a vital role,” said Dr Tim Lamont at Lancaster University, lead author of the study.

He added: “With their size, resources and logistics expertise, they could help deliver the large-scale restoration we need in many places. But at the moment there is very little transparency, which makes it hard for anyone to assess if projects are delivering benefits for ecosystems or people.

“When a business says it has planted thousands of trees to restore habitat and soak up carbon – how do we know if this has been delivered, if the trees will survive, and if it has resulted in a functioning ecosystem that benefits biodiversity and people? In many cases, we’ve found that the evidence provided by large corporations to support their claims is insufficient.”

The researchers say new improved reporting guidelines around ecosystem restoration should:

  • Recommend that companies clearly differentiate between restoration activities that merely mitigate the negative environmental impacts of a business’ operations from those that aim to provide wider climate, biodiversity and social justice outcomes.
  • Recommend a principle-based approach, drawing from conservation science, for planning and reporting, so that restoration projects in a range of different contexts can all maintain high standards across core areas.
  • Ensure corporations engage with and empower local stakeholders to co-design restoration projects from the outset.

Reference

Lamont, T, et al: ‘Hold big business to task on ecosystem restoration: corporate reporting must embrace holistic principles from restoration science.’ Sept 2023, Science. DOI: 10.1126/science.adh2610

Adapted from a press release by Lancaster University.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Switching ‘spin’ on and off (and up and down) in quantum materials at room temperature

Artist's impression of aligned spins in an organic semiconductor

source: www.cam.ac.uk

Researchers have found a way to control the interaction of light and quantum ‘spin’ in organic semiconductors, that works even at room temperature.

These new materials hold great promise for completely new applications, since we’ve been able to remove the need for ultra-cold temperaturesSebastian Gorgon

Spin is the term for the intrinsic angular momentum of electrons, which is referred to as up or down. Using the up/down spin states of electrons instead of the 0 and 1 in conventional computer logic could transform the way in which computers process information. And sensors based on quantum principles could vastly improve our abilities to measure and study the world around us.

An international team of researchers, led by the University of Cambridge, has found a way to use particles of light as a ‘switch’ that can connect and control the spin of electrons, making them behave like tiny magnets that could be used for quantum applications.

The researchers designed modular molecular units connected by tiny ‘bridges’. Shining a light on these bridges allowed electrons on opposite ends of the structure to connect to each other by aligning their spin states. Even after the bridge was removed, the electrons stayed connected through their aligned spins.

This level of control over quantum properties can normally only be achieved at ultra-low temperatures. However, the Cambridge-led team has been able to control the quantum behaviour of these materials at room temperature, which opens up a new world of potential quantum applications by reliably coupling spins to photons. The results are reported in the journal Nature.

Almost all types of quantum technology – based on the strange behaviour of particles at the subatomic level – involve spin. As they move, electrons usually form stable pairs, with one electron spin up and one spin down. However, it is possible to make molecules with unpaired electrons, called radicals. Most radicals are very reactive, but with careful design of the molecule, they can be made chemically stable.

“These unpaired spins change the rules for what happens when a photon is absorbed and electrons are moved up to a higher energy level,” said first author Sebastian Gorgon, from Cambridge’s Cavendish Laboratory. “We’ve been working with systems where there is one net spin, which makes them good for light emission and making LEDs.”

Gorgon is a member of Professor Sir Richard Friend’s research group, where they have been studying radicals in organic semiconductors for light generation, and identified a stable and bright family of materials a few years ago. These materials can beat the best conventional OLEDs for red light generation.

“Using tricks developed by different fields was important,” said Dr Emrys Evans from Swansea University, who co-led the research. “The team has significant expertise from a number of areas in physics and chemistry, such as the spin properties of electrons and how to make organic semiconductors work in LEDs. This was critical for knowing how to prepare and study these molecules in the solid state, enabling our demonstration of quantum effects at room temperature.”

Organic semiconductors are the current state-of-the-art for lighting and commercial displays, and they could be a more sustainable alternative to silicon for solar cells. However, they have not yet been widely studied for quantum applications, such as quantum computing or quantum sensing.

“We’ve now taken the next big step and linked the optical and magnetic properties of radicals in an organic semiconductor,” said Gorgon. “These new materials hold great promise for completely new applications, since we’ve been able to remove the need for ultra-cold temperatures.”

“Knowing what electron spins are doing, let alone controlling them, is not straightforward, especially at room temperature,” said Friend, who co-led the research. “But if we can control the spins, we can build some interesting and useful quantum objects.”

The researchers designed a new family of materials by first determining how they wanted the electron spins to behave. Using this bottom-up approach, they were able to control the properties of the end material by using a building block method and changing the ‘bridges’ between different modules of the molecule. These bridges were made of anthracene, a type of hydrocarbon.

For their ‘mix-and-match’ molecules, the researchers attached a bright light-emitting radical to an anthracene molecule. After a photon of light is absorbed by the radical, the excitation spreads out onto the neighbouring anthracene, causing three electrons to start spinning in the same way. When a further radical group is attached to the other side of the anthracene molecules, its electron is also coupled, bringing four electrons to spin in the same direction. 

“In this example, we can switch on the interaction between two electrons on opposite ends of the molecule by aligning electron spins on the bridge absorbing a photon of light,” said Gorgon. “After relaxing back, the distant electrons remember they were together even after the bridge is gone.

“In these materials we’ve designed, absorbing a photon is like turning a switch on. The fact that we can start to control these quantum objects by reliably coupling spins at room temperature could open up far more flexibility in the world of quantum technologies. There’s a huge potential here to go in lots of new directions.”

“People have spent years trying to get spins to reliably talk to each other, but by starting instead with what we want the spins to do and then the chemists can design a molecule around that, we’ve been able to get the spins to align,” said Friend. “It’s like we’ve hit the Goldilocks zone where we can tune the spin coupling between the building blocks of extended molecules.”

The advance was made possible through a large international collaboration – the materials were made in China, experiments were done in Cambridge, Oxford and Germany, and theory work was done in Belgium and Spain.

The research was supported in part by the European Research Council, the European Union, the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and the Royal Society. Richard Friend is a Fellow of St John’s College, Cambridge.

Reference:
Sebastian Gorgon et al. ‘Reversible spin-optical interface in luminescent organic radicals.’ Nature (2023). DOI: 10.1038/s41586-023-06222-1



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Reduced grey matter in frontal lobes linked to teenage smoking and nicotine addiction – study

Young people smoking

source: www.cam.ac.uk

Findings may demonstrate a brain and behavioural basis for how nicotine addiction is initiated and then takes hold in early life, say scientists. 

Smoking is perhaps the most common addictive behaviour in the world, and a leading cause of adult mortalityTrevor Robbins

Levels of grey matter in two parts of the brain may be linked to a desire to start smoking during adolescence and the strengthening of nicotine addiction, a new study has shown.

A team of scientists, led by the universities of Cambridge and Warwick in the UK and Fudan University in China, analysed brain imaging and behavioural data of over 800 young people at the ages of 14, 19 and 23.

They found that, on average, teenagers who started smoking by 14 years of age had markedly less grey matter in a section of the left frontal lobe linked to decision-making and rule-breaking. 

Grey matter is the brain tissue that processes information, and contains all of the organ’s neurons. While brain development continues into adulthood, grey matter growth peaks before adolescence.  

Low grey matter volume in the left side of the ventromedial prefrontal cortex may be an “inheritable biomarker” for nicotine addiction, say researchers – with implications for prevention and treatment. 

In addition, the scientists found that the opposite, right part of the same brain region also had less grey matter in smokers.

Importantly, loss of grey matter in the right prefrontal cortex appears to speed up only after someone has started smoking. This region is linked to the seeking of sensations.

The team argue that less grey matter in the left forebrain could lower cognitive function and lead to “disinhibition”: impulsive, rule-breaking behaviour arising from a limited ability to consider consequences. This may increase the chances of smoking at a young age.

Once a nicotine habit takes hold, grey matter in the right frontal lobe shrinks, which may weaken control over smoking by affecting “hedonic motivation”: the way pleasure is sought and managed. Excessive loss of grey matter in the right brain was also linked to binge drinking and marijuana use. 

Taken together, the findings point to a damaged “neurobehavioural mechanism” that can lead to nicotine use starting early and becoming locked into long-term addiction, say researchers. The study used data from the IMAGEN project and is published in the journal Nature Communications.

“Smoking is perhaps the most common addictive behaviour in the world, and a leading cause of adult mortality,” said Prof Trevor Robbins, co-senior author from Cambridge’s Department of Psychology.

“The initiation of a smoking habit is most likely to occur during adolescence. Any way of detecting an increased chance of this, so we can target interventions, could help save millions of lives.”

Annual deaths from cigarettes are expected to reach eight million worldwide by the end of the decade. Currently, one in five adult deaths each year are attributed to smoking in the US alone.

“In our study, reduced grey matter in the left prefrontal cortex is associated with increased rule-breaking behaviour as well as early smoking experiences. It could be that this rule-breaking leads to the violation of anti-smoking norms,” said Robbins.

Co-author Prof Barbara Sahakian from Cambridge’s Department of Psychiatry said: “The ventromedial prefrontal cortex is a key region for dopamine, the brain’s pleasure chemical. As well as a role in rewarding experiences, dopamine has long been believed to affect self-control. 

“Less grey matter across this brain region may limit cognitive function, leading to lower self-control and a propensity for risky behaviour, such as smoking.”

The study used data gathered by the IMAGEN project from sites in four European countries: UK, Germany, France and Ireland. The researchers compared brain imaging data for those who had smoked by age 14 with those who had not, and repeated this for the same participants at ages 19 and 23.

Those with smoking experience by 14 years of age had significantly less grey matter in the left prefrontal cortex, on average. Additionally, those who started smoking by age 19 also had less grey matter in their left prefrontal cortex at 14, indicating a potential causal influence.

The scientists also looked at the right ventromedial prefrontal cortex. Grey matter loss occurs in everyone as they age. However, those who smoked from age 14 as well as those smoking from age 19 both ended up with excessive grey matter loss in the right frontal lobe.

For the right prefrontal cortex, 19-year-old smokers who did not start during adolescence had similar grey matter levels at age 14 to those who never smoked at all. This suggests a rapid reduction in the right ventromedial prefrontal cortex only begins with the onset of smoking.

Data at age 23 showed that grey matter volume in the right prefrontal cortex shrank at a faster pace in those who continued to smoke, suggesting an influence of smoking itself on prefrontal function.

Researchers also analysed data from two questionnaires completed by participants to investigate the personality traits of novelty seeking and sensation seeking.

“Both questionnaires examine the pursuit of thrilling experiences, but they measure distinct behaviours,” said Robbins. “The sensation seeking scale focuses on pleasurable experiences, while the novelty seeking questionnaire includes items on impulsiveness and rule-breaking.”

Less grey matter in the left prefrontal cortex was associated with novelty seeking, particularly disorderly and rule-breaking behaviour, while reduced grey matter volume in the right prefrontal cortex was linked to sensation seeking only.

Lead author Prof Tianye Jia from Fudan University added: “Less grey matter in the left frontal lobes is linked to behaviours that increase the likelihood of smoking in adolescence.

“Smokers then experience excessive loss of grey matter in the right frontal lobes, which is linked to behaviours that reinforce substance use. This may provide a causal account of how smoking is initiated in young people, and how it turns into dependence.”

Reference:
Jia, Tianye; Xiang, Shintong et al. Association between vmPFC gray matter volume and smoking initiation in adolescents. Nature Communications; 15 Aug 2023; DOI: 10.1038/s41467-023-40079-2



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Death tolls from climate disasters will ‘balloon’ without investment in Africa’s weather stations

Drone shot in front of a spinning weather station, Free State, South Africa

source: www.cam.ac.uk

Investment in ‘hydromet systems’ using technologies from AI to SMS would provide a nine-to-one ROI in saved lives and assets across African nations.

Well-funded hydromet systems must become a priority to help at-risk populations mitigate and adapt to weather-related hazards as the effects of climate change take holdCatherine Richards

The climate crisis is increasing the frequency and intensity of floods, droughts and heatwaves, with Africa expected to be among the global regions hit hardest.

Yet the systems and technologies across the continent that monitor and forecast weather events and changes to water levels are “missing, outmoded or malfunctioning” – leaving African populations even more exposed to climate change.

This is according to a team of risk experts and climatologists from the UK and Africa led by the University of Cambridge, who warn that without major and rapid upgrades to “hydromet infrastructure”, the damage and death toll caused by climate-related disasters across Africa will “balloon”.

Writing in the journal Nature, the authors point to latest research showing that – over the last two decades – the average number of deaths caused by a flooding event in Africa is four times higher than the European and North American average per flood.

When investigating this disparity, the team looked at World Meteorological Organization (WMO) data and found the entire continent of Africa has just 6% of the number of radar stations as the US and Europe’s combined total, despite having a comparable population size and a third more land.*

Radar stations detect weather fluctuations and rainfall as well as long-term climate trends, and are vital for the forewarning of impending floods and other meteorological events. The African continent has just 37 such stations.

Moreover, WMO data shows that more than 50% of the radar stations that do currently operate across Africa are unable to produce accurate enough data to predict weather patterns for the coming days or even hours. 

The research team call on the international community to boost funding for systems that mitigate risks to life from climate disasters. Currently, just US $0.47 of every $100 spent on global development aid goes towards disaster risk reduction of any kind.

“The vast gaps in Africa’s disaster reduction systems are in danger of rendering other aid investments redundant,” said Dr Asaf Tzachor, co-lead author and research affiliate at Cambridge’s Centre for the Study of Existential Risk (CSER).

“For example, there is little point investing in smallholder farms if floods are simply going to wash away seeds, agrochemicals, and machinery.”

“We need to offer all Africans a chance to reduce their exposure to climate risks by fixing this glaring hydro-meteorological blind spot, before ever more lives are lost to the effects of global heating.”

To illustrate their point, the team compare two recent category 4 storms: Tropical Cyclone Idai hit southeast Africa in 2019, and Hurricane Ida swept the eastern US in 2021. Both had wind speeds of over 200km/hour.

US populations received evacuation alerts before Ida hit land, but the limited ‘hydromet’ capabilities meant Idai caught African nations by surprise. The US death toll was under a hundred, while over a thousand Africans lost their lives.

“Multilayered hydromet systems, including weather monitoring, forecasting and early warning, are taken for granted by the Global North, and have been for decades,” said co-lead author Dr Catherine Richards, also from CSER at the University of Cambridge.  

“Meanwhile, the most foundational layer on which the others depend is often missing, outmoded or malfunctioning across Africa – more so than any other global region.”

“Well-funded hydromet systems must become a priority to help at-risk populations mitigate and adapt to weather-related hazards as the effects of climate change take hold,” Richards said.

The team outline a series of recommendations for plugging Africa’s weather-warning gap.

Firstly, identify the most at-risk areas. “Types of climate hazard vary wildly across the continent – from the cyclones in Madagascar to the protracted droughts of east Africa,” said Tzachor.

“The need for more weather stations across Africa is undeniable, but this must go hand-in-hand with improved satellite monitoring and major training initiatives to increase the number of skilled African meteorologists.”

The latest computational techniques must be adopted, say the authors, including automated AI approaches that combine weather data with social media activity to predict disaster dynamics.

Early warning systems need to be expanded, and provide clear directions to evacuate in local dialects. “Over 80% of Africans have access to a mobile network, so text messages could be a powerful way to deliver targeted warnings,” said Richards.

Finally, major investment will be vital – and pay dividends. “The World Bank has estimated a $1.5 billion price tag for continent-wide hydromet systems, but it would save African countries from $13 billion in asset losses and $22 billion in livelihood losses annually,” said Tzachor. “A nearly nine-to-one return on investment is surely a no-brainer.”   


* In Europe and the US, there are 636 radar stations for a total population of 1.1 billion and a landmass of 20 million km². In Africa, there are just 37 for a comparable population of 1.2 billion and landmass of 30 million km².



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

How sure is sure? Incorporating human error into machine learning

Futuristic image of a doctor looking at brain scans

source: www.cam.ac.uk

Researchers are developing a way to incorporate one of the most human of characteristics – uncertainty – into machine learning systems.

Uncertainty is central in how humans reason about the world but many AI models fail to take this into accountKatherine Collins

Human error and uncertainty are concepts that many artificial intelligence systems fail to grasp, particularly in systems where a human provides feedback to a machine learning model. Many of these systems are programmed to assume that humans are always certain and correct, but real-world decision-making includes occasional mistakes and uncertainty.

Researchers from the University of Cambridge, along with The Alan Turing Institute, Princeton, and Google DeepMind, have been attempting to bridge the gap between human behaviour and machine learning, so that uncertainty can be more fully accounted for in AI applications where humans and machines are working together. This could help reduce risk and improve trust and reliability of these applications, especially where safety is critical, such as medical diagnosis.

The team adapted a well-known image classification dataset so that humans could provide feedback and indicate their level of uncertainty when labelling a particular image. The researchers found that training with uncertain labels can improve these systems’ performance in handling uncertain feedback, although humans also cause the overall performance of these hybrid systems to drop. Their results will be reported at the AAAI/ACM Conference on Artificial Intelligence, Ethics and Society (AIES 2023) in Montréal.

‘Human-in-the-loop’ machine learning systems – a type of AI system that enables human feedback – are often framed as a promising way to reduce risks in settings where automated models cannot be relied upon to make decisions alone. But what if the humans are unsure?

“Uncertainty is central in how humans reason about the world but many AI models fail to take this into account,” said first author Katherine Collins from Cambridge’s Department of Engineering. “A lot of developers are working to address model uncertainty, but less work has been done on addressing uncertainty from the person’s point of view.”

We are constantly making decisions based on the balance of probabilities, often without really thinking about it. Most of the time – for example, if we wave at someone who looks just like a friend but turns out to be a total stranger – there’s no harm if we get things wrong. However, in certain applications, uncertainty comes with real safety risks.

“Many human-AI systems assume that humans are always certain of their decisions, which isn’t how humans work – we all make mistakes,” said Collins. “We wanted to look at what happens when people express uncertainty, which is especially important in safety-critical settings, like a clinician working with a medical AI system.”

“We need better tools to recalibrate these models, so that the people working with them are empowered to say when they’re uncertain,” said co-author Matthew Barker, who recently completed his MEng degree at Gonville & Caius College, Cambridge. “Although machines can be trained with complete confidence, humans often can’t provide this, and machine learning models struggle with that uncertainty.”

For their study, the researchers used some of the benchmark machine learning datasets: one was for digit classification, another for classifying chest X-rays, and one for classifying images of birds. For the first two datasets, the researchers simulated uncertainty, but for the bird dataset, they had human participants indicate how certain they were of the images they were looking at: whether a bird was red or orange, for example. These annotated ‘soft labels’ provided by the human participants allowed the researchers to determine how the final output was changed. However, they found that performance degraded rapidly when machines were replaced with humans.

“We know from decades of behavioural research that humans are almost never 100% certain, but it’s a challenge to incorporate this into machine learning,” said Barker. “We’re trying to bridge the two fields so that machine learning can start to deal with human uncertainty where humans are part of the system.”

The researchers say their results have identified several open challenges when incorporating humans into machine learning models. They are releasing their datasets so that further research can be carried out and uncertainty might be built into machine learning systems.  

“As some of our colleagues so brilliantly put it, uncertainty is a form of transparency, and that’s hugely important,” said Collins. “We need to figure out when we can trust a model and when to trust a human and why. In certain applications, we’re looking at probability over possibilities. Especially with the rise of chatbots, for example, we need models that better incorporate the language of possibility, which may lead to a more natural, safe experience.”

“In some ways, this work raised more questions than it answered,” said Barker. “But even though humans may be miscalibrated in their uncertainty, we can improve the trustworthiness and reliability of these human-in-the-loop systems by accounting for human behaviour.”

The research was supported in part by the Cambridge Trust, the Marshall Commission, the Leverhulme Trust, the Gates Cambridge Trust and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

Reference:
Katherine M. Collins et al. ‘Human Uncertainty in Concept-Based AI Systems.’ Paper presented at the Sixth AAAI/ACM Conference on Artificial Intelligence, Ethics and Society (AIES 2023), August 8-10, 2023. Montréal, QC, Canada.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Dark energy could be measured by studying the galaxy next door

Artist's impression of the predicted collision between the Milky Way and Andromeda

source: www.cam.ac.uk

Researchers have found a new way to measure dark energy – the mysterious force that makes up more than two-thirds of the universe and is responsible for its accelerating expansion – in our own cosmic backyard.

Andromeda is the only galaxy that isn’t running away from us, so by studying its mass and movement, we may be able to make some determinations about dark energyDavid Benisty

The researchers, from the University of Cambridge, found that it may be possible to detect and measure dark energy by studying Andromeda, our galactic next-door neighbour that is on a slow-motion collision course with the Milky Way.

Since it was first identified in the late 1990s, scientists have used very distant galaxies to study dark energy but have yet to directly detect it. However, the Cambridge researchers found that by studying how Andromeda and the Milky Way are moving toward each other given their collective mass, they could place an upper limit on the value of the cosmological constant, which is the simplest model of dark energy. The upper limit they found is five times higher than the value of the cosmological constant that can be detected from the early universe.

Although the technique is still early in its development, the researchers say that it could be possible to detect dark energy by studying our own cosmic neighbourhood. The results are reported in The Astrophysical Journal Letters.

Everything we can see in our world and in the skies – from tiny insects to massive galaxies – makes up just five percent of the observable universe. The rest is dark: scientists believe that about 27% of the universe is made of dark matter, which holds objects together, while 68% is dark energy, which pushes objects apart.

“Dark energy is a general name for a family of models you could add to Einstein’s theory of gravity,” said first author Dr David Benisty from the Department of Applied Mathematics and Theoretical Physics. “The simplest version of this is known as the cosmological constant: a constant energy density that pushes galaxies away from each other.”

The cosmological constant was temporarily added by Einstein to his theory of general relativity. From the 1930s to the 1990s, the cosmological constant was set at zero, until it was discovered that an unknown force – dark energy – was causing the expansion of the universe to accelerate. There are at least two big problems with dark energy, however: we don’t know exactly what it is, and we haven’t directly detected it.

Since it was first identified, astronomers have developed a variety of methods to detect dark energy, most of which involve studying objects from the early universe and measuring how quickly they are moving away from us. Unpacking the effects of dark energy from billions of years ago is not easy: since it is a weak force between galaxies, dark energy is easily overcome by the much stronger forces inside galaxies.

However, there is one region of the universe that is surprisingly sensitive to dark energy, and it’s in our own cosmic backyard. The Andromeda galaxy is the closest to our own Milky Way, and the two galaxies are on a collision course. As they draw closer, the two galaxies will start to orbit each other – very slowly. A single orbit will take 20 billion years. However, due to the massive gravitational forces, well before a single orbit is complete, about five billion years from now, the two galaxies will start merging and falling into each other.  

“Andromeda is the only galaxy that isn’t running away from us, so by studying its mass and movement, we may be able to make some determinations about the cosmological constant and dark energy,” said Benisty, who is also a Research Associate at Queens’ College.

Using a series of simulations based on the best available estimates of the mass of both galaxies, Benisty and his co-authors – Professor Anne Davis from DAMTP and Professor Wyn Evans from the Institute of Astronomy – found that dark energy is affecting how Andromeda and the Milky Way are orbiting each other.

“Dark energy affects every pair of galaxies: gravity wants to pull galaxies together, while dark energy pushes them apart,” said Benisty. “In our model, if we change the value of the cosmological constant, we can see how that changes the orbit of the two galaxies. Based on their mass, we can place an upper bound on the cosmological constant, which is about five times higher than we can measure from the rest of the universe.”

The researchers say that while the technique could prove immensely valuable, it is not yet a direct detection of dark energy. Data from the James Webb Telescope (JWST) will provide far more accurate measurements of Andromeda’s mass and motion, which could help reduce the upper bounds of the cosmological constant.

In addition, by studying other pairs of galaxies, it could be possible to further refine the technique and determine how dark energy affects our universe. “Dark energy is one of the biggest puzzles in cosmology,” said Benisty. “It could be that its effects vary over distance and time, but we hope this technique could help unravel the mystery.”

Reference:
David Benisty, Anne-Christine Davis, and N. Wyn Evans. ‘Constraining Dark Energy from the Local Group Dynamics.’ The Astrophysical Journal Letters (2023). DOI: 10.3847/2041-8213/ace90b



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Stealth swimmers: the fish that hide behind others to hunt

Stealth swimmers: the fish that hide behind others to hunt

https://www.youtube-nocookie.com/embed/-8sfVAwznP8?wmode=opaque&controls=1&rel=0&autohide=0&enablejsapi=1&origin=https%3A%2F%2Fwww.cam.ac.uk

source: www.cam.ac.uk

An experiment on coral reefs provides the first evidence that predators use other animals for motion camouflage to approach their prey without detection.

The shadowing behaviour of the trumpetfish appears a useful strategy to improve its hunting success.James Herbert-Read

A new study provides the first experimental evidence that the trumpetfish, Aulostomus maculatus, can conceal itself by swimming closely behind another fish while hunting – and reduce the likelihood of being detected by its prey.

In this ‘shadowing’ behaviour, the long, thin trumpetfish uses a non-threatening species of fish, such as parrotfish, as camouflage to get closer to its dinner.

This is the only known example of one non-human animal using another as a form of concealment.

The research involved hours of diving in the Caribbean Sea, pulling hand-painted model fish along a wire.

“When a trumpetfish swims closely alongside another species of fish, it’s either hidden from its’ prey entirely, or seen but not recognised as a predator because the shape is different,” said Dr Sam Matchette, a researcher in the University of Cambridge’s Department of Zoology and first author of the study.

Damselfish, Stegastes partitus, form colonies on the seafloor and are a common meal for trumpetfish. Working amongst the coral reefs off the Dutch Caribbean island of Curaçao, researchers set up an underwater system to pull 3D-printed models of trumpetfish on nylon lines past colonies of damselfish, and filmed their responses.

When the trumpetfish model moved past alone, damselfish swam up to inspect – and rapidly fled back to shelter in response to the predatory threat.

When a model of a herbivorous parrotfish, Sparisoma viride, moved past alone, the damselfish inspected and responded far less.

When a trumpetfish model was attached to the side of a parrotfish model – to replicate the shadowing behaviour of the real trumpetfish – the damselfish responded just as they had to the parrotfish model alone: they had not detected the threat.

Matchette said: “I was surprised that the damselfish had such a profoundly different response to the different fish; it was great to watch this happening in real time.”

The study, involving collaborators at the University of Bristol, is published today in the journal Current Biology.

“Doing manipulative experiments in the wild like this allows us to test the ecological relevance of these behaviours,” said Professor Andy Radford in the University of Bristol’s School of Biological Sciences, and coauthor of the study.

Matchette, along with his co-author and dive buddy Christian Drerup, spent hours underwater, barely moving, to conduct their experiment.

Their earlier questioning of divers working at dive shops in the Caribbean revealed that trumpetfish are commonly seen swimming alongside parrotfish and other reef fish – but the reason for this remarkable behaviour had not been tested.

In addition, divers were much more likely to have seen the shadowing behaviour on degraded, less structurally complex reefs.

Coral reefs around the world are being degraded due to the warming climate, pollution and overfishing. The researchers say the strategy of hiding behind other moving fish may help animals adapt to the impacts of environmental change.

 “The shadowing behaviour of the trumpetfish appears a useful strategy to improve its hunting success. We might see this behaviour becoming more common in the future as fewer structures on the reef are available for them to hide behind,” said Dr James Herbert-Read in the University of Cambridge’s Department of Zoology, senior author of the study.

Human duck hunters historically hid behind cardboard cut-outs of domestic animals – called ‘stalking horses’ – to approach ducks without being detected. But this strategy has received little attention in non-human animals and has never been experimentally tested before. 

The research was funded by The Whitten Programme in Tropical and Aquatic Biology, The Association for the Study of Animal Behaviour, and The Fisheries Society of the British Isles.

Reference

Matchette, S R et al.: ‘Predatory trumpetfish conceal themselves from their prey by swimming alongside other fish.’ Current Biology, August 2023. DOI: 10.1016/j.cub.2023.05.075



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Robots cause company profits to fall – at least at first

Robots on a manufacturing line

source: www.cam.ac.uk

Researchers have found that robots can have a ‘U-shaped’ effect on profits: causing profit margins to fall at first, before eventually rising again.

It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch pointChander Velu

The researchers, from the University of Cambridge, studied industry data from the UK and 24 other European countries between 1995 and 2017, and found that at low levels of adoption, robots have a negative effect on profit margins. But at higher levels of adoption, robots can help increase profits.

According to the researchers, this U-shaped phenomenon is due to the relationship between reducing costs, developing new processes and innovating new products. While many companies first adopt robotic technologies to decrease costs, this ‘process innovation’ can be easily copied by competitors, so at low levels of robot adoption, companies are focused on their competitors rather than on developing new products. However, as levels of adoption increase and robots are fully integrated into a company’s processes, the technologies can be used to increase revenue by innovating new products.

In other words, firms using robots are likely to focus initially on streamlining their processes before shifting their emphasis to product innovation, which gives them greater market power via the ability to differentiate from their competitors. The results are reported in the journal IEEE Transactions on Engineering Management.

Robots have been widely used in industry since the 1980s, especially in sectors where they can carry out physically demanding, repetitive tasks, such as automotive assembly. In the decades since, the rate of robot adoption has increased dramatically and consistently worldwide, and the development of precise, electrically controlled robots makes them particularly useful for high-value manufacturing applications requiring greater precision, such as electronics.

While robots have been shown to reliably raise labour productivity at an industry or country level, what has been less studied is how robots affect profit margins at a similar macro scale.

“If you look at how the introduction of computers affected productivity, you actually see a slowdown in productivity growth in the 1970s and early 1980s, before productivity starts to rise again, which it did until the financial crisis of 2008,” said co-author Professor Chander Velu from Cambridge’s Institute for Manufacturing. “It’s interesting that a tool meant to increase productivity had the opposite effect, at least at first. We wanted to know whether there is a similar pattern with robotics.”

“We wanted to know whether companies were using robots to improve processes within the firm, rather than improve the whole business model,” said co-author Dr Philip Chen. “Profit margin can be a useful way to analyse this.”

The researchers examined industry-level data for 25 EU countries (including the UK, which was a member at the time) between 1995 and 2017. While the data did not drill down to the level of individual companies, the researchers were able to look at whole sectors, primarily in manufacturing where robots are commonly used.

The researchers then obtained robotics data from the International Federation of Robotics (IFR) database. By comparing the two sets of data, they were able to analyse the effect of robotics on profit margins at a country level.

“Intuitively, we thought that more robotic technologies would lead to higher profit margins, but the fact that we see this U-shaped curve instead was surprising,” said Chen.

“Initially, firms are adopting robots to create a competitive advantage by lowering costs,” said Velu. “But process innovation is cheap to copy, and competitors will also adopt robots if it helps them make their products more cheaply. This then starts to squeeze margins and reduce profit margin.”

The researchers then carried out a series of interviews with an American medical equipment manufacturer to study their experiences with robot adoption.

“We found that it’s not easy to adopt robotics into a business – it costs a lot of money to streamline and automate processes,” said Chen.

“When you start bringing more and more robots into your process, eventually you reach a point where your whole process needs to be redesigned from the bottom up,” said Velu. “It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch point.”

The researchers say that if companies want to reach the profitable side of the U-shaped curve more quickly, it’s important that the business model is adapted concurrently with robot adoption. Only after robots are fully integrated into the business model can companies fully use the power of robotics to develop new products, driving profits.

A related piece of work being led by the Institute for Manufacturing is a community programme to help small- and medium-sized enterprises (SMEEs) to adopt digital technologies including robotics in a low-cost, low-risk way. “Incremental and step changes in this area enable SMEs to get the benefits of cost reduction as well as margin improvements from new products,” said co-author Professor Duncan McFarlane.

The research was supported by the Engineering and Physical Sciences Research Council (EPSRC) and the Economic and Social Research Council (ESRC), which are both part of UK Research and Innovation (UKRI). Chander Velu is a Fellow of Selwyn College, Cambridge. Duncan McFarlane is a Fellow of St John’s College, Cambridge. 

Reference:
Yifeng P. Chen, Chander Velu, Duncan McFarlane. ‘The Effect of Robot Adoption on Profit Margins.’ IEEE Transactions on Engineering Management (2023). DOI: 10.1109/TEM.2023.3260734



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Genetic variant linked to lower levels of HIV virus in people of African ancestry

Know your HIV status sign in Simonga village, Zambia.

source: www.cam.ac.uk

An international team of researchers has found a genetic variant that may explain why some people of African ancestry have naturally lower viral loads of HIV, reducing their risk of transmitting the virus and slowing progress of their own illness.

Every time we discover something new about HIV control, we learn something new about the virus and something new about the cellHarriet Groom

Reported today in Nature, this is the first new genetic variant related to HIV infection discovered in over 25 years of research. It could, in the future, help direct the development of new treatment approaches for those living with HIV.

HIV remains a major threat to global health. According to UNAIDS, there were 38.4 million people living with HIV globally in 2021. A combination of pre-exposure drugs and medicines that dramatically reduce viral loads has had a major impact on transmission, yet 1.5 million people were newly infected in 2021. And while treatments have improved dramatically since the virus was first identified, 650,000 people still died from AIDS-related illnesses in that year.

Viral load is the amount of a virus in a patient’s system. Higher levels are known to correlate with faster disease progression and increased risk of transmission. But viral load varies widely among infected individuals, influenced by a number of factors including an individual’s genetic makeup.

Most of what we know about the relationship between our DNA and HIV comes from studies among European populations. But given that HIV disproportionately affects people on the African continent – more than 25 million people who are HIV-positive live on the continent – it’s important to better understand the role of genetics in HIV infection in African populations.

To investigate this question, researchers analysed the DNA of almost 4,000 people of African ancestry living with HIV-1, the most common type of the virus. They identified a variant within a region on chromosome 1 containing the gene CHD1L which was associated with reduced viral load in carriers of the variant. Between 4% and 13 % of people of African origin are thought to carry this particular variant.

Paul McLaren from the Public Health Agency of Canada’s National Microbiology Laboratory, joint first author, said: “African populations are still drastically underrepresented in human DNA studies, despite experiencing the highest burden of HIV infection. By studying a large sample of people of African ancestry, we’ve been able to identify a new genetic variant that only exists in this population and which is linked to lower HIV viral loads.”

CHD1L is known to play a role in repairing damaged DNA, though it is not clear why the variant should be important in reducing viral load. However, as HIV attacks immune cells, researchers at the University of Cambridge’s Department of Medicine, led by Dr Harriet Groom and Professor Andrew Lever, used stem cells to generate variants of cells that HIV can infect in which CHD1L had either been switched off or its activity turned down.

HIV turned out to replicate better in a type of immune cell known as a macrophage when CHD1L was switched off. In another cell type, the T cell, there was no effect – perhaps surprising since most HIV replication occurs in the latter cell type.

Dr Groom said: “This gene seems to be important to controlling viral load in people of African ancestry. Although we don’t yet know how it’s doing this, every time we discover something new about HIV control, we learn something new about the virus and something new about the cell. The link between HIV replication in macrophages and viral load is particularly interesting and unexpected.”

Co-author Professor Manjinder Sandhu from the Faculty of Medicine at Imperial College London said: “With more than a million new HIV infections a year, it’s clear that we still have a long way to go in the fight against HIV – we are yet to have a vaccine to prevent infection, have yet to find a cure and still see drug resistance emerging in some individuals. The next step is to fully understand exactly how this genetic variant controls HIV replication.”

The research in Cambridge was largely funded by the Medical Research Council. A full list of funders can be found in the research paper.

Dr Groom is a Research Fellow at Sidney Sussex College and Professor Andrew Lever is a Professorial Fellow Peterhouse, Cambridge.

Reference
McLaren, PJ; Porreca, I; Iaconis, G; Mok, HP, Mukhopadhyay, Sl; Karakoc, E et al. Africa-specific human genetic variation near CHD1L associates with HIV-1 load. Nature; 2 Aug 2023; DOI: 10.1038/s41586-023-06370-4



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge researchers awarded ERC funding to support commercial potential of their work

Left: Cecilia Mascolo, Right: Ismail Sami

source: www.cam.ac.uk

University of Cambridge researchers have been awarded Proof of Concept grants from the European Research Council (ERC), to help them explore the commercial or societal potential of their research. The funding is part of the EU’s research and innovation programme, Horizon Europe.

Professor Cecilia Mascolo from the Department of Computer Science and Technology will use the funding to further her work on developing mobile devices – like commercially-available earbuds – that can accurately pick up wearers’ body sounds and monitor them for health purposes.

The ERC Proof of Concept grants – worth €150,000 – help researchers bridge the gap between the discoveries stemming from their frontier research and the practical application of the findings, including the early phases of their commercialisation.

Researchers use this type of funding to verify the practical viability of scientific concepts, explore business opportunities or prepare patent applications.

Mascolo’s existing ERC-funded Project EAR was the first to demonstrate that the existing microphones in earbuds can be used to pick up wearers’ levels of activity and heart rate and to trace it accurately even when the wearer is exercising vigorously.

She now wants to build on this work by enhancing the robustness of these in-ear microphones and further improve their performance in monitoring human activity and physiology in ‘real life’ conditions, including by developing new algorithms to help the devices analyse the data they are collecting.

“There are currently no solutions on the market that use audio devices to detect body function signals like this and they could play an extremely valuable role in health monitoring,” said Mascolo. “Because the devices’ hardware, computing needs and energy consumption are inexpensive, they could put body function monitoring into the hands of the world’s population accurately and affordably.”

Professor Manish Chhowalla from the Department of Materials Science and Metallurgy was awarded a Proof of Concept Grant to demonstrate large-scale and high-performance lithium-sulfur batteries.

“Our breakthrough in lithium-sulphur batteries demonstrates a future beyond lithium-ion batteries; moving away from the reliance on critical raw materials and enabling the electrification of fundamentally new applications such as aviation,” said Dr Ismail Sami, Research Fellow in Chhowalla’s group. “This Proof of Concept will help us take the essential commercial and technical steps in bringing our innovation to market.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists discover secret of virgin birth, and switch on the ability in female flies

Fruit fly, Drosophila mercatorum

source: www.cam.ac.uk

Scientists have pinpointed a genetic cause for virgin birth for the first time, and once switched on the ability is passed down through generations of females.

It was very exciting to see a virgin fly produce an embryo able to develop to adulthoodAlexis Sperling

For the first time, scientists have managed to induce virgin birth in an animal that usually reproduces sexually: the fruit fly Drosophila melanogaster.

Once induced in this fruit fly, this ability is passed on through the generations: the offspring can reproduce either sexually if there are males around, or by virgin birth if there aren’t.

For most animals, reproduction is sexual – it involves a female’s egg being fertilised by a male’s sperm. Virgin birth, or ‘parthenogenesis’, is the process by which an egg develops into an embryo without fertilisation by sperm – a male is not needed. 

The offspring of a virgin birth are not exact clones of their mother but are genetically very similar, and are always female.

“We’re the first to show that you can engineer virgin births to happen in an animal – it was very exciting to see a virgin fly produce an embryo able to develop to adulthood, and then repeat the process,” said Dr Alexis Sperling, a researcher at the University of Cambridge and first author of the paper.

She added: “In our genetically manipulated flies, the females waited to find a male for half their lives – about 40 days – but then gave up and proceeded to have a virgin birth.”

In the experiments, only 1-2% of the second generation of female flies with the ability for virgin birth produced offspring, and this occurred only when there were no male flies around. When males were available, the females mated and reproduced in the normal way.

Switching to a virgin birth can be a survival strategy: a one-off generation of virgin births can help to keep the species going.

The study is published in the journal Current Biology.

To achieve their results, researchers first sequenced the genomes of two strains of another species of fruit fly, called Drosophila mercatorum. One strain needs males to reproduce, the other reproduces only through virgin birth. They identified the genes that were switched on, or switched off, when the flies were reproducing without fathers.

With the candidate genes for virgin birth ability identified in Drosophila mercatorum, the researchers altered what they thought were the corresponding genes in the model fruit fly, Drosophila melanogaster. It worked: Drosophila melanogaster suddenly acquired the ability for virgin birth.

The research involved over 220,000 virgin fruit flies and took six years to complete.

Key to the discovery was the fact that this work was done in Drosophila melanogaster – the researchers say it would have been incredibly difficult in any other animal. This fly has been the ‘model organism’ for research in genetics for over 100 years and its genes are very well understood.

Sperling, who carried out this work in the Department of Genetics, has recently moved to Cambridge Crop Science Centre to work on crop pests and hopes to eventually investigate why virgin birth in insects may be becoming more common, particularly in pest species.

“If there’s continued selection pressure for virgin births in insect pests, which there seems to be, it will eventually lead to them reproducing only in this way. It could become a real problem for agriculture because females produce only females, so their ability to spread doubles,” said Sperling.

The females of some egg-laying animals – including birds, lizards and snakes, can switch naturally to give birth without males. But virgin birth in animals that normally sexually reproduce is rare, often only observed in zoo animals, and usually happens when the female has been isolated for a long time and has little hope of finding a mate.

The research was funded by the Leverhulme Trust.

Reference

Sperling, A L et al.: ‘A genetic basis for facultative parthenogenesis in Drosophila.’ Current Biology, July 2023. DOI: 10.1016/j.cub.2023.07.006



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Lights could be the future of the internet and data transmission

Abstract colourful background

source: www.cam.ac.uk

Fast data transmission could be delivered in homes and offices through light emitting diode (LED) bulbs, complementing existing communication technologies and networks.

New internet technologies are being rapidly refined, and LED-based communication links are expected to be used in services and scenarios including Light-fidelity (Li-Fi), underwater communications, moderate- to high-speed photonic connections and various ‘Internet of Things’ (IoT) devices.

study, led by the Universities of Surrey and Cambridge and published in the journal Nature Photonics, has investigated how to release high-speed photonic sources using materials known as metal-halide perovskites. These semiconductors are studied with LEDs for their excellent optoelectronic properties and low-cost processing methods.

“IoT devices have the potential to add significant value to industry and the global economy,” said corresponding author Dr Wei Zhang from the University of Surrey. “In this market, costs and compatibility are often prioritised over data transmission speed and scientists are looking for alternative ways to reduce energy consumption per bit and improve compactness while simultaneously working on improving the speed of data connection.

“In our study, we have shown how metal-halide perovskites could provide a cost-efficient and powerful solution to make LEDs which have enormous potential to increase their bandwidths into the gigahertz levels.

“Our investigations will accelerate the development of high-speed perovskite photodetectors and continuous wave-pumped perovskite lasers, opening up new avenues for advancements in optoelectronic technologies.”

“We provided the first study to clarify the mechanisms behind achieving high-speed perovskite LEDs, representing a significant step toward the realisation of perovskite light sources for next-generation data communications,” said co-first author Hao Wang, a PhD candidate in Cambridge’s Department of Engineering. “The ability to achieve solution-processed perovskite emitters on silicon substrates also paves the way for their integration with micro-electronics platforms, presenting new opportunities for seamless integration and advancement in the field of data communications.”

The project involved researchers from Oxford, Bath, Warwick, UCL, EMPA and UESTC.

Reference:
Aobo Ren, Hao Wang et al. ‘High-bandwidth perovskite photonic sources on silicon.’ Nature Photonics (2023). DOI: 10.1038/s41566-023-01242-9

Adapted from a University of Surrey press release.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Largest ever DNA and health research programme for children and young people launches

Children playing tug of war

source: www.cam.ac.uk

Cambridge researchers are helping launch a nationwide study for children and young to unlock the power of our DNA.

Today we are at the beginning of the most tremendous opportunity which will transform our understanding of genetics for children’s healthAnna Moore

The National Institute for Health and Care Research (NIHR) BioResource’s D-CYPHR – the DNA, Children and Young People’s Health Resource – will play a key role in pioneering new treatments and creating better care for children and the adults they will become – everything from improving understanding of mental health to combatting heart disease. The programme is led by NIHR BioResource in partnership with the NHS, Anna Freud and the University of Cambridge.     

Many serious health conditions start in the first two decades of life, with over 1.7 million children in England alone suffering from long-term health conditions. However, most health research is carried out with adults, meaning we are not only missing important opportunities to understand how disease starts, develops and its causes, but are limited in developing new treatments.  

Dr Anna Moore, University of Cambridge, clinical lead for D-CYPHR, said: “Today we are at the beginning of the most tremendous opportunity which will transform our understanding of genetics for children’s health: a moment where families can help much needed health research from home. This will boost all the amazing research happening across the UK.”

The ground-breaking new programme is open to any child and young person aged 0 to 15 in the UK, to create the biggest health initiative of its kind in this country and a world first – a new national childhood DNA health resource for research from birth through adolescence.  

Supporting the programme’s launch, Dr Xand van Tulleken, BBC presenter, urged parents and their children to get involved: “We urgently need research projects that support children’s health and we need children to volunteer to help! Children are amazing – I’m constantly astounded by their bravery in stepping up for causes they believe in. Today, we’re offering the chance to be a hero for healthcare – just by spitting in a tube.

“D-CYPHR will help future children, and it will help all of us in our adult lives. The ambition and scope of the D-CYPHR project is awe inspiring and it relies on these incredible volunteers to make amazing discoveries from which all of us, children and adults, will benefit.” 

Research has shown the power that understanding genetics can have on outcomes for a range of conditions and illness, from improvements to how diabetes is treated in children, to the national roll-out of whole genome sequencing for babies and children in intensive care. Understanding genetics is vital in helping to understand and treat illness.  

D-CYPHR aims to support this, while also mapping development in children and young people. The environment and experiences while growing up work together with genetics to affect development, and the likelihood of getting diseases. The more we can understand about this, the more treatments can be developed, and tailored to individuals. It even opens up the opportunity for earlier identification of problems, and in the future, we hope to be able to avoid some illnesses, like heart disease and type II diabetes. 

Dr Anna Moore continued: “We’ve carefully designed and piloted the programme alongside children, schools and families over two years. This has been very important as this project will also be a way to address inequality in health research – health research needs to benefit everyone, and so we need children and young people from all backgrounds to get involved.

“Each sample joins a resource with thousands of others showing how environment and genetics affect health. The potential of D-CYPHR is therefore massive. We’re excited to unlock these secrets together with the young heroes that are agreeing to help us.” 

Joining D-CYPHR is simple. Each young person, with parental consent, donates a saliva sample and answers a health and lifestyle questionnaire. The information and sample are depersonalised and joins the resource. By studying thousands of DNA samples together with health information, scientists can begin to see the big picture of how our genes and our environment influence our health.

In a world first, the new D-CYPHR programme will support research into any health conditions that begin in or have their origins in childhood – not just a specific condition or age group. This will include mental health conditions, diabetes, heart conditions, rare diseases, immune conditions and many more. It will also help us understand childhood development and what sets the foundations for a healthy life.  

Professor Lucy Chappell, Chief Executive of the NIHR commented: “We’ve seen that genetics can help us unlock our understanding of diseases. Now we want to build on that knowledge by ensuring that our children and young people can access the power of genetics to transform diagnosis and treatment through this research. This exciting new project will help us develop an understanding of their genetics in a way that is more detailed and focused than ever before.”

Secretary of State for Health and Social Care, Steve Barclay said: “This pioneering genetic research programme is an exciting opportunity to advance our understanding of the causes of diseases and how they develop from childhood through to adulthood.

“By focusing on the DNA of children and young people we’ll be able to track how genetics affect a child’s development and build a picture of what might impact on their future health. As a result we’ll be able to develop more effective, bespoke treatments and even explore potential preventative measures for a wide range of conditions, including mental health issues and heart disease.”

Find out how you can take part on the D-CYPHR website

Adapted from an NIHR press release


Suzie, mother of a D-CYPHR participant from Salisbury

“I saw my daughter, Sophie’s, journey from a very unwell newborn to a vibrant and active seven-year-old, enjoying life to the fullest. She’s now thriving and loves reading, baking, riding, and drama, which brings immense joy to her and our family. But in her early days, she faced significant health challenges. The cause of these episodes was unknown, leaving us puzzled and seeking medical answers. Some of these questions remain unanswered, and as a doctor, I know this can happen.

“If by taking part in D-CYPHR, and expanding medical knowledge, then someone else in a similar position gets a better understanding of a cause and so the right treatment, then it’s completely worth it. D-CYPHR is an opportunity for us to support research that might give answers to other parents in our situation, as well as create better treatments for millions of people.” 



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge academics elected 2023 British Academy Fellows

The British Academy building

source: www.cam.ac.uk

Two academics from the University of Cambridge have been elected Fellows of the British Academy as part of a new group of leading international humanities and social sciences researchers.

Professor Jaideep Prabhu and Professor Sujit Sivasundaram join the latest cohort of Fellows which highlight the depth and breadth of the SHAPE (Social Sciences, Humanities and the Arts for People and the Economy) disciplines and reflects the importance of interdisciplinary research.

Founded in 1902, The British Academy is the UK’s national academy for the humanities and social sciences. Each year, it elects to its fellowship up to 52 outstanding UK-based scholars who have achieved distinction in any branch of the humanities and social sciences. 

Professor Jaideep Prabhu is a Professor of Marketing, the Jawaharlal Nehru Professor of Indian Business & Enterprise at the Judge Business School and Director of the Centre for India & Global Business. He is also a Fellow and Director of Studies at Clare College.

He said: “I am delighted to be joining such a distinguished group of academics, not only in my area of Management and Business Studies, but also across the Humanities and Social Sciences. I look forward to working with the other Fellows of the Academy on issues of importance to the UK and the world where the Humanities and Social Sciences have a crucial role to play. I owe a debt of gratitude to all those I have worked with over the years, and in particular my colleagues at Cambridge. This wouldn’t have been possible without them.”

Professor Sujit Sivasundaram is a Professor of World History and Fellow of Gonville & Caius College. He is currently working on the long history of Colombo as an exemplar of the global South city and also on an environmental history of the Indian Ocean. You can read a recent interview with Professor Sivasundaram in This Cambridge Life.

He said: “World history is about reaching for unexpected places to bring light to the human present and future. Research in this field is necessarily and fittingly collaborative and builds on the insights of librarians, curators, students and intellectuals, among others, in various places in the world. I thank all my friends, spread so far and wide, for pointing me to the right path in my research. This honour belongs to all of them.”

Welcoming the new Fellows, Professor Julia Black, President of the British Academy, said:

“It is with great pleasure that we welcome yet another outstanding cohort to the Academy’s Fellowship. The scope of research and expertise on display across our newly elected UK, Corresponding and Honorary Fellows shows the breadth and depth of knowledge and insight held by the British Academy. It is our role to harness this to understand and help shape a better world.”



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Give more people with learning disabilities the chance to work, Cambridge historian argues

A barista making a coffee

source: www.cam.ac.uk

Employment levels for people with learning disabilities in the UK are 5 to 10 times lower than they were a hundred years ago. And the experiences of workers from the 1910s–50s offer inspiration as well as lessons about safeguarding.

We need to have more bold ambition and stop being content with really marginal forms of inclusionLucy Delap

A new study by historian Professor Lucy Delap (Murray Edwards College) argues that loud voices in the 20th-century eugenics movement have hidden a much bigger picture of inclusion in British workplaces that puts today’s low rates to shame.

Professor Delap found that in some parts of Britain, up to 70% of people variously labelled ‘defective’, ‘slow’ and ‘odd’ at the time had paid jobs when demand for labour was high, including during and after the First World War. This proportion fell during recessions, but even then, 30% remained in work. By contrast, in the UK today less than 5% of adults with intellectual disabilities are employed.

“A recession now couldn’t make levels of employment of people with learning disabilities much worse, they are on the floor already,” Professor Delap says. Her study, published in the journal Social History of Medicine follows a decade of painstakingly piecing together evidence of people with learning disabilities in the British workforce in the first half of the 20th century.

Delap found no trace in employers’ records or in state archives which focused on segregation and detaining people. But she struck gold in The National Archives in Kew with a survey of ‘employment exchanges’ undertaken in 1955 to investigate how people then termed ‘subnormal’ or ‘mentally handicapped’ were being employed. She found further evidence in the inspection records of Trade Boards now held at Warwick University’s Modern Records Centre. In 1909, a complex system of rates and inspection emerged as part of an effort to set minimum wages. This led to the development of ‘exemption permits’ for a range of employees not considered to be worth ‘full’ payment.

Delap says: “Once I found these workers, they appeared everywhere and not just in stereotypical trades like shoe repair and basket-weaving. They were working in domestic service, all kinds of manufacturing, shops, coal mining, agriculture, and local authority jobs.”

Delap’s research goes against most previous writing about people with intellectual disabilities which has focused on eugenics and the idea that preindustrial community inclusion gave way to segregation and asylums in the nineteenth century. “We’ve been too ready to accept that narrative and haven’t gone looking for people in the archive,” Delap says. “Many weren’t swept up into institutions, they lived relatively independent lives, precarious lives, but often with the support of families, friends and co-workers.”

‘Wage age’ versus IQ

Previous studies have focused on the rise of IQ testing in this period, but the employment records that Delap studied showed something very different: a more positive sense of ability couched in terms of the wages someone was worth. This involved imagining a person’s ‘wage age’, meaning that an adult worker could begin with a starting age of 14 and advance in wage age through their working life. Not everyone did advance though.

Delap says “The idea of ‘wage age’ was harsh in many ways but it was far less stigmatising than IQ which emphasised divisions between ‘normal’ and ‘defective’ and suggested people couldn’t advance beyond a certain point. By contrast, ideas of fairness, productivity and ‘the going rate’ were deployed to evaluate workers. When labour was in demand, workers had leverage to negotiate their wage age up. IQ didn’t give people that power.”

Appeal to employers

Under the exemption system, employers saw the business case for employing – usually at a significantly lower rate of pay – loyal workers who could be trusted to carry out routine tasks.

Tailoring Trade Board entry (1915). Courtesy of Modern Records Centre, Warwick University
Tailoring Trade Board application for permit of exemption relating to a 19-year-old ‘unintelligent’ woman employed to do various errands in Peterborough (1915). Courtesy of Modern Records Centre, Warwick University.

Delap says: “If anything, governments gave signals that these people shouldn’t be employed, that they were better off under the care and control of the mental deficiency boards. But employers understood that they could be good workers.”

In 1918, an ‘odd job’ worker employed for 20 years at a London tin works was described as suffering from ‘mental deficiency’ and didn’t know the time of the year or who Britain was fighting. Nevertheless, in the inspector’s opinion, he was ‘little if at all inferior to an ordinary worker of full capacity’ on the hand press and ‘His speed at cutting out on an unguarded fly machine was noticeable.’ His employer agreed to a raise from 18 to 24 shillings a week, just below what a carter could earn.

Employer calculations, Delap emphasises, fluctuated with the state of the labour market. When workers were in short supply, those with learning disabilities became more attractive. When demand for labour fell these workers might be the first to lose their jobs.

Were employers just exploiting vulnerable workers?

Delap found clear evidence of some workers being exploited, being stuck on the same very low wage and the same monotonous task for years.

“We shouldn’t feel nostalgic, this wasn’t a ‘golden age’ of disability-friendly employment,” Delap says. And yet, the archive reveals a strong reciprocal sense of real work being done and wages being paid in exchange. “Many of these people would have considered themselves valued workers and not charity cases. Some were able to negotiate better conditions and many resisted being told to do boring, repetitive work.”

Delap repeatedly encountered families policing the treatment of their relative. In 1922, the owner of a laundry in Lincolnshire considered sacking a 25-year-old ‘mentally deficient’ woman who starched collars because ‘trade is so bad’ but kept her on ‘at request of her parents’. “Workers who had families looking out for them were more able to ask for wage rises, refuse to do certain jobs and limit exploitation,” Delap says. “I found lots of evidence of love and you don’t often see that in archives of intellectual disability.”

Parents or siblings sometimes worked on the same premises which, Delap argues, strengthened the bonds of moral obligation that existed between employers and families. In 1918, for instance, a 16-year-old who attached the bottoms of tin cans in Glamorgan was hired ‘for the sake of her sisters who are employed by the firm and are satisfactory workers’.

Lessons for today

Delap sees concerning similarities between the 1920s and the 2020s in terms of how British institutions manage, care for and educate people with learning disabilities.

Historically, Delap argues, institutions were just stop-gaps, places where people could be kept without onward pathways. People were often not trained at all or trained to do work that didn’t really exist like basket-weaving. “This remains a problem today,” Delap says. “We have a fast-changing labour market and our special schools and other institutions aren’t equipping people well enough for viable paid opportunities.”

Delap argues that evidence of people with learning disabilities successfully working in many different roles and environments in the past undermines today’s focus on a very narrow range of job types and sectors. She highlights the fact that many workers with learning disabilities used to be involved in the service sector, including public facing roles, and not just working in factories. “They were doing roles which brought them into contact with the general public and being a service sector economy today, we have lots of those jobs.”

Delap also believes that structural factors continue to prevent people from accessing jobs. “Credentialism has made it very difficult for people don’t have qualifications to get jobs which they might actually be very good at,” she says. “We need to think much harder about how we make the system work for people with a range of abilities. I also think the rise of IT is a factor, we haven’t been training people with learning disabilities well enough in computer skills so it has become an obstacle.”

Delap believes that Britain’s ageing population and struggle to fill unskilled jobs means there is a growing economic as well as a moral case for employing more people with learning disabilities.

She points out that many people with intellectual disabilities used to work in agriculture, a sector now facing chronic labour shortages. Delap acknowledges that exploitation remains a problem in agriculture, so safeguarding would be paramount, as it would be in every sector.

“I think employers are recognising that they need active inclusion strategies to fill vacancies and that they need to cultivate loyalty,” Delap says. “Work remains a place where we find meaning in our lives and where we make social connections and that’s why so many people with disabilities really want to work and why it deprives them of so much when they are excluded. We need to have more bold ambition and stop being content with really marginal forms of inclusion.”

Reference

L Delap, ‘Slow Workers: Labelling and Labouring in Britain, c. 1909–1955’, Social History of Medicine (2023). DOI: 10.1093/shm/hkad043



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Webb sees carbon-rich dust grains in the first billion years of cosmic time

The image shows a deep galaxy field, featuring thousands of galaxies of various shapes and sizes. A cutout indicates a particular galaxy, known as JADES-GS-z6, which was a research target for this result. It appears as a blurry smudge of blue, red and green.

source: www.cam.ac.uk

For the first time, the James Webb Space Telescope has observed the chemical signature of carbon-rich dust grains in the early universe.

Similar observational signatures have been observed in the much more recent universe, and have been attributed to complex, carbon-based molecules known as polycyclic aromatic hydrocarbons (PAHs). It is not thought likely, however, that PAHs would have developed within the first billion years of cosmic time.

The international team, including researchers from the University of Cambridge, say that Webb may have observed a different species of carbon-based molecule: possibly minuscule graphite- or diamond-like grains produced by the earliest stars or supernovas. Their results, which suggest that infant galaxies in the early universe developed much faster than anticipated, are reported in the journal Nature.

The seemingly empty spaces in our universe are in reality often not empty at all, but are filled by clouds of gas and cosmic dust. This dust consists of grains of various sizes and compositions that are formed and ejected into space in a variety of ways, including by supernova events.

This material is crucial to the evolution of the universe, as dust clouds ultimately form the birthplaces for new stars and planets. However, the dust absorbs stellar light at certain wavelengths, making some regions of space challenging to observe.

An upside is that certain molecules will consistently absorb or otherwise interact with specific wavelengths of light. This means that astronomers can get information about the cosmic dust’s composition by observing the wavelengths of light that it blocks.

The Cambridge-led team of astronomers used this technique, combined with Webb’s extraordinary sensitivity, to detect the presence of carbon-rich dust grains only a billion years after the birth of the universe.

“Carbon-rich dust grains can be particularly efficient at absorbing ultraviolet light with a wavelength around 217.5 nanometres, which for the first time we have directly observed in the spectra of very early galaxies,” said lead author Dr Joris Witstok from Cambridge’s Kavli Institute for Cosmology.

This 217.5-nanometre feature has previously been observed in the much more recent and local Universe, including within our own Milky Way galaxy, and has been attributed to two different types of carbon-based molecules: polycyclic aromatic hydrocarbons (PAHs) or nano-sized graphitic grains.

According to most models, it should take several hundreds of millions of years before PAHs form, so it would be surprising if the team had observed the chemical signature of molecules that shouldn’t have formed yet. However, according to the researchers, this result is the earliest and most distant direct signature for this carbon-rich dust grain.

The answer may lie in the details of what was observed. The feature observed by the team peaked at 226.3 nanometres, not the 217.5-nanometre wavelength associated with PAHs and tiny graphitic grains. A discrepancy of less than ten nanometres could be accounted for by measurement error. Equally, it could also indicate a difference in the composition of the early universe cosmic dust mixture that the team detected.

“This slight shift in wavelength of where the absorption is strongest suggests we may be seeing a different mix of grains, for example, graphite- or diamond-like grains,” said Witstok, who is also a Postdoctoral Research Associate at Sidney Sussex College. “This could also potentially be produced on short timescales by Wolf-Rayet stars or by material ejected from a supernova.”

Models have previously suggested that nano-diamonds could be formed in the material ejected from supernovas; and huge, hot Wolf-Rayet stars, which live fast and die young, would give enough time for generations of stars to have been born, lived, and died, to distribute carbon-rich grains into the surrounding cosmic dust in under a billion years.

However, it is still a challenge to fully explain these results with the existing understanding of the early formation of cosmic dust. These results will go on to inform the development of improved models and future observations.

With the advent of Webb, astronomers are now able to make detailed observations of the light from individual dwarf galaxies, seen in the first billion years of cosmic time. Webb finally permits the study of the origin of cosmic dust and its role in the crucial first stages of galaxy evolution.

“This discovery was made possible by the unparalleled sensitivity improvement in near-infrared spectroscopy provided by Webb, and specifically its Near-Infrared Spectrograph (NIRSpec),” said co-author Professor Roberto Maiolino, who is based in the Cavendish Laboratory and the Kavli Institute for Cosmology. “The increase in sensitivity provided by Webb is equivalent, in the visible, to instantaneously upgrading Galileo’s 37-millimetre telescope to the 8-metre Very Large Telescope, one of the most powerful modern optical telescopes.”

The team is planning further research into the data and this result. “We are planning to work with theorists who model dust production and growth in galaxies,” said co-author Irene Shivaei of the University of Arizona/Centro de Astrobiología (CAB). “This will shed light on the origin of dust and heavy elements in the early universe.”

These observations were made as part of the JWST Advanced Deep Extragalactic Survey, or JADES. This programme has facilitated the discovery of hundreds of galaxies that existed when the universe was less than 600 million years old, including some of the farthest galaxies known to date.

“I’ve studied galaxies in the first billion years of cosmic time my entire career and never did we expect to find such a clear signature of cosmic dust in such distant galaxies,” said co-author Dr Renske Smit from Liverpool John Moores University. “The ultradeep data from JWST is showing us that grains made up of diamond-like dust can form in the most primordial of systems. This is completely overthrowing models of dust formation and opening up a whole new way of studying the chemical enrichment of the very first galaxies.”

Webb is an international partnership between NASA, ESA and the Canadian Space Agency (CSA). This research was supported in part by the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

Reference:
Joris Witstok et al. ‘Carbonaceous dust grains seen in the first billion years of cosmic time.’ Nature (2023). DOI: 10.1038/s41586-023-06413-w

Adapted from an ESA press release.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge researchers help develop smart, 3D printed concrete wall for National Highways project

Cool Concrete – the smart, 3D printed concrete wall used for National Highways project.

https://www.youtube-nocookie.com/embed/p0RfM4I4Mxk?wmode=opaque&controls=1&rel=0&autohide=0&enablejsapi=1&origin=https%3A%2F%2Fwww.cam.ac.uk

source: www.cam.ac.uk

Cambridge researchers, working in partnership with industry, have helped develop the first 3D-printed piece of concrete infrastructure to be used on a National Highways project.

Making the wall digital means it can speak for itself, and we can use our sensors to understand these 3D-printed structures better and accelerate their acceptance in industryAbir Al-Tabbaa

The 3D-printed structure – a type of retaining wall known as a headwall – has been installed on the A30 in Cornwall, where it is providing real-time information thanks to Cambridge-designed sensors embedded in its structure. The sensors provide up-to-date measurements including temperature, strain and pressure. This ‘digital twin’ of the wall could help spot and correct faults before they occur.

Headwall structures are normally made in limited shapes from precast concrete, requiring formwork and extensive steel reinforcement. But by using 3D printing, the team – including specialists from Costain, Jacobs and Versarien – could design and construct a curved hollow wall with no formwork and no steel reinforcement. The wall gets its strength not from steel, but from geometry instead.

The wall – which took one hour to print – is roughly two metres high and three and a half metres across. It was printed in Gloucestershire at the headquarters of the advanced engineering company Versarien, using a robot arm-based concrete printer. Making the wall using 3D printing significantly saves on costs, materials and carbon emissions.

Over the past six years, Professor Abir Al-Tabbaa’s team in the Department of Engineering has been developing new sensor technologies and exploring the effectiveness of existing commercial sensors to get better-quality information out of infrastructure. Her team has also developed various ‘smart’ self-healing concretes. For this project, they supplied sensors to measure temperature during the printing process.

Temperature variations at different layers of the 3D-printed wall were continuously monitored to detect any potential hotspots, thermal gradients, or anomalies. The temperature data will be correlated with the corresponding thermal imaging profile to understand the thermal behaviour of the 3D-printed wall.

“Since you need an extremely fast-setting cement for 3D printing, it also generates an enormous amount of heat,” said Al-Tabbaa. “We embedded our sensors in the wall to measure temperature during construction, and now we’re getting data from them while the wall is on site.”  

In addition to temperature, the sensors measure relative humidity, pressure, strain, electrical resistivity, and electrochemical potential. The measurements provide valuable insights into the reliability, robustness, accuracy, and longevity of the sensors.

A LiDAR system also was used to scan the wall as it was being printed to create a 3D point cloud and generate a digital twin of the wall.

“Making the wall digital means it can speak for itself,” said Al-Tabbaa. “And we can use our sensors to understand these 3D printed structures better and accelerate their acceptance in industry.”

The Cambridge team developed a type of sensor, known as a PZT (Piezoceramic Lead-Zirconate-Titanate) sensor, which measures electromechanical impedance response and monitors changes in these measurements over time to detect any possible damage. These smart sensors can show how 3D-printed mortar hardens over time, while simultaneously monitoring the host structure’s health.

Eight PZT sensors were embedded within the wall layers at different positions during the 3D printing process to capture the presence of loading and strain, both during the construction process and service life after field installation.

The team, which included experts in smart materials, automation and robotics and data science, also developed a bespoke wireless data acquisition system. This enabled the collection of the multifrequency electromechanical response data of the embedded sensors remotely from Cambridge.

“This project will serve as a living laboratory, generating valuable data over its lifespan,” said Al-Tabbaa. “The sensor data and ‘digital twin’ will help infrastructure professionals better understand how 3D printing can be used and tailored to print larger and more complex cement-based materials for the strategic road network.”

Members of the team included Dr Sripriya Rengaraju, Dr Christos Vlachakis, Dr Yen-Fang Su, Dr Damian Palin, Dr Hussam Taha, Dr Richard Anvo and Dr Lilia Potseluyko from Cambridge; as well as Costain’s Head of Materials Bhavika Ramrakhyani, a part-time PhD student in the Department of Engineering, and Ben Harries, Architectural Innovation Lead at Versarien, who is also starting a part-time PhD in the Department of Engineering in October.

The Cambridge team’s work is part of the Resilient Materials for Life Programme and the Digital Roads of the Future Initiative. The research is supported in part by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and the European Union.



The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.