All posts by Adam Brinded

Experiment Evaluates The Effect Of Human Decisions On Climate Reconstructions

Experiment Evaluates The Effect Of Human Decisions On Climate Reconstructions

Subfossil trees preserved in Iceland Credit: Hrafn Óskarsson

 

The first double-blind experiment analysing the role of human decision-making in climate reconstructions has found that it can lead to substantially different results.

 

Scientists aren’t robots, and we don’t want them to be, but it’s important to learn where the decisions are made and how they affect the outcome

Ulf Büntgen

The experiment, designed and run by researchers from the University of Cambridge, had multiple research groups from around the world use the same raw tree-ring data to reconstruct temperature changes over the past 2,000 years.

While each of the reconstructions clearly showed that recent warming due to anthropogenic climate change is unprecedented in the past two thousand years, there were notable differences in variance, amplitude and sensitivity, which can be attributed to decisions made by the researchers who built the individual reconstructions.

Professor Ulf Büntgen from the University of Cambridge, who led the research, said that the results are “important for transparency and truth – we believe in our data, and we’re being open about the decisions that any climate scientist has to make when building a reconstruction or model.”

To improve the reliability of climate reconstructions, the researchers suggest that teams make multiple reconstructions at once so that they can be seen as an ensemble. The results are reported in the journal Nature Communications.

Information from tree rings is the main way that researchers reconstruct past climate conditions at annual resolutions: as distinctive as a fingerprint, the rings formed in trees outside the tropics are annually precise growth layers. Each ring can tell us something about what conditions were like in a particular growing season, and by combining data from many trees of different ages, scientists are able to reconstruct past climate conditions going back hundreds and even thousands of years.

Reconstructions of past climate conditions are useful as they can place current climate conditions or future projections in the context of past natural variability. The challenge with a climate reconstruction is that – absent a time machine – there is no way to confirm it is correct.

“While the information contained in tree rings remains constant, humans are the variables: they may use different techniques or choose a different subset of data to build their reconstruction,” said Büntgen, who is based at Cambridge’s Department of Geography, and is also affiliated with the CzechGlobe Centre in Brno, Czech Republic. “With any reconstruction, there’s a question of uncertainty ranges: how certain you are about a certain result. A lot of work has gone into trying to quantify uncertainties in a statistical way, but what hasn’t been studied is the role of decision-making.

“It’s not the case that there is one single truth – every decision we make is subjective to a greater or lesser extent. Scientists aren’t robots, and we don’t want them to be, but it’s important to learn where the decisions are made and how they affect the outcome.”

Büntgen and his colleagues devised an experiment to test how decision-making affects climate reconstructions. They sent raw tree ring data to 15 research groups around the world and asked them to use it to develop the best possible large-scale climate reconstruction for summer temperatures in the Northern hemisphere over past 2000 years.

“Everything else was up to them – it may sound trivial, but this sort of experiment had never been done before,” said Büntgen.

Each of the groups came up with a different reconstruction, based on the decisions they made along the way: the data they chose or the techniques they used. For example, one group may have used instrumental target data from June, July and August, while another may have only used the mean of July and August only.

The main differences in the reconstructions were those of amplitude in the data: exactly how warm was the Medieval warming period, or how much cooler a particular summer was after a large volcanic eruption.

Büntgen stresses that each of the reconstructions showed the same overall trends: there were periods of warming in the 3rd century, as well as between the 10th and 12th century; they all showed abrupt summer cooling following clusters of large volcanic eruptions in the 6th, 15th and 19th century; and they all showed that the recent warming since the 20th and 21st century is unprecedented in the past 2000 years.

“You think if you have the start with the same data, you will end up with the same result, but climate reconstruction doesn’t work like that,” said Büntgen. “All the reconstructions point in the same direction, and none of the results oppose one another, but there are differences, which must be attributed to decision-making.”

So, how will we know whether to trust a particular climate reconstruction in future? In a time where experts are routinely challenged, or dismissed entirely, how can we be sure of what is true? One answer may be to note each point where a decision is made, consider the various options, and produce multiple reconstructions. This would of course mean more work for climate scientists, but it could be a valuable check to acknowledge how decisions affect outcomes.

Another way to make climate reconstructions more robust is for groups to collaborate and view all their reconstructions together, as an ensemble. “In almost any scientific field, you can point to a single study or result that tells you what to hear,” he said. “But when you look at the body of scientific evidence, with all its nuances and uncertainties, you get a clearer overall picture.”

Reference:
Ulf Büntgen et al. ‘The influence of decision-making in tree ring-based climate reconstructions.’ Nature Communications (2021). DOI: 10.1038/s41467-021-23627-6

source: https://www.cam.ac.uk/research/news/experiment-evaluates-the-effect-of-human-decisions-on-climate-reconstructions


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Ultra-High-Density Hard Drives Made With Graphene Store Ten Times More Data

Ultra-High-Density Hard Drives Made With Graphene Store Ten Times More Data

 

Hard disk drive
Hard disk drive Credit: bohed

 

Graphene can be used for ultra-high density hard disk drives (HDD), with up to a tenfold jump compared to current technologies, researchers at the Cambridge Graphene Centre have shown.

 

Considering that in 2020, around 1 billion terabytes of fresh HDD storage was produced, these results indicate a route for mass application of graphene in cutting-edge technologies

Andrea Ferrari

The study, published in Nature Communications, was carried out in collaboration with teams at the University of Exeter, India, Switzerland, Singapore, and the US.

HDDs first appeared in the 1950s, but their use as storage devices in personal computers only took off from the mid-1980s. They have become ever smaller in size, and denser in terms of the number of stored bytes. While solid state drives are popular for mobile devices, HDDs continue to be used to store files in desktop computers, largely due to their favourable cost to produce and purchase.

HDDs contain two major components: platters and a head. Data are written on the platters using a magnetic head, which moves rapidly above them as they spin. The space between head and platter is continually decreasing to enable higher densities.

Currently, carbon-based overcoats (COCs) – layers used to protect platters from mechanical damages and corrosion – occupy a significant part of this spacing. The data density of HDDs has quadrupled since 1990, and the COC thickness has reduced from 12.5nm to around 3nm, which corresponds to one terabyte per square inch.  Now, graphene has enabled researchers to multiply this by ten.

The Cambridge researchers have replaced commercial COCs with one to four layers of graphene, and tested friction, wear, corrosion, thermal stability, and lubricant compatibility. Beyond its unbeatable thinness, graphene fulfills all the ideal properties of an HDD overcoat in terms of corrosion protection, low friction, wear resistance, hardness, lubricant compatibility, and surface smoothness.

Graphene enables two-fold reduction in friction and provides better corrosion and wear than state-of-the-art solutions. In fact, one single graphene layer reduces corrosion by 2.5 times.

Cambridge scientists transferred graphene onto hard disks made of iron-platinum as the magnetic recording layer, and tested Heat-Assisted Magnetic Recording (HAMR) – a new technology that enables an increase in storage density by heating the recording layer to high temperatures. Current COCs do not perform at these high temperatures, but graphene does. Thus, graphene, coupled with HAMR, can outperform current HDDs, providing an unprecedented data density, higher than 10 terabytes per square inch.

“Demonstrating that graphene can serve as protective coating for conventional hard disk drives and that it is able to withstand HAMR conditions is a very important result. This will further push the development of novel high areal density hard disk drives,” said Dr Anna Ott from the Cambridge Graphene Centre, one of the co-authors of this study.

A jump in HDDs’ data density by a factor of ten and a significant reduction in wear rate are critical to achieving more sustainable and durable magnetic data recording. Graphene based technological developments are progressing along the right track towards a more sustainable world.

Professor Andrea C. Ferrari, Director of the Cambridge Graphene Centre, added: “This work showcases the excellent mechanical, corrosion and wear resistance properties of graphene for ultra-high storage density magnetic media. Considering that in 2020, around 1 billion terabytes of fresh HDD storage was produced, these results indicate a route for mass application of graphene in cutting-edge technologies.”

Reference
Dwivedi et al. Graphene Overcoats for Ultra-High Storage Density Magnetic Media. Nature Communications 12, 2854 (2021), DOI: 10.1038/s41467-021-22687-y.

Adapted from a release from the Cambridge Graphene Centre.

source: https://www.cam.ac.uk/research/news/ultra-high-density-hard-drives-made-with-graphene-store-ten-times-more-data


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Atom Swapping Could Lead To Ultra-Bright, Flexible Next Generation LEDs

Atom Swapping Could Lead To Ultra-Bright, Flexible Next Generation LEDs

 

Artist’s impression of glowing halide perovskite nanocrystals
Artist’s impression of glowing halide perovskite nanocrystals Credit: Ella Maru Studio

 

An international group of researchers has developed a new technique that could be used to make more efficient low-cost light-emitting materials that are flexible and can be printed using ink-jet techniques.

 

The researchers, led by the University of Cambridge and the Technical University of Munich, found that by swapping one out of every one thousand atoms of one material for another, they were able to triple the luminescence of a new material class of light emitters known as halide perovskites.

This ‘atom swapping’, or doping, causes the charge carriers to get stuck in a specific part of the material’s crystal structure, where they recombine and emit light. The results, reported in the Journal of the American Chemical Society, could be useful for low-cost printable and flexible LED lighting, displays for smartphones or cheap lasers.

Many everyday applications now use light-emitting devices (LEDs), such as domestic and commercial lighting, TV screens, smartphones and laptops. The main advantage of LEDs is they consume far less energy than older technologies.

Ultimately, also the entirety of our worldwide communication via the internet is driven by optical signals from very bright light sources that within optical fibres carry information at the speed of light across the globe.

The team studied a new class of semiconductors called halide perovskites in the form of nanocrystals which measure only about a ten-thousandth of the thickness of a human hair. These ‘quantum dots’ are highly luminescent materials: the first high-brilliance QLED TVs incorporating quantum dots recently came onto the market.

The Cambridge researchers, working with Daniel Congreve’s group at Harvard, who are experts in the fabrication of quantum dots, have now greatly improved the light emission from these nanocrystals. They substituted one out of every one thousand atoms with another – swapping lead for manganese ions – and found the luminescence of the quantum dots tripled.

A detailed investigation using laser spectroscopy revealed the origin of this observation. “We found that the charges collect together in the regions of the crystals that we doped,” said Sascha Feldmann from Cambridge’s Cavendish Laboratory, the study’s first author. “Once localised, those energetic charges can meet each other and recombine to emit light in a very efficient manner.”

“We hope this fascinating discovery: that even smallest changes to the chemical composition can greatly enhance the material properties, will pave the way to cheap and ultrabright LED displays and lasers in the near future,” said senior author Felix Deschler, who is jointly affiliated at the Cavendish and the Walter Schottky Institute at the Technical University of Munich.

In the future, the researchers hope to identify even more efficient dopants which will help make these advanced light technologies accessible to every part of the world.

 

Reference:
Sascha Feldmann et al. ‘Charge carrier localization in doped perovskite nanocrystals enhances radiative recombination.’, Journal of the American Chemical Society (2021). DOI: 10.1021/jacs.1c01567

source: https://www.cam.ac.uk/research/news/atom-swapping-could-lead-to-ultra-bright-flexible-next-generation-leds


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Urban Crime Fell By Over A Third Around The World During COVID-19 Shutdowns

Urban Crime Fell By Over A Third Around The World During COVID-19 Shutdowns

 

Study finds crimes such as theft and robbery almost halved on average in major cities.  

Story: Fred Lewsey

An empty Regent Street in central London during the first national lockdown in May 2020.

A new analysis of crime rates in 27 cities across 23 countries in Europe, the Americas, Asia and the Middle East has found that stay-at-home policies during the pandemic led to an overall drop in police-recorded crime of 37% across all the sites in the study.

A team of criminologists led by the University of Cambridge and University of Utrecht examined trends in daily crime counts before and after COVID-19 restrictions were implemented in major metropolitan areas such as BarcelonaChicagoSao PauloTel AvivBrisbane and London.

While both stringency of lockdowns and the resulting crime reductions varied considerably from city to city, the researchers found that most types of crime – with the key exception of homicide – fell significantly in the study sites.

Across all 27 cities, daily assaults fell by an average of 35%, and robberies (theft using violence or intimidation, such as muggings) almost halved: falling an average of 46%. Other types of theft, from pick-pocketing to shop-lifting, fell an average of 47%.

“City living has been dramatically curtailed by COVID-19, and crime is a big part of city life,” said Prof Manuel Eisner, Director of the Violence Research Centre at the University of Cambridge and senior author of the study published today in the journal Nature Human Behaviour.

“No drinkers spilling into the streets after nights out at bars and pubs. No days spent in shops and cafés or at the racetrack or football match. Some cities even introduced curfews. It choked the opportunism that fuels so much urban crime.”

“We found the largest reductions in crimes where motivated offenders and suitable victims converge in a public space. There would be far fewer potential targets in the usual crime hotspots such as streets with lots of nightclubs,” said Eisner.

“City living has been dramatically curtailed by COVID-19, and crime is a big part of city life”

Manuel Eisner

Outside a theatre in central Chicago, one of the cities in the study, during a lockdown in 2020.

Falls in crime resulting from COVID-19 stay-at-home orders tended to be sharp but short-lived, with a maximum drop occurring around two to five weeks after implementation, followed by a gradual return to previous levels.

Overall, the team found that stricter lockdowns led to greater declines in crime – although even cities with voluntary “recommendations” instead of restrictions, such as Malmo and Stockholm in Sweden, saw drops in daily rates of theft.

Scatterplot depicting bivariate correlation between overall average decline in crime by city and stringency of stay-at-home restrictions.

Theft of vehicles fell an average of 39% over the study sites. Researchers found that tougher restrictions on use of buses and trains during lockdowns was linked to greater falls in vehicle theft – suggesting that negotiating cities via public transport is often a prerequisite for stealing a set of wheels.

Burglary also fell an average of 28% across all cities. However, lockdowns affected burglary numbers in markedly different ways from city to city. While Lima in Peru saw rates plunge by 84%San Francisco actually saw a 38% increase in break-ins as a result of COVID restrictions.

Data from many cities didn’t distinguish between commercial and residential. Where it did, burglaries of private premises – rather than shops or warehouses – was more likely to decline, with more people stuck in-doors around the clock.

“The measures taken by governments across the world… provided a series of natural experiments”

Manuel Eisner

A woman hurries home during Barcelona’s first lockdown in the spring of 2020.

Reduction was lowest for crimes of homicide: down just 14% on average across all cities in the study. Dr Amy Nivette from the University of Utrecht, the study’s first author, said: “In many societies, a significant proportion of murders are committed in the home. The restrictions on urban mobility may have little effect on domestic murders.

“In addition, organised crime – such as drug trafficking gangs – is responsible for a varying percentage of murders. The behaviour of these gangs is likely to be less sensitive to the changes enforced by a lockdown,” said Nivette.

However, three cities where gang crime drives violence, all in South America, did see major falls in daily homicide as a result of COVID-19 policies. In Rio de Janeiro in Brazil, homicide dropped 24%. In Cali, Columbia, the drop was 29%, and in Lima, Peru, it plummeted 76%.

Rates of reported assaults also saw striking falls in Rio de Janeiro (56% drop) and Lima (75% drop). “It may be that criminal groups used the crisis to strengthen their power by imposing curfews and restricting movement in territories they control, resulting in a respite to the violence that plagues these cities,” said Eisner.

Researchers found Barcelona to be something of an “outlier”, with massive falls in assault (84% drop) and robbery (80% drop). Police-recorded thefts in the Spanish city declined from an average of 385 per day to just 38 per day under lockdown.

London saw less pronounced but still significant falls in some crime, with daily robberies dropping by 60%theft by 44% and burglaries by 29%. The two US cities in the study, Chicago and San Francisco, had their best results in the category of assault, falling by 34% and 36% respectively.

The research team found no overall relationship between measures such as school closures or economic support and crime rates during lockdowns.

Added Eisner: “The measures taken by governments across the world to control COVID-19 provided a series of natural experiments, with major changes in routines, daily encounters and use of public space over entire populations.

“The pandemic has been devastating, but there are also opportunities to better understand social processes, including those involved in causing city-wide crime levels.”

source: https://www.cam.ac.uk/stories/COVIDcrime

One in Twenty Workers Are In ‘Useless’ Jobs – Far Fewer Than Previously Thought

One in Twenty Workers Are In ‘Useless’ Jobs – Far Fewer Than Previously Thought

 

Man working at a laptop
Man working at a laptop Credit: Bermix Studio

 

The so-called ‘bullshit jobs theory’ – which argues that a large and rapidly increasing number of workers are undertaking jobs that they themselves recognise as being useless and of no social value – contains several major flaws, argue researchers from the universities of Cambridge and Birmingham.

 

Although the data doesn’t always support David Graeber’s claims, his insightful and imaginative work played an important role in raising awareness of the harms of useless jobs

Brendan Burchell

Even so, writing in Work, Employment and Society, the academics applaud its proponent, American anthropologist David Graeber, who died in September 2020, for highlighting the link between a sense of purpose in one’s job and psychological wellbeing.

Graeber initially put forward the concept of ‘bullshit jobs’ – jobs that even those who do them view as worthless – in his 2013 essay The Democracy Project. He further expanded this theory in his 2018 book Bullshit Jobs: A Theory, looking at possible reasons for the existence of such jobs.

Jobs that Graeber described as bullshit (BS) jobs range from doormen and receptionists to lobbyists and public relations specialists through to those in the legal profession, particularly corporate lawyers and legal consultants.

Dr Magdalena Soffia from the University of Cambridge and the What Works Centre for Wellbeing, one of the authors of the article, said: “There’s something appealing about the bullshit jobs theory. The fact that many people have worked in such jobs at some point may explain why Graeber’s work resonates with so many people who can relate to the accounts he gives. But his theory is not based on any reliable empirical data, even though he puts forward several propositions, all of which are testable.”

To test Graeber’s propositions, the researchers turned to the 2005–2015 European Working Conditions Surveys (EWCS), examining reasons that led to respondents answering ‘rarely’ or ‘never’ to the statement: ‘I have the feeling of doing useful work’. The surveys – taken in 2005, 2010 and 2015 – gather measures on the usefulness of the job, workers’ wellbeing and objective data on the quality of work. The number of respondents grew from over 21,000 in 2005 to almost 30,000 in 2015.

According to Graeber, somewhere between 20% and 50% of the workforce – possibly as many as 60% – are employed in BS jobs. Yet the EWCS found that just 4.8% of EU workers said they did not feel they were doing useful work. The figure was slightly higher in the UK and Ireland, but still only 5.6% of workers.

Graeber also claimed that the number of BS jobs has been ‘increasing rapidly in recent years’, despite presenting no empirical evidence. Again the researchers found no evidence to support this conjecture – in fact, the percentage of people in BS jobs fell from 7.8% in 2005 to just 4.8% in 2015 – exactly the opposite of Graeber’s prediction.

His next hypothesis was that BS jobs are concentrated in particular professions, such as finance, law, administration and marketing, and largely absent in others, such as those linked to public services and manual labour. “Many service workers hate their jobs; but even those who do are aware that what they do does make some sort of meaningful difference in the world . . . [Whereas] we can only assume that any office worker who one might suspect secretly believes themselves to have a bullshit job does, indeed, believe this,” he wrote.

When the researchers ranked the occupations by the proportion of people who rated their job as rarely or never useful, they found no evidence for the existence of occupations in which the majority of workers feel their work is not useful.

The authors found that workers in some occupations, such as teachers and nurses, generally see themselves as doing useful jobs, while sales workers are above average in the proportion rating their job as not useful (7.7%). Even so, most of the results contradict Graeber’s assertion. For example, legal professionals and administration professionals are all low on this ranking, and jobs that Graeber rates as being examples of essential non-BS jobs, such as refuse collectors (9.7%) and cleaners and helpers (8.1%), are high on this scale.

Not everything that Graeber suggested was wrong, however. He argued, for example, that BS jobs are a form of ‘spiritual violence’ that lead to anxiety, depression and misery among workers. The team found strong evidence between the perception of one’s job as useless and an individual’s psychological wellbeing, albeit a correlation rather than necessarily a causal link. In the UK in 2015, workers who felt their job was not useful scored significantly lower on the World Health Organisation Well-Being Index than those who felt they were doing useful work (a mean average of 49.3 compared with 64.5). There was a similar gap across other EU nations.

Dr Alex Wood from the University of Birmingham said: “When we looked at readily-available data from a large cohort of people across Europe, it quickly became apparent to us that very few of the key propositions in Graeber’s theory can be sustained – and this is the case in every country we looked at, to varying degrees. But one of his most important propositions – that BS jobs are a form of ‘spiritual violence’ – does seem to be supported by the data.”

Given that, in absolute terms, a substantial number of people do not view their jobs as useful, what then leads to this feeling? The team found that those individuals who felt respected and encouraged by management were less likely to report their work as useless. Conversely, when employees experience management that is disrespectful, inefficient or poor at giving feedback, they were less likely to perceive their work as useful.

Similarly, individuals who saw their job as useful tended to be able to use their own ideas at work – an important element for feeling that your job provides you with the ability to make the most of your skills – was correlated with a perception of usefulness. There was a clear relationship between the extent to which people felt that they had enough time to do their job well and their rating of the usefulness of their job, suggesting that one source of feeling a job to be useless is the pace at which one is working, affecting the ability to realise one’s potential and capabilities. Other factors correlated with feeling that a job was worthwhile included support by managers and colleagues and the ability to influence important decisions and the direction of an organization.

Professor Brendan Burchell from the University of Cambridge said: “Although the data doesn’t always support David Graeber’s claims, his insightful and imaginative work played an important role in raising awareness of the harms of useless jobs. He may have been way off the mark with regards how common BS jobs are, but he was right to link people’s attitudes towards their jobs to their psychological wellbeing, and this is something that employers – and society as a whole – should take seriously.

“Most importantly, employees need to be respected and valued if they in turn are to value – and benefit psychologically as well as financially from – their jobs.”

Reference
Soffia, M, Wood, AJ and Burchell, B. Alienation Is Not ‘Bullshit’: An Empirical Critique of Graeber’s Theory of BS Jobs. WES; 3 June 2021; DOI: 10.1177/09500170211015067

source: https://www.cam.ac.uk/research/news/one-in-twenty-workers-are-in-useless-jobs-far-fewer-than-previously-thought


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Extra Classroom Time May Do Little To Help Pupils Recover Lost Learning After COVID-19

Extra Classroom Time May Do Little To Help Pupils Recover Lost Learning After COVID-19

School Credit: Jeswin Thomas via Unsplash

 

Adding extra classroom time to the school day may only result in marginal gains for pupils who have lost learning during the COVID pandemic, a study says.

 

Simply keeping all students in school for longer, in order to do more maths or more English, probably won’t improve results much

Vaughan Connolly

The University of Cambridge analysis used five years of Government data, collected from more than 2,800 schools in England, to estimate the likely impact of additional classroom instruction on academic progress, as measured at GCSE.

It found that even substantial increases in classroom teaching time would likely only lead to small improvements. For example, extending Year 11 pupils’ classroom time by one hour per class, in English or maths, was associated with an increase of 0.12 and 0.18 in a school’s ‘value-added’ score – a standard progress measure. This increase appears small, considering that most of the schools in the study had scores ranging between 994 and 1006.

The research also investigated the likely impact for disadvantaged pupils, whose education has been hardest hit by school closures. In keeping with the overall results, it again found that more of the same teaching was likely to do relatively little to improve academic outcomes.

The study was undertaken by Vaughan Connolly, a doctoral researcher at the Faculty of Education, University of Cambridge. His paper reporting the findings, published in the London Review of Education, suggests that long-term plans to recoup lost learning may be better off focusing on maximising the value of the existing school day, rather than extending it.

“Simply keeping all students in school for longer, in order to do more maths or more English, probably won’t improve results much; nor is it likely to narrow the attainment gap for those who have missed out the most,” Connolly said.

“This evidence suggests that re-evaluating how time is used in schools – for example, by trimming subject time and replacing it with sessions focusing on ‘learning to learn’ skills – could make a bigger difference. Quality is going to matter much more than quantity in the long run.”

One possible reason why additional instruction time may be relatively ineffective is diminishing returns – namely, that more contact hours simply increase the burden on both teachers and pupils, preventing them from being at their best.

Potentially extending the school day has been widely discussed as one possible component of a forthcoming Government recovery plan for education. While there is international evidence suggesting that additional teaching time only leads to small returns, there had been no large-scale study of this issue in the English school system until now.

The Cambridge study used timetable data gathered from 2,815 schools through the School Workforce Census over five years. It tracked the relationship between changes to the amount of instruction time that pupils received in English, maths, science and humanities subjects, and their academic progress.

‘Progress’ was identified using schools’ value-added scores. The Government gathers these when pupils sit GCSEs at age 16, by comparing their actual results with predictions made after their primary school SATs at age 11.

While the impact of additional classroom tuition on progress varied between subjects and groups, the effects were generally small. For example: one additional hour of instruction for a Year 11 class in English, science, maths, or the humanities, led to an increase in value-added scores of 0.12, 0.09, 0.18 and 0.43 respectively. ‘At a practical level, this seems small, particularly when considering the cost of such time,’ the study notes.

To examine the potential impact of extra classroom time on less-advantaged students, the study also assessed how far it closed the gap between the value-added scores of students on free school meals, and those of students with middle-ranking prior attainment. The results were again found to be modest. For example, an extra 59 minutes per week in English reduced the attainment gap between these groups by about 6.5%; and an extra 57 minutes per week of maths by about 8%.

The findings compare with those of the Education Endowment Foundation’s influential Teaching and Learning toolkit, which summarises international evidence on different teaching interventions and translates their effect sizes into months of progress. It suggests that increased instruction time is likely to lead to two months of progress over an academic year. This compares poorly with the results of other interventions listed in the same document.

In this context, the Cambridge study suggests that methods which focus on increasing the quality of learning in the classroom, rather than the amount of time spent there, may prove more fruitful. It echoes recommendations recently made by the Education Policy Institute which called for ambitious levels of investment in a wider-ranging programme of catch-up measures. The new study suggests that time could be reallocated during the school day, either to support the continuing professional development of staff, or to provide pupils with additional skills.

It also points to research conducted in 2016 in which Key Stage 3 pupils’ test scores improved dramatically after a portion of their regular curriculum was replaced with training in metacognition – the ability to understand how to learn and reason through problems. Other studies, such as a project examining learning recovery after the 2011 earthquake in Christchurch, New Zealand, have similarly suggested that supporting schools to better match their curriculum to student needs may have greater effect than extra classroom time.

“Rather than extending the school day to offer more instruction, a successful recovery agenda may well be one that tailors support and makes room for a wider range of learning within it, in line with the recent suggestions made by the EPI,” Connolly said. “In that sense, less instructional time could actually be more. Certainly, these results suggest that giving children more of the same is unlikely to help if we want to recover what has been lost during the pandemic.”

source: https://www.cam.ac.uk/research/news/extra-classroom-time-may-do-little-to-help-pupils-recover-lost-learning-after-covid-19


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Many Of Us Could Carry Up To 17kg Of Fat Due To A Change In A Single Gene

Many Of Us Could Carry Up To 17kg Of Fat Due To A Change In A Single Gene

Weighing scales and tape measure
Weighing scales and tape measure Credit: mojzagrebinfo

 

New research has found that one in every 340 people might carry a mutation in a single gene that makes them more likely to have a greater weight from early childhood and, by 18 years of age, they could be up to 30 pounds heavier with the excess weight likely to be mostly fat.

 

Our findings show that weight gain in childhood due to a single gene disorder is not uncommon. This should encourage a more compassionate and rational approach to overweight children and their families

Stephen O’Rahilly

The study led by scientists at the MRC Metabolic Diseases Unit which is part of the Wellcome-MRC Institute of Metabolic Science at the University of Cambridge and the MRC Integrative Epidemiology Unit at the University of Bristol is published today in Nature Medicine.

It has been known for a long time that obesity tends to run in families, but it was not until about 20 years ago that scientists started to discover that changes in specific genes can have very large effects on our weight even from early childhood.

One of these genes, the Melanocortin 4 Receptor (MC4R), makes a protein that is produced in the brain where it sends signals to our appetite centres, telling them how much fat we have stored. When the MC4R gene does not work properly, our brains think we have lower fat stores than we do, signalling that we are starving and need to eat.

The research team found that around one in every 340 people may carry a disruptive mutation at MC4R. People who carry these mutations were more likely to have a greater weight from early childhood and, by 18 years of age, they were on average 17 kg (37 lbs or 2.5 stone) heavier, with the majority of this excess weight likely to be fat.

These results were found by studying the MC4R gene in a random sample of around 6,000 participants born in Bristol in 1990-91, who were recruited to Children of the 90s, a health study based at the University of Bristol. This is a unique UK study that recruited approximately 80 percent of the births occurring in a specific region of the South West and which has continued to follow participants. As the Children of the 90s study managed to recruit such a high percentage of mothers during pregnancy, it is one of the most representative and comprehensive studies of its kind.

The authors examined the MC4R gene in all 6,000 people and, whenever a mutation was found, went on to study its functional effects in the laboratory. This meticulous approach has provided the best estimates so far of the frequency and impact of MC4R mutations on people’s weight and body fat. Based on the frequency of mutations in this study, it is possible that around 200,000 people in the UK could carry a substantial amount of additional fat because of mutations in MC4R.

Professor Sir Stephen O’Rahilly, from the University of Cambridge and one of the authors of the study, said: “Parents of obese children are often blamed for poor parenting and not all children obtain appropriate professional help. Our findings show that weight gain in childhood due to a single gene disorder is not uncommon. This should encourage a more compassionate and rational approach to overweight children and their families – including genetic analysis in all seriously obese children.”

Dr Kaitlin Wade, from the University of Bristol’s MRC Integrated Epidemiology Unit and an author on the paper, added: “Work like this is really made possible as a result of the amazing properties presented by a study like Children of the 90s. Having biological samples for sequencing and rich life course data within a representative population sample is critical to allow new understanding and deep characterisation of important biological genetic effects like these.”

Professor Nic Timpson, Children of the 90s’ Principal Investigator, and also one of the study’s authors, explained: “This work helps to recalibrate our understanding of the frequency and functional impact of rare MC4R mutations and will help to shape the future management of this important health factor – we extend our thanks to the participants of the Children of the 90s.”

Though the MC4R gene is a striking example, this is only one gene of many that affect our weight and there are likely to be further examples that emerge as genetic sequencing becomes more common.

In the longer term, knowledge of the brain pathways controlled by MC4R should help in the design of drugs that bypass the signalling blockade and help restore people to a healthy weight.

Reference
Wade KH et al. Loss-of-function mutations in the melanocortin 4 receptor in a UK birth cohort. Nature Medicine; 27 May 2021

Adapted from a press release by the Wellcome-MRC Institute of Metabolic Science at the University of Cambridge and the University of Bristol

source: https://www.cam.ac.uk/research/news/many-of-us-could-carry-up-to-17kg-of-fat-due-to-a-change-in-a-single-gene


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

How Mass Testing Helped Limit the Spread of COVID-19 at The University of Cambridge

How Mass Testing Helped Limit the Spread of COVID-19 at The University of Cambridge

Innovative approach shows how higher education institutions can re-open safely

A combination of testing programmes for staff and students, infection control measures and genomic surveillance helped reduce the number of cases of COVID-19 at the University of Cambridge and keep the wider community safe.

At the start of autumn term in October 2020, the University of Cambridge introduced a free weekly asymptomatic screening programme for all students living in College accommodation. It is complemented by a testing programme for staff and students with symptoms of possible COVID-19. Both programmes use PCR testing – still considered the gold standard.

Professor Stephen Toope, Vice-Chancellor of the University of Cambridge, said: “In just eight weeks, a dedicated team at the University set up, from scratch, an innovative programme which, at its peak, was testing around 10,000 students a week. This was an incredible achievement.”

Together with staff at our University and Colleges, those colleagues have all worked tirelessly to help us keep our students, staff and the wider Cambridge community safe. I’m truly grateful to them and to every student who has taken part in the programme over the past academic year.

Professor Stephen Toope, Vice-Chancellor

Cambridge researchers have released back-to-back studies examining the role of the screening programme, together with other infection control measures, in managing SARS-CoV-2 infections at the University. The results have been made available before peer-review, to allow rapid sharing of information that may be helpful for other universities, as well as mass testing programmes in different settings.

Patrick Maxwell, Regius Professor of Physic at the University of Cambridge, said: “University students are at particular risk of transmitting coronavirus because they live in shared accommodation, and often socialise in large networks. We know that it is possible to spread the disease without showing symptoms. Our team has shown that pooled asymptomatic screening is feasible and effective, and combined with infection control measures make it possible to keep the number of infections very low.”

PhD student Nordin Ćatić swabbing for infection

PhD student Nordin Ćatić swabbing for SARS-CoV-2 infection. (Credit: Dominic Hall)

ASYMPTOMATIC SCREENING FOR COVID-19 HELPS CONTROL TRANSMISSION

During autumn term 2020, 12,781 students living in College accommodation participated in the University’s Asymptomatic COVID-19 Screening Programme. Students were initially screened every fortnight, but by week seven all participating students were offered screening every week. More than four out of five (82%) of all eligible students ultimately agreed to take part.

Participating students swabbed their noses and throats, then pooled their swabs in the same sample tube as other students from the same household. Pooled samples were analysed at the Cambridge COVID-19 Testing Centre, a collaboration between the University, AstraZeneca and Charles River Laboratories. Pooling swabs made the programme much more efficient, without losing test sensitivity. Over the study period, 50,376 swabs were screened, using only 16,945 tests.

There was no evidence that relying on voluntary participation, or using nose and throat swabs, was a barrier to students taking part. In fact, in a separate survey of more than 750 participants, over 95% of respondents were satisfied with the screening programme and thought that it had made an overall positive contribution to the Cambridge community.

Among 671 students diagnosed with SARS-CoV-2 infection over the study period (5% of participants), 299 (45%) were either identified or pre-emptively asked to self-isolate because of the screening programme. Using a model based on transmission at the University, the researchers estimated that weekly screening reduced the ‘R number’ – the average number of people an individual would infect in a susceptible population – by about a third.

Dr Nicholas Matheson, from the Cambridge Institute of Therapeutic Immunology and Infectious Disease (CITIID), designed the screening programme. He said: “By combining regular asymptomatic screening with readily accessible symptomatic testing, we were able to detect almost all cases of SARS-CoV-2 infection amongst students living in our Colleges.

“We hope our screening programme will serve as a useful example for other universities and colleges looking at ways to keep their students, staff and communities safe. We’ve shown that it’s possible to do this at scale by pooling samples, making it logistically and economically viable, without compromising test performance.

“Most importantly, we’ve provided real-world evidence that regular, voluntary asymptomatic screening can be effective in helping control COVID-19 transmission. With the current uncertainty around new variants of concern, and most young adults in the UK – let alone the world – not yet vaccinated, that’s an important lesson about mass testing in general, not just in universities.”

Poster for screening programme - "I test to protect you. You test to protect me."

Poster for University of Cambridge asymptomatic screening programme

GENOMIC SEQUENCING CONFIRMS EFFECTIVENESS OF INFECTION CONTROL MEASURES

The University of Cambridge plays a leading role in the COVID-19 Genomics UK (COG-UK) consortium, which sequences the genetic code of viruses isolated from infected individuals to help understand the spread of infection. By analysing these sequences, it is possible to plot a ‘family tree’ (known as phylogenetic tree) of viral isolates and, coupled with epidemiological information, determine whether cases are related.

During autumn term, COG-UK analysed 446 genomes from the University’s testing programmes. Because these programmes use PCR-testing, all positives samples were readily available for sequencing.

The team found that the great majority of transmission chains were short, suggesting that infection control measures implemented by the University, including asymptomatic screening, support for self-isolation and in-house contact tracing, were successful in controlling transmission. In fact, 70% of all University cases belonged to one genetic cluster, likely dispersed by attending a nightclub.

Transmission mainly occurred within student accommodation and/or between students on the same courses. There was minimal evidence of transmission in University departments, or between students and staff. Critically, there was little evidence of transmission between the University and the local community.

Dr Ben Warne, also from CITIID, said: “There are several takeaways from our findings. First, we need to be cautious about access to certain types of high-risk venues during a pandemic, particularly in the context of a young, susceptible student population.

“Second, spread of the virus through the University has been effectively contained using a combination of prompt case identification, including asymptomatic screening, and simple infection control measures, such as supporting affected students and their contacts to isolate.”

Asymptomatic screening programme kit

Asymptomatic screening programme kit (Credit: Nordin Ćatić)

Dr Dinesh Aggarwal, from the Department of Medicine, added: “The lack of evidence of overspill from the University to the community was reassuring and suggests that, with appropriate precautions, it can be possible to keep universities and colleges open safely during the pandemic.

Through COG-UK, the UK has been at the forefront of using genomic epidemiology to inform outbreak investigations. With the widespread availability of sequencing, we believe it is a critical part of public health surveillance required to understand SARS-CoV-2 transmission and biology, and will be a critical part of future pandemic preparedness.

Dr Dinesh Aggarwal

Scientists Track Veil of Toxic Metals Carried in Kīlauea’s Gas Plumes, Revealing Hidden Dangers of Volcanic Pollution

Scientists Track Veil of Toxic Metals Carried in Kīlauea’s Gas Plumes, Revealing Hidden Dangers of Volcanic Pollution

Golden Hour at Kīlauea
Golden Hour at Kīlauea Credit: Emily Mason/USGS

 

A team of volcanologists who observed the colossal 2018 eruption of Kīlauea, Hawai’i, have tracked how potentially toxic metals carried in its gas plumes were transported away from the volcano to be deposited on the landscape.

 

This work is a key step to understanding the significant, yet underestimated, chemical risks of volcanoes

Emily Mason

The research, published in two companion papers in Nature Communications Earth and Environment, is the most extensive survey of metal release from any volcano to date – helping scientists understand the spread of metal-rich volcanic fumes and the exposure of communities to volcanic air pollution around Hawai’i.

The 2018 eruption of Kīlauea was the largest in centuries, flooding the eastern edge of the island with roughly a cubic kilometres of lava. Over a thousand people lost their homes and many more suffered from noxious volcanic gases.

Understanding how volcanic metals are released to the environment is critical from a public health perspective, “We don’t know much about these metal emissions at all, so this work is a key step to understanding the significant, yet underestimated, chemical risks of volcanoes,” said Emily Mason, PhD student at Cambridge Earth Sciences and lead author of one of the papers.

When volcanoes erupt they exhale a cocktail of gases – mostly steam, carbon dioxide and sulphur dioxide – laced with evaporated heavy metals, including lead and arsenic. To the communities living alongside volcanoes, these gases are often a considerable source of air pollution and the volatile metals they carry may have long-lasting impacts on both health and environment.

Volcanologists have been measuring volatile metal emissions from volcanoes for decades, but how these elements are dispersed in the atmosphere following an eruption, to later rain down on the landscape and be taken up in the environment through soils and water bodies, has remained poorly understood.

The team, including researchers from the University of Cambridge, report higher concentrations of airborne heavy metals within a 40 km radius of Kīlauea, meaning that communities living closer to the volcano were disproportionally exposed to metal pollution during the 2018 eruption.

They believe that the strong trade winds at the time of the eruption, combined with the topography of the local area, caused higher rainfall and, therefore metal deposition, closer to the vent. This could mean that an eruption in winter, when wind patterns are reversed, might result in a different distribution of metal deposition.

Their results could help delineate environmental monitoring strategies during and following eruptions – including the targeted testing of community water supplies in at-risk areas – as well as helping planners decide where to build safely around volcanoes.

Emily Mason was one of an all-female team of scientists from the Universities of Cambridge and Leeds that headed out to take gas measurements when Kīlauea erupted. Mason, together with then first-year PhD students Penny Wieser and Rachel Whitty, and early career scientists Evgenia Ilyinskaya and Emma Liu, arrived when the eruption was in full flow and some of their study area was already cut off by lava, “We had to fly in to one location via helicopter. I remember descending through a dense haze of volcanic gas…the acidic air actually stung our skin.” said Mason.

“We tend to think of the more immediate volcanic hazards like ash fall, pyroclastic flows, lava,” said Dr Evgenia Ilyinskaya, from the University of Leeds, who led the research on downwind metal dispersal, “But metal emissions, just like air pollution, are an insidious and often underestimated volcanic hazard – potentially impacting health over long periods.”

During the first few weeks of the eruption, the main air quality concern was volcanic smog, or ‘vog’, which contains mostly sulfur dioxide with traces of heavy metals and volcanic ash. But when molten lava reached the ocean and reacted with seawater it triggered new health warnings, as billowing white clouds of lava haze or ‘laze’ were released; carrying hydrochloric acid and toxic metals.

Working with collaborators from the USGS, the team took measurements of gases inside the laze and dry vog plumes from both the ground and the air, using specially-fitted drones. They even developed a back frame for their air filters, so they could move equipment quickly through areas where the air was thick with sulphur dioxide.

Mason and co-authors discovered that the two types of gas plume had a very different chemistry, “What really surprised us was the large amounts of copper in the laze plume…the impact of lava-seawater interactions on the biosphere may be significantly underestimated. It’s interesting to note that this type of plume was probably a common feature of the massive outpourings of lava throughout geological history – some of which have been linked to mass extinctions.”

Their long-term goal is to produce pollution hazard maps for volcanoes, showing at-risk areas for metal pollution, a method already used to communicate areas that might be at risk of other volcanic hazards, like lava flows, “Our research is just one part of the puzzle – the idea would be to understand all of these hazards in tandem”.

They aim to apply this method worldwide, but Mason cautions that local atmospheric conditions significantly influence metal dispersal and deposition. Now they want to know how the transport of volcanic metals might differ in cooler, drier environments like the Antarctic – or even in different areas of Hawai’i where rainfall is lower.

 

Ilyinskaya, Evgenia, et al. “Rapid metal pollutant deposition from the volcanic plume of Kīlauea, Hawai’i.” Communications Earth & Environment 2.1 (2021): 1-15.

Mason, Emily, et al. “Volatile metal emissions from volcanic degassing and lava–seawater interactions at Kīlauea Volcano, Hawai’i.” Communications Earth & Environment 2.1 (2021): 1-16.

source: https://www.cam.ac.uk/research/news/scientists-track-veil-of-toxic-metals-carried-in-kilaueas-gas-plumes-revealing-hidden-dangers-of


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Epic Dictionary Re-defines Ancient Greek Including The Words Which Made the Victorians Blush

Epic Dictionary Re-defines Ancient Greek Including The Words Which Made the Victorians Blush

Professor James Diggle in Cambridge's Museum of Classical Archaeology
Professor James Diggle Credit: Sir Cam

 

For 23 years a team from Cambridge’s Faculty of Classics has scoured Ancient Greek literature for meanings to complete the Cambridge Greek Lexicon, a monumental piece of scholarship and the most innovative dictionary of its kind in almost 200 years.

 

It took over my life

James Diggle

Recently published by Cambridge University Press, the Lexicon provides fresh definitions and translations gleaned by re-reading most of Ancient Greek literature from its foundations in Homer, right through to the early second century AD.

Introducing up-to-date English, the new dictionary clarifies meanings that had become obscured by antiquated verbiage in Liddell and Scott’s Intermediate Greek-English Lexicon which was first published in 1889.

Editor-in-Chief, Professor James Diggle of Queens’ College said: “We don’t call βλαύτη ‘a kind of slipper worn by fops’ as in the Intermediate Lexicon. In the Cambridge Lexicon, this becomes ‘a kind of simple footwear, slipper’.”

The team has also rescued words from Victorian attempts at modesty. “We spare no blushes,” said Diggle. “We do not translate the verb χέζω as ‘ease oneself, do one’s need’. We translate it as ‘to shit’. Nor do we explain ‘βινέω as ‘inire, coire, of illicit intercourse’, but simply translate it by the f-word.”

The two volumes are set to become instantly indispensable for Classics students as well as an important reference work for scholars.

The team used online databases – the Perseus Digital Library and later the Thesaurus Linguae Graecae – to make this huge corpus more easily accessible and searchable.

The researchers pored over every word, working steadily through the 24 letters of the Greek alphabet to build up a clear, modern and accessible guide to the meanings of Ancient Greek words and their development through different contexts and authors. The Lexicon features around 37,000 Greek words drawn from the writings of around 90 different authors and set out across more than 1,500 pages.

The project, which began in 1997, was the brainchild of the renowned Classical philologist and lexicographer John Chadwick (1920–98). The initial plan was to revise the Intermediate Greek-English Lexicon. An abridged version of a lexicon published in 1843, it has never been revised, but until now has remained the lexicon most commonly used by students in English schools and universities. It was hoped that the project might be completed by a single editor within five years.

Diggle was then chair of the project’s advisory committee. He said: “Soon after work began it became clear that it was not possible to revise the Intermediate Lexicon; it was too antiquated in concept, design and content. It was better to start afresh by compiling a new lexicon.

“We didn’t realise at the time the magnitude of the task, and it was only because of advances in technology that we were able to take it on. We then had to appoint additional editorial staff and raise a huge amount of financial support. It took us over 20 years because we decided that if we were going to do it we must do it thoroughly.

“At the outset of the project I undertook to read everything which the editors wrote. I soon realised that if we were ever to finish I had better start to write entries myself, and for the last 15 years or more I was fully occupied with it and did little else – it took over my life.”

The Cambridge Greek Lexicon takes a fundamentally different approach to its Victorian predecessor. While entries in the Liddell and Scott lexicon usually start with a word’s earliest appearance in the literature, the Cambridge team realised this might not give its original, or root, meaning. Instead, they begin their entries with that root meaning and then in numbered sections trace the word’s development in different contexts.

Opening summaries help ease the reader into the longer entries, setting out the order of what is to follow, while different fonts signpost the way, helping the reader to distinguish between definitions, translations, and other material, such as grammatical constructions.

The team tackled countless other interesting and challenging words, including πόλις, which will be familiar to many in its English form ‘polis’. Diggle said: “Our article shows the variety of senses which the word can have: in its earliest usage ‘citadel, acropolis’, then (more generally) ‘city, town’, also ‘territory, land’, and (more specifically, in the classical period) ‘city as a political entity, city-state’, also (with reference to the occupants of a city) ‘community, citizen body’.”

“’Verbs can be the most difficult items to deal with, especially if they are common verbs, with many different but interrelated uses. ἔχω, (ékhō) is one of the commonest Greek verbs, whose basic senses are ‘have’ and ‘hold’. Our entry for this verb runs to 55 sections. If a verb has as many applications as this, you need to provide the reader with signposts, to show how you have organised the material, to show that you have organised the numbered sections in groups, and to show that these groups follow logically one from the other.”

Professor Robin Osborne, Chair of the Faculty of Classics, said: “The Faculty takes enormous pride in this dictionary and in the way Cambridge University Press have aided us and produced it. It’s a beautiful piece of book making.”

“We invested in the Lexicon to make a contribution to the teaching of Greek over the next century. This puts into the hands of students a resource that will enable them access to Ancient Greek more securely and easily.

“It is hugely important that we continue to engage with the literature of Ancient Greece, not as texts frozen in a past world, but which engage with the world in which we live. There’s been continual engagement with them since antiquity, so we are also engaging with that history, which is the history of European thought.”

The project’s attention to detail also extended to the Press and the typesetters, who took immense care to ensure that consulting the Lexicon would be an easy and pleasurable experience, right down to a specially-created text design. Diggle and his fellow editors inputted their entries for the Lexicon in xml, using a complex system of more than 100 digital tags to ensure each element was automatically rendered in the correct format.

This also allowed for a constant feedback loop between the editors, the Press and the typesetters, with proofs reviewed and corrected, and the style and content honed as work progressed.

Michael Sharp, the Lexicon’s Publisher, said: “The Cambridge Greek Lexicon is one of the most important Classics books we have ever published. It represents a milestone in the history of Classics, the University and the Press. I am elated, relieved and immensely proud of the part the Press has played in supporting this project.”

Professor Diggle said: “The moment of greatest relief and joy was when I was able to sign off the very, very final proofs and say to the Press ‘It’s finished. You can print it’. You can’t imagine what it was like, to realise that we had finally got there; I literally wept with joy.”

source: https://www.cam.ac.uk/news/epic-dictionary-re-defines-ancient-greek-including-the-words-which-made-the-victorians-blush


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Victory for Cambridge’s Men’s and Women’s Crews in the Boat Race 2021

Victory for Cambridge’s Men’s and Women’s Crews in the Boat Race 2021

Credit: Row360

 

Cambridge won the men’s and women’s races during two hard-fought battles on the Great Ouse at Ely

 

Cambridge scored a double victory in the Boat Race 2021, with both the men’s and women’s crews coming out on top following two thrilling races.

The event was moved from the usual course, along the Thames in London, because of a combination of the COVID-19 pandemic and repair work on Hammersmith Bridge.

Professor Stephen Toope, Vice-Chancellor of Cambridge University, paid tribute to both crews.

“Huge congratulations to both crews in the Boat Race. They did the Light Blues proud in two hard-fought duels against battling and determined Oxford opponents.

“Every member of every crew played their part in an afternoon of extraordinary sporting excitement.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Original Article: https://www.cam.ac.uk/news/victory-for-cambridges-mens-and-womens-crews-in-the-boat-race-2021

Widespread Use of Control Measures Such as Facemasks is Vital to Suppress the Pandemic as Lockdown Lifts, Say Scientists

Widespread Use of Control Measures Such as Facemasks is Vital to Suppress the Pandemic as Lockdown Lifts, Say Scientists

Man putting on a facemask
Man putting on a facemask Credit: Kobby Mendez on Unsplash

 

A new mathematical model suggests that the easing of lockdown must be accompanied by wider and more effective use of control measures such as facemasks, even with vaccination, in order to suppress COVID-19 more quickly and reduce the likelihood of another lockdown.

 

More effective use of control measures like facemasks and handwashing would help us to stop the pandemic faster.

Yevhen Suprunenko

The model, developed by scientists at the Universities of Cambridge and Liverpool, is published today in the Journal of the Royal Society Interface. It uses mathematical equations to provide general insights about how COVID-19 will spread under different potential control scenarios.

Control measures involving facemasks, handwashing and short-scale (1-2 metre) social distancing can all limit the number of virus particles being spread between people. These are termed ‘non spatial’ measures to distinguish them from a second category of ‘spatial’ control measures that include lockdown and travel restrictions, which reduce how far virus particles can spread. The new model compares the efficacy of different combinations of measures in controlling the spread of COVID-19, and shows how non-spatial control needs to be ramped up as lockdown is lifted.

“More effective use of control measures like facemasks and handwashing would help us to stop the pandemic faster, or to get better results in halting transmission through the vaccination programme. This also means we could avoid another potential lockdown,” said Dr Yevhen Suprunenko, a Research Associate in the University of Cambridge’s Department of Plant Sciences and first author of the paper. The authors stress that their predictions rely on such non-spatial control measures being implemented effectively.

The model also considered the socio-economic impact of both types of measure, and how this changes during the pandemic. The socio-economic consequences of spatial measures such as lockdown have increased over time, while the cost of non-spatial control measures has decreased -for example, facemasks have become more widely available and people have become used to wearing them.

“Measures such as lockdowns that limit how far potentially infected people move can have a stronger impact on controlling the spread of disease, but methods that reduce the risk of transmission whenever people mix provide an inexpensive way to supplement them,” said Dr Stephen Cornell at the University of Liverpool, co-author of the paper.

The model arose from a broader research programme to identify control strategies for plant diseases threatening staple crops. By using a mathematical approach rather than a conventional computer simulation model, the authors were able to identify – for a wide range of scenarios – general insights on how to deal with newly emerging infectious diseases of plants and animals.

“Our new model will help us study how different infectious diseases can spread and become endemic. This will enable us to find better control strategies, and stop future epidemics faster and more efficiently,” said Professor Chris Gilligan in the University of Cambridge’s Department of Plant Sciences, co-author of the paper.

Part of this research was funded by the Bill and Melinda Gates Foundation.

Reference

Suprunenko, Y.F. et al: ‘Analytical approximation for invasion and endemic thresholds, and the optimal control of epidemics in spatially explicit individual-based models.’ J.R.Soc. Interface, March 2021. DOI: 10.1098/rsif.2020.0966

Original Article: https://unsplash.com/photos/VGYcVZguFzs?utm_source=unsplash&utm_medium=referral&utm_content=creditShareLink


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Lakes on Greenland Ice Sheet Can Drain Huge Amounts of Water, Even In Winter

Lakes on Greenland Ice Sheet Can Drain Huge Amounts of Water, Even In Winter

Lake on the surface of the Greenland Ice Sheet
Lake on the surface of the Greenland Ice Sheet Credit: Ian Willis

 

Using satellite data to ‘see in the dark’, researchers have shown for the first time that lakes on the Greenland Ice Sheet drain during winter, a finding with implications for the speed at which the world’s second-largest ice sheet flows to the ocean.

 

We don’t yet know how widespread this winter lake drainage phenomenon is, but it could have important implications for the Greenland Ice Sheet, as well as elsewhere in the Arctic and Antarctic

Ian Willis

The researchers, from the University of Cambridge, used radar data from a European Space Agency satellite to show that even when the heat from the Sun is absent, these lakes can discharge large amounts of water to the base of the ice sheet. These ‘drainage events’ are thought to play a significant role in accelerating the movement of the ice by lubricating it from below.

Previous studies of draining lakes have all been carried out during the summer months, through a combination of direct field observations and optical satellite data, which requires daylight.

The approach developed by the Cambridge researchers uses the radar ‘backscatter’ – the reflection of waves back to the satellite from where they were emitted – to detect changes in the lakes during the winter months, when Greenland is in near-total darkness.

The results, reported in the journal The Cryosphere, imply that the ‘plumbing’ system beneath the Greenland Ice Sheet doesn’t just slowly leak water from the previous summer, but even in the depths of the Arctic winter, it can be ‘recharged’, as large amounts of surface lake water cascade to the base of the ice sheet.

Many previous studies have shown that the Greenland Ice Sheet is losing mass, and the rate of loss is accelerating, due to melting and runoff.

“One of the unknowns in terms of predicting the future of the ice sheet is how fast the glaciers move – whether they will speed up and if so, by how much,” said co-author Dr Ian Willis from Cambridge’s Scott Polar Research Institute (SPRI). “The key control on how fast the glaciers move is the amount of meltwater getting to the bottom of the ice sheet, which is where our work comes in.”

Lakes form on the surface of the Greenland ice sheet each summer as the weather warms. They exist for weeks or months but can drain in a matter of hours due to hydrofracturing, transferring millions of cubic metres of water and heat to the base of the ice sheet. The affected areas include sensitive regions of the ice sheet interior where the impact on ice flow is potentially large.

“It’s always been thought that these lakes drained only in the summer, simply because it’s warmer and the sun causes the ice to melt,” said co-author Corinne Benedek, also from SPRI. “In the winter, it’s dark and the surfaces freeze. We thought that the filling of the lakes is what caused their eventual drainage, but it turns out that isn’t always the case.”

Benedek, who is currently a PhD candidate at SPRI, first became interested in what happens to surface lakes in the winter while she was a Master’s student studying satellite thermal data.

“The thermal data showed me that liquid water can survive in the lakes throughout the winter,” she said. “Previous studies using airborne radar had also identified lakes buried a few metres beneath the surface of the ice sheet in the summer. Both of these things got me thinking about ways to observe lakes all year long. The optical satellite imagery we normally use to observe the lakes isn’t available in winter, or even when it’s cloudy.”

Benedek and Willis developed a method using data from the Sentinel-1 satellite, which uses a type of radar called synthetic aperture radar (SAR). SAR functions at a wavelength that makes it possible to see through clouds and in the dark. Ice and water read differently using SAR, and so they developed an algorithm that tracks when sudden changes in SAR backscatter occur.

Over three winters, they identified six lakes that appeared to drain over the winter months. These lakes were buried lakes or surface lakes that were frozen over. The algorithm was able to identify where the backscatter characteristics of the lake changed markedly between one image and the next one recorded 12 days later.

The SAR data was backed up with additional optical data from the previous autumn and subsequent spring, which confirmed that lakes areas shrank considerably for the six drained lakes. For three of the lakes, the optical data, as well as data from other satellites, was used to show the snow- and ice-covered lakes collapsed, dropping by several metres, again confirming the water had drained.

“The first lake I found was surprising,” said Benedek. “It took me a while to be sure that what I thought I was seeing was really what I was seeing. We used surface elevation data from before and after the events to confirm what we were thinking. We know now that drainage of lakes during the winter is something that can happen, but we don’t yet know how often it happens.”

“Glaciers slow down in the winter, but they’re still moving,” said Willis. “It must be this movement that causes fractures to develop in certain places allowing some lakes to drain. We don’t yet know how widespread this winter lake drainage phenomenon is, but it could have important implications for the Greenland Ice Sheet, as well as elsewhere in the Arctic and Antarctic.”

 

Reference:
Corinne L. Benedek and Ian C. Willis. ‘Winter drainage of surface lakes on the Greenland Ice Sheet from Sentinel-1 SAR imagery.’ The Cryosphere (2021). DOI: 10.5194/tc-15-1-2021


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Original Article: https://www.cam.ac.uk/research/news/lakes-on-greenland-ice-sheet-can-drain-huge-amounts-of-water-even-in-winter

Opinion: Why Scientists Need To Work More Closely With Faith Communities On Climate Change

Opinion: Why Scientists Need To Work More Closely With Faith Communities On Climate Change

Unitarian Universalist and larger faith contingent taking part in the 21 September 2014 Peoples Climate March
Unitarian Universalist and larger faith contingent taking part in the 21 September 2014 Peoples Climate March Credit: © Peter Bowden

 

To make sufficient progress in the fight against climate change, scientists need to start taking religious groups more seriously as allies, writes Cambridge political scientist, Dr Tobias Müller, in Nature.

 

I am used to sceptical looks when I talk to scientists about my work with religious communities

Tobias Müller

I am used to sceptical looks when I talk to scientists about my work with religious communities. They have reason to see science as under threat from zealots: examples abound, from the treatment of Galileo Galilei to vaccine aversion. But faith communities can feel the same way about scientists. Even if they disagree on important topics, it’s both possible and essential to collaborate on urgent issues, such as the fact that large parts of Earth are becoming uninhabitable. In my view, this Easter, Passover or Ramadan is the perfect time to start.

I’m a political scientist who studies how religious groups respond to problems, from environmental crises to domestic violence to racism. Since 2013, I have worked with other researchers, some religious and some not, to explore climate science with communities of faith.

I’ve seen the power of this approach: some 1,200 institutions have committed to divest from fossil-fuel companies, totalling US$14.5 trillion. One-third are faith-based organizations. Many, such as Operation Noah, are led by scientists. Similarly, the group Extinction Rebellion Muslims has built a transnational network with scientists and activists in Kenya, Gambia, the United Kingdom and beyond; they host “Green Ramadan” seminars. Their efforts stalled plans for a luxury tourist resort that would have destroyed parts of the Nairobi National Park in Kenya. A co-campaigner, Maasai leader Nkamunu Patita, has been appointed to a government task force that will map wildlife-migration routes and be consulted in future development plans.

 

This is the opening of an opinion piece published in Nature on 30 March 2021. This is open access and can be read in full here.

Dr Tobias Müller is a Junior Research Fellow at the Woolf Institute and an Affiliated Lecturer at Cambridge’s Department of Politics and International Studies (POLIS).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Original Article: https://www.cam.ac.uk/research/discussion/opinion-why-scientists-need-to-work-more-closely-with-faith-communities-on-climate-change

Gene Therapy Technique Shows Potential for Repairing Damage Caused by Glaucoma and Dementia

Gene Therapy Technique Shows Potential for Repairing Damage Caused by Glaucoma and Dementia

Screening for glaucoma
Screening for glaucoma Credit: IAPB/VISION 2020

Scientists at the University of Cambridge have shown in animal studies that gene therapy may help repair some of the damage caused in chronic neurodegenerative conditions such as glaucoma and dementia. Their approach demonstrates the potential effectiveness of gene therapy in polygenic conditions – that is, complex conditions with no single genetic cause.

 

[Our] approach also leads to a much more sustained therapeutic effect, which is very important for a treatment aimed at a chronic degenerative disease

Tasneem Khatib

Gene therapy – where a missing or defective gene is replaced by a healthy version – is becoming increasingly common for a number of neurological conditions including Leber’s Congenital Amaurosis, Spinal Muscular Atrophy and Leber’s Hereditary Optic Neuropathy. However, each of these conditions is rare, and monogenic – that is, caused by a single defective gene. The application of gene therapy to complex polygenic conditions, which make up the majority of neurodegenerative diseases, has been limited to date.

A common feature of neurodegenerative diseases is disruption of axonal transport, a cellular process responsible for movement of key molecules and cellular ‘building blocks’ including mitochondria, lipids and proteins to and from the body of a nerve cell. Axons are long fibres that transmit electrical signals, allowing nerve cells to communicate with other nerve cells and muscles. Scientists have suggested that stimulating axonal transport by enhancing intrinsic neuronal processes in the diseased central nervous system might be a way to repair damaged nerve cells.

Two candidate molecules for improving axonal function in injured nerve cells are brain-derived neurotrophic factor (BDNF) and its receptor tropomyosin receptor kinase B (TrkB).

In research published today in Science Advances, scientists at the University of Cambridge show that delivering both of these molecules simultaneously to nerve cells using a single virus has a strong effect in stimulating axonal growth compared to delivering either molecule on its own. They tested their idea in two models of neurodegenerative disease known to be associated with reduced axonal transport, namely glaucoma and tauopathy (a degenerative disease associated with dementia).

Dr Tasneem Khatib from the John van Geest Centre for Brain Repair at the University of Cambridge, the study’s first author, said: “The axons of nerve cells function a bit like a railway system, where the cargo is essential components required for the cells to survive and function. In neurodegenerative diseases, this railway system can get damaged or blocked. We reckoned that replacing two molecules that we know work effectively together would help to repair this transport network more effectively than delivering either one alone, and that is exactly what we found.

“This combined approach also leads to a much more sustained therapeutic effect, which is very important for a treatment aimed at a chronic degenerative disease.

“Rather than using the standard gene therapy approach of replacing or repairing damaged genes, we used the technique to supplement these molecules in the brain.”

Glaucoma is damage to the optic nerve often, but not always, associated with abnormally high pressure in the eye. In an experimental glaucoma model, the researchers used a tracer dye to show that axonal transport between the eye and brain was impaired in glaucoma. Similarly, a reduction in electrical activity in the retina in response to light suggested that vision was also impaired.

Dr Khatib and colleagues used ‘viral vectors’ – gene therapy delivery systems – to deliver TrkB and BDNF to the retina of rats. They found that this restored axonal transport between the retina and the brain, as observed by movement of the dye. The retinas also showed an improved electrical response to light, a key prerequisite for visual restoration.

Next, the team used transgenic mice bred to model tauopathy, the build-up of ‘tangles’ of tau protein in the brain. Tauopathy is seen in a number of neurodegenerative diseases including Alzheimer’s disease and frontotemporal dementia. Once again, injection of the dye showed that axonal transport was impaired between the eye and the brain – and that this was restored using the viral vectors.

Intriguingly, the team also found preliminary evidence of possible improvement in the mice’s short-term memory. Prior to treatment, the researchers tested the mice on an object recognition task. The mouse was placed at the start of a Y-shaped maze and left to explore two identical objects at the end of the two arms. After a short while, the mouse was once again placed in the maze, but this time one arm contained a new object, while the other contained a copy of the repeated object. The researchers measured the amount of the time the mouse spent exploring each object to see whether it had remembered the object from the previous task.

This task was repeated after the viral vector had been injected into the mouse’s brain and the results were suggestive of a small improvement in short-term memory. While the results of this particular study did not quite achieve statistical significance – a measure of how robust the findings are – the researchers say they are promising and a larger study is now planned to confirm the effect.

Professor Keith Martin from the Centre for Eye Research Australia and the University of Melbourne, who led the study while at Cambridge, added: “While this is currently early stage research, we believe it shows promise for helping to treat neurodegenerative diseases that have so far proved intractable. Gene therapy has already proved effective for some rare monogenic conditions, and we hope it will be similarly useful for these more complex diseases which are much more common.”

The research was supported by Fight for Sight, Addenbrooke’s Charitable Trust, the Cambridge Eye Trust, the Jukes Glaucoma Research Fund, Quethera Ltd, Alzheimer’s Research UK, Gates Cambridge Trust, Wellcome and the Medical Research Council.

Reference
Khatib, TZ et al. Receptor-ligand supplementation via a self-cleaving 2A peptide-based gene therapy promotes CNS axon transport with functional recovery. Science Advances; 31 Mar 2021; DOI: 10.1126/sciadv.abd2590

Original Article: https://www.cam.ac.uk/research/news/gene-therapy-technique-shows-potential-for-repairing-damage-caused-by-glaucoma-and-dementia


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Functioning ‘Mechanical Gears’ Seen In Nature For The First Time

Functioning ‘Mechanical Gears’ Seen In Nature For The First Time

 

 

Previously believed to be only man-made, a natural example of a functioning gear mechanism has been discovered in a common insect – showing that evolution developed interlocking cogs long before we did.

 

In Issus, the skeleton is used to solve a complex problem that the brain and nervous system can’t

Malcolm Burrows

The juvenile Issus – a plant-hopping insect found in gardens across Europe – has hind-leg joints with curved cog-like strips of opposing ‘teeth’ that intermesh, rotating like mechanical gears to synchronise the animal’s legs when it launches into a jump.

The finding demonstrates that gear mechanisms previously thought to be solely man-made have an evolutionary precedent. Scientists say this is the “first observation of mechanical gearing in a biological structure”.

Through a combination of anatomical analysis and high-speed video capture of normal Issus movements, scientists from the University of Cambridge have been able to reveal these functioning natural gears for the first time. The findings are reported in the latest issue of the journal Science.

The gears in the Issus hind-leg bear remarkable engineering resemblance to those found on every bicycle and inside every car gear-box. Each gear tooth has a rounded corner at the point it connects to the gear strip; a feature identical to man-made gears such as bike gears – essentially a shock-absorbing mechanism to stop teeth from shearing off.

The gear teeth on the opposing hind-legs lock together like those in a car gear-box, ensuring almost complete synchronicity in leg movement – the legs always move within 30 ‘microseconds’ of each other, with one microsecond equal to a millionth of a second.

Cog wheels connecting the hind legs of the plant hopper, Issus
Credit: Burrows/Sutton

This is critical for the powerful jumps that are this insect’s primary mode of transport, as even miniscule discrepancies in synchronisation between the velocities of its legs at the point of propulsion would result in “yaw rotation” – causing the Issus to spin hopelessly out of control.

“This precise synchronisation would be impossible to achieve through a nervous system, as neural impulses would take far too long for the extraordinarily tight coordination required,” said lead author Professor Malcolm Burrows, from Cambridge’s Department of Zoology.

“By developing mechanical gears, the Issus can just send nerve signals to its muscles to produce roughly the same amount of force – then if one leg starts to propel the jump the gears will interlock, creating absolute synchrony.

“In Issus, the skeleton is used to solve a complex problem that the brain and nervous system can’t,” said Burrows. “This emphasises the importance of considering the properties of the skeleton in how movement is produced.”

“We usually think of gears as something that we see in human designed machinery, but we’ve found that that is only because we didn’t look hard enough,” added co-author Gregory Sutton, now at the University of Bristol.

“These gears are not designed; they are evolved – representing high speed and precision machinery evolved for synchronisation in the animal world.”

Interestingly, the mechanistic gears are only found in the insect’s juvenile – or ‘nymph’ – stages, and are lost in the final transition to adulthood. These transitions, called ‘molts’, are when animals cast off rigid skin at key points in their development in order to grow.

It’s not yet known why the Issus loses its hind-leg gears on reaching adulthood. The scientists point out that a problem with any gear system is that if one tooth on the gear breaks, the effectiveness of the whole mechanism is damaged. While gear-teeth breakage in nymphs could be repaired in the next molt, any damage in adulthood remains permanent.

It may also be down to the larger size of adults and consequently their ‘trochantera’ – the insect equivalent of the femur or thigh bones. The bigger adult trochantera might allow them to create enough friction to power the enormous leaps from leaf to leaf without the need for intermeshing gear teeth to drive it, say the scientists.

Each gear strip in the juvenile Issus was around 400 micrometres long and had between 10 to 12 teeth, with both sides of the gear in each leg containing the same number – giving a gearing ratio of 1:1.

Unlike man-made gears, each gear tooth is asymmetrical and curved towards the point where the cogs interlock – as man-made gears need a symmetric shape to work in both rotational directions, whereas the Issus gears are only powering one way to launch the animal forward.

While there are examples of apparently ornamental cogs in the animal kingdom – such as on the shell of the cog wheel turtle or the back of the wheel bug – gears with a functional role either remain elusive or have been rendered defunct by evolution.

The Issus is the first example of a natural cog mechanism with an observable function, say the scientists.

Inset image: an Issus nymph

For more information, please contact fred.lewsey@admin.cam.ac.uk


This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.

STAY TUNED: New Connected Cambridge Eletter Format Coming!

STAY TUNED: New Connected Cambridge Eletter Format Coming!

As we approach the 550th  e-letter bringing you the latest news and jobs in and around Cambridge, we feel the format and style are due an update.

The new format will look great on mobile or desktop – whichever device you prefer to look use.

via GIPHY

If you have any suggestions for us, please fill out the comment section below and we’ll take it into account during this exciting time!

 

Mindfulness Can Improve Mental Health and Wellbeing – But Unlikely To Work For Everyone

Mindfulness Can Improve Mental Health and Wellbeing – But Unlikely To Work For Everyone

Mindfulness meditation

Mindfulness meditation
Credit: World Economic Forum

source: cam.ac.uk

 

Mindfulness courses can reduce anxiety, depression and stress and increase mental wellbeing within most but not all non-clinical settings, say a team of researchers at the University of Cambridge. They also found that mindfulness may be no better than other practices aimed at improving mental health and wellbeing.

 

Mindfulness training in the community needs to be implemented with care. Community mindfulness courses should be just one option among others

Julieta Galante

Mindfulness is typically defined as ‘the awareness that emerges through paying attention on purpose, in the present moment, and nonjudgmentally to the unfolding of experience moment by moment’. It has become increasingly popular in recent years as a way of increasing wellbeing and reducing stress levels.

In the UK, the National Health Service offers therapies based on mindfulness to help treat mental health issues such as depression and suicidal thoughts. However, the majority of people who practice mindfulness learn their skills in community settings such as universities, workplaces, or private courses. Mindfulness-based programmes are frequently promoted as the go-to universal tool to reduce stress and increase wellbeing, accessible to anyone, anywhere.

Many randomised controlled trials (RCTs) have been conducted around the world to assess whether in-person mindfulness training can improve mental health and wellbeing, but the results are often varied. In a report published today in PLOS Medicine, a team of researchers from the Department of Psychiatry at the University of Cambridge led a systematic review and meta-analysis to examine the published data from the RCTs. This approach allows them to bring together existing – and often contradictory or under-powered – studies to provide more robust conclusions.

The team identified 136 RCTs on mindfulness training for mental health promotion in community settings. These trials included 11,605 participants aged 18 to 73 years from 29 countries, more than three-quarters (77%) of whom were women.

The researchers found that in most community settings, compared with doing nothing, mindfulness reduces anxiety, depression and stress, and increases wellbeing. However, the data suggested that in more than one in 20 trials settings, mindfulness-based programmes may not improve anxiety and depression.

Dr Julieta Galante from the Department of Psychiatry at the University of Cambridge, the report’s first author, said: “For the average person and setting, practising mindfulness appears to be better than doing nothing for improving our mental health, particularly when it comes to depression, anxiety and psychological distress – but we shouldn’t assume that it works for everyone, everywhere.

“Mindfulness training in the community needs to be implemented with care. Community mindfulness courses should be just one option among others, and the range of effects should be researched as courses are implemented in new settings. The courses that work best may be those aimed at people who are most stressed or in stressful situations, for example health workers, as they appear to see the biggest benefit.”

The researchers caution that RCTs in this field tended to be of poor quality, so the combined results may not represent the true effects. For example, many participants stopped attending mindfulness courses and were not asked why, so they are not represented in the results. When the researchers repeated the analyses including only the higher quality studies, mindfulness only showed effects on stress, not on wellbeing, depression or anxiety.

When compared against other ‘feel good’ practices such as exercise, mindfulness fared neither better nor worse. Professor Peter Jones, also from Cambridge’s Department of Psychiatry, and senior author, said: “While mindfulness is often better than taking no action, we found that there may be other effective ways of improving our mental health and wellbeing, such as exercise. In many cases, these may prove to be more suitable alternatives if they are more effective, culturally more acceptable or are more feasible or cost effective to implement. The good news is that there are now more options.”

The researchers say that the variability in the success of different mindfulness-based programmes identified among the RCTs may be down to a number of reasons, including how, where and by whom they are implemented as well as at whom they are targeted. The techniques and frameworks taught in mindfulness have rich and diverse backgrounds, from early Buddhist psychology and meditation through to cognitive neuroscience and participatory medicine – the interplay between all of these different factors can be expected to influence how effective a programme is.

The number of online mindfulness courses has increased rapidly, accelerated further by the COVID-19 pandemic. Although this review has not looked at online courses, studies suggest that these may be as effective as their offline counterparts, despite most lacking interactions with teacher and peers.

Dr Galante added: “If the effects of online mindfulness courses vary as widely according to the setting as their offline counterparts, then the lack of human support they offer could cause potential problems. We need more research before we can be confident about their effectiveness and safety.”

The research was funded by the National Institute for Health Research (NIHR) Applied Research Collaboration East of England and NIHR Cambridge Biomedical Research Centre, with additional support from the Cambridgeshire & Peterborough NHS Foundation Trust, Medical Research Council, Wellcome and the Spanish Ministry of Education, Culture and Sport.

Reference
Galante, J et al. Mindfulness-based programmes for mental health promotion in adults in non-clinical settings: A systematic review and meta-analysis of randomised controlled trials. PLOS Medicine; 11 Jan 2021; DOI: 10.1371/journal.pmed.1003481


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Asymptomatic Screening and Genome Sequencing Help Cambridge Understand Spread Of SARS-CoV-2 Among Its Students

Asymptomatic Screening and Genome Sequencing Help Cambridge Understand Spread Of SARS-CoV-2 Among Its Students

Cambridge University shield
Credit: Sir Cam

source: cam.ac.uk

 

Since the start of the academic year in October 2020, the University of Cambridge has been offering regular SARS-CoV-2 tests to all students living in its Colleges, even if they show no symptoms. Initial results suggest that the screening programme, together with the University’s public health measures and responsible student behaviour, has helped limit the spread of the virus.

 

Asymptomatic screening can help identify cases of infection early, including where students are unaware of infection, and inform infection control measures. This has never been more urgent, with the emergence of the new variant

Patrick Maxwell

Now, the team running the programme has joined up with researchers at the COVID-19 Genomics UK Consortium (COG-UK) to track how infections spread among the student population. They have shown how a small number of transmission events early on were likely responsible for most of the infections at the University and found little evidence of substantial transmission of SARS-CoV-2 between students and the local Cambridge community in the first five weeks of term.

Around 12,000 students living in College accommodation (80% of eligible students) signed up to the asymptomatic screening programme, which uses a pooled sample approach to reduce the number of tests to fewer than 2,000 per week. In the first weeks of term, 1-2 students from each ‘household’ were tested each week; this has now increased to all participating students being tested each week. In addition, the University offers tests to students and staff who show symptoms of potential COVID-19.

The University is also playing a leading role in COG-UK, which is sequencing the genetic code of samples of the virus isolated from infected individuals to help better understand the spread of infection. As a virus spreads, its genetic code acquires mutations. By comparing the genetic code of samples, it is possible to plot a genetic ‘family tree’ known as phylogenetic tree and to say, coupled with epidemiological information, whether two cases are related – identical or almost-identical samples are likely to be closely related, while genomes with a larger number of genetic differences are less likely to be related.

As part of this work, COG-UK is analysing virus samples from students identified as positive through the University of Cambridge’s testing programmes and comparing them to samples taken from people in the wider Cambridge community. COG-UK and the University have released their interim report, covering the first five weeks of term.

The analysis showed that in week two, 90% of infections were confined to three lineages (related viral genomes). This lack of diversity suggests that a small number of transmission events at the start of term led to the majority of infections in the University.

Outbreaks that have largely been restricted to single Colleges appear to have been contained, suggesting that measures to prevent spread of the virus were successful. In one of the largest clusters (which included 32 cases by week three), half of the students were asymptomatic, highlighting the importance of screening programmes in helping identify infected individuals.

The largest cluster of cases (139 cases by week five, including 135 students, 1 staff member and 3 individuals from the local community) was the source of ongoing transmission within the University. It included students from a number of Colleges, courses and years of study. However, it is not clear whether these can be traced back to a single event that led to dispersion amongst colleges and courses.

Dr Dinesh Aggarwal from the Department of Medicine at the University of Cambridge and a member of COG-UK said: “It appears that a few instances of the virus being introduced to the University account for the majority of cases of established transmission. This suggests to us that in most cases, when a virus was introduced, students behaving responsibly and complying with infection control measures helped stop the virus in its tracks.

“We hope it will be particularly reassuring that so far we have not found evidence of substantial transmission between our students and the local community.”

Dr Ben Warne, a Clinical Research Fellow and one of the leads on the University’s asymptomatic screening programme, added: “It’s clear we need to better understand how the virus spreads between students on different courses and at different Colleges. Once established, these widely-distributed outbreaks are more challenging to control, potentially resulting in continued spread. Genomics should help us piece together this puzzle and help us target prevention strategies.”

The team say their findings appear to suggest that a regular screening programme to detect asymptomatic infection and robust containment measures can be effective at limiting transmission both within the University and to the wider community. This will be particularly important with the emergence of a new, more transmissible variant and substantially higher levels of transmission within the community.

Patrick Maxwell, Regius Professor of Physic at the University of Cambridge, said: “Getting our screening programme up and running in time for the start of term was no small order, but we believe it has paid off. Asymptomatic screening can help identify cases of infection early, including where students are unaware of infection, and inform infection control measures. This has never been more urgent, with the emergence of the new variant.”

The University recently announced that while it will remain open, almost all teaching and learning for undergraduate and postgraduate taught students will move online for the entirety of the Lent term. Undergraduate and postgraduate taught students have been asked to remain where they are currently staying, other than for certain exceptions.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Study Identifies Genetic Changes Likely To Have Enabled SARS-CoV-2 to Jump From Bats to Humans

Study Identifies Genetic Changes Likely To Have Enabled SARS-CoV-2 to Jump From Bats to Humans

Horseshoe bats

Horseshoe bats
Credit: orientalizing on Flickr

source: cam.ac.uk

A new study, involving the University of Cambridge and led by the Pirbright Institute, has identified key genetic changes in SARS-CoV-2 – the virus that causes COVID-19 – that may be responsible for the jump from bats to humans, and established which animals have cellular receptors that allow the virus to enter their cells most effectively.

 

It is essential to understand which animals can be infected by SARS-CoV-2 and how mutations in the viral spike protein change its ability to infect different species

Stephen Graham

The genetic adaptions identified were similar to those made by SARS-CoV – which caused the 2002-2003 SARS epidemic – when it adapted from bats to infect humans. This suggests that there may be a common mechanism by which this family of viruses mutates in order to jump from animals to humans. This understanding can be used in future research to identify viruses circulating in animals that could adapt to infect humans (known as zoonoses) and which potentially pose a pandemic threat.

“This study used a non-infectious, safe platform to probe how spike protein changes affect virus entry into the cells of different wild, livestock and companion animals, something we will need to continue monitoring closely as additional SARS-CoV-2 variants arise in the coming months,” said Dr Stephen Graham in the University of Cambridge’s Department of Pathology, who was involved in the study.

In the 2002-2003 SARS epidemic, scientists were able to identify closely related isolates in both bats and civets – in which the virus is thought to have adapted to infect humans. However, in the current COVID-19 outbreak scientists do not yet know the identity of the intermediate host or have similar samples to analyse. But they do have the sequence of a related bat coronavirus called RaTG13 which shares 96 percent similarity to the SARS-CoV-2 genome. The new study compared the spike proteins of both viruses and identified several important differences.

SARS-CoV-2 and other coronaviruses use their spike proteins to gain entry to cells by binding to their surface receptors, for example ACE2. Like a lock and key, the spike protein must be the right shape to fit the cell’s receptors, but each animal’s receptors have a slightly different shape, which means the spike protein binds to some better than others.

To examine whether these differences between SARS-CoV-2 and RaTG13 were involved in the adaptation of SARS-CoV-2 to humans, scientists swapped these regions and examined how well these resulting spike proteins bound human ACE2 receptors – using a method that does not involve using live virus.

The results, published in the journal PLOS Biology, showed SARS-CoV-2 spikes containing RaTG13 regions were unable to bind to human ACE2 receptors effectively, while the RaTG13 spikes containing SARS-CoV-2 regions could bind more efficiently to human receptors – although not to the same level as the unedited SARS-CoV-2 spike protein. This potentially indicates that similar changes in the SARS-CoV-2 spike protein occurred historically, which may have played a key role in allowing the virus to jump the species barrier.

Researchers also investigated whether the SARS-CoV-2 spike protein could bind to the ACE2 receptors from 22 different animals to ascertain which of these, if any, may be susceptible to infection. They demonstrated that bat and bird receptors made the weakest interactions with SARS-CoV-2. The lack of binding to bat receptors adds weight to the evidence that SARS-CoV-2 likely adapted its spike protein when it jumped from bats into people, possibly via an intermediate host.

Dog, cat, and cattle ACE2 receptors were identified as the strongest interactors with the SARS-CoV-2 spike protein. Efficient entry into cells could mean that infection may be more easily established in these animals, although receptor binding is only the first step in viral transmission between different animal species.

“As we saw with the outbreaks in Danish mink farms last year, it’s essential to understand which animals can be infected by SARS-CoV-2 and how mutations in the viral spike protein change its ability to infect different species,” said Graham.

An animal’s susceptibility to infection and its subsequent ability to infect others is reliant on a range of factors – including whether SARS-CoV-2 is able to replicate once inside cells, and the animal’s ability to fight off the virus. Further studies are needed to understand whether livestock and companion animals could be receptive to COVID-19 infection from humans and act as reservoirs for this disease.

This research was funded by the Medical Research Council, the Biotechnology and Biological Sciences Research Council and Innovate UK – all part of UK Research and Innovation; the Royal Society and Wellcome.

Reference
Conceicao, C. et.al: ‘The SARS-CoV-2 Spike protein has a broad tropism for mammalian ACE2 proteins’. PLOS Biology, Dec 2020. DOI:10.1371/journal.pbio.3001016

Adapted from a press release by the Pirbright Institute

‘Virtual Biopsies’ Could Replace Tissue Biopsies in Future Thanks to Technique Developed by Cambridge Scientists

‘Virtual Biopsies’ Could Replace Tissue Biopsies in Future Thanks to Technique Developed by Cambridge Scientists

Image showing individual and combined scans

Image showing individual and combined scans
Credit: Evis Sala

source: cam.ac.uk

 

A new advanced computing technique using routine medical scans to enable doctors to take fewer, more accurate tumour biopsies, has been developed by cancer researchers at the University of Cambridge. This is an important step towards precision tissue sampling for cancer patients to help select the best treatment. In future the technique could even replace clinical biopsies with ‘virtual biopsies’, sparing patients invasive procedures.

 

This study provides an important milestone towards precision tissue sampling. We are truly pushing the boundaries in translating cutting edge research to routine clinical care

Evis Sala

The research published in European Radiology shows that combining computed tomography (CT) scans with ultrasound images creates a visual guide for doctors to ensure they sample the full complexity of a tumour with fewer targeted biopsies.

Capturing the patchwork of different types of cancer cell within a tumour – known as tumour heterogeneity – is critical for selecting the best treatment because genetically-different cells may respond differently to treatment.

Most cancer patients undergo one or several biopsies to confirm diagnosis and plan their treatment. But because this is an invasive clinical procedure, there is an urgent need to reduce the number of biopsies taken and to make sure biopsies accurately sample the genetically-different cells in the tumour, particularly for ovarian cancer patients.

High grade serous ovarian (HGSO) cancer, the most common type of ovarian cancer, is referred to as a ‘silent killer’ because early symptoms can be difficult to pick up. By the time the cancer is diagnosed, it is often at an advanced stage, and survival rates have not changed much over the last 20 years.

But late diagnosis isn’t the only problem. HGSO tumours tend to have a high level of tumour heterogeneity and patients with more genetically-different patches of cancer cells tend to have a poorer response to treatment.

Professor Evis Sala from the Department of Radiology, co-lead CRUK Cambridge Centre Advanced Cancer Imaging Programme, leads a multi-disciplinary team of radiologists, physicists, oncologists and computational scientists using innovative computing techniques to reveal tumour heterogeneity from standard medical images. This new study, led by Professor Sala, involved a small group of patients with advanced ovarian cancer who were due to have ultrasound-guided biopsies prior to starting chemotherapy.

For the study, the patients first had a standard-of-care CT scan. A CT scanner uses x-rays and computing to create a 3D image of the tumour from multiple image ‘slices’ through the body.

The researchers then used a process called radiomics – using high-powered computing methods to analyse and extract additional information from the data-rich images created by the CT scanner – to identify and map distinct areas and features of the tumour. The tumour map was then superimposed on the ultrasound image of the tumour and the combined image used to guide the biopsy procedure.

By taking targeted biopsies using this method, the research team reported that the diversity of cancer cells within the tumour was successfully captured.

Co-first author Dr Lucian Beer, from the Department of Radiology and CRUK Cambridge Centre Ovarian Cancer Programme, said of the results: “Our study is a step forward to non-invasively unravel tumour heterogeneity by using standard-of-care CT-based radiomic tumour habitats for ultrasound-guided targeted biopsies.”

Co-first author Paula Martin-Gonzalez, from the Cancer Research UK Cambridge Institute and CRUK Cambridge Centre Ovarian Cancer Programme, added: “We will now be applying this method in a larger clinical study.”

Professor Sala said: “This study provides an important milestone towards precision tissue sampling. We are truly pushing the boundaries in translating cutting edge research to routine clinical care.”

Fiona Barve (56) is a science teacher who lives near Cambridge. She was diagnosed with ovarian cancer in 2017 after visiting her doctor with abdominal pain. She was diagnosed with stage 4 ovarian cancer and immediately underwent surgery and a course of chemotherapy. Since March 2019 she has been cancer free and is now back to teaching three days a week.

“I was diagnosed at a late stage and I was fortunate my surgery, which I received within four weeks of being diagnosed, and chemotherapy worked for me. I feel lucky to be around,” said Barve.

“When you are first undergoing the diagnosis of cancer, you feel as if you are on a conveyor belt, every part of the journey being extremely stressful. This new enhanced technique will reduce the need for several procedures and allow patients more time to adjust to their circumstances. It will enable more accurate diagnosis with less invasion of the body and mind. This can only be seen as positive progress.”

This feasibility study, involving researchers from the Department of Radiology, CRUK Cambridge Institute, Addenbrooke’s Hospital, Cambridge University Hospitals NHS Foundation Trust, and collaborators at Cannon, was facilitated through the CRUK Cambridge Centre Integrated Cancer Medicine programme.

The goal of Integrated Cancer Medicine is to revolutionise cancer treatment using complex data integration. Combining and integrating patient data from multiple sources – blood tests, biopsies, medical imaging, and genetic tests – can inform and predict the best treatment decisions for each individual patient.

The study was funded by Cancer Research UK and The Mark Foundation for Cancer Research.

Reference
Lucian Beer, Paula Martin-Gonzalez et al. Ultrasound-guided targeted biopsies of distinct CT based radiomic tumour habitats: proof of concept. European Radiology; 14 Dec 2020; DOI: 10.1007/s00330-020-07560-8


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.