All posts by Admin

Astronomers Show How Planets Form In Binary Systems Without Getting Crushed

Astronomers show how planets form in binary systems without getting crushed

Artist’s impression of the planet around Alpha Centauri B
source: www.cam.ac.uk

 

Astronomers have developed the most realistic model to date of planet formation in binary star systems.

 

Planet formation in binary systems is more complicated, because the companion star acts like a giant eggbeater, dynamically exciting the protoplanetary disc

Roman Rafikov

The researchers, from the University of Cambridge and the Max Planck Institute for Extra-terrestrial Physics, have shown how exoplanets in binary star systems – such as the ‘Tatooine’ planets spotted by NASA’s Kepler Space Telescope – came into being without being destroyed in their chaotic birth environment.

They studied a type of binary system where the smaller companion star orbits the larger parent star approximately once every 100 years – our nearest neighbour, Alpha Centauri, is an example of such a system.

“A system like this would be the equivalent of a second Sun where Uranus is, which would have made our own solar system look very different,” said co-author Dr Roman Rafikov from Cambridges Department of Applied Mathematics and Theoretical Physics.

Rafikov and his co-author Dr Kedron Silsbee from the Max Planck Institute for Extra-terrestrial Physics found that for planets to form in these systems, the planetesimals – planetary building blocks which orbit around a young star – need to start off at least 10 kilometres in diameter, and the disc of dust and ice and gas surrounding the star within which the planets form needs to be relatively circular.

The research, which is published in Astronomy and Astrophysics, brings the study of planet formation in binaries to a new level of realism and explains how such planets, a number of which have been detected, could have formed.

Planet formation is believed to begin in a protoplanetary disc – made primarily of hydrogen, helium, and tiny particles of ices and dust – orbiting a young star. According to the current leading theory on how planets form, known as core accretion, the dust particles stick to each other, eventually forming larger and larger solid bodies. If the process stops early, the result can be a rocky Earth-like planet. If the planet grows bigger than Earth, then its gravity is sufficient to trap a large quantity of gas from the disc, leading to the formation of a gas giant like Jupiter.

“This theory makes sense for planetary systems formed around a single star, but planet formation in binary systems is more complicated, because the companion star acts like a giant eggbeater, dynamically exciting the protoplanetary disc,” said Rafikov.

“In a system with a single star the particles in the disc are moving at low velocities, so they easily stick together when they collide, allowing them to grow,” said Silsbee. “But because of the gravitational eggbeater’ effect of the companion star in a binary system, the solid particles there collide with each other at much higher velocity. So, when they collide, they destroy each other.”

Many exoplanets have been spotted in binary systems, so the question is how they got there. Some astronomers have even suggested that perhaps these planets were floating in interstellar space and got sucked in by the gravity of a binary, for instance.

Rafikov and Silsbee carried out a series of simulations to help solve this mystery. They developed a detailed mathematical model of planetary growth in a binary that uses realistic physical inputs and accounts for processes that are often overlooked, such as the gravitational effect of the gas disc on the motion of planetesimals within it.

The disc is known to directly affect planetesimals through gas drag, acting like a kind of wind,” said Silsbee. A few years ago, we realised that in addition to the gas drag, the gravity of the disc itself dramatically alters dynamics of the planetesimals, in some cases allowing planets to form even despite the gravitational perturbations due to the stellar companion.”

The model weve built pulls together this work, as well as other previous work, to test the planet formation theories,” said Rafikov.

Their model found that planets can form in binary systems such as Alpha Centauri, provided that the planetesimals start out at least 10 kilometres across in size, and that the protoplanetary disc itself is close to circular, without major irregularities. When these conditions are met, the planetesimals in certain parts of the disc end up moving slowly enough relative to each other that they stick together instead of destroying each other.

These findings lend support to a particular mechanism of planetesimal formation, called the streaming instability, being an integral part of the planet formation process. This instability is a collective effect, involving many solid particles in the presence of gas, that is capable of concentrating pebble-to-boulder sized dust grains to produce a few large planetesimals, which would survive most collisions.

The results of this work provide important insights for theories of planet formation around both binary and single stars, as well as for the hydrodynamic simulations of protoplanetary discs in binaries. In future, the model could also be used to explain the origin of the Tatooine planets – exoplanets orbiting both components of a binary – about a dozen of which have been identified by NASAs Kepler Space Telescope.

 

Reference:
Kedron Silsbee and Roman R. Rafikov. ‘Planet Formation in Stellar Binaries: Global Simulations of Planetesimal Growth.’ Astronomy and Astrophysics (2021). DOI:10.1051/0004-6361/20214113


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Earth’s Interior is Swallowing Up More Carbon Than Thought

Alaska’s Pavlof Volcano: NASA’s View from Space
source: www.cam.ac.uk

 

Scientists from Cambridge University and NTU Singapore have found that slow-motion collisions of tectonic plates drag more carbon into Earth’s interior than previously thought.

 

We currently have a relatively good understanding of the surface reservoirs of carbon and the fluxes between them, but know much less about Earth’s interior carbon stores, which cycle carbon over millions of years

Stefan Farsang

They found that the carbon drawn into Earth’s interior at subduction zones – where tectonic plates collide and dive into Earth’s interior – tends to stay locked away at depth, rather than resurfacing in the form of volcanic emissions.

Their findings, published in Nature Communications, suggest that only about a third of the carbon recycled beneath volcanic chains returns to the surface via recycling, in contrast to previous theories that what goes down mostly comes back up.

One of the solutions to tackle climate change is to find ways to reduce the amount of CO2 in Earth’s atmosphere. By studying how carbon behaves in the deep Earth, which houses the majority of our planet’s carbon, scientists can better understand the entire lifecycle of carbon on Earth, and how it flows between the atmosphere, oceans and life at the surface.

The best-understood parts of the carbon cycle are at or near Earth’s surface, but deep carbon stores play a key role in maintaining the habitability of our planet by regulating atmospheric CO2 levels. “We currently have a relatively good understanding of the surface reservoirs of carbon and the fluxes between them, but know much less about Earth’s interior carbon stores, which cycle carbon over millions of years,” said lead author Stefan Farsang, who conducted the research while a PhD student at Cambridge’s Department of Earth Sciences.

There are a number of ways for carbon to be released back to the atmosphere (as CO2) but there is only one path in which it can return to the Earth’s interior: via plate subduction. Here, surface carbon, for instance in the form of seashells and micro-organisms which have locked atmospheric CO2 into their shells, is channelled into Earth’s interior. Scientists had thought that much of this carbon was then returned to the atmosphere as CO2 via emissions from volcanoes. But the new study reveals that chemical reactions taking place in rocks swallowed up at subduction zones trap carbon and send it deeper into Earth’s interior – stopping some of it coming back to Earth’s surface.

The team conducted a series of experiments at the European Synchrotron Radiation Facility, “The ESRF have world-leading facilities and the expertise that we needed to get our results,” said co-author Simon Redfern, Dean of the College of Science at NTU Singapore, “The facility can measure very low concentrations of these metals at the high pressure and temperature conditions of interest to us.” To replicate the high pressures and temperatures of subductions zones, they used a heated ‘diamond anvil’, in which extreme pressures are achieved by pressing two tiny diamond anvils against the sample.

The work supports growing evidence that carbonate rocks, which have the same chemical makeup as chalk, become less calcium-rich and more magnesium-rich when channelled deeper into the mantle. This chemical transformation makes carbonate less soluble – meaning it doesn’t get drawn into the fluids that supply volcanoes. Instead, the majority of the carbonate sinks deeper into the mantle where it may eventually become diamond.

“There is still a lot of research to be done in this field,” said Farsang. “In the future, we aim to refine our estimates by studying carbonate solubility in a wider temperature, pressure range and in several fluid compositions.”

The findings are also important for understanding the role of carbonate formation in our climate system more generally. “Our results show that these minerals are very stable and can certainly lock up CO2 from the atmosphere into solid mineral forms that could result in negative emissions,” said Redfern. The team have been looking into the use of similar methods for carbon capture, which moves atmospheric CO2 into storage in rocks and the oceans.

“These results will also help us understand better ways to lock carbon into the solid Earth, out of the atmosphere. If we can accelerate this process faster than nature handles it, it could prove a route to help solve the climate crisis,” said Redfern.

 

Reference:
Farsang, S., Louvel, M., Zhao, C. et al. Deep carbon cycle constrained by carbonate solubility. Nature Communications (2021). DOI: 10.1038/s41467-021-24533-7

Adapted from a news release by the ESRF


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Scientists Can Detect Brain Tumours Using a Simple Urine or Blood Plasma Test

source: www.cam.ac.uk

 

Researchers from the Cancer Research UK Cambridge Institute have developed two tests that can detect the presence of glioma, a type of brain tumour, in patient urine or blood plasma.

 

The team say that a test for detecting glioma using urine is the first of its kind in the world.

Although the research, published in EMBO Molecular Medicine, is in its early stages and only a small number of patients were analysed, the team say their results are promising.

The researchers suggest that in the future, these tests could be used by GPs to monitor patients at high risk of brain tumours, which may be more convenient than having an MRI every three months, which is the standard method.

When people have a brain tumour removed, the likelihood of it returning can be high, so they are monitored with an MRI scan every three months, which is followed by biopsy.

Blood tests for detecting different cancer types are a major focus of research for teams across the world, and there are some in use in the clinic. These tests are mainly based on finding mutated DNA, shed by tumour cells when they die, known as cell-free DNA (cfDNA).

However, detecting brain tumour cfDNA in the blood has historically been difficult because of the blood-brain-barrier, which separates blood from the cerebrospinal fluid (CSF) that surrounds the brain and spinal cord, preventing the passage of cells and other particles, such as cfDNA.

Researchers have previously looked at detecting cfDNA in CSF, but the spinal taps needed to obtain it can be dangerous for people with brain tumours so are not appropriate for patient monitoring.

Scientists have known that cfDNA with similar mutations to the original tumour can be found in blood and other bodily fluids such as urine in very low levels, but the challenge has been developing a test sensitive enough to detect these specific mutations.

The researchers, led by Dr Florent Mouliere who is based at the Rosenfeld Lab of the Cancer Research UK Cambridge Institute and at the Amsterdam UMC, and Dr Richard Mair, who is based at Cancer Research UK Cambridge Institute and the University of Cambridge developed two approaches in parallel to overcome the challenge of detecting brain tumour cfDNA.

The first approach works for patients who have previously had glioma removed and biopsied. The team designed a tumour-guided sequencing test that was able to look for the mutations found in the tumour tissue within the cfDNA in the patient’s urine, CSF, and blood plasma.

A total of eight patients who had suspected brain tumours based on MRIs were included in this part of the study. Samples were taken at their initial brain tumour biopsies, alongside CSF, blood and urine samples.

By knowing where in the DNA strand to look, the researchers found that it was possible to find mutations even in the tiny amounts of cfDNA found in the blood plasma and urine.

The test was able to detect cfDNA in 7 out of 8 CSF samples, 10 out of the 12 plasma blood samples and 10 out of the 16 urine samples.

For the second approach the researchers looked for other patterns in the cfDNA that could also indicate the presence of a tumour, without having to identify the mutations.

They analysed 35 samples from glioma patients, 27 people with non-malignant brain disorders, and 26 healthy people. They used whole genome sequencing, where all the cfDNA of the tumour is analysed, not just the mutations.

They found in the blood plasma and urine samples that fragments of cfDNA, which came from patients with brain tumours were different sizes than those from patients with no tumours in CSF. They then fed this data into a machine learning algorithm which was able to successfully differentiate between the urine samples of people with and without glioma.

The researchers say that while the machine learning test is cheaper and easier, and a tissue biopsy from the tumour is not needed, it is not as sensitive and is less specific than the first tumour-guided sequencing approach.

MRIs are not invasive or expensive, but they do require a trip to the hospital, and the three-month gap between checks can be a regular source of anxiety for patients.

The researchers suggest that their tests could be used between MRI scans, and could ultimately be able to detect a returning brain tumour earlier.

The next stage of this research will see the team comparing both tests against MRI scans in a trial with patients with brain tumours who are in remission to see if it can detect if their tumours are coming back at the same time or earlier than the MRI. If the tests prove that they can detect brain tumours earlier than an MRI, then the researchers will look at how they can adapt the tests so they could be offered in the clinic, which could be within the next ten years.

“We believe the tests we’ve developed could in the future be able to detect a returning glioma earlier and improve patient outcomes,” said Mair. “Talking to my patients, I know the three-month scan becomes a focal point for worry. If we could offer a regular blood or urine test, not only will you be picking up recurrence earlier, you can also be doing something positive for the patient’s mental health.”

Michelle Mitchell, Chief Executive of Cancer Research UK said, “While this is early research, it’s opened up the possibility that within the next decade we could be able to detect the presence of a brain tumour with a simple urine or blood test. Liquid biopsies are a huge area of research interest right now because of the opportunities they create for improved patient care and early diagnosis. It’s great to see Cancer Research UK researchers making strides in this important field.”

Sue Humphreys, from Wallsall, a brain tumour patient, said: “If these tests are found to be as accurate as the standard MRI for monitoring brain tumours, it could be life changing.

If patients can be given a regular and simple test by their GP, it may help not only detect a returning brain tumour in its earliest stages, it can also provide the quick reassurance that nothing is going on which is the main problem we all suffer from, the dreaded Scanxiety.

The problem with three-monthly scans is that these procedures can get disrupted by other things going on, such as what we have seen with the Covid pandemic. As a patient, this causes worry as there is a risk that things may be missed, or delayed, and early intervention is the key to any successful treatment.”

 

Reference:
Florent Mouliere et al. ‘Fragmentation patterns and personalized sequencing of cell-free DNA in urine and plasma of glioma patients.’ EMBO Molecular Medicine (2021). DOI: 10.15252/emmm.202012881

Adapted from a Cancer Research UK press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Blushing Plants Reveal When Fungi Are Growing In Their Roots

Cells of roots colonised by fungi turn red
source: www.cam.ac.uk

 

Scientists have created plants whose cells and tissues ‘blush’ with beetroot pigments when they are colonised by fungi that help them take up nutrients from the soil.

 

We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact.

Sebastian Schornack

This is the first time this vital, 400 million year old process has been visualised in real time in full root systems of living plants. Understanding the dynamics of plant colonisation by fungi could help to make food production more sustainable in the future.

Almost all crop plants form associations with a particular type of fungi – called arbuscular mycorrhiza fungi – in the soil, which greatly expand their root surface area. This mutually beneficial interaction boosts the plant’s ability to take up nutrients that are vital for growth.

The more nutrients plants obtain naturally, the less artificial fertilisers are needed. Understanding this natural process, as the first step towards potentially enhancing it, is an ongoing research challenge. Progress is likely to pay huge dividends for agricultural productivity.

In a study published in the journal PLOS Biology, researchers used the bright red pigments of beetroot – called betalains – to visually track soil fungi as they colonised plant roots in a living plant.

“We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact. We previously had no idea about what happened because there was no way to visualise it in a living plant without the use of elaborate microscopy,” said Dr Sebastian Schornack, a researcher at the University of Cambridge’s Sainsbury Laboratory and joint senior author of the paper.

To achieve their results, the researchers engineered two model plant species – a legume and a tobacco plant – so that they would produce the highly visible betalain pigments when arbuscular mycorrhiza fungi were present in their roots. This involved combining the control regions of two genes activated by mycorrhizal fungi with genes that synthesise red-coloured betalain pigments.

The plants were then grown in a transparent structure so that the root system was visible, and images of the roots could be taken with a flatbed scanner without disturbing the plants.

Using their technique, the researchers could select red pigmented parts of the root system to observe the fungus more closely as it entered individual plant cells and formed elaborate tree-like structures – called arbuscules – which grow inside the plant’s roots. Arbuscules take up nutrients from the soil that would otherwise be beyond the reach of the plant.

Other methods exist to visualise this process, but these involve digging up and killing the plant and the use of chemicals or expensive microscopy. This work makes it possible for the first time to watch by eye and with simple imaging how symbiotic fungi start colonising living plant roots, and inhabit parts of the plant root system over time.

“This is an exciting new tool to visualise this, and other, important plant processes. Beetroot pigments are a distinctive colour, so they’re very easy to see. They also have the advantage of being natural plant pigments, so they are well tolerated by plants,” said Dr Sam Brockington, a researcher in the University of Cambridge’s Department of Plant Sciences, and joint senior author of the paper.

Mycorrhiza fungi are attracting growing interest in agriculture. This new technique provides the ability to ‘track and trace’ the presence of symbiotic fungi in soils from different sources and locations. The researchers say this will enable the selection of fungi that colonise plants fastest and provide the biggest benefits in agricultural scenarios.

Understanding and exploiting the dynamics of plant root system colonisation by fungi has potential to enhance future crop production in an environmentally sustainable way. If plants can take up more nutrients naturally, this will reduce the need for artificial fertilisers – saving money and reducing associated water pollution.

This research was funded by the Biotechnology and Biological Sciences Research Council, Gatsby Charitable Foundation, Royal Society, and Natural Environment Research Council.

Reference
Timoneda, A. & Yunusov, T. et al: ‘MycoRed: Betalain pigments enable in vivo real-time visualisation of arbuscular mycorrhizal colonisation.’ PLOS Biology, July 2021. DOI: 10.1371/journal.pbio.3001326


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Smartphone Screens Effective Sensors For Soil or Water Contamination

 

The touchscreen technology used in billions of smartphones and tablets could also be used as a powerful sensor, without the need for any modifications.

 

Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes, since these ions also interact with the electric fields?

Ronan Daly

Researchers from the University of Cambridge have demonstrated how a typical touchscreen could be used to identify common ionic contaminants in soil or drinking water by dropping liquid samples on the screen, the first time this has been achieved. The sensitivity of the touchscreen sensor is comparable to typical lab-based equipment, which would make it useful in low-resource settings.

The researchers say their proof of concept could one day be expanded for a wide range of sensing applications, including for biosensing or medical diagnostics, right from the phone in your pocket. The results are reported in the journal Sensors and Actuators B.

Touchscreen technology is ubiquitous in our everyday lives: the screen on a typical smartphone is covered in a grid of electrodes, and when a finger disrupts the local electric field of these electrodes, the phone interprets the signal.

Other teams have used the computational power of a smartphone for sensing applications, but these have relied on the camera or peripheral devices, or have required significant changes to be made to the screen.

“We wanted to know if we could interact with the technology in a different way, without having to fundamentally change the screen,” said Dr Ronan Daly from Cambridge’s Institute of Manufacturing, who co-led the research. “Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes, since these ions also interact with the electric fields?”

The researchers started with computer simulations, and then validated their simulations using a stripped down, standalone touchscreen, provided by two UK manufacturers, similar to those used in phones and tablets.

The researchers pipetted different liquids onto the screen to measure a change in capacitance and recorded the measurements from each droplet using the standard touchscreen testing software. Ions in the fluids all interact with the screen’s electric fields differently depending on the concentration of ions and their charge.

“Our simulations showed where the electric field interacts with the fluid droplet. In our experiments, we then found a linear trend for a range of electrolytes measured on the touchscreen,” said first author Sebastian Horstmann, a PhD candidate at IfM. “The sensor saturates at an anion concentration of around 500 micromolar, which can be correlated to the conductivity measured alongside. This detection window is ideal to sense ionic contamination in drinking water.”

One early application for the technology could be to detect arsenic contamination in drinking water. Arsenic is another common contaminant found in groundwater in many parts of the world, but most municipal water systems screen for it and filter it out before it reaches a household tap. However, in parts of the world without water treatment plants, arsenic contamination is a serious problem.

“In theory, you could add a drop of water to your phone before you drink it, in order to check that it’s safe,” said Daly.

At the moment, the sensitivity of phone and tablet screens is tuned for fingers, but the researchers say the sensitivity could be changed in a certain part of the screen by modifying the electrode design in order to be optimised for sensing.

“The phone’s software would need to communicate with that part of the screen to deliver the optimum electric field and be more sensitive for the target ion, but this is achievable,” said Professor Lisa Hall from Cambridge’s Department of Chemical Engineering and Biotechnology, who co-led the research. “We’re keen to do much more on this – it’s just the first step.”

While it’s now possible to detect ions using a touchscreen, the researchers hope to further develop the technology so that it can detect a wide range of molecules. This could open up a huge range of potential health applications.

“For example, if we could get the sensitivity to a point where the touchscreen could detect heavy metals, it could be used to test for things like lead in drinking water. We also hope in the future to deliver sensors for home health monitoring,” said Daly.

“This is a starting point for broader exploration of the use of touchscreen sensing in mobile technologies and the creation of tools that are accessible to everyone, allowing rapid measurements and communication of data,” said Hall.

 

Reference:
Sebastian Horstmann, Cassi J Henderson, Elizabeth A H Hall, Ronan Daly ‘Capacitive touchscreen sensing – a measure of electrolyte conductivity.’ Sensors and Actuators B (2021). DOI: https://doi.org/10.1016/j.snb.2021.130318


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Biological ‘Fingerprints’ of Long COVID in Blood Could Lead To Diagnostic Test, Say Cambridge Scientists

Tired looking womansource: www.cam.ac.uk

Markers in our blood – ‘fingerprints’ of infection – could help identify individuals who have been infected by SARS-CoV-2, the coronavirus that causes COVID-19, several months after infection even if the individual had only mild symptoms or showed no symptoms at all, say Cambridge researchers.

 

Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties

Nyarie Sithole

The team has received funding from the National Institute for Health Research to develop a test that could complement existing antibody tests. They also aim to use similar biological signatures to develop a test and monitor for long COVID.

While most people recover from COVID-19 in a matter of days or weeks, around one in ten people go on to develop symptoms that can last for several months. This can be the case irrespective of the severity of their COVID-19 – even individuals who were asymptomatic can experience so-called ‘long COVID’.

Diagnosing long COVID can be a challenge, however. A patient with asymptomatic or mild disease may not have taken a PCR test at the time of infection – the gold standard for diagnosing COVID-19 –  and so has never had a confirmed diagnosis.  Even antibody tests – which look for immune cells produced in response to infection – are estimated to miss around 30% of cases, particularly among those who have had only mild disease and or beyond six months post-initial illness.

A team at the University of Cambridge and Cambridge University Hospital NHS Foundation Trust has received £370,000 from the National Institute for Health Research to develop a COVID-19 diagnostic test that would complement existing antibody tests and a test that could objectively diagnose and monitor long COVID.

The research builds on a pilot project supported by the Addenbrooke’s Charitable Trust. The team has been recruiting patients from the Long COVID Clinic established in May 2020 at Addenbrooke’s Hospital, part of Cambridge University Hospital NHS Foundation Trust.

During the pilot, the team recruited 85 patients to the Cambridge NIHR COVID BioResource, which collects blood samples from patients when they are first diagnosed and then at follow-up intervals over several months. They now hope to expand their cohort to 500 patients, recruited from Cambridgeshire and Peterborough.

In their initial findings, the team identified a biomarker – a biological fingerprint – in the blood of patients who had previously had COVID-19. This biomarker is a molecule known as a cytokine produced by T cells in response to infection. As with antibodies, this biomarker persists in the blood for a long time after infection. The team plans to publish their results shortly.

Dr Mark Wills from the Department of Medicine at the University of Cambridge, who co-leads the team, said: “We need a reliable and objective way of saying whether someone has had COVID-19. Antibodies are one sign we look for, but not everyone makes a very strong response and this can wane over time and become undetectable.

“We’ve identified a cytokine that is also produced in response to infection by T cells and is likely to be detectable for several months – and potentially years – following infection. We believe this will help us develop a much more reliable diagnostic for those individuals who did not get a diagnosis at the time of infection.”

By following patients for up to 18 months post-infection, the team hopes to address several questions, including whether immunity wanes over time. This will be an important part of helping understand whether people who have been vaccinated will need to receive boosters to keep them protected.

As part of their pilot study, the team also identified a particular biomarker found in patients with long COVID. Their work suggests these patients produce a second type of cytokine, which persists in patients with long COVID compared to those that recover quickly and might be one of the drivers behind the many symptoms that patients experience. This might therefore prove to be useful for diagnosing long COVID.

Dr Nyarie Sithole, also from the Department of Medicine at the University of Cambridge, who co-leads the team and helps to manage long COVID patients, said:  “Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties.

“There is anecdotal evidence that patients see an improvement in symptoms of long COVID once they have been vaccinated – something that we have seen in a small number of patients in our clinic. Our study will allow us to see how this biomarker changes over a longer period of time in response to vaccination.”

At the moment, the team is using the tests for research purposes, but by increasing the size of their study cohort and carrying out further work, they hope to adapt and optimise the tests that can be scaled up and speeded up, able to be used by clinical diagnostic labs.

As well as developing a reliable test, the researchers hope their work will help provide an in-depth understanding of how the immune system responds to coronavirus infection – and why it triggers long COVID in some people.

Dr Sithole added: “One of the theories of what’s driving long COVID is that it’s a hyperactive immune response – in other words, the immune system switches on at the initial infection and for some reason never switches off or never goes back to the baseline. As we’ll be following our patients for many months post-infection, we hope to better understand whether this is indeed the case.”

In addition, having a reliable biomarker could help in the development of new treatments against COVID. Clinical trials require an objective measure of whether a drug is effective. Changes in – or the disappearance of – long-COVID-related cytokine biomarkers with corresponding symptom improvement in response to drug treatment would suggest that a treatment intervention is working.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

University of Cambridge Launches Roadmap To Support Future Growth of Life Sciences Cluster

Cambridge Biomedical Campus
source: www.cam.ac.uk

 

Connect: Health Tech, the University of Cambridge Enterprise Zone, has today launched a roadmap, ‘Creating a University Enterprise Zone for Cambridge across the life and physical sciences’, that examines the challenges faced in futureproofing and sustaining the growth of the life sciences cluster to maintain Cambridge as a global centre of excellence for health tech.

 

Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences

Andy Neely

The roadmap sets out a clear plan to create a bridge between two of Cambridge’s historical strengths — biomedical research and cutting-edge technology — and bring these specialisms together to develop new treatments and health tech with real world applications. The solutions in the roadmap are scalable beyond Cambridge and also applicable to other disciplines and sectors.

Professor Andy Neely, Pro-Vice-Chancellor for Enterprise and Business Relations at the University of Cambridge, said: “Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences. Our roadmap sets out a plan to do just that and will ensure that Cambridge remains a global leader in health technology into the next generation.

“This will require us to pioneer new ways of working and creating connections between different institutions across disciplines, be they academic or private enterprise. Such a model has been proven to work at a small scale – our proposal in the roadmap is to scale this up and apply it across the cluster and beyond.”

The University sits at the heart of the so-called ‘Cambridge cluster’, in which more than 5,300 knowledge-intensive firms employ more than 67,000 people and generate £18 billion in turnover. Cambridge has the highest number of patent applications per 100,000 residents in the UK.

The mission of the University is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. This includes cultivating and delivering excellent research and world-leading innovation and training of the next generation of highly skilled researchers and entrepreneurs, thereby underpinning the UK’s economic growth and competitiveness.

Professor Tony Kouzarides, Director of the Milner Therapeutics Institute at the University of Cambridge, said: “The pandemic has clearly shown the importance of rapid innovation in healthcare. We are determined to harness the power of innovation, creativity and collaboration in Cambridge, and apply this towards solving some of the biggest medical challenges facing the country, and the world.”

The Connect: Health Tech roadmap is a result of consultation with major stakeholders and a series of road-mapping workshops with the Cambridge community. It aims to shape the future success of the Cambridge cluster in health tech through a supportive and dynamic ecosystem that aligns with the needs of the community.

The roadmap includes ambitious steps to build strong foundations for the Cambridge cluster for the next 20 years and will support the region’s economic recovery post-pandemic and bring cutting-edge research, businesses and innovators together to be better prepared and connected for the future. Connect: Health Tech will also increase access to the Cambridge ecosystem extending reach and helping to level up growth and investment across the East of England and the Oxford-Cambridge Arc.

One of the major recommendations in the report is to create and foster connectivity at the interface between medicine and technology and across sectors. This recommendation has been piloted by expanding the Cambridge cluster from a physical community to a digital one.

The COVID19 pandemic has required the creation of an innovative model of access and navigation to Cambridge. The digital platform simplifies navigation of the Cambridge research community and enables new companies based all over the world to access expertise and knowledge across the University with the aim of increasing inward investment in the life sciences. It also pilots an approach to navigation and connectivity that can be scaled up across the Arc and the UK. This new way of working will speed up the development of new healthcare innovations and technologies that the NHS will use in years to come.

Connect: Health Tech is a Cambridge University initiative funded by Research England. Connect: Health Tech UEZ has been created to build a highly effective interdisciplinary bridge between two Cambridge research hubs and beyond: the West science and technology hub anchored at the Maxwell Centre and South biomedical hub anchored at the Milner Therapeutics Institute. The bridge will bring together and integrate a community from across the University, research institutes, NHS, industry, investors, local and national Government, with a focus on medtech, digital health and therapeutics, to create opportunities that will transform ideas at the interface between medicine and technology into reality.

Read Creating a University Enterprise Zone for Cambridge across the life and physical sciences


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Top UK Organisations Release Annual Statistics For Use Of Animals In Research

Top UK organisations release annual statistics for use of animals in research

source: www.cam.ac.uk

 

The ten organisations in Great Britain that carry out the highest number of animal procedures – those used in medical, veterinary and scientific research – have today released their annual statistics.

 

We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use.

Martin Vinnell

This is to coincide with the Home Office’s publication of Great Britain’s statistics for animals used in research in 2020.

These ten organisations carried out 1,343,893 procedures, 47% or nearly half of the 2,883,310 procedures carried out in Great Britain in 2020. More than 99% of these 1,343,893 procedures were carried out in rodents or fish.

The statistics are freely available on the organisations’ websites as part of their ongoing commitment to transparency and openness around the use of animals in research.

The ten organisations are listed below alongside the total number of procedures that they carried out in 2020. This is the sixth consecutive year organisations have come together to publicise their collective statistics and examples of their research.

Organisation Number of Procedures
The Francis Crick Institute 183,811
University of Cambridge 177,219
Medical Research Council 173,637
University of Oxford 169,511
University of Edinburgh 151,669
UCL 142,988
University of Glasgow 102,526
University of Manchester 93,448
King’s College London 85,414
Imperial College London 63,670
TOTAL 1,343,893

A further breakdown of Cambridge’s numbers, including the number of procedures by species and detail of the levels of severity, can be found on our animal research pages.

Animal research has been essential for developing lifesaving vaccines and treatments for Covid-19. Ferrets and macaque monkeys were used to test the safety and efficacy of Covid-19 vaccines, including the successful Oxford / AstraZeneca vaccine. Hamsters are being used to develop Covid-19 treatment strategies as they display a more severe form of the disease than ferrets and monkeys. Guinea pigs have also been used in regulatory research to batch test vaccine potency.

Despite all this research to develop vaccines and treatments for Covid-19, the majority of UK research facilities carried out significantly less research than usual due to the various national lockdowns. Therefore, the 2020 figures cannot be reasonably compared with previous statistics.

All organisations are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible; minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as institutions expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

All organisations listed are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. More than 120 organisations have signed the Concordat including UK universities, medical research charities, research funders, learned societies and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said:

Animal research has been essential to the development and safety testing of lifesaving COVID-19 vaccines and treatments. Macaque monkeys and ferrets have been used to develop vaccines, including the Oxford / AstraZeneca vaccine, hamsters are being used to develop treatments, and guinea pigs are used to quality-check each batch of vaccines.

“Animal testing provided scientists with initial data that the vaccines were effective and safe enough to move into human clinical trials. During these trials, thousands more humans than animals were used to test how effective and safe the vaccines were in people. The pandemic has led to increased public interest in the way vaccines and medicines are developed and UAR has worked with research institutions and funding bodies throughout the UK to develop resources that explain to the public how animals have been used in this critical research.”

University of Cambridge Establishment Licence Holder Dr Martin Vinnell said:

“Animal research currently plays an essential role in our understanding of health and disease and in the development of modern medicines and surgical techniques. Without the use of animals, we would not have many of the modern medicines, antibiotics, vaccines and surgical techniques we take for granted in both human and veterinary medicine.

“We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use, for example in the development of ‘mini-organs’ grown from human cells, which can be used to model disease.”

Adapted from a press release by Understanding Animal Research.

Find out more

A team in the University of Cambridge’s Department of Engineering is developing implantable devices to bypass nerve damage and restore movement to paralysed limbs.

“Our aim is to make muscles wireless by intercepting electrical signals from the brain before they enter the damaged nerve and sending them directly to the target muscles via radio waves,” says Sam Hilton, a Research Assistant in the team.

The procedure has been tested and refined in computer simulations, and on cells grown in the lab. But before it can be tested in humans there is another important step: testing its safety in living rats. To avoid testing in animals entirely would place untenable risk on the first human recipients of this new device. All the experiments are carefully designed to ensure that just enough animals are used to produce convincing data, without resulting in unnecessary excess.

By working out how complex microelectronics can interface with living tissue in a very precise and controlled way, this work has potential to improve or restore movement in patients suffering severe nerve damage – improving their quality of life and easing the burden on our healthcare services.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Women Economists Underrepresented ‘At Every Level’ In UK Academia – Report

 

New research shows the gender gap in the teaching and study of economics is still dramatic and actually getting worse. Economists argue that this is not just a problem for the discipline, but for society as a whole.

 

Unless economists are diverse, we cannot hope to build a complete understanding of the economy, and, with it, formulate the right kinds of policies

Victoria Bateman

Women are underrepresented “at almost every level” within the discipline of economics at UK universities, according to a new report co-authored by a Cambridge economist.

Dr Victoria Bateman says that her research for the Royal Economics Society (RES) found signs of “stagnation and retreat” in the closing of gender gaps across the study of economics – with female intake (relative to male) actually falling at both undergraduate and master’s levels over the last two decades.

Published today, the report ‘Gender Imbalance in UK Economics’ marks 25 years since the establishment of the RES Women’s Committee, which was set up to monitor and advance the representation of women in UK economics.

“The economy affects everyone, and economists need to represent us all,” said Bateman, an Economics Fellow at Gonville and Caius College. “If they don’t, that’s a major barrier to building a solid understanding of the economy.”

“Across all students, from undergraduate to PhD, there are twice as many men studying economics as there are women in UK universities. While in many respects the discipline of economics has come a long way in the 21st century, the gender gap is clearly still real, persistent and in some ways getting worse.”

Bateman and colleagues argue that attracting, retaining and promoting female economists is a “particular problem” within UK academia when compared to areas of government and third sector organisations such as think tanks.

Only a quarter (26%) of economists working in UK academia are female, and only 15% of economics professors are women, compared to 38% of the economists at the UK Treasury and 44% of researchers at economic think tanks.

Among UK students entering the discipline, the gender gap has actually widened since 2002, when 31% of economics undergraduates and 37% of master’s students were women. By 2018, this had fallen to 27% and 31% respectively. Bateman says these statistics show that the closure of the gender gap in economics “isn’t simply a matter of time”.

“Only a third of economics lecturers in the UK are women, and just 15% of economics professors,” said report co-author Dr Erin Hengel, who received her PhD in economics from Cambridge before going on to lecture at the University of Liverpool.

“While these figures are better than they were 25 years ago, the improving trend has levelled off. It appears that progress is starting to slow far before we reach any kind of gender parity.”

When the report’s authors factored in ethnicity, the percentage of female students was higher. In 2018, a third (33%) of Black economics undergraduates and 31% of Asian ethnicity undergraduates were women, compared to a quarter (25%) of White students.

However, women from ethnic minority backgrounds are not staying in academic economics. The report also found that at PhD level, the proportion of women is ten percentage points lower among minority candidates than white candidates.

Perhaps startlingly, the report found that between 2012 and 2018 there was not a single Black woman employed as a professor of economics anywhere in the UK.

Bateman says she hopes the new report will serve as a “call to arms” for the discipline of economics. “We are calling on universities to ask themselves why so few UK women are attracted to studying and researching the economy and why, even when they are, they do not stay,” she said.

Bateman’s 2019 book The Sex Factor showed how the status and freedom of women are central to prosperity, and that ‘gender blindness’ in economics has left the discipline wide of the mark on everything from poverty and inequality to understanding cycles of boom and bust.

“Unless economists are diverse, we cannot hope to build a complete understanding of the economy, and, with it, formulate the right kinds of policies,” Bateman added.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Trinity Challenge Announces Inaugural Winners

Trinity Challenge announces inaugural winners

Collage of Trinity Challenge finalists

 

The Trinity Challenge has announced the winners of its inaugural competition, and is investing a £5.7 million (US$8 million) charitable pledged prize fund into one grand prize winner, two 2nd prize winners, and five 3rd prize winners.

 

While others talked, we took action. The solutions we have discovered in the course of the Challenge will be a link between systems and countries

Dame Sally Davies

The eight winners have been selected by an international panel of expert judges, out of a total of 340 applications from 61 countries. The competition has seen unprecedented collaborations between the private, public, charitable and academic sectors, and will drive a step-change in using data and analytics for pandemic preparedness.

The University of Cambridge joined a coalition of some of the world’s leading businesses and academic and tech institutions to launch The Trinity Challenge in September 2020. The global challenge, convened by Dame Sally Davies, Master of Trinity College, provides a £10m prize fund for breakthrough solutions to make sure one billion more people are better protected against health emergencies.

Participatory One Health Disease Detection (PODD), which empowers farmers to identify and report zoonotic diseases that could potentially pass from animals to humans, has been named the grand prize winner at the inaugural awards ceremony. The organisation is being awarded £1.3 million (US$1.8 million) in pledged funding.

Led by Susumpat Patipow, General Director at OpenDream, PODD has developed a platform for livestock owners to report suspected animal illness, and in return receive veterinary care to improve animal health. If it appears a disease outbreak is likely, local health officials will quarantine the sick animals, saving the remaining livestock and possibly preventing the next COVID-19-type outbreak.

Having already achieved significant success in Thailand, with a network of 20,000 farmers helping to detect and control disease outbreaks, PODD is looking to expand its operations to Cambodia, India, Indonesia, Laos, Uganda and Vietnam over the next three years.

BloodCounts! – an international consortium of scientists, led by Professor Carola-Bibiane Schönlieb from Cambridge’s Department of Applied Mathematics and Theoretical Physics (DAMTP) that has developed an innovative infectious disease outbreak detection system, was one of two second-prize winners, each awarded £1 million in pledged funding.

Developed by Dr Michael Roberts and Dr Nicholas Gleadall, the BloodCounts! Solution uses data from routine blood tests and powerful AI-based techniques to provide a ‘tsunami-like’ early warning system for new disease outbreaks.

“Since the beginning of the pandemic I have been developing AI-based methods to aid in medical decision making for COVID-19 patients, starting with analysis of Chest X-ray data,” said Roberts, who is affiliated with DAMTP and the Cambridge Mathematics of Information in Healthcare (CMIH) Hub. “Echoing the observations made by the clinical teams, we saw profound and unique differences in the medical measurements of infected individuals, particularly in their full blood count data. It is these changes that we can train models to detect at scale.”

Unlike many current test methods, their approach doesn’t require any prior knowledge of a specific pathogen to work, instead, they use full blood count data to exploit the pathogen detecting abilities of the human immune system by observing changes in the blood measurements associated with infection.

As the full blood count is the world’s most common medical laboratory test, with over 3.6 billion being performed worldwide each year, the BloodCounts! team can rapidly apply their methods to scan for abnormal changes in the blood cells of large populations – alerting public health agencies to potential outbreaks of pathogen infection.

This solution is a demonstration of how the application of AI-based methods can lead to healthcare benefits. It also highlights the importance of strong collaboration between leading organisations, as the development of these algorithms was only possible due the EpiCov data sharing initiative pioneered by Cambridge University Hospitals.

“Hundreds of millions of full blood count tests are being performed every day worldwide, and this meant that we could apply our AI methods at population scale,” said Gleadall, from the University of Cambridge and NHS Blood and Transplant. “Usually the rich measurement data are discarded after summary results have been reported, but by working with Cambridge University, Barts Health London, and University College London NHS Hospitals we have rescued throughout the pandemic the rich data from 2.8 million full blood count tests.”

The Sentinel Forecasting System is the other second-prize winner, and will explore the emergence of new infectious diseases in West Africa, beginning with Lassa fever. The system will combine data from ecology, social science, genomics and epidemiology to provide real-time disease risk for haemorrhagic fevers, such as Lassa and Ebola.

Lassa is a virus usually passed to humans through exposure to food or household items contaminated by infected rats. It is endemic in West African countries including Benin, Ghana, Guinea, Liberia, Mali, Sierra Leone, Togo and Nigeria.

Around 80% of people who become infected with Lassa virus have no symptoms, and the overall case-fatality rate is 1%. 1 in 5 infections can result in severe disease affecting the liver, spleen and kidneys.

The UCL team will partner with the African Centre of Excellence for Genomics of Infectious Diseases in Nigeria, Nigeria Centre for Disease Control, Zoological Society of London, London School of Hygiene and Tropical Medicine, Microsoft, and Cambridge’s Laboratory of Viral Zoonotics (LVZ) to produce the system.

“This Trinity Challenge project brings new multidisciplinary technologies together to anticipate climatic, human, animal population, agricultural impacts on the likelihood of spill overs of infections from animals to humans,” said Professor Jonathan Heeney, who leads LVZ at Cambridge’s Department of Veterinary Medicine.

Additionally, five 3rd prize winners are each being awarded £480,000 (US$ 660,000) in pledged funding.

Dame Sally Davies said: “It was crystal clear at the beginning of this pandemic that the world had a lack of data, a lack of access to data, and a lack of interoperability of data, presenting a challenge. While others talked, we took action. The solutions we have discovered in the course of the Challenge will be a link between systems and countries.”

In addition to financial support, The Trinity Challenge will provide connections to the right organisations to maximise the impact of these solutions. Since its inception nine months ago, TTC has united early applicants with partners from the private, academic and social sectors to receive access to digital platforms, data, and technical advice, to scale-up the use of data and analytics to protect the world from future health emergencies. The Trinity Challenge has helped form over 200 connections between applicants and its members.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Marmoset Study Identifies Brain Region Linking Actions To Their Outcomes

Marmoset

 

Researchers have discovered a specific brain region underlying ‘goal-directed behaviour’ – that is, when we consciously do something with a particular goal in mind, for example going to the shops to buy food.

 

This is a first step towards identifying suitable molecular targets for future drug treatments, or other forms of therapy, for devastating mental health disorders such as OCD and addiction.

Trevor Robbins

The study, published today in the journal Neuron, found that marmoset monkeys could no longer make an association between their behaviour and a particular outcome when a region of their brain called the anterior cingulate cortex was temporarily switched off.

This finding is important because the compulsive behaviours in OCD and addiction are thought to result from impairments in the ‘goal-directed system’ in the brain. In these conditions worrying, obsessions or compulsive behaviour such as drug seeking may reflect an alternative, habit-based system at work in the brain in which behaviours are not correctly linked with their outcomes.

It also sheds more light on how healthy people behave in a goal-directed way, which is needed to respond to changing environments and goals.

“We have identified the very specific region of the brain involved in goal-directed behaviour. When we temporarily turned this off, behaviour became more habitual – like when we go onto autopilot,” said Lisa Duan in the University of Cambridge’s Department of Psychology, first author of the report.

Marmosets were used because their brains share important similarities with human brains, and it is possible to manipulate specific regions of their brains to understand causal effects.

In the experiment, marmosets were first taught a goal-directed behaviour: by tapping a coloured cross when it appeared on a touchscreen, they were rewarded with their favourite juice to drink. But this connection between action and reward was randomly uncoupled so that they sometimes received the juice without having to respond to the image. They quickly detected this change and stopped responding to the image, because they saw they could get juice without doing anything.

Using drugs, the researchers temporarily switched off the anterior cingulate cortex including its connections with another brain region called the caudate nucleus. Repeating the experiment, they found when the connection between tapping the cross and receiving juice was randomly uncoupled, the marmosets did not change their behaviour but kept tapping the cross when it appeared.

Such habitual responding to the coloured cross was not observed when several other neighbouring regions of the brain’s prefrontal cortex – known to be important for other aspects of decision-making – were switched off. This shows the specificity of the anterior cingulate region for goal-directed behaviour.

A similar effect has been observed in computer-based tests on patients with Obsessive Compulsive Disorder (OCD) or addiction – when the relationship between an action and an outcome is uncoupled the patients continue to respond as though the connection is still there.

Previous evidence from patients suffering brain damage, and from brain imaging in healthy volunteers, shows that part of the brain called the prefrontal cortex is involved in goal-directed behaviour. However, the prefrontal cortex is a complex structure with many regions, and it has not previously been possible to identify the specific part responsible for goal-directed behaviour from human studies alone.

“We think this is the first study to have established the specific brain circuitry that controls goal-directed behaviour in primates, whose brains are very similar to human brains,” said Professor Angela Roberts in the University of Cambridge’s Department of Physiology, Development and Neuroscience, joint senior author of the report.

“This is a first step towards identifying suitable molecular targets for future drug treatments, or other forms of therapy, for devastating mental health disorders such as OCD and addiction,” added Professor Trevor Robbins in the University of Cambridge’s Department of Psychology, joint senior author of the report.

This research was conducted in the University of Cambridge’s Behavioural and Clinical Neuroscience Institute, and was funded by Wellcome.

Reference

Duan, L.Y. et al. ‘Controlling one’s world: identification of sub-regions of primate PFC underlying goal-directed behaviour.’ Neuron, June 2021. DOI: 10.1016/j.neuron.2021.06.003


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Astronomers Pinpoint When Cosmic Dawn Occurred

The formation and evolution of the first stars and galaxies in a virtual universe similar to our own

 

Cosmic dawn, when stars formed for the first time, occurred 250 million to 350 million years after the beginning of the universe, according to a new study led by researchers from the University of Cambridge and University College London (UCL).

 

Witnessing the moment when the universe was first bathed in starlight is a major quest in astronomy

Nicolas Laporte

The study, published in the Monthly Notices of the Royal Astronomical Society, suggests that the NASA James Webb Space Telescope (JWST), scheduled to launch in November 2021, will be sensitive enough to observe the birth of galaxies directly.

The UK-led research team examined six of the most distant galaxies currently known, whose light has taken most of the universe’s lifetime to reach us. They found that the distance of these galaxies away from Earth corresponded to a ‘look back’ time of more than 13 billion years ago, when the universe was only 550 million years old.

Analysing images from the Hubble and Spitzer Space Telescopes, the researchers calculated the age of these galaxies as ranging from 200 to 300 million years, allowing an estimate of when their stars first formed.

“Theorists speculate that the universe was a dark place for the first few hundred million years, before the first stars and galaxies formed,” said lead author Dr Nicolas Laporte from Cambridge’s Institute of Astronomy. “Witnessing the moment when the universe was first bathed in starlight is a major quest in astronomy.

“Our observations indicate that cosmic dawn occurred between 250 and 350 million years after the beginning of the universe, and, at the time of their formation, galaxies such as the ones we studied would have been sufficiently luminous to be seen with the James Webb Space Telescope.”

The researchers analysed starlight from the galaxies as recorded by the Hubble and Spitzer Space Telescopes, examining a marker in their energy distribution indicative of the presence of atomic hydrogen in their stellar atmospheres. This provides an estimate of the age of the stars they contain.

This hydrogen signature increases in strength as the stellar population ages but diminishes when the galaxy is older than a billion years. The age-dependence arise because the more massive stars that contribute to this signal burn their nuclear fuel more rapidly and therefore die first.

“This age indicator is used to date stars in our own neighbourhood in the Milky Way but it can also be used to date extremely remote galaxies, seen at a very early period of the universe,” said co-author Dr Romain Meyer from UCL and the Max Planck Institute for Astronomy. “Using this indicator we can infer that, even at these early times, our galaxies are between 200 and 300 million years old.”

In analysing the Hubble and Spitzer data, the researchers needed to estimate the ‘redshift’ of each galaxy, which indicates their cosmological distance and hence the look-back time at which they are being observed. To achieve this, they undertook spectroscopic measurements using the full armoury of ground-based telescopes – the Chilean Atacama Large Millimetre Array (ALMA), the European Very Large Telescope, the twin Keck telescopes in Hawai’i, and Gemini-South telescope.

These measurements enabled the team to confirm that looking at these galaxies corresponded to looking back to a time when the universe was 550 million years old.

“Over the last decade, astronomers have pushed back the frontiers of what we can observe to a time when the universe was only 4% of its present age,” said co-author Professor Richard Ellis from UCL. “However, due to the limited transparency of Earth’s atmosphere and capabilities of the Hubble and Spitzer Space Telescopes, we have reached our limit.

“We now eagerly await the launch of the James Webb Space Telescope, which we believe has the capability to directly witness cosmic dawn. The quest to see this important moment in the universe’s history has been a holy grail in astronomy for decades. Since we are made of material processed in stars, this is in one sense the search for our own origins.”

The new study involved astronomers from the University of California-Santa Cruz, the University of California, and the University of Texas.

The researchers received support from the Kavli Foundation, the European Research Council, the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF) in the United States.

The NASA-led James Webb Space Telescope, the successor to the Hubble observatory, is scheduled to be launched into space in November. It will be the premier observatory over the next decade, serving thousands of astronomers worldwide.

Reference:
N Laporte et al. ‘Probing Cosmic Dawn: Ages and Star Formation Histories of Candidate z ≥ 9 Galaxies.’ The Monthly Notices of the Royal Astronomical Society (2021). 

Adapted from a UCL press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Rock Crystals From The Deep Give Microscopic Clues To Earthquake Ground Movements

Chunks of exotic green rocks from the mantle erupted from the San Carlos Volcanic Field, Arizona

 

Microscopic imperfections in rock crystals deep beneath Earth’s surface play a deciding factor in how the ground slowly moves and resets in the aftermath of major earthquakes, says new research involving the University of Cambridge.

 

The stresses resulting from these defects – which are small enough to disrupt the atomic building blocks of a crystal – can transform how hot rocks beneath Earth’s crust move and in turn transfer stress back to Earth’s surface, starting the countdown to the next earthquake.

The new study, published in Nature Communications, is the first to map out the crystal defects and surrounding force fields in detail. “They’re so tiny that we’ve only been able to observe them with the latest microscopy techniques,” said lead author Dr David Wallis from Cambridge’s Department of Earth Sciences, “But it’s clear that they can significantly influence how deep rocks move, and even govern when and where the next earthquake will happen.”

By understanding how these crystal defects influence rocks in the Earth’s upper mantle, scientists can better interpret measurements of ground motions following earthquakes, which give vital information on where stress is building up – and in turn where future earthquakes may occur.

Earthquakes happen when pieces of Earth’s crust suddenly slip past each other along fault lines, releasing stored-up energy which propagates through the Earth and causes it to shake. This movement is generally a response to the build-up of tectonic forces in the Earth’s crust, causing the surface to buckle and eventually rupture in the form of an earthquake.

Their work reveals that the way Earth’s surface settles after an earthquake, and stores stress prior to a repeat event, can ultimately be traced to tiny defects in rock crystals from the deep.

“If you can understand how fast these deep rocks can flow, and how long it will take to transfer stress between different areas across a fault zone, then we might be able to get better predictions of when and where the next earthquake will strike,” said Wallis.

The team subjected olivine crystals – the most common component of the upper mantle — to a range of pressures and temperatures in order to replicate conditions of up to 100 km beneath Earth’s surface, where the rocks are so hot (roughly 1250oC) they move like syrup.

Wallis likens their experiments to a blacksmith working with hot metal – at the highest temperatures, their samples were glowing white-hot and pliable.

They observed the distorted crystal structures using a high-resolution form of electron microscopy, called electron backscatter diffraction, which Wallis has pioneered on geological materials.

Their results shed light on how hot rocks in the upper mantle can mysteriously morph from flowing almost like syrup immediately after an earthquake to becoming thick and sluggish as time passes.

This change in thickness — or viscosity – transfers stress back to the cold and brittle rocks in the crust above, where it builds up – until the next earthquake strikes.

The reason for this switch in behaviour has remained an open question, “We’ve known that microscale processes are a key factor controlling earthquakes for a while, but it’s been difficult to observe these tiny features in enough detail,” said Wallis. “Thanks to a state-of-the-art microscopy technique, we’ve been able to look into the crystal framework of hot, deep rocks and track down how important these miniscule defects really are”.

Wallis and co-authors show that irregularities in the crystals become increasingly tangled over time; jostling for space due to their competing force fields – and it’s this process that causes the rocks to become more viscous.

Until now it had been thought that this increase in viscosity was because of the competing push and pull of crystals against each other, rather than being caused by microscopic defects and their stress fields inside the crystals themselves.

The team hope to apply their work to improving seismic hazard maps, which are often used in tectonically active areas like southern California to estimate where the next earthquake will occur. Current models, which are usually based on where earthquakes have struck in the past, and where stress must therefore be building up, only take into account the more immediate changes across a fault zone and do not consider gradual stress changes in rocks flowing deep within the Earth.

Working with colleagues at Utrecht University, Wallis also plans to apply their new lab constraints to models of ground movements following the hazardous 2004 earthquake which struck Indonesia, and the 2011 Japan quake – both of which triggered tsunamis and lead to the loss of tens of thousands of lives.

 

Reference:
David Wallis et al. ‘Dislocation interactions in olivine control postseismic creep of the upper mantle.’ Nature Communications (2021). DOI: 10.1038/s41467-021-23633-8


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Low-Cost Imaging Technique Shows How Smartphone Batteries Could Charge In Minutes

Illustration of batteries charging

 

Researchers have developed a simple lab-based technique that allows them to look inside lithium-ion batteries and follow lithium ions moving in real time as the batteries charge and discharge, something which has not been possible until now.

 

This technique could be an important piece of the puzzle in the development of next-generation batteries

Christoph Schnedermann

Using the low-cost technique, the researchers identified the speed-limiting processes which, if addressed, could enable the batteries in most smartphones and laptops to charge in as little as five minutes.

The researchers, from the University of Cambridge, say their technique will not only help improve existing battery materials, but could accelerate the development of next-generation batteries, one of the biggest technological hurdles to be overcome in the transition to a fossil fuel-free world. The results are reported in the journal Nature.

While lithium-ion batteries have undeniable advantages, such as relatively high energy densities and long lifetimes in comparison with other batteries and means of energy storage, they can also overheat or even explode, and are relatively expensive to produce. Additionally, their energy density is nowhere near that of petrol. So far, this makes them unsuitable for widespread use in two major clean technologies: electric cars and grid-scale storage for solar power.

“A better battery is one that can store a lot more energy or one that can charge much faster – ideally both,” said co-author Dr Christoph Schnedermann, from Cambridge’s Cavendish Laboratory. “But to make better batteries out of new materials, and to improve the batteries we’re already using, we need to understand what’s going on inside them.”

To improve lithium-ion batteries and help them charge faster, researchers need to follow and understand the processes occurring in functioning materials under realistic conditions in real time. Currently, this requires sophisticated synchrotron X-ray or electron microscopy techniques, which are time-consuming and expensive.

“To really study what’s happening inside a battery, you essentially have to get the microscope to do two things at once: it needs to observe batteries charging and discharging over a period of several hours, but at the same time it needs to capture very fast processes happening inside the battery,” said first author Alice Merryweather, a PhD student at Cambridge’s Cavendish Laboratory.

The Cambridge team developed an optical microscopy technique called interferometric scattering microscopy to observe these processes at work. Using this technique, they were able to observe individual particles of lithium cobalt oxide (often referred to as LCO) charging and discharging by measuring the amount of scattered light.

They were able to see the LCO going through a series of phase transitions in the charge-discharge cycle. The phase boundaries within the LCO particles move and change as lithium ions go in and out. The researchers found that the mechanism of the moving boundary is different depending on whether the battery is charging or discharging.

“We found that there are different speed limits for lithium-ion batteries, depending on whether it’s charging or discharging,” said Dr Akshay Rao from the Cavendish Laboratory, who led the research. “When charging, the speed depends on how fast the lithium ions can pass through the particles of active material. When discharging, the speed depends on how fast the ions are inserted at the edges. If we can control these two mechanisms, it would enable lithium-ion batteries to charge much faster.”

“Given that lithium-ion batteries have been in use for decades, you’d think we know everything there is to know about them, but that’s not the case,” said Schnedermann. “This technique lets us see just how fast it might be able to go through a charge-discharge cycle. What we’re really looking forward to is using the technique to study next-generation battery materials – we can use what we learned about LCO to develop new materials.”

“The technique is a quite general way of looking at ion dynamics in solid-state materials, so you can use it on almost any type of battery material,” said Professor Clare Grey, from Cambridge’s Yusuf Hamied Department of Chemistry, who co-led the research.

The high throughput nature of the methodology allows many particles to be sampled across the entire electrode and, moving forward, will enable further exploration of what happens when batteries fail and how to prevent it.

“This lab-based technique we’ve developed offers a huge change in technology speed so that we can keep up with the fast-moving inner workings of a battery,” said Schnedermann. “The fact that we can actually see these phase boundaries changing in real time was really surprising. This technique could be an important piece of the puzzle in the development of next-generation batteries.”

 

Reference:
Alice J. Merryweather et al. ‘Operando optical tracking of single-particle ion dynamics in batteries.’ Nature (2021). DOI: 10.1038/s41586-021-03584-2


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Researcher Named One of Top 50 Women in Engineering

Sohini Kar-Narayan
source: www.cam.ac.uk

 

Dr Sohini Kar-Narayan from Cambridge’s Department of Materials Science and Metallurgy has been named one of the top 50 Women in Engineering 2021 by the Women’s Engineering Society.

 

Now in its sixth year, the 2021 WE50 celebrates the wealth of female talent within engineering and related disciplines. The annual celebration is aligned with International Women in Engineering Day (INWED) which takes place on 23 June.

Some of Kar-Narayan’s happiest childhood memories involved taking apart cassette players and VCR recorders, and that curiosity is what drew her to her current role. Her research involves developing new polymeric materials for harvesting energy to power health monitoring devices and integrating materials into versatile sensors. She has also been working on developing self-powered devices for patients.

“I am absolutely thrilled by this award, and to be recognised as an ‘Engineering Hero’ will go down well with my kids,” said Kar-Narayan, who is a Fellow of Clare Hall. “My late father was diabetic and suffered from heart disease, and this played a role in my desire to use science and engineering to improve patient care by developing self-powered devices that can offer personalised healthcare and remote health monitoring, and new technologies to study and manage the progression of disease at a cellular level. I am so grateful to WES for this award, and of course, to all the people who have supported me over the years, including my brilliant research group without whom this would not have been possible.”

One of the aims of Kar-Narayan’s research is the development of early-stage prototypes and eventual commercialisation of energy harvesting and self-powered sensing technologies. An example is the spin-out company ArtioSense Ltd that she has recently co-founded, which seeks to deliver low-cost conformable sensors that can aid orthopaedic surgery through real-time force monitoring in joints.

Even in the current climate, the number and standard of nominations were high, emphasising the exceptional achievements made by women in this field. The WE50 awards were judged by a panel of industry experts.

“It was wonderful to read about the achievements of these extraordinary women and the impact that they are making on society with their talent, hard work and dedication,” said Head Judge Professor Catherine Noakes OBE CEng FIMechE FIHEEM. “The COVID-19 pandemic has highlighted how truly important science, technology and engineering are to the health of our planet. The 2021 WE50 personify the inventive and inclusive thinking needed to build a sustainable future. If there was ever a time that we needed these heroes in engineering, it is now.”

INWED celebrates the achievements of women in engineering and related roles and highlights the opportunities available to engineers of the future. The WE50 was created to raise awareness of the skills shortage facing the industry, highlighting the huge discrepancy between the number of men vs. women currently in engineering professions. The theme of WE50 changes each year to recognise women working in different fields and from varying routes into engineering. This year’s theme is ‘Engineering Heroes.’


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Professor Clare Grey Awarded €1 Million Körber Prize

Professor Clare Grey
source: www.cam.ac.uk

 

The Körber European Science Prize 2021, worth one million euros, is to be awarded to University of Cambridge chemist Professor Clare Grey, one of the UK’s leading battery researchers.

 

Grey pioneered the optimisation of batteries with the help of NMR spectroscopy –similar to MRI technology – a method that allows non-invasive insights into the inner workings of batteries.

Her NMR studies have helped to significantly increase the performance of lithium-ion batteries, which power mobile phones, laptops and electric cars. She has been instrumental in the development of next-generation batteries and cost-effective, durable storage systems for renewable energy. She sees her fundamental research as an important contribution to achieving net-zero emissions by 2050.

“There have been significant advances in lithium-ion batteries since they were commercialised in the 1990s,” said Grey. “Their energy density has tripled and prices have fallen by 90 percent.”

Grey’s research has made key contributions to these developments. She is a pioneer in the study of solids with the help of NMR (nuclear magnetic resonance) spectroscopy, which she has developed and applied to allow researchers to observe the electrochemical processes at work during charging and discharging of batteries.

Clare Grey, 56, studied chemistry at the University of Oxford. At the age of 22, she published her first scientific article in the journal Nature. After completing her doctoral studies in 1991, she went to Radboud University in Nijmegen, the Netherlands, and has also worked as a visiting scientist at the US chemical company Dupont.

In 1994, she joined the State University of New York at Stony Brook as an assistant professor, and she became a full professor in 2001. In 2009, she became Geoffrey Moorhouse Gibson Professor at the University of Cambridge’s Yusuf Hamied Department of Chemistry. She is a Fellow of Pembroke College, and has been a Fellow of the Royal Society since 2011.

At the time Grey was still a student, most chemist and physicists used X-rays to determine the internal structure of solids. Grey was one of the first in her field to use solid state NMR instead: during her time in the USA, she met researchers from the Duracell company who inspired her to use the technology to study materials in batteries.

“Previously, the usual investigations with X-rays only provided an average picture,” Grey said. “With the help of NMR, I was able to detect the local structural details in these often-disordered materials.”

Initially, she examined individual materials by opening the batteries at a certain stage of their charging and discharging cycle. The aim was to find out which chemical processes cause the batteries to age and how their lifespan and capacity could be increased. Later, she improved the NMR technology so that she could use it to examine batteries during operation without destroying them, which helped speed up the studies enormously.

Now, in addition to her work improving lithium-ion batteries, Grey is developing a range of different next-generation batteries, including lithium-air batteries (which use oxidation of lithium and reduction of oxygen to induce a current), sodium, magnesium and redox flow batteries.

Her NMR studies allow her to follow the processes at work inside these batteries in real time and help determine the processes that cause batteries to degrade. She is working on further optimising the NMR method to design even more powerful, faster-charging and more environmentally friendly batteries.

In 2019, Grey co-founded a company, (Nyobolt), for ultra-fast charging batteries. Another company supplies the NMR measurement technology she designed to laboratories around the world.

To achieve climate goals and transition away from fossil fuels, Grey believes it is vital that “basic research into new battery technologies is already in full swing today – tomorrow will be too late.”

The Körber European Science Prize 2021 will be presented to Professor Clare Grey on 10 September in the Great Festival Hall of Hamburg City Hall. Since 1985, the Körber Foundation has honoured a breakthrough in the physical or life sciences in Europe with the Körber Prize. It is awarded for excellent and innovative research approaches with high application potential. To date, six Körber Prize winners have been awarded the Nobel Prize.

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

THE RAPID CHALLENGE

THE RAPID CHALLENGE

Hardware Start-ups

& Entrepreneurs

Join The RAPID Challenge journey to commercialise your innovation. Appications now open

Brought to you by

TBAT Main Logo.png
One PLM.png
Crowdcube_Logo.png
Prodrive Performance Unlimited logo.png
LEAD PARTNER -RPD_International.jpg
HSBC_MASTERBRAND_UK_RGB.png
PARTNER - Bridgehead.png
IP ASSET 2018 SQUARE LOGO.png

Media Partner

DEVELOP3D-LOGO-(WEB) 2.jpg
https://www.therapidchallenge.com/

THE PROCESS

A Challenge designed for hardware start-ups & entrepreneurs

The following 3 stage process has been created with the ethos that all who pass the criteria receive value by taking part. Specifically designed to address the challenges met by those commercialising hardware projects.

1

Workshop

Projects matching the entry criteria will be invited to attend a range of workshops. These sessions will include valuable advice and insight from our challenge partners to help you build and develop the business plan for your project. Following the workshop, applicants will be asked to submit a pitch deck including their business plan in order to reach the interview stage.

2

Interview

The business plans will be reviewed by a judging panel consisting of industry experts representing The RAPID Challenge Partners. Five  successful submissions will be invited to the interview stage to present their project and business plans in more detail and answer questions from the judging panel.  The best 3 projects will proceed to the Finals event.

3

Final

The final event will provide the best 3 projects with the opportunity to present their project to a wider audience of RAPID Challenge partners, affiliates, invited guests and investors where the winners of the prize package will be announced.

THE RAPID CHALLENGE

£75K Prize Package

A prize package specifically curated to benefit hardware startups. The prize package consists of:

  • £5K cash prize

  • An engineering service package from Prodrive

  • An R&D and grant application service package from TBAT Innovation

  • Manufacturing and design service package from RPD International

  • Go-to-market service package and entry to the Discover 21 Programme from Bridgehead Agency

  • Solid Edge CAD license for all applicants and an advanced CAD training course for the winner from OnePLM

  • Discounted listing fees on Crowdcube

  • Patent advice services from The IP Asset Partnership

  • Invitation to HSBC corporate events

KEY DATES

Applications open

Applications for 2021 The RAPID Challenge are open. Read more about the application criteria on the apply page: here.

 

Honeywell Takes a Majority Stake in British Quantum Computing Company: Cambridge Quantum Computing (CQC)

Honeywell Takes a Majority Stake in British Quantum Computing Company: Cambridge Quantum Computing (CQC)

Source: https://quantumzeitgeist.com/

June 8, 2021 

Honeywell, the developers of Ion Trap quantum technology have announced that they will take a majority stake in the Cambridge firm. Honeywell is a publicly listed technology company that has many products ranging from control systems to quantum computers. The move will help cement Honeywell’s push into quantum utilising the expertise that has been built up by CQC in Quantum Machine Learning, quantum languages (such a t|ket>) and NLP.

Upon completion of the deal, Honeywell will own the majority stake of the new business created. The quantum operating system will be a major part of that new business. The new business aims to offer a full suite of quantum software, including the most advanced quantum operating system. The business will continue to work on supporting developments across a number of sectors such as cyber security, drug discovery and delivery, material science, finance, and general optimisation. Of course the NLP (Natural Language Processing) expertise of CQC will likely play an important future role, something not lost on Honeywell and their motivations for the combined partnership.

The deal is expected to be complete in the third quarter of 2021, subject to regulatory approvals and customary closing conditions. Honeywell Chairman and Chief Executive Officer Darius Adamczyk will serve as chairman of the new company. The new company will be led by Ilyas Khan, the CEO and founder of CQC. Tony Uttley, currently the president of HQS, will serve as the new company’s president. Honeywell will invest between $270 million and $300 million in the new joint venture business.

At CQC, we are committed to using the most advanced tools and devices to develop the world’s leading quantum applications and products. Our cutting-edge software complements Honeywell’s innovative quantum technology and this investment
and partnership is of real significance in the overall development of quantum computers and their real-world impact on corporations and governments globally.”

Ilyas Khan CEO of CQC

In addition, Honeywell will invest between $270 million and $300 million in the new company and will have a long-term agreement to help manufacture the critical ion traps needed to power the quantum hardware. Honeywell’s businesses will continue to serve as a proving ground for the new company’s quantum offerings.

The combination is expected to be complete in the third quarter of 2021, subject to regulatory approvals and customary closing conditions. Honeywell Chairman and Chief Executive Officer Darius Adamczyk will serve as chairman of the new company. The new company will be led by Ilyas Khan, the CEO and founder of CQC. Tony Uttley, currently the president of HQS, will serve as the new company’s president.

A race for acquisitions and consolidation in Quantum?

As yet we do not know the new company name in the QZ office and are excited to learn what this will be, in addition to the excellent news for both Honeywell and CQC it comes as IonQ which also makes Ion Trap computers and plans to go public via a SPAC – something we have written extensively upon in the past. The news could create a flurry of activity as companies look to create more vertically aligned businesses.

The new venture will comprise experts in both and software and hardware and with multiple offerings will likely be a threat to many as the vertical alignment will strengthen their ability to attract clients to the nascent quantum area with the technical and applied experience of delivering software and hardware solutions.

Riverlane is also a Cambridge based company with its own Quantum operating system. This could mean it too, is also a target for a possible joint venture – perhaps teaming up with a hardware specialist to create a more vertically aligned combo.

Targeting Cellular Response To SARS-CoV-2 Holds Promise As New Way To Fight Infection

Scanning electron microscope image of SARS-CoV-2 (orange) emerging from the surface of cells (green) cultured in the lab.
source: www.cam.ac.uk

 

A new treatment approach focused on fixing cell damage, rather than fighting the virus directly, is effective against SARS-CoV-2 in lab models. If found safe for human use, this anti-viral treatment would make COVID-19 symptoms milder and speed up recovery times.

 

When a person is infected with SARS-CoV-2, the virus that causes COVID-19, it invades their cells and uses them to replicate – which puts the cells under stress. Current approaches to dealing with infection target the virus itself with antiviral drugs. But Cambridge scientists have switched focus to target the body’s cellular response to the virus instead.

In a new study, published today in the journal PLOS Pathogens, they found that all three branches of a three-pronged signalling pathway called the ‘unfolded protein response’ (UPR) are activated in lab-grown cells infected with SARS-CoV-2. Inhibiting the UPR to restore normal cell function using drugs was also found to significantly reduce virus replication.

“The virus that causes COVID-19 activates a response in our cells – called the UPR – that enables it to replicate,” said Dr Nerea Irigoyen in the University of Cambridge’s Department of Pathology, senior author of the report.

She added: “Using drugs we were able to reverse the activation of this specific cellular pathway, and remarkably this reduced virus production inside the cells almost completely, which means the infection could not spread to other cells. This has exciting potential as an anti-viral strategy against SARS-CoV-2.”

Treatment with a drug that targets one prong of the UPR pathway had some effect in reducing virus replication. But treatment with two drugs together – called Ceapin-A7 and KIRA8 – to simultaneously target two prongs of the pathway reduced virus production in the cells by 99.5%. This is the first study to show that the combination of two drugs has a much greater effect on virus replication in cells than a single drug.

The approach would not stop a person getting infected with the coronavirus, but the scientists say symptoms would be much milder, and recovery time would be quicker.

Anti-viral drugs currently in use to treat COVID-19, such as remdesivir, target replication of the virus itself. But if the virus develops resistance to these drugs they will no longer work. In contrast, the new treatment targets the response of the infected cells; this will not change even if new variants emerge, because the virus needs this cellular response in order to replicate.

The next step is to test the treatment in mouse models. The scientists also want to see whether it works against other viruses, and illnesses such as pulmonary fibrosis and neurological disorders that also activate the UPR response in cells.

“We hope this discovery will enable the development a broad-spectrum anti-viral drug, effective in treating infections with other viruses as well as SARS-CoV-2. We’ve already found it has an effect on Zika virus too. It has the potential to have a huge impact,” said Irigoyen.

SARS-CoV-2 is the novel coronavirus responsible for the COVID-19 pandemic. Since the end of 2019 there have been over 150 million cases of the disease worldwide, and over 3 million people have died.

This research was funded by an Isaac Newton Trust/Wellcome Trust ISSF/University of Cambridge Joint Research Grant.

Reference
Echavarria-Consuegra, L. et al: ‘Manipulation of the unfolded protein response: a pharmacological strategy against coronavirus infection.’ PLOS Pathogens. May 2021. DOI:10.1371/journal.ppat.1009644

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Study Identifies Trigger For ‘Head-To-Tail’ Axis Development in Human Embryo

Human embryo in the lab 9 days after fertilisation.
source: www.cam.ac.uk

 

Scientists have identified key molecular events in the developing human embryo between days 7 and 14 – one of the most mysterious, yet critical, stages of our development.

 

We have revealed the patterns of gene expression in the developing embryo just after it implants in the womb

Magdalena Zernicka-Goetz

The second week of gestation represents a critical stage of embryo development, or embryogenesis. Failure of development during this time is one of the major causes of early pregnancy loss. Understanding more about it will help scientists to understand how it can go wrong, and take steps towards being able to fix problems.

The pre-implantation period, before the developing embryo implants into the mother’s womb, has been studied extensively in human embryos in the lab. On the seventh day the embryo must implant into the womb to survive and develop. Very little is known about the development of the human embryo once it implants, because it becomes inaccessible for study.

Pioneering work by Professor Magdalena Zernicka-Goetz and her team developed a technique, reported in 2016, to culture human embryos outside the body of the mother beyond implantation. This enabled human embryos to be studied up to day 14 of development for the first time.

In a new study, the team collaborated with colleagues at the Wellcome Sanger Institute to reveal what happens at the molecular level during this early stage of embryogenesis. Their findings provide the first evidence that a group of cells outside the embryo, known as the hypoblast, send a message to the embryo that initiates the development of the head-to-tail body axis.

When the body axis begins to form, the symmetrical structure of the embryo starts to change. One end becomes committed to developing into the head end, and the other the ‘tail’.

The new results, published today in the journal Nature Communications, reveal that the molecular signals involved in the formation of the body axis show similarities to those in animals, despite significant differences in the positioning and organisation of the cells.

“We have revealed the patterns of gene expression in the developing embryo just after it implants in the womb, which reflect the multiple conversations going on between different cell types as the embryo develops through these early stages,” said Professor Magdalena Zernicka-Goetz in the University of Cambridge’s Department of Physiology, Development and Neuroscience, and senior author of the report.

She added: “We were looking for the gene conversation that will allow the head to start developing in the embryo, and found that it was initiated by cells in the hypoblast – a disc of cells outside the embryo. They send the message to adjoining embryo cells, which respond by saying ‘OK, now we’ll set ourselves aside to develop into the head end.’”

The study identified the gene conversations in the developing embryo by sequencing the code in the thousands of messenger RNA molecules made by individual cells. They captured the evolving molecular profile of the developing embryo after implantation in the womb, revealing the progressive loss of pluripotency (the ability of the embryonic cells to give rise to any cell type of the future organism) as the fates of different cells are determined.

“By creating an atlas of the cells involved in human development and how they communicate with other cells, we can start to understand more about the cellular processes and mechanisms behind very early embryo growth, which has been much harder to study compared to other mammals. This freely available information can now be used by researchers around the world to help inform future studies,” said Dr Roser Vento-Tormo, one of the senior authors and Group Leader at the Wellcome Sanger Institute.

“Our goal has always been to enable insights to very early human embryo development in a dish, to understand how our lives start. By combining our new technology with advanced sequencing methods we have delved deeper into the key changes that take place at this incredible stage of human development, when so many pregnancies unfortunately fail,” said Zernicka-Goetz.

This research was funded by Wellcome. It was carried out with the oversight of the UK Human Fertilisation and Embryology Authority, and with permission from a local research ethics committee.

Reference: Mole, M.A. et al: ‘A single cell characterisation of human embryogenesis identifies pluripotency transitions and putative anterior hypoblast centre.’ Nature Communications, June 2021. DOI: 10.1038/s41467-021-23758-w 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Teenagers at Greatest Risk of Self-Harming Could Be Identified Almost a Decade Earlier

 

A man sitting in front of a screen
source: www.cam.ac.uk

 

Researchers have identified two subgroups of adolescents who self-harm and have shown that it is possible to predict those individuals at greatest risk almost a decade before they begin self-harming.

 

The current approach to supporting mental health in young people is to wait until problems escalate. Instead, we need a much better evidence base so we can identify who is at most risk of mental health difficulties in the future, and why

Duncan Astle

The team, based at the MRC Cognition and Brain Sciences Unit, University of Cambridge, found that while sleep problems and low self-esteem were common risk factors, there were two distinct profiles of young people who self-harm – one with emotional and behavioural difficulties, and a second group without those difficulties but with different risk factors.

Between one in five and one in seven adolescents in England self-harms, for example by deliberately cutting themselves. While self-harm is a significant risk factor for subsequent suicide attempts, many do not intend suicide but face other harmful outcomes, including repeatedly self-harming, poor mental health, and risky behaviours like substance abuse. Despite its prevalence and lifelong consequences, there has been little progress in the accurate prediction of self-harm.

The Cambridge team identified adolescents who reported self-harm at age 14, from a nationally representative UK birth cohort of approximately 11,000 individuals. They then used a machine learning analysis to identify whether there were distinct profiles of young people who self-harm, with different emotional and behavioural characteristics. They used this information to identify risk factors from early and middle childhood. The results are published in the Journal of the American Academy of Child and Adolescent Psychiatry.

Because the data tracked the participants over time, the researchers were able to distinguish factors that appear alongside reported self-harm behaviour, such as low self-esteem, from those that precede it, such as bullying.

The team identified two distinct subgroups among young people who self-harm, with significant risk factors present as early as age five, nearly a decade before they reported self-harming. While both groups were likely to experience sleep difficulties and low self-esteem reported at age 14, other risk factors differed between the two groups.

The first group showed a long history of poor mental health, as well as bullying before they self-harmed. Their caregivers were more likely to have mental health issues of their own.

For the second group, however, their self-harming behaviour was harder to predict early in childhood. One of the key signs was a greater willingness to take part in risk-taking behaviour, which is linked to impulsivity. Other research suggests these tendencies may predispose the individual towards spending less time to consider alternate coping methods and the consequences of self-harm. Factors related to their relationships with their peers were also important for this subgroup, including feeling less secure with friends and family at age 14 and a greater concern about the feelings of others as a risk factor at age 11.

Stepheni Uh, a Gates Cambridge Scholar and first author of the study, said: “Self-harm is a significant problem among adolescents, so it’s vital that we understand the nuanced nature of self-harm, especially in terms of the different profiles of young people who self-harm and their potentially different risk factors.

“We found two distinct subgroups of young people who self-harm. The first was much as expected – young people who experience symptoms of depression and low self-esteem, face problems with their families and friends, and are bullied. The second, much larger group was much more surprising as they don’t show the usual traits that are associated with those who self-harm.”

The researchers say that their findings suggest that it may be possible to predict which individuals are most at risk of self-harm up to a decade ahead of time, providing a window to intervene.

Dr Duncan Astle said: “The current approach to supporting mental health in young people is to wait until problems escalate. Instead, we need a much better evidence base so we can identify who is at most risk of mental health difficulties in the future, and why. This offers us the opportunity to be proactive, and minimise difficulties before they start.

“Our results suggest that boosting younger children’s self-esteem, making sure that schools implement anti-bullying measures, and providing advice on sleep training, could all help reduce self-harm levels years later.

“Our research gives us potential ways of helping this newly-identified second subgroup. Given that they experience difficulties with their peers and are more willing to engage in risky behaviours, then providing access to self-help and problem-solving or conflict regulation programmes may be effective.”

Professor Tamsin Ford from the Department of Psychiatry added: “We might also help at-risk adolescents by targeting interventions at mental health leaders and school-based mental health teams. Teachers are often the first people to hear about self-harm but some lack confidence in how to respond. Providing them with training could make a big difference.”

The research was supported by the Gates Cambridge Trust, Templeton World Charity Foundation, and the UK Medical Research Council.

Reference
Uh, S et al. Two pathways to self-harm in adolescence. Journal of the American Academy of Child and Adolescent Psychiatry; 14 June 2021; DOI: 10.1016/j.jaac.2021.03.010


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Developer Of Aluminum-Ion Battery Claims It Charges 60 Times Faster Than Lithium-Ion, Offering EV Range Breakthrough

Michael Taylor

  • source: www.forbes.com

Range anxiety, recycling and fast-charging fears could all be consigned to electric-vehicle history with a nanotech-driven Australian battery invention.

The graphene aluminum-ion battery cells from the Brisbane-based Graphene Manufacturing Group (GMG) are claimed to charge up to 60 times faster than the best lithium-ion cells and hold three time the energy of the best aluminum-based cells.

They are also safer, with no upper Ampere limit to cause spontaneous overheating, more sustainable and easier to recycle, thanks to their stable base materials. Testing also shows the coin-cell validation batteries also last three times longer than lithium-ion versions.

GMG plans to bring graphene aluminum-ion coin cells to market late this year or early next year, with automotive pouch cells planned to roll out in early 2024.

Based on breakthrough technology from the University of Queensland’s (UQ) Australian Institute for Bioengineering and Nanotechnology, the battery cells use nanotechnology to insert aluminum atoms inside tiny perforations in graphene planes.

Testing by peer-reviewed specialist publication Advanced Functional Materials publication concluded the cells had “outstanding high-rate performance (149 mAh g−1 at 5 A g−1), surpassing all previously reported AIB cathode materials”.

MORE FOR YOU

Elon Musk’s New Tesla Plant Runs Into Germany’s ‘Bureaucratic Hell’

The Impact Of Free Community College

GMG Managing Director Craig Nicol insisted that while his company’s cells were not the only graphene aluminum-ion cells under development, they were easily the strongest, most reliable and fastest charging.

“It charges so fast it’s basically a super capacitor,” Nicol claimed. “It charges a coin cell in less than 10 seconds.”

The new battery cells are claimed to deliver far more power density than current lithium-ion batteries, without the cooling, heating or rare-earth problems they face.

“So far there are no temperature problems. Twenty percent of a lithium-ion battery pack (in a vehicle) is to do with cooling them. There is a very high chance that we won’t need that cooling or heating at all,” Nicol claimed.

“It does not overheat and it nicely operates below zero so far in testing.

“They don’t need circuits for cooling or heating, which currently accounts for about 80kg in a 100kWh pack.”

The new cell technology, Nicol insisted, could also be industrialized to fit inside current lithium-ion housings, like the Volkswagen Group’s MEB archicture, heading off problems with car-industry architectures that tend to be used for up to 20 years.

“Ours will be the same shape and voltage as the current lithium-ion cells, or we can move to whatever shape is necessary,” Nicol confirmed.

“It’s a direct replacement that charges so fast it’s basically a super capacitor.

“Some lithium-ion cells can’t do more than 1.5-2 amps or you can blow up the battery, but our technology has no theoretical limit.”

Aluminum-ion battery cells are a hot bed of development, particularly for automotive use.

Recent projects alone have included a collaboration between China’s Dalian University of Technology and the University of Nebraska, plus others from Cornell University, Clemson University, the University of Maryland, Stanford University, the Zhejiang University’s Department of Polymer Science and the European Alion industrial consortium.

The differences are highly technical, but the GMG cells use graphene from made from its proprietary plasma process, rather than traditional graphite sourcing, and the result is three times the energy density of the next-best cell, from Stanford University.

Stanford’s natural graphite aluminum-ion technology delivers 68.7 Watt-hours per kilogram and 41.2 Watts per kilogram, while its graphite-foam bumps up to 3000W/kg.

The GMG-UQ battery heaves that forward to between 150 and 160Wh/kg and 7000W/kg.

“They (UQ) found a way to make holes in graphene and a way to store aluminum atoms closer together in the holes.

“If we drill holes the atoms stick inside the graphene and it becomes a whole lot more dense, like a bowling ball on a mattress.”

The peer-reviewed publication Advanced Functional Materials found surface-perforated, three-layer graphene (SPG3-400) had “a significant amount of in-plane mesopores (≈2.3 nm), and an extremely low O/C ratio of 2.54%, has demonstrated excellent electrochemical performance.

“This SPG3-400 material exhibits an extraordinary reversible capacity (197 mAh g−1 at 2 A g−1) and outstanding high-rate performance,” it concluded.

Aluminum-ion technology has intrinsic advantages and disadvantages over the preeminent lithium-ion battery technology being used in almost every EV today.

When a cell recharges, aluminum ions return to the negative electrode and can exchange three electrons per ion instead of lithium’s speed limit of just one.

There is also a massive geopolitical, cost, environmental and recycling advantage from using aluminum-ion cells, because they use hardly any exotic materials.

“It’s basically aluminum foil, aluminum chloride (the precursor to aluminum and it can be recycled), ionic liquid and urea,” Nicol said.

“Ninety percent of world lithium production and purchasing is still through China and 10 percent is through Chile.

“We have all the aluminum we need right here in Australia, and they can be safely made in the first world.”

Listed on the TSX Venture exchange in Canada, GMG hooked itself in to UQ’s graphene aluminum-ion battery technology by supplying the university with graphene.

“Our lead product scientist Dr Ashok Nanjundan was involved in the University of Queensland project in its nanotechnology research centre in its early days,” Nicol said, admitting GMG almost “lucked into” the technology by supplying research projects with its graphene at no cost.

GMG has not locked down a supply deal with a major manufacturer or manufacturing facility.

“We are not tied in to big brands yet, but this could go into an Apple iPhone and charge it in seconds,” Nicol confirmed.

“We will bring the coin cell to market first. It recharges in less than a minute, and it has three times the energy than with lithium,” the Barcaldine product said.

“It’s a lot less adverse effect on health, too. A kid can be killed by lithium if it’s ingested, but not with aluminum.”

Another benefit would be cost. Lithium has risen from US$1460 a metric tonne in 2005 to US$13,000 a tonne this week, while aluminum’s price has edged up from US$1730 to US$2078 over the same period.

Another advantage is that the GMG graphene aluminum-ion cells do not use copper, which costs around US$8470 a tonne.

While it is open to manufacturing agreements, GMG’s preferred plan is to “run” with the technology as far as it can, with 10 gigaWatt to 50gW plants, first, even if Australia may not be the logical first choice for the manufacturing facility.

It’s not the only Brisbane-based company pushing battery solutions onto the world, either.

PPK Group has a joint venture with Deakin University to develop lithium-sulphur batteries and the Vecco Group has confirmed a deal with Shanghai Electric for a Brisbane manufacturing plant for vanadium batteries for commercial energy storage.

Physical Activity May Help To Close the Wealth Gap in School Attainment By Improving Self-Control

Children running
source: www.cam.ac.uk

 

Guaranteeing every child the opportunity to participate in certain types of physical activity could support their academic attainment and help to close the achievement gap between wealthy and less-advantaged pupils, new research indicates.

 

In the context of COVID in particular, there may be a real temptation to encourage schools to maximise classroom time to stop children falling behind. This study is saying ‘think again’, because playtime and PE lessons benefit the mind in ways that children really need in order to do their best.

Michelle Ellefson

The study, which analysed data from more than 4,000 children in England, suggests that those who do more physical activity are likely to have stronger ‘self-regulation’ – the ability to keep themselves in check – and in particular may find it easier to control their emotions at an earlier age. Physical activities which promote self-control in this way, such as swimming or ball sports, also have positive, knock-on effects for academic attainment.

This pattern of association, through which physical activity indirectly influences progress at school by supporting self-regulation, was found to be particularly pronounced among disadvantaged children. The authors of the study, which is published in the journal PLoS ONE, suggest that this may in part be because less-advantaged children often have fewer opportunities to participate in organised recreation and sports, and therefore experience stronger benefits when they do so.

The study was the first ever long-term analysis of the connections between physical activity, self-regulation and academic achievement. Researchers used data captured at three stages during childhood and adolescence: ages seven, 11 and 14.

Fotini Vasilopoulos, who led the study while a research student at the Faculty of Education, University of Cambridge, said: “Research examining the links between physical activity and attainment has produced mixed findings, but there is a positive, indirect relationship because of the impact on mental processes like self-control. This may be particularly important for children from families who find it harder to access sports clubs or other forms of physical activity outside school.”

Dr Michelle Ellefson, Reader in Cognitive Science at the Faculty of Education and a co-author, said: “In the context of COVID in particular, there may be a real temptation to encourage schools to maximise classroom time to stop children falling behind. This study is saying ‘think again’, because playtime and PE lessons benefit the mind in ways that children really need in order to do their best.”

The research used a subset of data covering pupils’ physical activity from the Millennium Cohort Study, which is following the lives of around 19,000 young people born between 2000 and 2002 in the UK.

Researchers also used measures of the children’s behavioural regulation (their ability to manage their behaviour to achieve certain goals) and emotional regulation (control over thoughts and feelings). Standardised test scores and teacher reports were used to measure academic attainment, and a survey of standard risk factors, taken when the children were very young, to establish socio-economic status.

Vasilopoulos and Ellefson then conducted a statistical analysis in two broad stages. First, they examined the direct relationship between physical activity and self-regulation. Next, they examined how far this had an indirect, knock-on effect on achievement. In both cases, they produced a set of correlations which indicated how strong the relationship was, and whether it was positive or negative.

Overall, children who engaged in more physical activity had better emotional regulation – meaning fewer mood swings or emotional outbursts – although there was no corresponding impact on their behavioural regulation.

When the researchers factored in low socio-economic status, however, this positive association was lost. This suggests the direct connection between physical activity and children’s ability to self-regulate is actually being shaped by advantage and wealth. For example, it may reflect the fact that children from disadvantaged settings are known to struggle with emotional regulation. Equally, less-advantaged children often have fewer opportunities to join sports clubs, to participate in activities like swimming and dance lessons, or to access safe, open spaces for games and exercise.

The nature of the indirect pathway through which physical activity, by influencing self-regulation, also has knock-on effects for young people’s attainment, was found to vary between age groups. At age seven, the researchers found a positive relationship with academic progress through emotional control; by age 11, it was physical activity’s impact on behavioural regulation that principally led to any resulting academic benefits.

In both cases, these effects were measurably stronger when low socio-economic status was taken into account. This may be because physical activity has added value for children who might otherwise experience it less. Research by the Social Mobility Commission has, for example, suggested that 34% of disadvantaged children participate in sport less than once a week, compared with 13% of their better-off counterparts.

“The attainment gap is a really complex problem, but we know that some of it is linked to less-advantaged children having poor self-regulation skills early in childhood,” Vasilopoulos said. “Physical activities that help them to do things like focus on a task or maintain attention could be part of the way to bridge that gap.”

In general, the findings indicate that activities which influence emotional control – such as games that involve co-operation, or encourage children to take responsibility for their actions – could be particularly important during early childhood, while those which shape behavioural control may be more important later on. The authors also suggest that schools could build links with sports clubs to create targeted programmes for children experiencing early disadvantage.

“Even giving children less-structured opportunities to run around outside could be of real developmental importance,” Ellefson added. “We really need to ensure that physical activity does not become an area schools feel they can legitimately sacrifice to drive up academic attainment. It has a crucial part to play.”

Reference:
Fotini Vasilopoulos, Michelle R. Ellefson. ‘Investigation of the associations between physical activity, self-regulation and educational outcomes in childhood.’ PLoS ONE (2021). DOI: 10.1371/journal.pone.0250984


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Researchers Awarded the Millennium Technology Prize

Cambridge researchers awarded the Millennium Technology Prize

 

British duo Professor Shankar Balasubramanian and Professor David Klenerman have been awarded the Millennium Technology Prize for their development of revolutionary DNA sequencing techniques.

 

University of Cambridge chemists Shankar Balasubramanian and David Klenerman have been jointly awarded the 2020 Millennium Technology Prize, one of the world’s most prestigious science and technology prizes, by Technology Academy Finland (TAF).

The global prize, awarded at two-year intervals since 2004 to highlight the impact of science and innovation on society, is worth €1 million. Of the nine previous winners of the Millennium Technology Prize, three have subsequently gone on to win a Nobel Prize. This is the first time that the prize has been awarded to more than one recipient for the same innovation, celebrating the significance of collaboration. The announcement of the 2020 award was delayed due to the COVID-19 pandemic.

Professors Balasubramanian and Klenerman co-invented Solexa-Illumina Next Generation DNA Sequencing (NGS), technology that has enhanced our basic understanding of life, converting biosciences into ‘big science’ by enabling fast, accurate, low-cost and large-scale genome sequencing – the process of determining the complete DNA sequence of an organism’s make-up. They co-founded the company Solexa to make the technology available to the world.

The technology has had – and continues to have – a transformative impact in the fields of genomics, medicine and biology. One measure of the scale of change is that it has allowed a million-fold improvement in speed and cost when compared to the first sequencing of the human genome. In 2000, sequencing of one human genome took over 10 years and cost more than a billion dollars: today, the human genome can be sequenced in a single day at a cost of $1,000. More than a million human genomes are sequenced at scale each year, thanks to the technology co-invented by Professors Balasubramanian and Klenerman, meaning we can understand diseases much better and much more quickly.

Professor Sir Shankar Balsubramanian FRS from the Yusuf Hamied Department of Chemistry, Cancer Research UK Cambridge Institute and a Fellow of Trinity College, said: “I am absolutely delighted at being awarded the Millennium Technology Prize jointly with David Klenerman, but it’s not just for us, I’m happy on behalf of everyone who has contributed to this work.”

Professor Sir David Klenerman FMedSci FRS from the Yusuf Hamied Department of Chemistry, and a Fellow of Christ’s College, said: “It’s the first time that we’ve been internationally recognised for developing this technology. The idea came from Cambridge and was developed in Cambridge. It’s now used all over the world, so I’m delighted largely for the team of people who worked on this project and contributed to its success.”

Next-generation sequencing involves fragmenting sample DNA into many small pieces that are immobilized on the surface of a chip and locally amplified. Each fragment is then decoded on the chip, base-by-base, using fluorescently coloured nucleotides added by an enzyme. By detecting the colour-coded nucleotides incorporated at each position on the chip with a fluorescence detector – and repeating this cycle hundreds of times – it is possible to determine the DNA sequence of each fragment.

The collected data is then analysed using computer software to assemble the full DNA sequence from the sequence of all these fragments. The NGS method’s ability to sequence billions of fragments in a parallel fashion makes the technique fast, accurate and cost-efficient. The invention of NGS was a revolutionary approach to the understanding of the genetic code in all living organisms.

Next-generation sequencing provides an effective way to study and identify new coronavirus strains and other pathogens. With the emergence of the COVID-19 pandemic, the technology is now being used to track and explore mutations in the coronavirus. This work has helped the creation of multiple vaccines now being administered worldwide and is critical to the creation of new vaccines against new dangerous viral strains. The results will also be used to prevent future pandemics.

The technology is also allowing scientists and researchers to identify the underlying factors in individuals that contribute to their immune response to COVID-19. This information is essential to unravelling the reason behind why some people respond much worse to the virus than others.

NGS technology has revolutionised global biological and biomedical research and has enabled the development of a broad range of related technologies, applications and innovations. Due to its efficiency, NGS is widely adopted in healthcare and diagnostics, such as cancer, rare diseases, infectious medicine, and sequencing-based non-invasive prenatal testing.

It is increasingly used to define the genetic risk genes for patients with a rare disease and to define new drug targets for disease in defined patient groups. NGS has also contributed to the creation of new and powerful biological therapies like antibodies and gene therapies.

In the field of cancer, NGS is becoming the standard analytical method for defining personalised therapeutic treatment. The technology has dramatically improved our understanding of the genetic basis of many cancers and is often used both for clinical tests for early detection and diagnostics both from tumours and patients’ blood samples.

In addition to medical applications, NGS has also had a major impact on all of biology as it allows the clear identification of thousands of organisms in almost any kind of sample, which is important for agriculture, ecology and biodiversity research.

Academy Professor Päivi Törmä, Chair of the Millennium Technology Prize Selection Committee, said: “The future potential of NGS is enormous and the exploitation of the technology is still in its infancy. The technology will be a crucial element in promoting sustainable development through personalisation of medicine, understanding and fighting killer diseases, and hence improving the quality of life. Professor Balasubramanian and Professor Klenerman are worthy winners of the prize.”

Professor Marja Makarow, Chair of Technology Academy Finland said: “Collaboration is an essential part of ensuring positive change for the future. Next Generation Sequencing is the perfect example of what can be achieved through teamwork and individuals from different scientific backgrounds coming together to solve a problem.

“The technology pioneered by Professor Balasubramanian and Professor Klenerman has also played a key role in helping discover the coronavirus’s sequence, which in turn enabled the creation of the vaccines – itself a triumph for cross-border collaboration – and helped identify new variants of COVID-19.”

Tomorrow (19 May 2021) Professors Balasubramanian and Klenerman will deliver the Millennium Technology Prize Lecture, talking about their innovation, at 14:30 at the Millennium Innovation Forum. The lecture can be accessed here.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Climate Exp0 – International Climate Conference Being Hosted By UK Universities Ahead of COP26

Dr Emily Shuckburgh Exp0 2021

source: www.cam.ac.uk

 

Weeklong conference brings together leading scientists, government ministers and experts from around the world to set the agenda ahead of the United Nations Climate Change conference.

 

As we look to emerge from the pandemic and build a more resilient, sustainable future, we must harness the ideas and innovations that will support a cleaner and greener future

Emily Shuckburgh

This week, more than 500 researchers from over 80 UK and Italian Universities will be joining colleagues from 40 countries to contribute to Climate Exp0. Online, free, and open to all, it’s an opportunity to connect policy, academic and student audiences across the globe, and harness the power of virtual collaboration to help advance a resilient, zero-carbon world. It will feature a range of speakers – from policymakers and academics, to practitioners and students.

Organised by the COP26 Universities Network, the conference aims to raise ambition for tangible outcomes from the UN COP26 Climate Change Conference, jointly hosted by the UK and Italy in Glasgow this November.

At a critical point in the COP26 pre-meetings and negotiations – six months prior to the conference itself – Climate Exp0 will showcase the latest thinking and most relevant UK and international research around five key themes: Adaptation and Resilience; Finance; Green Recovery; Mitigation Solutions and Nature-based Solutions.

Dr Emily Shuckburgh, Chair of the COP26 Universities Network and Director of Cambridge Zero, said: “This is a vital moment for the world. As we look to emerge from the pandemic and build a more resilient, sustainable future, we must harness the ideas and innovations that will support a cleaner and greener future. Climate Exp0 is an exhibition of hope and inspiration to encourage the ambitious global climate action that is required.”

Highlights of this week’s conference include:

Monday
Opening of conference

The Rt Hon Alok Sharma, President of COP26 and Minister Roberto Cingolani, Minister for Ecological Transition, Italy
Climate risk. Opening session setting out the threat we face (09:30 – 10:30).
Dr Emily Shuckburgh Cambridge University; Albert Klein Tank, MET Office and Tim Benton, Royal Institute of International Affairs, Chatham House.

Tuesday
Nature-based solutions and the opportunities they offer (09:00 – 09:30)

Zac Goldsmith, Minister of State for Pacific and the Environment and Emma Howard Boyd, Chair of the Environment Agency, UK

Wednesday
Policies for delivering COP26 mitigation objectives (16:30 – 17:15)

Manuel Pulgar Vidal, Global Head of Climate and Energy at WWF, Former Ministry of Environment of Peru, COP20 President, Jim Watson, Professor of Energy Policy, University College London (UCL) Institute for Sustainable Resources and Jacob Werksman, Principal Advisor, DG-CLIMA, European Commission, Brussels, Belgium

Thursday
Adaptation and resilience challenges in the COP26 meeting (09:30 – 10:30)

Minister Maria Cristina Messa, Ministry of University and Research and The Rt Hon Anne-Marie Trevelyan MP, Minister of State for Business, Energy and Clean Growth, UK

Friday
Ask a Climate Change expert: How can we save our planet? (17:15 – 18:30)

Tamsin Edwards, Reader in Climate Change, Kings College London, Lord Deben, Chairman of the Committee on Climate Change and Brighton Kaoma, Global Director of UN Sustainable Development Solutions Network-Youth Initiative.

The conference is a partnership with ‘Rete delle Universita’ per lo sviluppo Sostenibile (Italian University Network for Sustainable Development), sponsored by UK Research and Innovation (UKRI), and with support from UN Climate Change Conference UK 2021 in Partnership with Italy and Cambridge University Press.

 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.