All posts by Admin

Study In Mice Suggests Drug To Turn Fat ‘Brown’ Could Help Fight Obesity

Study in mice suggests drug to turn fat ‘brown’ could help fight obesity

source: www.cam.ac.uk

Our bodies contain two types of fat: white fat and brown fat. While white fat stores calories, brown fat burns energy and could help us lose weight. Now, scientists at the University of Cambridge have found a way of making the white fat ‘browner’ and increasing the efficiency of brown fat.

There have been a lot of studies that have found molecules that promote brown fat development, but simply increasing the amount of brown fat will not work to treat disease

Toni Vidal-Puig

While their study was carried out in mice, they hope that this finding will translate into humans and provide a potential new drug to help fight obesity.

Obesity is a condition in which individuals accumulate more and more fat until their fat stops functioning. This can lead to diseases such as diabetes. However, not all fat tissue is bad: the fat that accumulates in obesity is known as ‘white fat’, but a second form of fat known as ‘brown fat’ could be used to treat obesity.

Both brown and white fat are made up of fat cells known as adipocytes, but in brown fat, these cells are rich in mitochondria – the ‘batteries’ that power our bodies – which give the tissue its brown colour. Brown fat also contains more blood vessels to allow the body to provide it with oxygen and nutrients.

While white fat stories energy, brown fat burns it in a process known as ‘thermogenesis’. When fully activated, just 100g of brown fat can burn 3,400 calories a day – significantly higher than most people’s daily food intake and more than enough to fight obesity.

We all have some brown fat – or brown adipose tissue, as it is also known – in our bodies, but it is found most abundantly in newborns and in hibernating animals (where the heat produced by brown fat enables them to survive even in freezing temperatures). As we age, the amount of brown fat in our bodies decreases.

Just having more brown fat alone is not enough – the tissue also needs to be activated. Currently, the only ways to activate brown fat are to put people in the cold to mimic hibernation, which is both impractical and unpleasant, or to treat them with drugs known as adrenergic agonists, but these can cause heart attacks. It is also necessary to increase the number of blood vessels in the tissue to carry nutrients to the fat cells and the number of nerve cells to allow the brain to ‘switch on’ the tissue.

In 2012, a team led by Professor Toni Vidal-Puig from the Wellcome Trust-MRC Institute of Metabolic Science, University of Cambridge, identified a molecule known as BMP8b that regulates the activation of brown fat in both the brain and the body’s tissues. They showed that deleting the gene in mice that produces this protein stopped brown fat from functioning.

Now, in a study published today in the journal Nature Communications, Professor Vidal-Puig has led an international team of researchers which has shown that increasing how much BMP8b mice can produce increases the function of their brown fat. This implies that BMP8b, which is found in the blood, could potentially be used as a drug to increase the amount of brown fat amount in humans as well as making it more active. Further research will be necessary to demonstrate if this is the case.

To carry out their research, the team used mice that had been bred to produce higher levels of the protein in adipose tissue. As anticipated, they found that increasing BMP8b levels changed some of the white fat into brown fat, a process known as beiging and thus increased the amount of energy burnt by the tissue.

They showed that higher levels of BMP8b make the tissue more sensitive to adrenergic signals from nerves – the same pathway target by adrenergic agonist drugs. This may allow lower doses of these drugs to be used to activate brown fat in people, hence reducing their risk of heart attack.

Unexpectedly, but importantly, the team also found that the molecule increased the amount of blood vessels and nerves in brown fat.

“There have been a lot of studies that have found molecules that promote brown fat development, but simply increasing the amount of brown fat will not work to treat disease – it has to be able to get enough nutrients and be turned on,” says Professor Vidal-Puig, lead author of the study.

Co-author Dr Sam Virtue, also from the Institute of Metabolic Science, adds: “It’s like taking a one litre engine out of a car and sticking in a two litre engine in its place. In theory the car can go quicker, but if you only have a tiny fuel pipe to the engine and don’t connect the accelerator pedal it won’t do much good. BMP8b increases the engine size, and fits a new fuel line and connects up the accelerator!”

The research was funded by the British Heart Foundation, Medical Research Council, European Research Council, WHRI-Academy and Wellcome.

Reference
Pellegrinelli, V et al. Adipocyte-1 secreted BMP8b mediates adrenergic-induced remodeling of the neurovascular network in adipose tissue. Nature Communications; 26 Nov 2018; DOI: 10.1038/s41467-018-07453-x


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Brexit and Trump Voters More Likely to Believe in Conspiracy Theories, Survey Study Shows

Brexit and Trump voters more likely to believe in conspiracy theories, survey study shows

source: www.cam.ac.uk

Latest research reveals the extent to which conspiracy theories have become “mainstream rather than marginal beliefs” across much of Europe and the US.

These findings provide important clues to understanding the popularity of populist and nationalist parties

Hugo Leal

The largest cross-national study ever conducted on conspiracy theories suggests that around a third of people in countries such as the UK and France think their governments are “hiding the truth” about immigration, and that voting for Brexit and Trump is associated with a wide range of conspiratorial beliefs – from science denial to takeover plots by Muslim migrants.

The research, conducted as part of the University of Cambridge’s Conspiracy & Democracy project, and based on survey work from the YouGov-Cambridge centre, covers nine countries – US, Britain*, Poland, Italy, France, Germany, Portugal, Sweden, Hungary – and will be presented at a public launch in Cambridge on Friday 23 November.

According to project researcher Dr Hugo Leal, anti-immigration conspiracy theories have been “gaining ground” since the refugee crisis first came to prominence in 2015. “The conspiratorial perception that governments are deliberately hiding the truth about levels of migration appears to be backed by a considerable portion of the population across much of Europe and the United States,” he said.

In Hungary, where controversial Prime Minister Viktor Orban is regularly accused of stoking anti-migrant sentiment, almost half of respondents (48%) believe their government is hiding the truth about immigration. Germany was the next highest (35%), with France (32%), Britain (30%) and Sweden (29%) also showing high percentages of this conspiracy among respondents, as well as a fifth (21%) of those in the United States.

Close to half of respondents who voted for Brexit (47%) and Trump (44%) believe their government is hiding the truth about immigration, compared with just 14% of Remain voters and 12% of Clinton voters.

The researchers also set out to measure the extent of belief in a conspiracy theory known as ‘the great replacement’: the idea that Muslim immigration is part of a bigger plan to make Muslims the majority of a country’s population.

“Originally formulated in French far-right circles, the widespread belief in a supposedly outlandish nativist conspiracy theory known as the ‘great replacement’ is an important marker and predictor of the Trump and Brexit votes,” said Leal. Some 41% of Trump voters and 31% of Brexit voters subscribed to this theory, compared with 3% of Clinton voters and 6% of Remain voters.

Researchers also looked at a number of other popular conspiracy theories. Both Trump and Brexit voters were more likely to believe that climate change is a hoax, vaccines are harmful, and that a group of people “secretly control events and rule the world together”. “We found the existence of a conspiratorial worldview linking both electorates,” said Leal.

He describes the levels of science denial as an “alarming global trend”. In general, researchers found the idea that climate change is a hoax to be far more captivating for right-wing respondents, while scepticism about vaccines was less determined by “ideological affiliation”.

The view that “the truth about the harmful effects of vaccines is being deliberately hidden from the public” ranged from lows of 10% in Britain to a startling quarter of the population – some 26% – in France.

The conspiracy belief that a secret cabal “control events and rule the world together” varies significantly between European countries such as Portugal (42%) and Sweden (12%). Dr Hugo Drochon, also a researcher on the Leverhulme Trust-funded Conspiracy & Democracy project, suggests this has “public policy implications, because there are structural issues at play here too”.

“More unequal countries with a lower quality of democracy tend to display higher levels of belief in the world cabal, which suggests that conspiracy beliefs can also be addressed at a more ‘macro’ level,” said Drochon.

The research team assessed the levels of “conspiracy scepticism” by looking at those who refuted every conspiratorial view in the study. Sweden had the healthiest levels of overall conspiracy scepticism, with 48% rejecting every conspiracy put to them. The UK also had a relatively strong 40% rejection of all conspiracies. Hungary had the lowest, with just 15% of people not taken in by any conspiracy theories.

Half of both Remain and Clinton voters were conspiracy sceptics, while 29% of Brexit voters and just 16% of Trump voters rejected all conspiracy theories.

The question of trust, and which professions the public see as trustworthy, was also investigated by researchers. Government and big business came out worst across all countries included in the study. Roughly three-quarters of respondents in Italy, Portugal, Poland, Hungary and Britain say they distrusted government ministers and company CEOs. Distrust of journalists, trade unionists, senior officials of the EU, and religious leaders are also high in all surveyed countries.

Trust in academics, however, was still relatively high, standing at 57% in the US and 64% in Britain. “We hope these findings can provide incentive for academics to reclaim a more active role in the public sphere, particularly when it comes to illuminating the differences between verifiable truths and demonstrable falsehoods,” said Hugo Leal.

Apart from academics, only family and friends escape the general climate of distrust, with trust reaching levels between 80% and 90% in all countries. Leal argues that this might help explain the credibility assigned to “friend mediated” online social networks.

In all surveyed countries apart from Germany, about half the respondents got their news from social media, with Facebook the preferred platform followed by YouTube. Getting news from social media was less likely to be associated with complete scepticism of conspiracy theories – much less likely in countries such as the US and Italy.

Researchers found that consuming news from YouTube in particular was associated with the adoption of particular conspiratorial views, such as anti-vaccine beliefs in the US and climate change denial in Britain.

“A telling takeaway of the study is that conspiracy theories are, nowadays, mainstream rather than marginal beliefs,” said Leal. “These findings provide important clues to understanding the popularity of populist and nationalist parties contesting elections across much of the western world.”

The survey was conducted by YouGov during 13-23 August 2018, with a total sample size of 11,523 adults and results then weighted to be “representative of each market”.

* Northern Ireland was not included in the survey.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Top Ten Universities For Animal Research Announced

Top ten universities for animal research announced

source: www.cam.ac.uk

Understanding Animal Research, an organisation promoting greater openness about animal research, has today released a list of the ten universities in the UK that conduct the highest number of animal procedures – those used in medical, veterinary and scientific research. These statistics are freely available on the universities’ websites as part of their ongoing commitment to transparency and openness.

The figures show that the ten institutions collectively conducted over one third of all UK animal research in 2017. All ten universities appear in the QS 2018 World University Ranking Top 200 and seven appear in the Top 50.

The top ten institutions conducted 1.32 million procedures, 35% of the 3.79 million procedures conducted in Great Britain in 2017. Over 99% of these procedures were carried out on rodents or fish, and in line with national data they were almost evenly split between experimental work and the breeding of genetically modified animals.

The ten universities are listed below alongside the total number of procedures that they carried out in 2017. Each institution’s name links to its animal research webpage which includes more detailed statistics. This is the third year in a row universities have come together to publicise their collective numbers and examples of their research.

Institution Number of Procedures
University of Oxford 236,429
University of Edinburgh 225,366
University College London 214,570
University of Cambridge 157,975
King’s College London 121,741
University of Manchester 104,863
University of Sheffield 83,300
Imperial College London 79,492
Cardiff University 46,743
University of Glasgow 46,045
TOTAL  1,316,524

All universities are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible; minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as universities expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

All ten universities are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. Over 120 organisations have signed the concordat including UK universities, charities, research funders and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said: “The Concordat has fostered a culture of openness at research institutions up and down the country. Institutions now provide an unprecedented level of information about how and why they conduct medical, veterinary and scientific research using animals. Almost two-thirds of the university Concordat signatories provide their animal numbers openly on their websites – accounting for almost 90% of all animal research at UK universities.”

Animal research at Cambridge

The University of Cambridge has received two Openness Awards for its films about animal research. The first, Fighting Cancer, was a behind-the-scenes tour of one of its animal facilities, explaining how mice are used to study cancer and featuring images of mice with tumours and undergoing procedures. Its second film, Understanding the OCD Brain, looked at how both animal and human studies are vital to exploring a distressing mental health condition and included footage of its marmoset facility. Researchers from the University regularly speak about their work at its annual Science Festival.

“Animals are used in research at the University of Cambridge in a wide variety of experiments designed to develop new treatments for humans and animals,” says Dr Martin Vinnell, Director of University Biomedical Services at the University of Cambridge. “However, as seen from the additional data collection undertaken for the first time in 2018 just under 17% of the animals used at Cambridge were used in scientific procedures which did not required a Home Office licence – instead these animals were humanely killed and by applying new and often cutting edge technologies their cells and tissues were used in experiments.”

Find out more about animal research at Cambridge

Adapted from a press release by Understanding Animal Research


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Spitting Image Archives Donated to Cambridge University

Spitting Image archives donated to Cambridge University

  • 14 November 2018
  • source: https://www.bbc.co.uk/news/uk-england-cambridgeshire-46143333
Roger Law and the Spitting Image puppet of Margaret Thatcher
Image captionSpitting Image co-creator and caricaturist Roger Law with the donated puppet of Margaret Thatcher

The co-creator of TV satire series Spitting Image is donating his entire archive to Cambridge University.

The collection – including original scripts, puppet moulds, drawings and recordings – will be conserved and held in the library.

Spitting Image parodied political leaders, celebrities and royals over 18 series, and was broadcast by ITV from 1984 to 1996.

Roger Law said the material would be “in the right place, it’s come home”.

Among the archives is a rubber puppet of former Prime Minister Margaret Thatcher, caricatured with a wide-eyed stare and prominent nose.

Spitting ImageImage copyrightTOBY MELVILLE/PA
Image captionThe TV show parodied big names from the 1980s and 1990s, including members of the Royal Family

No-one in her cabinet or opposition was immune to the show’s satirical scrutiny and exaggerated impressions.

The often-controversial programme also featured prominent sports stars and celebrities, as well as senior members of the Royal Family.

Much of the donated collection has been kept in boxes at Mr Law’s home, or in “three sea containers out in the Cambridgeshire Fens”.

An original sketch of footballer and pundit Jimmy Greaves
Image captionAn original sketch of footballer and pundit Jimmy Greaves
A pencil sketch of French footballer Eric Cantona
Image captionA pencil sketch of French footballer Eric Cantona

Mr Law, who studied at the Cambridge School of Art and began his association with co-creator Peter Fluck in the city, said the university’s library was the best place for the collection.

“I was hoping the banks of the Ouse would break and it would all go into the North Sea but this is better,” he said.

“I also thought the show would die a death because no-one had done it before, I never thought it would be like a heater in the corner of the room, gently warming your knees.

“I knew people would react to it, like Marmite.”

Spitting Image script
Image captionAn excerpt from the 1984 pilot script for Spitting Image
Spitting Image script
Image captionThe collection includes original scripts from the show’s 18-series run

The body of donated work comprises every script from the show, including that of a 1985 pilot that was never broadcast.

There are thousands of visual images, as well as individual sketches, magazines and books and more than 400 videos.

Spitting ImageImage copyrightIAN ELLIS/EMPICS
Image captionWorld leaders and TV personalities were not immune from the satire

The University’s library is home to some of the world’s most important public records, including the original work and correspondence of Charles Darwin, and the papers of Sir Isaac Newton.

Librarian Dr Jessica Gardner described the collection as a “national treasure”.

“Spitting Image was anarchic, it was creative, it entered the public imagination like nothing else from that era,” she said.

“It is an extraordinary political and historical record. Great satire holds up a mirror, it questions and challenges.”

Spitting Image characters as boxers
Image captionThe collection includes dozens of photographic slides
Neil Kinnock's Spitting Image puppet with a long nose
Image captionThis slide shows former Labour leader Neil Kinnock

Mr Law’s wife, archivist Deirdre Amsden, listed and organised every item in the donated collection over the course of five years.

Mr Law said he was not a fan of puppetry but it was a “means to an end”.

“The great thing about Spitting Image was that there were writers, puppet makers and puppeteers but there were no stars,” he said.

“You could say the things you wanted to say with knobs on with puppets – like Mr Punch.

“Puppets have no agents, they don’t answer back, and you could put them in a cupboard. Great.”

Presentational grey line
Spitting Image puppets of (left-to-right) Paddy Ashdown, Neil Kinnock and John Major

The anarchic show that pulled a punch – and 15 million viewers

  • Spitting Image was created by caricaturists Peter Fluck, Roger Law and Martin Lambie-Nairn
  • At its height it pulled in an audience of 15 million viewers
  • It was nominated for nine BAFTA Television Awards (winning two) and four Emmys in 1985 and 1986
  • Much of Margaret Thatcher’s Cabinet was parodied, with Douglas Hurd depicted with “Mr Whippy ice cream” hair, and her successor John Major caricatured as a grey, dull puppet with a penchant for peas
  • World leaders were also stereotyped, with Mikhail Gorbachev’s forehead birthmark redrawn as a hammer and sickle
  • The series was axed in 1996 because of declining audiences

Gaia Spots a ‘Ghost’ Galaxy Next Door

Gaia spots a ‘ghost’ galaxy next door

source: www.cam.ac.uk

The Gaia satellite has spotted an enormous ‘ghost’ galaxy lurking on the outskirts of the Milky Way.

When we looked closer, it turned out we found something new

Vasily Belokurov

An international team of astronomers, including from the University of Cambridge, discovered the massive object when trawling through data from the European Space Agency’s Gaia satellite. The object, named Antlia 2 (or Ant 2), has avoided detection until now thanks to its extremely low density as well as a perfectly-chosen hiding place, behind the shroud of the Milky Way’s disc. The researchers have published their results online today.

Ant 2 is known as a dwarf galaxy. As structures emerged in the early Universe, dwarfs were the first galaxies to form, and so most of their stars are old, low-mass and metal-poor. But compared to the other known dwarf satellites of our Galaxy, Ant 2 is immense: it is as big as the Large Magellanic Cloud (LMC), and a third the size of the Milky Way itself.

What makes Ant 2 even more unusual is how little light it gives out. Compared to the LMC, another satellite of the Milky Way, Ant 2 is 10,000 times fainter. In other words, it is either far too large for its luminosity or far too dim for its size.

“This is a ghost of a galaxy,” said Gabriel Torrealba, the paper’s lead author. “Objects as diffuse as Ant 2 have simply not been seen before. Our discovery was only possible thanks to the quality of the Gaia data.”

The ESA’s Gaia mission has produced the richest star catalogue to date, including high-precision measurements of nearly 1.7 billion stars and revealing previously unseen details of our home Galaxy. Earlier this year, Gaia’s second data release made new details of stars in the Milky Way available to scientists worldwide.

The researchers behind the current study – from Taiwan, the UK, the US, Australia and Germany – searched the new Gaia data for Milky Way satellites by using RR Lyrae stars. These stars are old and metal-poor, typical of those found in a dwarf galaxy. RR Lyrae change their brightness with a period of half a day and can be located thanks to these well-defined pulses.

“RR Lyrae had been found in every known dwarf satellite, so when we found a group of them sitting above the Galactic disc, we weren’t totally surprised,” said co-author Vasily Belokurov from Cambridge’s Institute of Astronomy. “But when we looked closer at their location on the sky it turned out we found something new, as no previously identified object came up in any of the databases we searched through.”

The team contacted colleagues at the Anglo-Australian Telescope (AAT) in Australia, but when they checked the coordinates for Ant 2, they realised they had a limited window of opportunity to get follow-up data. They were able to measure the spectra of more than 100 red giant stars just before the Earth’s motion around the Sun rendered Ant 2 unobservable for months.

The spectra enabled the team to confirm that the ghostly object they spotted was real: all the stars were moving together. Ant 2 never comes too close to the Milky Way, always staying at least 40 kiloparsecs (about 130,000 light-years) away. The researchers were also able to obtain the galaxy’s mass, which was much lower than expected for an object of its size.

“The simplest explanation of why Ant 2 appears to have so little mass today is that it is being taken apart by the Galactic tides of the Milky Way,” said co-author Sergey Koposov from Carnegie Mellon University. “What remains unexplained, however, is the object’s giant size. Normally, as galaxies lose mass to the Milky Way’s tides, they shrink, not grow.”

If it is impossible to puff the dwarf up by removing matter from it, then Ant 2 had to have been born huge. The team has yet to figure out the exact process that made Ant 2 so extended. While objects of this size and luminosity have not been predicted by current models of galaxy formation, recently it has been speculated that some dwarfs could be inflated by vigorous star formation. Stellar winds and supernova explosions would push away the unused gas, weakening the gravity that binds the galaxy and allowing the dark matter to drift outward as well.

“Even if star formation could re-shape the dark matter distribution in Ant 2 as it was put together, it must have acted with unprecedented efficiency,” said co-author Jason Sanders, also from Cambridge.

Alternatively, Ant 2’s low density could mean that a modification to the dark matter properties is needed. The currently favoured theory predicts dark matter to pack tightly in the centres of galaxies. Given how fluffy the new dwarf appears to be, a dark matter particle which is less keen to cluster may be required.

“Compared to the rest of the 60 or so Milky Way satellites, Ant 2 is an oddball,” said co-author Matthew Walker, also from Carnegie Mellon University. “We are wondering whether this galaxy is just the tip of an iceberg, and the Milky Way is surrounded by a large population of nearly invisible dwarfs similar to this one.”

The gap between Ant 2 and the rest of the Galactic dwarfs is so wide that this may well be an indication that some important physics is missing in the models of dwarf galaxy formation. Solving the Ant 2 puzzle may help researchers understand how the first structures in the early Universe emerged. Finding more objects like Ant 2 will show just how common such ghostly galaxies are, and the team is busy looking for other similar galaxies in the Gaia data.

Reference: 
G. Torrealba et al. ‘The hidden giant: discovery of an enormous Galactic dwarf satellite in Gaia DR2.’ arXiv: 1811.04082


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Over Half a Million People Take Part In Largest Ever Study of Psychological Sex Differences and Autistic Traits

Over half a million people take part in largest ever study of psychological sex differences and autistic traits

 

source: www.cam.ac.uk

Scientists at the University of Cambridge have completed the world’s largest ever study of typical sex differences and autistic traits. They tested and confirmed two long-standing psychological theories: the Empathising-Systemising theory of sex differences and the Extreme Male Brain theory of autism.

Big data is important to draw conclusions that are replicable and robust. This is an example of how scientists can work with the media to achieve big data science

David Greenberg

Working with the television production company Channel 4, they tested over half a million people, including over 36,000 autistic people. The results are published today in the Proceedings of the National Academy of Sciences.

The Empathising-Systemising theory predicts that women, on average, will score higher than men on tests of empathy, the ability to recognize what another person is thinking or feeling, and to respond to their state of mind with an appropriate emotion. Similarly, it predicts that men, on average, will score higher on tests of systemising, the drive to analyse or build rule-based systems.

The Extreme Male Brain theory predicts that autistic people, on average, will show a masculinised shift on these two dimensions: namely, that they will score lower than the typical population on tests of empathy and will score the same as if not higher than the typical population on tests of systemising.

Whereas both theories have been confirmed in previous studies of relatively modest samples, the new findings come from a massive sample of 671,606 people, which included 36,648 autistic people. They were replicated in a second sample of 14,354 people. In this new study, the scientists used very brief 10-item measures of empathy, systemising, and autistic traits.

Using these short measures, the team identified that in the typical population, women, on average, scored higher than men on empathy, and men, on average, scored higher than women on systemising and autistic traits. These sex differences were reduced in autistic people. On all these measures, autistic people’s scores, on average, were ‘masculinised’: that is, they had higher scores on systemising and autistic traits and lower scores on empathy, compared to the typical population.

The team also calculated the difference (or ‘d-score’) between each individual’s score on the systemising and empathy tests. A high d-score means a person’s systemising is higher than their empathy, and a low d-score means their empathy is higher than their systemising.

They found that in the typical population, men, on average, had a shift towards a high d-score, whereas women, on average, had a shift towards a low d-score. Autistic individuals, on average, had a shift towards an even higher d-score than typical males. Strikingly, d-scores accounted for 19 times more of the variance in autistic traits than other variables, including sex.

Finally, men, on average, had higher autistic trait scores than women. Those working in STEM (Science, Technology, Engineering and Mathematics), on average, had higher systemising and autistic traits scores than those in non-STEM occupations. And conversely, those working in non-STEM occupations, on average, had had higher empathy scores than those working in STEM.

In the paper, the authors discuss how it is important to bear in mind that differences observed in this study apply only to group averages, not to individuals. They underline that these data say nothing about an individual based on their gender, autism diagnosis, or occupation. To do that would constitute stereotyping and discrimination, which the authors strongly oppose.

Further, the authors reiterate that the two theories are applicable to only two dimensions of typical sex differences: empathy and systemising. They do not apply to all sex differences, such as aggression, and to extrapolate the theories beyond these two dimensions would be a misinterpretation.

Finally, the authors highlight that although autistic people on average struggle with ‘cognitive’ empathy – recognizing other people’s thoughts and feelings – they nevertheless have intact ‘affective’ empathy – they care about others. It is a common misunderstanding that autistic people struggle with all forms of empathy, which is untrue.

Dr Varun Warrier, from the Cambridge team, said: “These sex differences in the typical population are very clear. We know from related studies that individual differences in empathy and systemising are partly genetic, partly influenced by our prenatal hormonal exposure, and partly due to environmental experience. We need to investigate the extent to which these observed sex differences are due to each of these factors, and how these interact.”

Dr David Greenberg, from the Cambridge team, said: “Big data is important to draw conclusions that are replicable and robust. This is an example of how scientists can work with the media to achieve big data science.”

Dr Carrie Allison, from the Cambridge team, said: “We are grateful to both the general public and to the autism community for participating in this research. The next step must be to consider the relevance of these findings for education, and support where needed.”

Professor Simon Baron-Cohen, Director of the Autism Research Centre at Cambridge who proposed these two theories nearly two decades ago, said: “This research provides strong support for both theories. This study also pinpoints some of the qualities autistic people bring to neurodiversity. They are, on average, strong systemisers, meaning they have excellent pattern-recognition skills, excellent attention to detail, and an aptitude in understanding how things work. We must support their talents so they achieve their potential – and society benefits too.”

This study was supported by the Autism Research Trust, the Medical Research Council, Wellcome, and the Templeton World Charity Foundation., Inc. It was conducted in association with the NIHR CLAHRC for Cambridgeshire and Peterborough NHS Foundation Trust, and the NIHR Cambridge Biomedical Research Centre.

Reference
Greenberg, DM et al. Testing the Empathizing-Systemising theory of sex differences and the Extreme Male Brain theory of autism in half a million people. PNAS; 12 Nov 2018; DOI: 10.1073/pnas.1811032115

If you’d like to complete these measures and participate in studies at the Autism Research Centre please register here


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Ancient DNA Analysis Unlocks Secrets of Ice Age Tribes in The Americas

source: www.cam.ac.uk

Scientists have sequenced 15 ancient genomes spanning from Alaska to Patagonia and were able to track the movements of the first humans as they spread across the Americas at “astonishing” speed during the last Ice Age, and also how they interacted with each other in the following millennia.

Our study proves that Spirit Cave and Lagoa Santa were actually genetically closer to contemporary Native Americans than to any other ancient or contemporary group sequenced to date

Eske Willeslev

The results have been published in the journal Science as part of a wide-ranging international study, led by the University of Cambridge, which genetically analysed the DNA of a series of well-known and controversial ancient remains across North and South America.

The research also discovered clues of a puzzling Australasian genetic signal in the 10,400-year-old Lagoa Santa remains from Brazil revealing a previously unknown group of early South Americans – but the Australasian link left no genetic trace in North America.

Additionally, a legal battle over a 10,600-year-old ancient skeleton – called the ‘Spirit Cave Mummy’ – has ended after advanced DNA sequencing found it was related to a Native American tribe. The researchers were able to dismiss a longstanding theory that a group called Paleoamericans existed in North America before Native Americans. The Paleoamerican hypothesis was first proposed in the 19th century, but this new study disproves that theory.

“Spirit Cave and Lagoa Santa were very controversial because they were identified as so-called ‘Paleoamericans’ based on craniometry – it was determined that the shape of their skulls was different to current day Native Americans,” said Professor Eske Willeslev, who holds positions at the Universities of Cambridge and Copenhagen, and led the study. “Our study proves that Spirit Cave and Lagoa Santa were actually genetically closer to contemporary Native Americans than to any other ancient or contemporary group sequenced to date.”

The scientific and cultural significance of the Spirit Cave remains, which were found in 1940 in a small rocky alcove in the Great Basin Desert, was not properly understood for 50 years. The preserved remains of the man in his forties were initially believed to be between 1,500 and 2000 years old but during the 1990s new textile and hair testing dated the skeleton at 10,600 years old.

The Fallon Paiute-Shoshone Tribe, a group of Native Americans based in Nevada near Spirit Cave, claimed cultural affiliation with the skeleton and requested immediate repatriation of the remains.

Their request was refused and the tribe sued the US government, a lawsuit that pitted tribal leaders against anthropologists, who argued the remains provided invaluable insights into North America’s earliest inhabitants and should continue to be displayed in a museum.

The deadlock continued for 20 years until the tribe agreed that Professor Willeslev could carry out genome sequencing on DNA extracted from the Spirit Cave for the first time.

“I assured the tribe that my group would not do the DNA testing unless they gave permission and it was agreed that if Spirit Cave was genetically a Native American the mummy would be repatriated to the tribe,” said Professor Willeslev, who is a Fellow of St John’s College.

The team extracted DNA from the inside of the skull proving that the skeleton was an ancestor of present-day Native Americans. Spirit Cave was returned to the tribe in 2016 and there was a private reburial ceremony earlier this year. The tribe were kept informed throughout the two-year project and two members visited the lab in Copenhagen to meet the scientists and they were present when all of the DNA sampling was taken.

The genome of the Spirit Cave skeleton has wider significance because it not only settled the legal and cultural dispute between the tribe and the Government, it also helped reveal how ancient humans moved and settled across the Americas. The scientists were able to track the movement of populations from Alaska to as far south as Patagonia. They often separated from each other and took their chances travelling in small pockets of isolated groups.

Dr David Meltzer, from the Department of Anthropology, Southern Methodist University, Dallas, said: “A striking thing about the analysis of Spirit Cave and Lagoa Santa is their close genetic similarity which implies their ancestral population travelled through the continent at astonishing speed. That’s something we’ve suspected due to the archaeological findings, but it’s fascinating to have it confirmed by the genetics. These findings imply that the first peoples were highly skilled at moving rapidly across an utterly unfamiliar and empty landscape. They had a whole continent to themselves and they were travelling great distances at speed.”

The study also revealed surprising traces of Australasian ancestry in ancient South American Native Americans but no Australasian genetic link was found in North American Native Americans.

Dr Victor Moreno-Mayar, from the Centre for GeoGenetics, University of Copenhagen and first author of the study, said: “We discovered the Australasian signal was absent in Native Americans prior to the Spirit Cave and Lagoa Santa population split which means groups carrying this genetic signal were either already present in South America when Native Americans reached the region, or Australasian groups arrived later. That this signal has not been previously documented in North America implies that an earlier group possessing it had disappeared or a later arriving group passed through North America without leaving any genetic trace.”

Dr Peter de Barros Damgaard, from the Centre for GeoGenetics, University of Copenhagen, explained why scientists remain puzzled but optimistic about the Australasian ancestry signal in South America. He explained: “If we assume that the migratory route that brought this Australasian ancestry to South America went through North America, either the carriers of the genetic signal came in as a structured population and went straight to South America where they later mixed with new incoming groups, or they entered later. At the moment we cannot resolve which of these might be correct, leaving us facing extraordinary evidence of an extraordinary chapter in human history! But we will solve this puzzle.”

The population history during the millennia that followed initial settlement was far more complex than previously thought. The peopling of the Americas had been simplified as a series of north to south population splits with little to no interaction between groups after their establishment.

The new genomic analysis presented in the study has shown that around 8,000 years ago, Native Americans were on the move again, but this time from Mesoamerica into both North and South America.

Researchers found traces of this movement in the genomes of all present-day indigenous populations in South America for which genomic data is available to date.

Dr Moreno-Mayar added: “The older genomes in our study not only taught us about the first inhabitants in South America but also served as a baseline for identifying a second stream of genetic ancestry, which arrived from Mesoamerica in recent millennia and that is not evident from the archaeological record. These Mesoamerican peoples mixed with the descendants of the earliest South Americans and gave rise to most contemporary groups in the region.”

Reference: 
J. Victor 
Moreno-Mayar et al. ‘Early human dispersals within the Americas.’ Science (2018). DOI: 10.1126/science.aav2621

Adapted from a St John’s College press release.

Inset image: Skulls and other human remains from P.W. Lund’s Collection from Lagoa Santa, Brazil. Kept in the Natural History Museum of Denmark. Credit: Natural History Museum of Denmark


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Selective Amnesia: How Rats and Humans Are Able To Actively Forget Distracting Memories

source: www.cam.ac.uk

Our ability to selectively forget distracting memories is shared with other mammals, suggests new research from the University of Cambridge. The discovery that rats and humans share a common active forgetting ability – and in similar brain regions – suggests that the capacity to forget plays a vital role in adapting mammalian species to their environments, and that its evolution may date back at least to the time of our common ancestor.

Quite simply, the very act of remembering is a major reason why we forget, shaping our memory according to how it is used

Michael Anderson

The human brain is estimated to include some 86 billion neurons (or nerve cells) and as many as 150 trillion synaptic connections, making it a powerful machine for processing and storing memories. We need to retrieve these memories to help us carry out our daily tasks, whether remembering where we left the car in the supermarket car park or recalling the name of someone we meet in the street. But the sheer scale of the experiences people could store in memory over our lives creates the risk of being overwhelmed with information. When we come out of the supermarket and think about where we left the car, for example, we only need to recall where we parked the car today, rather than being distracted by recalling every single time we came to do our shopping.

Previous work by Professor Michael Anderson at the Medical Research Council Cognition and Brain Sciences Unit, University of Cambridge, showed that humans possess the ability to actively forget distracting memories, and that retrieval plays a crucial role in this process. His group has shown how intentional recall of a past memory is more than simply reawakening it; it actually leads us to forget other competing experiences that interfere with retrieval of the memory we seek.

“Quite simply, the very act of remembering is a major reason why we forget, shaping our memory according to how it is used,” says Professor Anderson.

“People are used to thinking of forgetting as something passive. Our research reveals that people are more engaged than they realise in actively shaping what they remember of their lives. The idea that the very act of remembering can cause forgetting is surprising and could tell us more about people’s capacity for selective amnesia.”

While this process improves the efficiency of memory, it can sometimes lead to problems. If the police interview a witness to a crime, for example, their repeated questioning about selected details might lead the witness to forget information that could later prove important.

Although the ability to actively forget has been seen in humans, it is unclear whether it occurs in other species. Could this ability be unique to our species, or at least to more intelligent mammals such as monkeys and great apes?

In a study published today in the journal Nature Communications, Professor Anderson together with Pedro Bekinschtein and Noelia Weisstaub of Universidad Favaloro in Argentina, has shown that the ability to actively forget is not a peculiarly human characteristic: rats, too, share our capacity for selective forgetting and use a very similar brain mechanism, suggesting this is an ability shared among mammals.

To demonstrate this, the researchers devised an ingeniously simple task based on rats’ innate sense of curiosity: when put into an environment, rats actively explore to learn more about it. When exploring an environment, rats form memories of any new objects they find and investigate.

Building on this simple observation, the researchers allowed rats to explore two previously-unseen objects (A and B) in an open arena – the objects included a ball, a cup, small toys, or a soup can.  Rats first got to explore object A for five minutes, and then were removed from the arena; they were then placed back in the arena 20 minutes later with object B, which they also explored for five minutes.

To see whether rats showed retrieval-induced forgetting, like humans, rats next performed “retrieval practice” on one of the two objects (e.g. A) to see how this affected their later memory for the competitor object (B). During this retrieval practice phase, the researchers repeatedly placed the rat in the arena with the object they wanted the rat to remember (e.g. A), together with another object never seen in the context of the arena. Rats instinctively prefer exploring novel objects, and so on these “retrieval practice” trials, the rats clearly preferred to explore the new objects, implying that they indeed had remembered A and saw it as “old news”.

To find out how repeatedly retrieving A affected rats’ later memory for B, in a final phase conducted 30 minutes later, the researchers placed the rat into the arena with B and an entirely new object.  Strikingly, on this final test, the rats explored both B and the new object equally – by selectively remembering their experience with A over and over, rats had actively trained themselves to forget B.

In contrast, in control conditions in which the researchers skipped the retrieval practice phase and replaced it with an equal amount of relaxing time in the rats’ home cage, or an alternative memory storage task not involving retrieval, rats showed excellent memory for B.

Professor Anderson’s team then identified an area towards the front of the rat’s brain that controls this active forgetting mechanism. When a region at the front of the rat’s brain known as the medial prefrontal cortex was temporarily ‘switched off’ using the drug muscimol, the animal entirely lost its ability to selectively forget competing memories; despite undergoing the same “retrieval practice” task as before, rats now recognised B. In humans, the ability to selectively forget in this manner involves engaging an analogous region in the prefrontal cortex.

“Rats appear to have the same active forgetting ability as humans do – they forget memories selectively when those memories cause distraction,” says Professor Anderson. “And, crucially, they use a similar prefrontal control mechanism as we do. This discovery suggests that this ability to actively forget less useful memories may have evolved far back on the ‘Tree of Life’, perhaps as far back as our common ancestor with rodents some 100 million years ago.”

Professor Anderson says that now that we know that the brain mechanisms for this process are similar in rats and humans, it should be possible to study this adaptive forgetting phenomenon at a cellular – or even molecular – level. A better understanding of the biological foundations of these mechanisms may help researchers develop improved treatments to help people forget traumatic events.

The research was funded by the Medical Research Council, the National Agency of Scientific and Technological Promotion of Argentina and the International Brain Research Organization.

Reference
Bekinschtein, B, et al. A Retrieval-Specific Mechanism of Adaptive Forgetting in the Mammalian Brain. Nature Comms; 7 Nov 2019; DOI: 10.1038/s41467-018-07128-7


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

New Efficiency Record Set For Perovskite LEDs

source: www.cam.ac.uk

Researchers have set a new efficiency record for LEDs based on perovskite semiconductors, rivalling that of the best organic LEDs (OLEDs).

Compared to OLEDs, which are widely used in high-end consumer electronics, the perovskite-based LEDs, developed by researchers at the University of Cambridge, can be made at much lower costs, and can be tuned to emit light across the visible and near-infrared spectra with high colour purity.

The researchers have engineered the perovskite layer in the LEDs to show close to 100% internal luminescence efficiency, opening up future applications in display, lighting and communications, as well as next-generation solar cells.

These perovskite materials are of the same type as those found to make highly efficient solar cells that could one day replace commercial silicon solar cells. While perovskite-based LEDs have already been developed, they have not been nearly as efficient as conventional OLEDs at converting electricity into light.

Earlier hybrid perovskite LEDs, first developed by Professor Sir Richard Friend’s group at the University’s Cavendish Laboratory four years ago, were promising, but losses from the perovskite layer, caused by tiny defects in the crystal structure, limited their light-emission efficiency.

Now, Cambridge researchers from the same group and their collaborators have shown that by forming a composite layer of the perovskites together with a polymer, it is possible to achieve much higher light-emission efficiencies, close to the theoretical efficiency limit of thin-film OLEDs. Their results are reported in the journal Nature Photonics.

“This perovskite-polymer structure effectively eliminates non-emissive losses, the first time this has been achieved in a perovskite-based device,” said Dr Dawei Di from Cambridge’s Cavendish Laboratory, one of the corresponding authors of the paper. “By blending the two, we can basically prevent the electrons and positive charges from recombining via the defects in the perovskite structure.”

The perovskite-polymer blend used in the LED devices, known as a bulk heterostructure, is made of two-dimensional and three-dimensional perovskite components and an insulating polymer. When an ultra-fast laser is shone on the structures, pairs of electric charges that carry energy move from the 2D regions to the 3D regions in a trillionth of a second: much faster than earlier layered perovskite structures used in LEDs. Separated charges in the 3D regions then recombine and emit light extremely efficiently.

“Since the energy migration from 2D regions to 3D regions happens so quickly, and the charges in the 3D regions are isolated from the defects by the polymer, these mechanisms prevent the defects from getting involved, thereby preventing energy loss,” said Di.

“The best external quantum efficiencies of these devices are higher than 20% at current densities relevant to display applications, setting a new record for perovskite LEDs, which is a similar efficiency value to the best OLEDs on the market today,” said Baodan Zhao, the paper’s first author.

While perovskite-based LEDs are beginning to rival OLEDs in terms of efficiency, they still need better stability if they are to be adopted in consumer electronics. When perovskite-based LEDs were first developed, they had a lifetime of just a few seconds. The LEDs developed in the current research have a half-life close to 50 hours, which is a huge improvement in just four years, but still nowhere near the lifetimes required for commercial applications, which will require an extensive industrial development programme. “Understanding the degradation mechanisms of the LEDs is a key to future improvements,” said Di.

The research was funded by the Engineering and Physical Sciences Research Council (EPSRC) and the European Research Council (ERC).

Reference:
Baodan Zhao et al. ‘High-efficiency perovskite-polymer bulk heterostructure light-emitting diodes.’ Nature Photonics (2018). DOI: 10.1038/s41566-018-0283-4


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Observation of Blood Vessel Cells Changing Function Could Lead tTo Early Detection of Blocked Arteries

source: www.cam.ac.uk

A study in mice has shown that it may be possible to detect the early signs of atherosclerosis, which leads to blocked arteries, by looking at how cells in our blood vessels change their function.

The muscle cells that line the blood vessels have long been known to multi-task. While their main function is pumping blood through the body, they are also involved in ‘patching up’ injuries in the blood vessels. Overzealous switching of these cells from the ‘pumping’ to the ‘repair’ mode can lead to atherosclerosis, resulting in the formation of ‘plaques’ in the blood vessels that block the blood flow.

Using state-of-the art genomics technologies, an interdisciplinary team of researchers based in Cambridge and London has caught a tiny number of vascular muscle cells in mouse blood vessels in the act of switching and described their molecular properties. The researchers used an innovative methodology known as single-cell RNA-sequencing, which allows them to track the activity of most genes in the genome in hundreds of individual vascular muscle cells.

Their findings, published today in Nature Communications, could pave the way for detecting the ‘switching’ cells in humans, potentially enabling the diagnosis and treatment of atherosclerosis at a very early stage in the future.

Atherosclerosis can lead to potentially serious cardiovascular diseases such as heart attack and stroke. Although there are currently no treatments that reverse atherosclerosis, lifestyle interventions such as improved diet and increased exercise can reduce the risk of the condition worsening; early detection can minimise this risk.

“We knew that although these cells in healthy tissues look similar to each other, they are actually quite a mixed bag at the molecular level,” explains Dr Helle Jørgensen, a group leader at the University of Cambridge’s Division of Cardiovascular Medicine, who co-directed the study. “However, when we got the results, a very small number of cells in the vessel really stood out. These cells lost the activity of typical muscle cell genes to various degrees, and instead expressed a gene called Sca1 that is best known to mark stem cells, the body’s ‘master cells’.”

The ability to detect the activity (or ‘expression’) of thousands of genes in parallel in these newly-discovered cells has been a game-changer, say the researchers.

“Single-cell RNA-sequencing has allowed us to see that in addition to Sca1, these cells expressed a whole set of other genes with known roles in the switching process,” says Lina Dobnikar, a computational biologist based at Babraham Institute and joint first author on the study. “While these cells did not necessarily show the properties of fully-switched cells, we could see that we caught them in the act of switching, which was not possible previously.”

To confirm that these unusual cells originated from muscle cells, the team used another new technology, known as lineage labelling, which allowed the researchers to trace the history of a gene’s expression in each cell.

“Even when the cells have entirely shut down muscle cell genes, lineage labelling demonstrated that at some point either they or their ancestors were indeed the typical muscle cells,” says Annabel Taylor, a cell biologist in Jørgensen’s lab and joint first author on the study.

Knowing the molecular profile of these unusual cells has made it possible to study their behaviour in disease. Researchers have confirmed that these cells become much more numerous in damaged blood vessels and in atherosclerotic plaques, as would be expected from switching cells.

“We were fortunate in that single-cell RNA-sequencing technologies had been rapidly evolving while we were working on the project,” says Dr Mikhail Spivakov, a genomics biologist and group leader at MRC London Institute of Medical Sciences, who co-directed the study with Jørgensen. Dr Spivakov carried out the work while he was a group leader at the Babraham Institute. “When we started out, looking at hundreds of cells was the limit, but for the analysis of atherosclerotic plaques we really needed thousands. By the time we got to doing this experiment, it was already possible.”

In the future, the findings by the team may pave the way for catching atherosclerosis early and treating it more effectively.

“Theoretically, seeing an increase in the numbers of switching cells in otherwise healthy vessels should raise an alarm”, says Jørgensen. “Likewise, knowing the molecular features of these cells may help selectively target them with specific drugs. However, it is still early days. Our study was done in mice, where we could obtain large numbers of vascular muscle cells and modify their genomes for lineage labelling. Additional research is still required to translate our results into human cells first and then into the clinic.”

The research was funded by the British Heart Foundation and UK Research and Innovation.

Reference
Dobnikar, L, Taylor, AL et al. Disease-relevant transcriptional signatures identified in individual smooth muscle cells from healthy vessels. Nature Communications; 1 Nov 2019; DOI: 10.1038/s41467-018-06891-x


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Multi-Million Pound Initiative From Microsoft To Support AI Research At Cambridge

source: www.cam.ac.uk

The University of Cambridge is joining with Microsoft to help tackle the problem of ‘brain drain’ in AI and machine learning research.

By working together with industry on issues such as how best to use AI and machine learning, we can not only help solve complex issues for industry, but continue to support world-leading research and train the next generation of leaders in the field

Andy Neely

As part of the Microsoft Research – Cambridge University Machine Learning Initiative, Microsoft will help increase AI and machine learning research capacity and capability at Cambridge by supporting visiting researchers, postdoctoral researchers, PhD students and interns from the UK, EU and beyond.

The new Initiative builds on more than two decades of collaboration between the University and Microsoft Research Cambridge, and will be based in the University’s Department of Engineering. It will be formally announced today at the Microsoft Future Decoded Conference in London.

AI and machine learning have the potential to revolutionise how we interact with the world, but before these technologies can be widespread and used in industries such as healthcare, education and transportation, there are complex problems that need to be solved.

A shortage of skills in AI and machine-learning, particularly at PhD level and above, has led to many large tech companies recruiting from academia, leaving behind a shortage in research and teaching capacity at universities.

“By focusing on a two-way collaborative initiative for long-term growth, not short-term gain, we are taking a different approach to this problem. We are working with universities to build up AI and machine learning talent and research in the UK,” said Chris Bishop, Lab Director, Microsoft Research Cambridge. “Our researchers regularly work together on projects with global impact, and this initiative will help to build on the already strong links between the University of Cambridge and Microsoft.”

“Cambridge has a culture of ideas going back and forth between industry and academia, and this agreement with Microsoft is a prime example,” said Professor Andy Neely, Pro-Vice-Chancellor for Enterprise and Business Relations at Cambridge. “By working together with industry on issues such as how best to use AI and machine learning, we can not only help solve complex issues for industry, but continue to support world-leading research and train the next generation of leaders in the field.”

Earlier this year the Government and the AI sector agreed a Sector Deal to further boost the UK’s global reputation as a leader in developing AI technologies, ensuring the UK remains a go-to destination for AI innovation and investment.

Secretary of State for Digital, Culture, Media and Sport, Jeremy Wright, said: “The UK is a beacon for international talent and at the forefront of emerging technologies because of the ideas developed in our world-leading universities.

“This new collaboration between Microsoft and Cambridge University will help us continue to develop home-grown AI talent and supports the government’s modern Industrial Strategy and £1 billion AI sector deal. It is crucial that we do all we can to capitalise on our global advantage in this technology.”

Business Secretary Greg Clark said: “The UK has an unmatched heritage in AI and its application in emerging sectors and technologies.

“This partnership between one of the world’s leading universities and technology developer and Microsoft is a great example of collaboration between business and academia.  The UK’s leading research and innovation base are driving parts of our modern Industrial Strategy supported with the biggest increase in public research and development investment in the UK’s history.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Studies Raise Questions Over How Epigenetic Information Is Inherited

source: www.cam.ac.uk

Evidence has been building in recent years that our diet, our habits or traumatic experiences can have consequences for the health of our children – and even our grandchildren. The explanation that has gained most currency for how this occurs is so-called ‘epigenetic inheritance’ – patterns of chemical ‘marks’ on or around our DNA that are hypothesised to be passed down the generations. But new research from the University of Cambridge suggests that this mechanism of non-genetic inheritance is likely to be very rare.

There’s been a lot of excitement and hype surrounding the extent to which our epigenetic information is passed on to subsequent generations, but our work suggests that it’s not as pervasive as was previously thought

Tessa Bertozzi

A second study, also from Cambridge, suggests, however, that one way that environmental effects are passed on may in fact be through molecules produced from the DNA known as RNA that are found in a father’s sperm.

The mechanism by which we inherit innate characteristics from our parents is well understood: we inherit half of our genes from our mother and half from our father. However, the mechanism whereby a ‘memory’ of the parent’s environment and behaviour might be passed down through the generations is not understood.

Epigenetic inheritance has proved a compelling and popular explanation. The human genome is made up of DNA – our genetic blueprint. But our genome is complemented by a number of ‘epigenomes’ that vary by cell type and developmental time point.  Epigenetic marks are attached to our DNA and dictate in part whether a gene is on or off, influencing the function of the gene. The best understood epigenetic modification is DNA methylation, which places a methyl group on one of the bases of DNA (the A, C, G or T that make up our genetic code).

One model in which DNA methylation is associated with epigenetic inheritance is a mouse mutant called Agouti Viable Yellow. The coat of this mouse can be completely yellow, completely brown, or a pattern of these two colours – yet, remarkably, despite their different coat colours, the mice are genetically identical.

The explanation of how this occurs lies with epigenetics. Next to one of the key genes for coat colour lies a section of genetic code known as a ‘transposable element’ – a small mobile DNA ‘cassette’ that is actually repeated many times in the mouse genome but here acts to regulate the coat colour gene.

As many of these transposable elements come from external sources – for example, from a virus’s genome – they could be dangerous to the host’s DNA. But organisms have evolved a way of controlling their movement through methylation, which is most often a silencing epigenetic mark.

In the case of the gene for coat colour, if methylation switches off the transposable element completely, the mouse will be brown; if acquisition of methylation fails completely, the mouse will be yellow. But this does not affect the genetic code itself, just the epigenetic landscape of that DNA segment.

And yet, a yellow-coated female is more likely to have yellow-coated offspring and a brown-coated female is more likely to have brown-coated offspring. In other words, the epigenetically regulated behaviour of the transposable element is somehow being inherited from parent to offspring.

A team led by Professor Anne Ferguson-Smith at Cambridge’s Department of Genetics set out to examine this phenomenon in more detail, asking whether similar variably-methylated transposable elements existed elsewhere that could influence a mouse’s traits, and whether the ‘memory’ of these methylation patterns could be passed from one generation to the next. Their results are published in the journal Cell.

The researchers found that while these transposable elements were common throughout the genome – transposable elements comprise around 40% of a mouse’s total genome – the vast majority were completely silenced by methylation and hence had no influence on genes.

Only around one in a hundred of these sequences were variably-methylated. Some of these are able to regulate nearby genes, whereas others may have the ability to regulate genes located further away in the genome in a long-range capacity.

When the team looked at the extent to which the methylation patterns on these regions could be passed down to subsequent generations, only one of the six regions they studied in detail showed evidence of epigenetic inheritance – and even then, the effect size was small. Furthermore, only methylation patterns from the mother, not the father, were passed on.

“One might have assumed that all the variably-methylated elements we identified would show memory of parental epigenetic state, as is observed for coat colour in Agouti Viable Yellow mice,” says Tessa Bertozzi, a PhD candidate and one of the study’s first authors. “There’s been a lot of excitement and hype surrounding the extent to which our epigenetic information is passed on to subsequent generations, but our work suggests that it’s not as pervasive as was previously thought.”

“In fact, what we showed was that methylation marks at these transposable elements are reprogrammed from one generation to the next,” adds Professor Ferguson-Smith. “There’s a mechanism that removes methylation from the vast majority of the genome and puts it back on again, once in the process of generating eggs and sperms and again before the fertilised egg implants into the uterus. How the methylation patterns at the regions we have identified get reconstructed after this genome-wide erasure is still somewhat of a mystery.

“We know there are some genes – imprinted genes for example– that do not get reprogrammed in this way in the early embryo. But these are exceptions, not the rule.”

Professor Ferguson-Smith says that there is evidence that some environmentally-induced information can somehow be passed down generations. For example, her studies in mice show that the offspring of a mother who is undernourished during pregnancy are at increased risk of type 2 diabetes and obesity – and their offspring will in turn go on to be obese and diabetic. Again, she showed that DNA methylation was not the culprit – so how does this occur?

Every sperm is scarred?

The answer may come from research at the Wellcome/Cancer Research UK Gurdon Institute, also at the University of Cambridge, in collaboration with the lab of Professor Isabelle Mansuy from the University of Zürich and Swiss Federal Institute of Technology. In a study carried out in mice and published in the journal Molecular Psychiatry, they report how the ‘memory’ of early life trauma can be passed down to the next generation via RNA molecules carried by sperm.

Dr Katharina Gapp from Erica Miska’s lab at the Gurdon Institute and the Mansuy lab have previously shown that trauma in postnatal life increases the risk of behavioural and metabolic disorders not only in the directly exposed individuals but also in their subsequent offspring.

Now, the team has shown that the trauma can cause alterations in ‘long RNA’ (RNA molecules containing more than 200 nucleotides) in the father’s sperm and that these contribute to the inter-generational effect. This complements earlier research that found alterations in ‘short RNA’ molecules (with fewer than 200 nucleotides) in the sperm. RNA is a molecule that serves a number of functions, including, for some of the long versions called messenger RNA, ‘translating’ DNA code into functional proteins and regulating functions within cells.

Using a set of behavioural tests, the team showed that specific effects on the resulting offspring mediated by long RNA included risk-taking, increased insulin sensitivity and overeating, whereas small RNA conveyed the depressive-like behaviour of despair.

Dr Gapp said: “While other research groups have recently shown that small RNAs contribute to inheritance of the effects of chronic stress or changes in nutrition, our study indicates that long RNA can also contribute to transmitting some of the effects of early life trauma. We have added another piece to the puzzle for potential interventions in transfer of information down the generations.”

References
Kazachenka, A, Bertozzi, TM et al. Identification, Characterization, and Heritability of Murine Metastable Epialleles: Implications for Non-genetic Inheritance. Cell; 25 Oct 2018; DOI: 10.1016/j.cell.2018.09.043

Gapp K et al. Alterations in sperm long RNA contribute to the epigenetic inheritance of the effects of postnatal trauma. Molecular Psychiatry; 30 Oct 2018; DOI: 10.1038/s41380-018-0271-6


Researcher Profile: Tessa Bertozzi

Epigenetics has become something of a buzzword in recent years. It is the study of chemical modifications to DNA that switch genes on and off without changing the underlying DNA sequence. But what particularly excites interest is the extent to which these modifications, which can be altered by our environment – our diet, our behaviour, for example – can be inherited alongside DNA.

“The unknowns far outweigh the knowns in the young field of epigenetics, which is part of what makes it such an exciting time,” explains Tessa Bertozzi, a PhD student in the lab of Professor Anne Ferguson-Smith at Cambridge.

Tessa grew up in Mexico before moving to Seattle, Washington and then to southern California. “I came across Anne’s research in one of my undergraduate courses and found it fascinating. I contacted her soon after that and four years later I’m a final-year PhD student in her lab at Cambridge!”

Professor Ferguson-Smith’s lab has recently identified regions of the mouse genome that show different methylation levels across genetically identical mice. Tessa focuses on the mechanisms underlying the reconstruction of this epigenetic variation across generations.

“I conduct breeding experiments with mice and use specialised sequencing technologies to look at their DNA methylation patterns. While I am often found at the bench or analysing data on my computer, I also spend time developing ideas at meetings, seminars, and conferences, as well as participating in outreach activities.”

Cambridge has been a hub for epigeneticists for a while now, she says. “It is very motivating to be surrounded by like-minded researchers eager to interact and collaborate. In fact, my PhD has relied heavily on a number of collaborations across Cambridge.

“The University attracts academics from all over the world, making it a vibrant international community of people with different backgrounds and experiences. I have met and interacted with incredibly interesting people over the years.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

New Efficiency Record Set For Perovskite LEDs

soiurce: www.cam.ac.uk

Researchers have set a new efficiency record for LEDs based on perovskite semiconductors, rivalling that of the best organic LEDs (OLEDs).

Compared to OLEDs, which are widely used in high-end consumer electronics, the perovskite-based LEDs, developed by researchers at the University of Cambridge, can be made at much lower costs, and can be tuned to emit light across the visible and near-infrared spectra with high colour purity.

The researchers have engineered the perovskite layer in the LEDs to show close to 100% internal luminescence efficiency, opening up future applications in display, lighting and communications, as well as next-generation solar cells.

These perovskite materials are of the same type as those found to make highly efficient solar cells that could one day replace commercial silicon solar cells. While perovskite-based LEDs have already been developed, they have not been nearly as efficient as conventional OLEDs at converting electricity into light.

Earlier hybrid perovskite LEDs, first developed by Professor Sir Richard Friend’s group at the University’s Cavendish Laboratory four years ago, were promising, but losses from the perovskite layer, caused by tiny defects in the crystal structure, limited their light-emission efficiency.

Now, Cambridge researchers from the same group and their collaborators have shown that by forming a composite layer of the perovskites together with a polymer, it is possible to achieve much higher light-emission efficiencies, close to the theoretical efficiency limit of thin-film OLEDs. Their results are reported in the journal Nature Photonics.

“This perovskite-polymer structure effectively eliminates non-emissive losses, the first time this has been achieved in a perovskite-based device,” said Dr Dawei Di from Cambridge’s Cavendish Laboratory, one of the corresponding authors of the paper. “By blending the two, we can basically prevent the electrons and positive charges from recombining via the defects in the perovskite structure.”

The perovskite-polymer blend used in the LED devices, known as a bulk heterostructure, is made of two-dimensional and three-dimensional perovskite components and an insulating polymer. When an ultra-fast laser is shone on the structures, pairs of electric charges that carry energy move from the 2D regions to the 3D regions in a trillionth of a second: much faster than earlier layered perovskite structures used in LEDs. Separated charges in the 3D regions then recombine and emit light extremely efficiently.

“Since the energy migration from 2D regions to 3D regions happens so quickly, and the charges in the 3D regions are isolated from the defects by the polymer, these mechanisms prevent the defects from getting involved, thereby preventing energy loss,” said Di.

“The best external quantum efficiencies of these devices are higher than 20% at current densities relevant to display applications, setting a new record for perovskite LEDs, which is a similar efficiency value to the best OLEDs on the market today,” said Baodan Zhao, the paper’s first author.

While perovskite-based LEDs are beginning to rival OLEDs in terms of efficiency, they still need better stability if they are to be adopted in consumer electronics. When perovskite-based LEDs were first developed, they had a lifetime of just a few seconds. The LEDs developed in the current research have a half-life close to 50 hours, which is a huge improvement in just four years, but still nowhere near the lifetimes required for commercial applications, which will require an extensive industrial development programme. “Understanding the degradation mechanisms of the LEDs is a key to future improvements,” said Di.

The research was funded by the Engineering and Physical Sciences Research Council (EPSRC) and the European Research Council (ERC).

Reference:
Baodan Zhao et al. ‘High-efficiency perovskite-polymer bulk heterostructure light-emitting diodes.’ Nature Photonics (2018). DOI: 10.1038/s41566-018-0283-4


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Partners in New €1 billion European Quantum Flagship

Cambridge partners in new €1 billion European Quantum Flagship

source: www.cam.ac.uk

The University of Cambridge is a partner in the €1 billion Quantum Flagship, an EU-funded initiative to develop quantum technologies across Europe.

The Flagships are the largest and most transformative investments in research of the European Union, and will cement the EU leadership in future and emerging technologies

Andrea Ferrari

The Quantum Flagship, which is being officially launched today in Vienna, is one of the most ambitious long-term research and innovation initiatives of the European Commission. It is funded under the Horizon 2020 programme, and will have a budget of €1 billion over the next ten years.

The Quantum Flagship is the third large-scale research and innovation initiative of this kind funded by the European Commission, after the Graphene Flagship – of which the University of Cambridge is a founding partner – and the Human Brain Project. The Quantum Flagship work in Cambridge is being coordinated by Professor Mete Atature of the Cavendish Laboratory and Professor Andrea Ferrari, Director of the Cambridge Graphene Centre.

Quantum technologies take advantage of the ability of particles to exist in more than one quantum state at a time. A quantum computer could enable us to make calculations that are well out of reach of even the most powerful supercomputers, while quantum secure communication could power ‘unhackable’ networks made safe by the laws of physics.

The long-term research goal is the so-called quantum web, where quantum computers, simulators and sensors are interconnected via quantum networks, distributing information and quantum resources such as coherence and entanglement.

The potential performance increase resulting from quantum technologies may yield unprecedented computing power, guarantee data privacy and communication security, and provide ultra-high precision synchronisation and measurements for a range of applications available to everyone, locally and in the cloud.

The new Quantum Flagship will bring together academic and industrial partners, with over 500 researchers working on solving these problems, and help turn the results into technological opportunities that can be taken up by industry.

In close partnership with UK, Italian, Spanish, Swedish universities and companies, Cambridge will develop layered quantum materials and devices for scalable integrated photonic circuits, for applications in quantum communication and networks.

Cambridge is investigating and refining layered semiconductors just a few atoms thick, based on materials known as transition metal dichalcogenides (TMDs). Certain TMDs contain quantum light sources that can emit single photons of light, which could be used in quantum computing and sensing applications.

These quantum light emitters occur randomly in layered materials, as is the case for most other material platforms. Over the past three years, the Cambridge researchers have developed a technique to obtain large-scale arrays of these quantum emitters in different TMDs and on a variety of substrates, establishing a route to build quantum networks on compact chips. The Cambridge team has also shown how to electrically control emission from these devices.

Additionally, the researchers have found that TMDs can support complex quasi-particles, called quintons. Quintons could be a source of entangled photons – particles of light which are intrinsically linked, no matter how far apart they are – if they can be trapped in quantum emitters.

These findings are the basis of the work being done in the Quantum Flagship, aimed at the development of scalable on-chip devices for quantum integrated photonic circuits, to enable secure quantum communications and quantum sensing applications.

“Our goal is to bring some of the amazing properties of the layered materials platform into the quantum technologies realm for a number of applications,” said Atature. “Achieving compact integrated quantum photonic circuits is a challenge pursued globally and our patented layered materials technology offers solutions to this challenge. This is a great project that combines quantum physics, optoelectronics and materials science to produce technology for the future.”

“Quantum technology is a key investment area for Europe, and layered materials show great promise for the generation and manipulation of quantum light for future technological advances,” said Ferrari. “The Graphene Flagship led the way for these large European Initiatives, and we are pleased to be part of the new Quantum Flagship. The Flagships are the largest and most transformative investments in research of the European Union, and will cement the EU leadership in future and emerging technologies.”

Andrus Ansip, Commission Vice-President for the Digital Single Market, said: “Europe is determined to lead the development of quantum technologies worldwide. The Quantum Technologies Flagship project is part of our ambition to consolidate and expand Europe’s scientific excellence. If we want to unlock the full potential of quantum technologies, we need to develop a solid industrial base making full use of our research.”

Inset images: Mete Atature and Andrea Ferrari; Artist’s impression of on-chip quantum photonics architecture with single photon sources and nonlinear switches on optical waveguides, credit Matteo Barbone.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

3D ‘Organ On a Chip’ Could Accelerate Search For New Disease Treatments

3D ‘organ on a chip’ could accelerate search for new disease treatments

source: www.cam.ac.uk

Researchers have developed a three-dimensional ‘organ on a chip’ which enables real-time continuous monitoring of cells, and could be used to develop new treatments for disease while reducing the number of animals used in research.

Two-dimensional cell models have served the scientific community well, but we now need to move to three-dimensional cell models in order to develop the next generation of therapies

Róisín Owens

The device, which incorporates cells inside a 3D transistor made from a soft sponge-like material inspired by native tissue structure, gives scientists the ability to study cells and tissues in new ways. By enabling cells to grow in three dimensions, the device more accurately mimics the way that cells grow in the body.

The researchers, led by the University of Cambridge, say their device could be modified to generate multiple types of organs – a liver on a chip or a heart on a chip, for example – ultimately leading to a body on a chip which would simulate how various treatments affect the body as whole. Their results are reported in the journal Science Advances.

Traditionally, biological studies were (and still are) done in petri dishes, where specific types of cells are grown on a flat surface. While many of the medical advances made since the 1950s, including the polio vaccine, have originated in petri dishes, these two-dimensional environments do not accurately represent the native three-dimensional environments of human cells, and can, in fact, lead to misleading information and failures of drugs in clinical trials.

“Two-dimensional cell models have served the scientific community well, but we now need to move to three-dimensional cell models in order to develop the next generation of therapies,” said Dr Róisín Owens from Cambridge’s Department of Chemical Engineering and Biotechnology, and the study’s senior author.

“Three-dimensional cell cultures can help us identify new treatments and know which ones to avoid if we can accurately monitor them,” said Dr Charalampos Pitsalidis, a postdoctoral researcher in the Department of Chemical Engineering & Biotechnology, and the study’s first author.

Now, 3D cell and tissue cultures are an emerging field of biomedical research, enabling scientists to study the physiology of human organs and tissues in ways that have not been possible before. However, while these 3D cultures can be generated, technology that accurately assesses their functionality in real time has not been well-developed.

“The majority of the cells in our body communicate with each other by electrical signals, so in order to monitor cell cultures in the lab, we need to attach electrodes to them,” said Dr Owens. “However, electrodes are pretty clunky and difficult to attach to cell cultures, so we decided to turn the whole thing on its head and put the cells inside the electrode.”

The device which Dr Owens and her colleagues developed is based on a ‘scaffold’ of a conducting polymer sponge, configured into an electrochemical transistor. The cells are grown within the scaffold and the entire device is then placed inside a plastic tube through which the necessary nutrients for the cells can flow. The use of the soft, sponge electrode instead of a traditional rigid metal electrode provides a more natural environment for cells and is key to the success of organ on chip technology in predicting the response of an organ to different stimuli.

Other organ on a chip devices need to be completely taken apart in order to monitor the function of the cells, but since the Cambridge-led design allows for real-time continuous monitoring, it is possible to carry out longer-term experiments on the effects of various diseases and potential treatments.

“With this system, we can monitor the growth of the tissue, and its health in response to external drugs or toxins,” said Pitsalidis. “Apart from toxicology testing, we can also induce a particular disease in the tissue, and study the key mechanisms involved in that disease or discover the right treatments.”

The researchers plan to use their device to develop a ‘gut on a chip’ and attach it to a ‘brain on a chip’ in order to study the relationship between the gut microbiome and brain function as part of the IMBIBE project, funded by the European Research Council.

The researchers have filed a patent for the device in France.

Reference:
C. Pitsalidis et al. ‘Transistor in a tube: a route to three-dimensional bioelectronics.’ Science Advances (2018). DOI: 10.1126/sciadv.aat4253

Researcher profile: Dr Charalampos Pitsalidis

Dr Charalampos Pitsalidis is a postdoctoral researcher in the Department of Chemical Engineering & Biotechnology, where he develops prototypes of miniaturised platforms that can be integrated with advanced cell cultures for drug screening. A physicist with materials science background, he collaborates with biologists and chemists, in the UK and around the world, in order to develop and test drug screening platforms to help reduce the number of animals used in research.

“Animal studies remain the major means of drug screening in the later stages of drug development however they are increasingly questioned due to ethics, cost and relevance concerns. The reduction of animals in research is what motivates my work.

“I hope that one day I will have managed to make a small contribution in accelerating the drug discovery pipeline and towards the replacement reduction and refinement of animal research,” he said. “I believe that in 2018, we have everything in our hands, huge technological advancements, and all we need is to develop better and more predictive tools for assessing various therapies. It is not impossible; it just requires a systematic and highly collaborative approach across multiple disciplines.”

He calls Cambridge a truly inspiring place to work. “The state-of-the-art facilities and world-class infrastructure with cutting-edge equipment allow us to conduct high-quality research,” he said. “On top of that, the highly collaborative environment among the various groups and the various departments support multidisciplinary research endeavours and well-balanced research. The strong university and entrepreneurial ecosystem in both high tech and biological science makes Cambridge an ideal place for innovative research in my field.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Brexit: The Three Transition Options Open To The UK

Brexit: the three transition options open to the UK

source: www.cam.ac.uk

Will the UK agree to an extended transition period, keeping it bound by EU rules for longer after exiting the EU? Here, Professor Kenneth Armstrong outlines three “potential models” to extend the transition period, as explored in his new research paper published today.

A perpetual transition would be politically unacceptable… and conflict with EU law. It would, therefore, need an exit mechanism

Kenneth Armstrong

For some time now, both the United Kingdom and European Union have agreed that once the UK ceases to be a Member State on 29 March 2019, it will enter into a ‘stand-still’ period – during which the UK will continue to be bound by its existing EU obligations.

The rationale behind this is to avoid a ‘cliff-edge’ departure that would see tariffs and regulatory controls imposed on cross-border trade between the UK and the EU.

To the extent there has been disagreement between the two sides it has been on terminology – the EU refers to this as a ‘transition period’ while the UK insists on calling it an ‘implementation period’ – and duration – the UK sought a two-year period whereas the EU was only willing to agree a transition that would end on 31 December 2020 (coinciding with the end of the current budgetary ‘multi-annual framework’). The UK agreed to the EU’s offer of a transition ending in December 2020.

However, the duration of the transition period has come back to the fore of the negotiations for two reasons.

The UK believes that the issue of how to avoid a hard border on the island of Ireland can only properly be resolved in the context of the negotiations on the future economic relationship. The UK had hoped that this might be negotiated in parallel with the withdrawal arrangements.

However, the EU has insisted that it is only the framework for future cooperation that can be discussed in the context of the withdrawal negotiations, meaning that the terms of a future economic relationship can only be agreed once the UK leaves. As long as the UK is in transition, the issue of frontier controls on the island of Ireland does not arise.

But with the transitional period ending at the end of 2020, EU negotiators have insisted on the need for a ‘backstop’ to ensure that, if transition ends without a deal that meets the commitments made in the 2017 Joint Report, a ‘hard border’ in Ireland will be avoided. It is the failure to reach agreement on a backstop which is making negotiators on both sides reconsider a time-limited transition period.

The second reason is that the pace of negotiations, coupled with deep disagreement over the UK Government’s ‘Chequers Plan’, suggest that the transition period as currently conceived will be too short to allow for negotiations on a future relationship to be concluded. Taken together with the backstop issue, minds have turned to whether it would be prudent to extend transition.

In a recent European Policy Centre paper, Tobias Lock and Fabian Zuleeg make a strong case for the extension of transition, suggesting that a one-time one-year option to extend transition would be a workable solution.

In a new Research Paper, I have looked at three potential models for an extended transition:

  • A one-off option to extend transition for a year following the end of the initial transition period (the Lock and Zuleeg model)
  • A rolling or open-ended transition with an exit mechanism
  • An extended transition and implementation facility.

While Lock and Zuleeg make a good case, their proposal still risks a ‘second cliff-edge’ at the end of an extended transitional period if there is no agreement on a future relationship. A one-year optional extension may not give negotiators enough time to reach an agreement, and might not create sufficient confidence to avoid the need to negotiate a backstop.

The most obvious way to avoid a backstop would be to keep the UK in transition unless and until a new economic partnership between the UK and the EU was agreed (provided also that this met the commitments on the Irish border agreed in the 2017 Joint Report).

However, a perpetual transition would be politically unacceptable, difficult to manage in budgetary terms, and conflict with EU law. It would, therefore, need an exit mechanism. This could be modelled on Article 50 itself and allow either the UK or the EU to notify the other of their intention to end the transition period.

A compromise solution draws on the existing draft Agreement, and would allow transition to end once new agreements on customs and trade, foreign, security and defence policy are agreed and became applicable. Unlike an open transition, this facility would need a defined endpoint, and a proposed deadline of 31 December 2022 is suggested.

The aim would be to give negotiators the flexibility to agree new partnership arrangements, but with incentives to reach agreements early – avoiding the continued use of the transition and implementation facility. The UK and EU could depart transition well before the facility expired.

Kenneth Armstrong is Professor of European law and holds a Leverhulme Trust Major Research Fellowship for the project The Brexit Effect – Convergence, Divergence and Variation in UK Regulatory Policy.

The full Faculty of Law working paper can be viewed here. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

A healthy Lifestyle Cuts Stroke Risk, Irrespective of Genetic Risk

A healthy lifestyle cuts stroke risk, irrespective of genetic risk

source: www.cam.ac.uk

People at high genetic risk of stroke can still reduce their chance of having a stroke by sticking to a healthy lifestyle, in particular stopping smoking and not being overweight, finds a study in The BMJ today.

This drives home just how important a healthy lifestyle is for all of us, even those without an obvious genetic predisposition

Hugh Markus

Stroke is a complex disease caused by both genetic and environmental factors, including diet and lifestyle. But could adhering to a healthy lifestyle offset the effect of genetics on stroke risk?

An international team led by researchers at the University of Cambridge decided to find out by investigating whether a genetic risk score for stroke is associated with actual (“incident”) stroke in a large population of British adults.

They developed a genetic risk score based on 90 gene variants known to be associated with stroke from 306,473 white men and women in the UK Biobank – a database of biological information from half a million British adults.

Participants were aged between 40 and 73 years and had no history of stroke or heart attack. Adherence to a healthy lifestyle was based on four factors: non-smoker, diet rich in fruit, vegetables and fish, not overweight or obese (body mass index less than 30), and regular physical exercise.

Hospital and death records were then used to identify stroke events over an average follow-up of seven years.

Across all categories of genetic risk and lifestyle, the risk of stroke was higher in men than women.

Risk of stroke was 35% higher among those at high genetic risk compared with those at low genetic risk, irrespective of lifestyle.

However, an unfavourable lifestyle was associated with a 66% increased risk of stroke compared with a favourable lifestyle, and this increased risk was present within any genetic risk category.

A high genetic risk combined with an unfavourable lifestyle profile was associated with a more than twofold increased risk of stroke compared with a low genetic risk and a favourable lifestyle.

These findings highlight the benefit for entire populations of adhering to a healthy lifestyle, independent of genetic risk, say the researchers. Among the lifestyle factors, the most significant associations were seen for smoking and being overweight or obese.

This is an observational study, so no firm conclusions can be drawn about cause and effect, and the researchers acknowledge several limitations, such as the narrow range of lifestyle factors, and that the results may not apply more generally because the study was restricted to people of European descent.

However, the large sample size enabled study of the combination of genetic risk and lifestyle in detail. As such, the researchers conclude that their findings highlight the potential of lifestyle interventions to reduce risk of stroke across entire populations, even in those at high genetic risk of stroke.

Professor Hugh Markus from the Department of Clinical Neurosciences at University of Cambridge says: “This drives home just how important a healthy lifestyle is for all of us, even those without an obvious genetic predisposition. Some people are at an added disadvantage if ‘bad’ genes put them at a higher risk of stroke, but even so they can still benefit from not smoking and from having a healthy diet.”

The research was funded by the British Heart Foundation and the NIHR Cambridge Biomedical Research Centre.

Adapted from a press release by The BMJ.

Reference
Rutten-Jacobs, LCA, et al. Genetic risk, incident stroke, and the benefits of adhering to a healthy lifestyle: follow-up study of 306,473 UK Biobank participants. BMJ; 25 Oct 2018; DOI: 10.1136/bmj.k4168


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Brain Training App Helps Reduce OCD Symptoms, Study Finds

Brain training app helps reduce OCD symptoms, study finds

source: www.cam.ac.uk

A ‘brain training’ app developed at the University of Cambridge could help people who suffer from obsessive compulsive disorder (OCD) manage their symptoms, which may typically include excessive handwashing and contamination fears.

This technology will allow people to gain help at any time within the environment where they live or work, rather than having to wait for appointments

Barbara Sahakian

In a study published in the journal Scientific Reports, Baland Jalal and Professor Barbara Sahakian from the Department of Psychiatry, show how just one week of training can lead to significant improvements.

One of the most common types of OCD, affecting up to 46% of OCD patients, is characterised by severe contamination fears and excessive washing behaviour. Excessive washing can be harmful as sometimes OCD patients use spirits, surface cleansers or even bleach to clean their hands. The behaviours can have a serious impact on people’s lives, their mental health, their relationships and their ability to hold down jobs.

This repetitive and compulsive behaviour is also associated with ‘cognitive rigidity’ – in other words, an inability to adapt to new situations or new rules. Breaking out of compulsive habits, such as handwashing, requires cognitive flexibility so that the OCD patient can switch to new activities instead.

OCD is treated using a combination of medication such as Prozac and a form of cognitive behavioural therapy (‘talking therapy’) termed ‘exposure and response prevention’. This latter therapy often involves instructing OCD patients to touch contaminated surfaces, such as a toilet, but to refrain from then washing their hands.

These treatments are not particularly effective, however – as many as 40% of patients fail to show a good response to either treatment. This may be in part because often people with OCD have suffered for years prior to receiving a diagnosis and treatment. Another difficulty is that patients may fail to attend exposure and response prevention therapy as they find it too stressful to undertake.

For these reasons, Cambridge researchers developed a new treatment to help people with contamination fears and excessive washing. The intervention, which can be delivered through a smartphone app, involves patients watching videos of themselves washing their hands or touching fake contaminated surfaces.

Ninety-three healthy people who had indicated strong contamination fears as measured by high scores on the ‘Padua Inventory Contamination Fear Subscale’ participated in the study. The researchers used healthy volunteers rather than OCD patients in their study to ensure that the intervention did not potentially worsen symptoms.

The participants were divided into three groups: the first group watched videos on their smartphones of themselves washing their hands; the second group watched similar videos but of themselves touching fake contaminated surfaces; and the third, control group watched themselves making neutral hand movements on their smartphones.

After only one week of viewing their brief 30 second videos four times a day, participants from both of the first two groups – that is, those who had watched the hand washing video and those with the exposure and response prevention video – improved in terms of reductions in OCD symptoms and showed greater cognitive flexibility compared with the neutral control group. On average, participants in the first two groups saw their Yale-Brown Obsessive Compulsive Scale (YBOCS) scores improve by around 21%. YBOCS scores are the most widely used clinical assessments for assessing the severity of OCD.

Importantly, completion rates for the study were excellent – all participants completed the one-week intervention, with participants viewing their video an average (mean) of 25 out of 28 times.

Mr Jalal said: “Participants told us that the smartphone washing app allowed them to easily engage in their daily activities. For example, one participant said ‘if I am commuting on the bus and touch something contaminated and can’t wash my hands for the next two hours, the app would be a sufficient substitute’.”

Professor Sahakian said: “This technology will allow people to gain help at any time within the environment where they live or work, rather than having to wait for appointments. The use of smartphone videos allows the treatment to be personalised to the individual.

“These results while very exciting and encouraging, require further research, examining the use of these smartphone interventions in people with a diagnosis of OCD.”

The smartphone app is not currently available for public use. Further research is required before the researchers can show conclusively that it is effective at helping patients with OCD.

The research was funded by the Wellcome Trust, NIHR Cambridge Biomedical Research Centre, the Medical Research Council and the Wallitt Foundation.

Reference
Baland Jalal, Annette Bruhl, Claire O’Callaghan, Thomas Piercy, Rudolf N. Cardinal, Vilayanur S. Ramachandran and Barbara J. Sahakian. Novel smartphone interventions improve cognitive flexibility and obsessive-compulsive disorder symptoms in individuals with contamination fears. Scientific Reports; 23 Oct 2018; DOI: 10.1038/s41598-018-33142-2


Researcher profile: Baland Jalal

“Cambridge is the perfect place for the ‘idealistic scholar’ – those who believe they can re-write the science textbooks. The culture—like no other—embraces novel ideas, even if outlandish and far-fetched on the surface,” says Baland Jalal, a neuroscientist at the Behavioural and Clinical Neuroscience Institute and PhD candidate at Trinity College.

“It is no coincidence that the foremost scientists in history have stepped foot here, including my scientific hero Newton. One cannot help but feel inspired, as if part of a lineage of greatness—‘standing on the shoulder of giants’.”

Jalal considers himself fortunate to have been able to stand on the shoulders of proverbial giants throughout his research career. He received his initial training at the University of California in the laboratory of legendary neuroscientist VS Ramachandran.

“California was an enchanting experience. Rama and I would often go for long strolls on San Diego’s beaches where he would tell mesmerizing stories about the good-old-days when he was a Cambridge student and how he later invented his famous ‘mirror box’ for phantom limb pain. He was like a second father – a mentor who instilled in me a genuine love of science.”

Jalal now works with husband-and-wife team Professors Barbara Sahakian and Trevor Robbins, who he describes as embodying “the ‘Cambridge spirit’ of innovation”. His work is ultimately about developed new psychiatric treatments. “This often involves taking an unorthodox and somewhat radical approach—thinking ‘outside the box’ so to speak,” he says. Ideas include the above treatment for OCD and a second treatment based on the ‘rubber hand illusion’, making a fake hand feel like it is your own.

His other area of interest is in sleep paralysis—being paralyzed from head to toe while seeing ghosts and space aliens when waking up from sleep. He has studied this peculiar phenomenon around the world and recently invented a novel meditation-relaxation therapy for this condition, called MR Therapy.

“I hope my research will lead to new therapies that can help people in distress around the world – especially folks in low-income countries who don’t have adequate access to health care. The feeling I have when someone tells me that my work has helped alleviate their anguish is – simply – indescribable.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Study Unearths Britain’s First Speech Therapists

Study unearths Britain’s first speech therapists

Joseph Priestley: theologian, scientist, clergyman and stammerer
source: www.cam.ac.uk

On International Stammering Awareness Day (22 October), a new study reveals that Britain’s first speech therapists emerged at least a century earlier than previously thought.

It is tempting to think that sympathy for stammering is a very recent phenomenon but a significant change in attitudes took hold in the eighteenth century

Elizabeth Foyster

Until now, historians had assumed that John Thelwall became Britain’s first speech therapist in the early nineteenth century.*
But Cambridge historian Elizabeth Foyster has discovered that James Ford was advertising his services in London as early as 1703, and that many other speech therapists emerged over the course of the eighteenth century.
Ford’s advert (pictured), published in the Post Man newspaper on 23 October 1703, states that “he removes Stammering, and other impediments in Speech”, as well as teaching “Foreigners to pronounce English like Natives”.
Ford had previously worked with the deaf and dumb but realised that there was more money to be made by offering other speech improvement services as a branch of education for wealthy children.
“In the eighteenth century, speaking well was crucial to being accepted in polite society and to succeeding in a profession,” said Foyster. “Speech impediments posed a major obstacle and the stress this caused often made a sufferer’s speech even worse. At the same time, wealthy parents were made to feel guilty and they started spending increasingly large sums to try to “cure” their children.”
By 1703, Ford was based in Newington Green, in the suburbs of London, but twice a week he waited near the city’s Royal Exchange and Temple Bar to secure business from merchants, financiers and lawyers desperate to improve their children’s life chances.
By 1714, some of these families were seeking out the help of Jacob Wane, a therapist who drew on a 33-year personal struggle with the condition. And by the 1760s, several practitioners were competing for business in London.
“We have lost sight of these origins of speech therapy because historians have been looking to identify a profession which had agreed qualifications for entry, an organising body, scientific methods and standards, as we have today,” said Foyster. “In the eighteenth century, speech therapy was regarded as an art not a science. But with its attention to the individual, and the psychological as well as physiological causes of speech defects, we can see the roots of today’s speech therapy.”

Art and business

Foyster’s study, published in the journal Cultural and Social History, shows that speech specialists emerged in the early eighteenth century as new attention was given to the role of the nerves, emotions and psychological origins of speech impediments.
Prior to this, in the seventeenth century, the main cure on offer had involved painful physical intervention including the cutting of tongues. But as speech defects came to be understood as resulting from nervous disorders, entrepreneurial therapists stepped in to end the monopoly of the surgeons.
“These men, and some women, made no claim to medical knowledge,” Foyster says. “In fact, some were very keen to emphasise that they were nothing like the surgeons who had caused so much unnecessary pain. They described themselves as ‘Artists’ and their gentler methods were much more attractive to wealthy clients.”
These speech ‘artists’ jealously guarded their trade secrets but gave away some clues to their methods in print. Close attention was paid to the position of the lips, tongue and mouth; clients were given breathing and voice exercises to practise; and practitioners emphasised the importance of speaking slowly so that every sound could be articulated.
By the 1750s, London’s speech therapists had become masters of publicity publishing books, placing advertisements in newspapers and giving lectures in universities and other venues. In 1752, Samuel Angier achieved the remarkable feat of lecturing to Cambridge academics on four occasions about speech impediments and the ‘art of pronunciation’, despite having never attended university himself.
Foyster has identified several successful speech therapy businesses, some of which were passed down from one generation to the next. Most of these were based in London but practitioners would often follow their clientele to fashionable resort towns such as Bath and Margate.
In 1761, Charles Angier became the third generation to take over his family’s business; and by the 1780s, he claimed to be able to remove all speech impediments within six to eight months if his pupils were ‘attentive’. By then, he was reported to be charging fifty guineas ‘for the Cure’ at a time when many Londoners were earning less than ten guineas a year.
To be successful, these entrepreneurs had to separate themselves from quackery. Some heightened their credibility by securing accreditation from respected physicians while others printed testimonials from satisfied clients beneath their newspaper advertisements.

Suffering and determination

Foyster’s study also sheds light on the appalling suffering and inspirational determination of stammerers in the eighteenth century, including some well-known figures.
Joseph Priestley (1733-1804), the theologian, scientist and clergyman (pictured), recalled that his worsening stammer made ‘preaching very painful, and took from me all chance of recommending myself to any better place’.
His fellow scientist, Erasmus Darwin, also suffered from a stammer, as did Darwin’s daughter, Violetta, and eldest son, Charles. In 1775, Darwin compiled detailed instructions to help his daughter overcome her stammer which involved sounding out each letter and practising problematic words for weeks on end.
“It is tempting to think that sympathy for stammering is a very recent phenomenon but a significant change in attitudes took hold in the eighteenth century,” said Foyster. “While stammerers continued to be mocked and cruelly treated, polite society became increasingly compassionate, especially when someone demonstrated a willingness to seek specialist help.”
References:
Elizabeth Foyster, ‘Fear of Giving Offence Makes Me Give the More Offence’: Politeness, Speech and Its Impediments in British Society, c.1660–1800.’ Cultural and Social History (2018). DOI: 10.1080/14780038.2018.1518565
* Denyse Rockey, ‘The Logopaedic thought of John Thelwall, 1764-1834: First British Speech Therapist‘, British Journal of Disorders of Communication (1977). DOI: 10.3109/13682827709011313

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

History Shows Abuse of Children In Custody Will Remain An ‘Inherent Risk’ – Report

History shows abuse of children in custody will remain an ‘inherent risk’ – report

New research conducted for the current independent inquiry suggests that – despite recent policy improvements – cultures of child abuse are liable to emerge while youth custody exists, and keeping children in secure institutions should be limited as far as possible.

History tells us that it is impossible to ‘manage out’ the risk of abuse through improved policies alone

Caroline Lanskey

A new report on the history of safeguarding children detained for criminal offences in the UK has concluded that it is impossible to remove the potential for abuse in secure institutions, and that the use of custody for children should only be a “last resort”.

A team of criminologists and historians from the universities of Cambridge and Edinburgh were asked by HM Prison and Probation Service (HMPPS) to build a “collective memory” of the abuse cases and preventative policies that emerged in the youth wing of the UK’s secure estate between 1960 and 2016.

The research was commissioned to help prepare HMPPS to give evidence to the Independent Inquiry into Child Sexual Abuse. It covers physical and sexual abuse in secure children’s homes and training centres, young offender institutions such as Deerbolt and Feltham, and their predecessors: detention centres and borstals.

Drawing on often limited archival records – as well as inspection reports and previous findings – the research reveals how past safeguards broke down, failing to recognise children in custody as vulnerable.

Researchers found abuse was especially likely at times of overcrowding and budgetary constraint, and occurred despite contemporary beliefs that protective policies were working.

The historical overview goes beyond individual misconduct to show how whole institutions become “detached from their purpose”, with undertrained staff collectively drifting into “morally compromised” cultures where abusive acts appear acceptable even as procedure is followed.

The researchers say this “acculturation” at times extended to inspectorates and monitors overfamiliar with failing systems. They argue that it is vital to ensure effective complaints processes and protect whistle-blowers.

The report has been produced by Cambridge criminologists and Dr Lucy Delap and Professor Louise Jackson from the History and Policy network, and is published online today alongside a policy paper summarising the findings.

“History tells us that it is impossible to ‘manage out’ the risk of abuse through improved policies alone,” said report co-author Dr Caroline Lanskey, from Cambridge’s Institute of Criminology (IoC).

“The steep power imbalance between staff and children means there is a need to focus on staff culture, rather than only on detailed policy, in order to establish greater trust between staff and young people in a secure institution,” she said.

Until the 1990s safeguards against abuse were weak, and ineffective in many institutions, say researchers. Children were often left to “fend for themselves” in detention centres such as Medomsley, where reports of sexual abuse during the 1970s and 1980s have since come to light.

The research reveals major rifts in the mid-1970s between the external Board of Visitors – Medomsley’s main monitoring body – and the centre’s management over disciplinary approaches. Inspections of the time recorded that neither staff nor children “seem to know what the purpose of the centre really is…”

Inspectors were concerned with basic functions such as kitchen cleanliness. That the kitchen manager worked unsupervised, and hand-picked his team of children and young people, was not perceived as risky. This Medomsley manager was subsequently convicted of sexual offences.

“Inspectors and Boards of Visitors checked procedure, but they lacked the concepts and language to recognise that certain situations were potentially abusive. These blind spots persisted until at least the 1990s,” said Ben Jarman, a researcher at Cambridge’s IoC, who carried out the archival research.

The turn of the millennium saw a “new orthodoxy” in protective policies, combined with a spike in custodial sentences for children that wouldn’t decline again until 2010.

Part of this policy shift included the questioning of long-standing practices such as strip-searching and forms of restraint, and whether they amounted to abuse.

“Strip-searching before the 1990s seems to have been so routine and unremarkable that it’s hardly mentioned in the documentary record,” said Jarman. “As late as 1995, inspectors at Deerbolt reported without comment that staff believed more routine strip searches were required.”

However, by 2002 inspectors were expressing serious concerns about untargeted strip-searching. A 2005 inspection of Feltham described strip-searches as “degrading”, and an independent inquiry the following year argued that, in any other circumstances, such practices would “trigger a child protection investigation”.

The use of pain-inducing restraint has also become the subject of fierce debate and some policy change, following the deaths of two children in secure training centres in 2004.

Strip-searching and restraint are still used but much more carefully regulated. New monitoring systems attempt to take account of the ‘voice’ of children, who the report suggests have been recast as ‘users’ of custodial ‘services’.

Yet improved safeguards can inspire false confidence and mask the “corruption of care”, say researchers. The exposure by the BBC of violence and bullying by staff in Medway Secure Training Centre in 2016 came shortly after an inspection declaring safety there to be “good”.

“Investigations at Medway concluded that child protection failed despite the apparent compliance with safeguarding policies,” said Jarman. “Inadequately trained and under pressure to achieve contractual targets, some of the staff did not appear to understand what they were doing was wrong.”

“We wouldn’t argue for fewer safeguards, but without a focus on staff culture, even the best policies can be circumvented when an abusive climate develops,” he added.

“The ever-present potential for abuse means that custody should be used for children only as a last resort, where there is no alternative,” the report concludes.

The full report, Safeguarding children in the secure estate: 1960 -2016, available here.   


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Targeting Hard-To-Treat Cancers

Targeting hard-to-treat cancers

source: www.cam.ac.uk

Cambridge leads a £10 million interdisciplinary collaboration to target the most challenging of cancers.

We are going to pierce through the body’s natural barriers and deliver anti-cancer drugs to the heart of the tumour.

George Malliaras

While the survival rate for most cancers has doubled over the past 40 years, some cancers such as those of the pancreas, brain, lung and oesophagus still have low survival rates.

Such cancers are now the target of an Interdisciplinary Research Collaboration (IRC) led by the University of Cambridge and involving researchers from Imperial College London, University College London and the Universities of Glasgow and Birmingham.

“Some cancers are difficult to remove by surgery and highly invasive, and they are also hard to treat because drugs often cannot reach them at high enough concentration,” explains George Malliaras, Prince Philip Professor of Technology in Cambridge’s Department of Engineering, who leads the IRC. “Pancreatic tumour cells, for instance, are protected by dense stromal tissue, and tumours of the central nervous system by the blood-brain barrier.”

The aim of the project, which is funded for six years by the Engineering and Physical Sciences Research Council, is to develop an array of new delivery technologies that can deliver almost any drug to any tumour in a large enough concentration to kill the cancerous cells.

Chemists, engineers, material scientists and pharmacologists will focus on developing particles, injectable gels and implantable devices to deliver the drugs. Cancer scientists and clinicians from the Cancer Research UK Cambridge Centre and partner sites will devise and carry out clinical trials. Experts in innovative manufacturing technologies will ensure the devices are able to be manufactured and robust enough to withstand surgical manipulation.

One technology the team will examine is the ability of advanced materials to self-assemble and entrap drugs inside metal-organic frameworks. These structures can carry enormous amounts of drugs, and be tuned both to target the tumour and to release the drug at an optimal rate.

“We are going to pierce through the body’s natural barriers,” says Malliaras, “and deliver anti-cancer drugs to the heart of the tumour.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Cambridge Team Develops Technique To ‘Listen’ To a Patient’s Brain During Tumour Surgery

Cambridge team develops technique to ‘listen’ to a patient’s brain during tumour surgery

www.cam.ac.uk

Surgeons could soon eavesdrop on a patient’s brain activity during surgery to remove their brain tumour, helping improve the accuracy of the operation and reduce the risk of impairing brain function.

There’s been huge progress in brain imaging and electrophysiology – our understanding of the electricity within our bodies – so why not use this information to improve brain surgery?

Yaara Erez

Patients with low-grade gliomas in their brains – a slow-spreading, but potentially life-threatening tumour – will usually receive surgery to have the tumour removed. But removing brain tissue can be risky as there is no boundary between the brain and tumour – the tumour infiltrates the brain. Removal of tumour can lead to removal of vital parts of the brain and resulting impairments in functions such as speech, movement and executive function (which enables the individual to plan, organise and execute tasks).

To minimise this risk, neurosurgeons open the patient’s skull and then waken them. A local anaesthetic means the patient will feel no pain, and the brain itself contains no pain receptors. The surgeon will probe the patient’s brain, applying mild electric pulses to tissue surrounding the tumour while asking them to perform a set of tasks. For example, the patient may be asked to count from one to five: if an electric pulse applied to a certain place in the brain affects their ability to perform this task, the surgeon will leave this tissue in place.

“As surgeons, we’re always trying to minimise the risk to patients and provide them with the best possible outcomes,” says Thomas Santarius, a neurosurgeon at Addenbrooke’s, Cambridge University Hospitals. “Operating on brain tumours is always a delicate balance between removing as much diseased tissue as possible to give patients better prognosis, while minimising the risk of damage to brain functions that will have a potentially massively detrimental impact on the patient’s life.”

While the current approach is considered the ‘gold standard’, it is not perfect. It takes time to apply the pulses on different parts of the brain and it may miss out some areas that are important for certain functions. The current battery of cognitive tests that surgeons use is also limited and does not test for the essential executive function, for example.

Now, a team of scientists and clinicians from the University of Cambridge and Addenbrooke’s Hospital, led by Mr Santarius, Dr Yaara Erez and Mr Michael Hart, together with Pedro Coelho from Neurophys Ltd, has collaborated to develop a new approach that will enable patients to get a more accurate, personalised ‘read-out’ of their brain networks, and will provide surgeons with real-time feedback on the patient’s brain activity in theatre.

“At the moment, neurosurgeons only know about function in the average brain – they have no patient-specific information,” explains Dr Yaara Erez, a neuroscientist from the MRC Cognition and Brain Sciences Unit at the University of Cambridge. “But there’s been huge progress in brain imaging and electrophysiology – our understanding of the electricity within our bodies – so why not use this information to improve brain surgery? We are aiming to bring all this knowledge into the theatre, providing surgeons with integrated data and the best tools to support their work.”

Under this approach, patients would undergo a number of neuroimaging examinations using magnetic resonance imaging (MRI) before surgery aimed at identifying not only the exact location of the tumour but also how different regions of their brains communicate with each other.

As part of this process, a 3D-printed copy of the patient’s brain will be used, showing where the tumour is located. This model is intended to help surgeons plan the surgery, discuss with the patient the potential risks from surgery and involve the patient in decisions over which tissue to remove.

“Doctors need to be able to talk through the options with patients, and we hope that using neuroimaging data and presenting this as a 3D model will help surgeons with the planning of surgery and ensure patients are better informed about the risks and benefits from surgery,” says Dr Erez.

During surgery, once the patient’s skull has been opened, the surgeon will place electrodes on the surface of the brain, to ‘listen’ to their brain activity. A computer algorithm will analyse this information as the patient performs a battery of cognitive tests, giving live feedback to the surgeon. This will enable the surgeon to predict more accurately the likely impact of removing a particular area of brain tissue.

In particular, executive function is difficult to test using electrical stimulation – in part because it involves networks of regions across the brain. Dr Erez hopes that a combination of improved cognitive tests and a more accurate understanding of an individual patient’s networks will enable surgeons to monitor potential impairment to executive function during surgery.

“This isn’t going to replace brain stimulation during surgery,” says Dr Erez, “but it will guide the surgeon and it will save time and make surgery more efficient, more accurate. It will also enable us to understand how patients’ brains adapt to the presence of a tumour and how well they recover from surgery. It involves equipment that is largely already in use in surgeries, so should be easy and cost effective to implement.”

So far, the team has obtained data from 12 patients, already providing a large amount of data to analyse, with a rich dataset from each patient, collected before, during and after surgery. Although they are currently analysing this information offline, the data will help them find the best measures to provide the required information – what the ideal tasks for patients to perform are – and then to optimise the analysis.

The research has only been possible because of the interaction between researchers and clinicians from a variety of disciplines, says Dr Erez. “At Cambridge, we have different groups of neuroscientists with a range of expertise from psychology and imaging to computer science working with clinicians and surgeons at the hospital.  Whatever we need, we can always find someone in Cambridge who knows how to do it!”

The research is supported by the Medical Research Council, the Royal Society and The Brain Tumour Charity.


Researcher profile: Dr Yaara Erez

Originally from Israel, Dr Yaara Erez is now a neuroscientist at the MRC Cognition and Brain Sciences Unit – a centre that not only has “a long history of great contributions to the theoretical and experimental foundations of cognitive psychology”, she says, but “is also famous for its truly lovely garden!”

Yaara’s background is in Computer Science and Psychology. She spent several years as a software developer before deciding to pursue a PhD in neuroscience, and she is now a Royal Society Dorothy Hodgkin Research Fellow. Her background is proving essential for understanding the inner workings of the brain.

“We process the information around us in an active way – we pay attention to what is relevant to us and filter out what we don’t need. We do that all the time, effortlessly and efficiently, but from a computational perspective it is a very complicated problem. We only have hints about how this is done in the brain.”

Yaara’s interest lies in the brain systems that allow us to behave flexibly, adapt our behaviour to changing circumstances, and select only the information that we need. These systems are involved in a wide range of cognitive function known as ‘executive function’, including problem-solving, keeping focus, switching focus and planning, all of which are essential to normal healthy life. “It’s important to understand these brain mechanisms because it may help us develop treatments for patients with different brain disorders that affect cognitive function, such as stroke, brain tumour, depression, and many more,” she says.

While Yaara’s research is basic science, she is interested in its clinical application and how the knowledge might be used to improve healthcare and treatments for patients. “I believe we can improve existing procedures so patients can have a high quality of life after brain surgery. We can and we should use our knowledge from basic neuroscience to improve treatments for patients.”

Her work uses a variety of techniques that involve different types of brain signals that she collects from healthy volunteers and patients with brain tumours. “This data is very complex, so requires detailed analysis, which I like. The combination of the data from the different techniques, and what we can learn from each of them, makes my work exciting and enables me to get the full picture.

Yaara recalls the day she first saw a live brain surgery on an awake patient. “As a neuroscientist, I study the brain and know quite a lot about it, but seeing a real brain and how brief pulses of electrical stimulation immediately affect behaviour was a different level of experience and truly eye-opening.”

Cambridge, says Yaara, is the perfect place for her research. “There are people from all over the world, and they all bring their expertise, knowledge, and perspective. My research is multidisciplinary in its nature, and the combination of the different expertise of people in Cambridge makes it work. We also have great facilities here and are very fortunate to have such a great University Hospital as Addenbrooke’s as our local hospital.

“I enjoy meeting and working with people from all around the world, and the international community in Cambridge is amazing.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Graphene May Exceed Bandwidth Demands of Future Telecommunications

Graphene may exceed bandwidth demands of future telecommunications

source: www.cam.ac.uk

Researchers from the Cambridge Graphene Centre, together with industrial and academic collaborators within the European Graphene Flagship project, showed that integrated graphene-based photonic devices offer a solution for the next generation of optical communications.

The researchers have demonstrated how properties of graphene – a two-dimensional form of carbon – enable ultra-wide bandwidth communications and low power consumption to radically change the way data is transmitted across the optical communications systems.

This could make graphene-integrated devices the key ingredient in the evolution of 5G, the Internet-of-Things (IoT), and Industry 4.0. The findings are published in Nature Reviews Materials.

As conventional semiconductor technologies approach their physical limitations, researchers need to explore new technologies to realise the most ambitious visions of a future networked global society. Graphene promises a significant step forward in performance for the key components of telecommunications and data communications.

In their new paper, the researchers have presented a vision for the future of graphene-based integrated photonics, and provided strategies for improving power consumption, manufacturability and wafer-scale integration. With this new publication, the Graphene Flagship partners also provide a roadmap for graphene-based photonics devices surpassing the technological requirement for the evolution of datacom and telecom markets driven by 5G, IoT, and the Industry 4.0.

“Graphene integrated in a photonic circuit is a low cost, scalable technology that can operate fibre links at a very high data rates,” said study lead author Marco Romagnoli from CNIT, the National Interuniversity Consortium for Telecommunications in Italy.

Graphene photonics offers advantages both in performance and manufacturing over the state of the art. Graphene can ensure modulation, detection and switching performances meeting all the requirements for the next evolution in photonic device manufacturing.

Co-author Antonio D’Errico, from Ericsson Research, says that “graphene for photonics has the potential to change the perspective of Information and Communications Technology in a disruptive way. Our publication explains why, and how to enable new feature rich optical networks.”

This industrial and academic partnership, comprising researchers in the Cambridge Graphene Centre, CNIT, Ericsson, Nokia, IMEC, AMO, and ICFO produced the vision for the future of graphene photonic integration.

“Collaboration between industry and academia is key for explorative work towards entirely new component technology,” said co-author Wolfgang Templ of Nokia Bell Labs. “Research in this phase bears significant risks, so it is important that academic research and industry research labs join the brightest minds to solve the fundamental problems. Industry can give perspective on the relevant research questions for potential in future systems. Thanks to a mutual exchange of information we can then mature the technology and consider all the requirements for a future industrialization and mass production of graphene-based components.”

“An integrated approach of graphene and silicon-based photonics can meet and surpass the foreseeable requirements of the ever-increasing data rates in future telecom systems,” said Professor Andrea Ferrari, Director of the Cambridge Graphene Centre. “The advent of the Internet of Things, Industry 4.0 and the 5G era represent unique opportunities for graphene to demonstrate its ultimate potential.”

Reference: 
Marco Romagnoli et al. ‘Graphene-based integrated photonics for next-generation datacom and telecom.’ Nature Reviews Materials (2018). DOI: 10.1038/s41578-018-0040-9.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

New Legal Tool Aims To Increase Openness, Sharing and Innovation In Global Biotechnology

New legal tool aims to increase openness, sharing and innovation in global biotechnology

source: www.cam.ac.uk

A new easy-to-use legal tool that enables exchange of biological material between research institutes and companies launches today.

The OpenMTA provides a new pathway for open exchange of DNA components – the basic building blocks for new engineering approaches in biology

Jim Haseloff

The OpenMTA is a Material Transfer Agreement (MTA) designed to foster a spirit of openness, sharing and innovation in global biotechnology. MTAs provide the legal frameworks within which research organisations lay down terms and conditions for sharing their materials – everything from DNA to plant seeds to patient samples.

Use of the OpenMTA allows redistribution and commercial use of materials, while respecting the rights of creators and promoting safe practice and responsible research. The new standardised framework also eases the administrative burden for technology transfer offices, negating the need to negotiate unique terms for individual transfers of widely-used material.

The OpenMTA launches today with a commentary published in the journal Nature Biotechnology. It provides a new way to openly exchange low level “nuts and bolts” components for biological research and engineering, complementing existing, more restrictive arrangements for material transfer.

The OpenMTA was developed through a collaboration, led by the San Francisco-based BioBricks Foundation and UK-based OpenPlant Synthetic Biology Research Centre. OpenPlant is a joint initiative between the University of Cambridge, John Innes Centre and the Earlham Institute, which aims to develop open technologies and responsible innovations for industrial biotechnology sustainabile agriculture.

Professor Jim Haseloff, University of Cambridge, UK, said: “The OpenMTA provides a new pathway for open exchange of DNA components – the basic building blocks for new engineering approaches in biology. It is a necessary step towards building a commons [commonly owned resource] that will underpin and democratise access to future biotechnological advances and sustainable industries.”

The collaboration brought together an international working group comprising researchers, technology transfer professionals, social scientists and legal experts to inform the creation of a legal framework that could improve sharing of biomaterials and increase innovation. The team identified five design goals on which to base the new agreement: access, attribution, reuse, redistribution and non-discrimination.  Additional design goals included issues of safety and, in particular, the sharing of biomaterials in an international context.

Dr Linda Kahl, Senior Counsel of the BioBricks Foundation, said: “We encourage organisations worldwide to sign the OpenMTA Master Agreement and start using it. In five years’ time my ideal is for the OpenMTA to be the default option for the transfer of research materials within and between academic research institutions and companies.

“Instead of automatically placing restrictions on materials, people will ask whether restrictions on use and redistribution are appropriate and instead use this tool to promote sharing and innovation in a way that does not compromise safety.”

Dr Colette Matthewman, Programme Manager for the OpenPlant Synthetic Biology Research Centre, said: “We hope to see the OpenMTA enable an international flow of non-proprietary tools between academic, government, NGO and industry researchers, to be used, reused and expanded upon to develop new tools and innovations.”

The agreement will facilitate the use, modification and redistribution of tools for innovation in academic and commercial research, and promote access for researchers in less privileged institutions and world regions.

Dr Fernán Federici, Millennium Institute for Integrative Biology (iBio), Santiago, Chile, said: “The OpenMTA will be particularly useful in Latin America, allowing researchers to redistribute materials imported from overseas sources, reducing shipping costs and waiting times for future local users. We are implementing it in an international project that requires sharing genetic tools among labs in four different continents. We believe, the OpenMTA will support projects based on community-sourced resources and distributed repositories that lead to more fluid collaborations.”

The OpenPlant Synthetic Biology Research Centre is funded by the UK Biotechnology and biological Sciences Research Council and the Engineering and Physics Council as part of the UK Synthetic Biology for Growth programme.

Adapted from a press release from the John Innes Centre. 

Reference
Kahl, L et al. Opening options for material transfer. Nature Biotechnology; 11 Oct 2018


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Austerity Cuts ‘Twice As Deep’ In England As Rest of Britain

Austerity cuts ‘twice as deep’ in England as rest of Britain

ource www.cam.ac.uk

Research finds significant inequalities in cuts to council services across the country, with deprived areas in the north of England and London seeing the biggest drops in local authority spending since 2010.

Public finance is politics hidden in accounting columns

Mia Gray

A “fine-grained” analysis of local authority budgets across Britain since 2010 has found that the average reduction in service spending by councils was almost 24% in England compared to just 12% in Wales and 11.5% in Scotland.

While some areas – Glasgow, for example – experienced significant service loss, the new study suggests that devolved powers have allowed Scottish and Welsh governments to mitigate the harshest local cuts experienced in parts of England.

University of Cambridge researchers found that, across Britain, the most severe cuts to local service spending between 2010 and 2017 were generally associated with areas of “multiple deprivation”.

This pattern is clearest in England, where all 46 councils that cut spending by 30% or more are located. These local authorities tend to be more reliant on central government, with lower property values and fewer additional funding sources, as well as less ability to generate revenue through taxes.

The north was hit with the deepest cuts to local spending, closely followed by parts of London. The ten worst affected councils include Salford, South Tyneside and Wigan, as well as the London boroughs of Camden and Hammersmith and Fulham. Westminster council had a drop in service spending of 46% – the most significant in the UK.

The research also shows a large swathe of southern England, primarily around the ‘home counties’, with low levels of reliance on central government and only relatively minor local service cuts. Northern Ireland was excluded from the study due to limited data.

The authors of the new paper, published today in the Cambridge Journal of Regions, Economy and Society, say the findings demonstrate how austerity has been pushed down to a local level, “intensifying territorial injustice” between areas.

They argue that initiatives claimed by government to ameliorate austerity, such as local retention of business taxes, will only fuel unfair competition and inequality between regions – as local authorities turn to “beggar thy neighbor” policies in efforts to boost tax bases and buffer against austerity.

“The idea that austerity has hit all areas equally is nonsense,” said geographer Dr Mia Gray, who conducted the research with her Cambridge colleague Dr Anna Barford.

“Local councils rely to varying degrees on the central government, and we have found a clear relationship between grant dependence and cuts in service spending.

“The average cuts to local services have been twice as deep in England compared to Scotland and Wales. Cities have suffered the most, particularly in the old industrial centres of the north but also much of London,” said Gray.

“Wealthier areas can generate revenues from business tax, while others sell off buildings such as former back offices to plug gaping holes in council budgets.

“The councils in greatest need have the weakest local economies. Many areas with populations that are ageing or struggling to find employment have very little in the way of a public safety net.

“The government needs to decide whether it is content for more local authorities to essentially go bust, in the way we have already seen in Northamptonshire this year,” she said.

Local authorities with largest spending drop Change in service spending 2010-2017
Westminster -46%
Salford -45%
South Tyneside -44%
Slough -44%
Wigan -43%
Oldham -42%
Gateshead -41%
Camden -39%
Hammersmith & Fulham -38%
Kensington & Chelsea -38%

The latest study used data from the Institute of Fiscal Studies to conduct a spatial analysis of Britain’s local authority funding system.

Gray and Barford mapped the levels of central grant dependence across England’s councils, and the percentage fall of service spend by local authorities across Scotland, Wales and England between financial years 2009/2010 and 2016/2017.

Some of the local services hit hardest across the country include highways and transport, culture, adult social care, children and young people’s services, and environmental services.

The part of central government formerly known as the Department of Communities and Local Government experienced a dramatic overall budget cut of 53% between 2010 and 2016.

As budget decisions were hit at a local level, “mandatory” council services – those considered vital – were funded at the expense of “discretionary” services. However, the researchers found these boundaries to be blurry.

“Taking care of ‘at risk’ children is a mandatory concern. However, youth centres and outreach services are considered unessential and have been cut to the bone. Yet these are services that help prevent children becoming ‘at risk’ in the first place,” said Gray.

“There is a narrative at national and local levels that the hands of politicians are tied, but many of these funding decisions are highly political. Public finance is politics hidden in accounting columns.”

Gray points out that once local councils “go bust” and Section 114 notices are issued, as with Northamptonshire Council, administrators are sent in who then take financial decisions that supersede any democratic process.

The research has also contributed to the development of a new play from the Menagerie Theatre Company, in which audience members help guide characters through situations taken from the lives of those in austerity-hit Britain. The play opens tonight in Oxford, and will be performed in community venues across the country during October and November.

Gray added: “Ever since vast sums of public money were used to bail out the banks a decade ago, the British people have been told that there is no other choice but austerity imposed at a fierce and relentless rate.”

“We are now seeing austerity policies turn into a downward spiral of disinvestment in certain people and places. Local councils in some communities are shrunk to the most basic of services. This could affect the life chances of entire generations born in the wrong part of the country.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.