All posts by Admin

Database Protecting UK Expats From Brexit ‘Misinformation’ To Be Built By Cambridge Researchers

Database protecting UK expats from Brexit ‘misinformation’ to be built by Cambridge researchers

source: www.cam.ac.uk

Urgent requirement for channels of timely and reliable information to be developed targeting UK-born people living on the continent, say researchers – before life-changing decisions get made rashly in a milieu of rumour and speculation.

UK citizens abroad need to be empowered to make sound, informed decisions during Brexit negotiations on whether to remain in their adopted homelands or return to the UK

Brendan Burchell

University of Cambridge researchers have set out to compile a database of communication routes that will allow UK expats residing in EU nations to receive reliable, up-to-the-minute advice throughout the negotiation process once Article 50 is triggered.

The work is part of an effort to mitigate rash Brexit-induced decisions fuelled by an information vacuum that could see thousands of over-65s in particular arriving back in the UK without necessarily having property or pensions on return.

Such a sudden reverse migration could increase pressures on already overstretched health and social care services in the UK, at a time when significant numbers of key workers in these sectors may themselves be returning to EU homelands as a result of Brexit-related insecurities.

Researchers say that fears over future rights held by UK citizens who have settled on the continent – about everything from possible legal status and rights to work, as well as access to welfare, healthcare and pensions – could be exacerbated by misinformation resulting from rumour, speculation and tabloid bombast.

They say there is an urgent need to create a ‘one stop shop’ for trustworthy information channels that cover the various types of UK migrants currently within the remaining EU: from students and young families in the cities to retirees on Mediterranean coastlines.

The research, funded by the UK’s Economic and Social Research Council (ESRC), will take place over the next six weeks. Researchers say the final product will be shared widely with trusted parties such as government agencies, legal charities and citizen advice bureaux, but will not be released fully into the public domain for fear of exploitation by commercial and lobby organisations.

“UK citizens abroad need to be empowered to make sound, informed decisions during Brexit negotiations on whether to remain in their adopted homelands or return to the UK,” says lead researcher Dr Brendan Burchell from Cambridge’s Department of Sociology.

“However, at the moment there is a missing link: there is no database of the conduits through which high quality information can be communicated that targets specific countries or sub-groups of UK migrants. This is what we aim to build over the coming weeks.”

The team of researchers will be scouring the internet and interrogating local charities and expat organisations to compile the most comprehensive list of information channels used by UK citizens in each of the other EU27 countries. These will include legal, health, financial and property advice services, English language local newspapers, Facebook pages, blogs, chat rooms and so on.

Last year, the BBC’s ‘Reality Check’ website reported that there are around 1.2 million UK-born people living in EU nations. Over 300,000 of those live in Spain, of which one-third receive a UK state pension.

Burchell says that talk of migratory influxes into the UK has been almost entirely limited to EU nationals during the heated debates around Brexit. Little consideration has been given to returning UK nationals from EU countries such as France and Spain – many of whom are increasingly elderly baby-boomer retirees that may not have lived in the UK for a decade or more.

“Without access to well-grounded information that updates throughout the Brexit process, the current void will be increasingly filled with dangerous speculation and even so-called ‘fake news’ from partisan groups or those that would seek to prey upon the anxiety of UK over-65s to make quick money through lowball property sales or investment scams,” says Dr Burchell.

Professor Maura Sheehan, an economist from Edinburgh Napier University’s Business School in also working on this project, believes that if panic is sparked it could lead to a domino effect in certain expatriate communities.

“Housing markets in areas along the Mediterranean coast could collapse as retirees try to sell up, but with no new UK expats looking to buy. Life savings could get swept away in the confusion,” she says.

“Meanwhile there is no slack in UK social infrastructure for ageing expats returning en masse with expectations of support. The NHS has yet to emerge from its current crisis, there is a desperate shortage of housing, and social care is badly underfunded.

“The idea that we could see socially isolated baby-boomer expats back in the UK with health conditions, financial woes and even ending in destitution as a result of bad decisions based on misinformation should not simply be written off as so-called ‘remoaner’ hysteria.”

Anyone who would like to suggest material for the database or find out more about the project can contact the team on brexit_expat_info@magd.cam.ac.uk.

inset image by Ville Miettinen (cc: Att-SA)


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Mapping The Family Tree Of Stars

Mapping the family tree of stars

Image showing a family trees of stars in our Galaxy, including the Sun
source: www.cam.ac.uk

Astronomers are borrowing principles applied in biology and archaeology to build a family tree of the stars in the galaxy. By studying chemical signatures found in the stars, they are piecing together these evolutionary trees looking at how the stars formed and how they are connected to each other. The signatures act as a proxy for DNA sequences. It’s akin to chemical tagging of stars and forms the basis of a discipline astronomers refer to as Galactic archaeology.

The branches of the tree serve to inform us about the stars’ shared history

Dr Paula Jofré

It was Charles Darwin, who, in 1859 published his revolutionary theory that all life forms are descended from one common ancestor. This theory has informed evolutionary biology ever since but it was a chance encounter between an astronomer and an biologist over dinner at King’s College in Cambridge that got the astronomer thinking about how it could be applied to stars in the Milky Way.

Writing in Monthly Notices of the Royal Astronomical Society, Dr Paula Jofré, of the University of Cambridge’s Institute of Astronomy, describes how she set about creating a phylogenetic “tree of life” that connects a number of stars in the galaxy.

“The use of algorithms to identify families of stars is a science that is constantly under development. Phylogenetic trees add an extra dimension to our endeavours which is why this approach is so special. The branches of the tree serve to inform us about the stars’ shared history“ she says.

The team picked twenty-two stars, including the Sun, to study. The chemical elements have been carefully measured from data coming from ground-based high-resolution spectra taken with large telescopes located in the north of Chile. Once the families were identified using the chemical DNA, their evolution was studied with the help of their ages and kinematical properties obtained from the space mission Hipparcos, the precursor of Gaia, the spacecraft orbiting Earth that was launched by the European Space Agency and is almost halfway through a 5-year project to map the sky.

Stars are born from violent explosions in the gas clouds of the galaxy. Two stars with the same chemical compositions are likely to have been born in the same molecular cloud. Some live longer than the age of the Universe and serve as fossil records of the composition of the gas at the time they were formed.  The oldest star in the sample analysed by the team is estimated to be almost ten billion years old, which is twice as old as the Sun. The youngest is 700 million years old.

In evolution, organisms are linked together by a pattern of descent with modification as they evolve. Stars are very different from living organisms, but they still have a history of shared descent as they are formed from gas clouds, and carry that history in their chemical structure. By applying the same phylogenetic methods that biologists use to trace descent in plants and animals it is possible to explore the ‘evolution’ of stars in the Galaxy.

“The differences between stars and animals is immense, but they share the property of changing over time, and so both can be analysed by building trees of their history”, says Professor Robert Foley, of the Leverhulme Centre for Human Evolutionary Studies at Cambridge.

With an increasing number of datasets being made available from both Gaia and more advanced telescopes on the ground, and on-going and future large spectroscopic surveys, astronomers are moving closer to being able to assemble one tree that would connect all the stars in the Milky Way.

Paula Jofré et al. ‘Cosmic phylogeny: reconstructing the chemical history of the solar neighbourhood with an evolutionary tree’ is published by Monthly Notices of the Royal Astronomical Society. DOI 10.1093/mnras/stx075

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Virtual Reality Journey Through A Tumour: Cambridge Scientists Receive £40 Million Funding Boost

Virtual reality journey through a tumour: Cambridge scientists receive £40 million funding boost

source: www.cam.ac.uk

Cambridge scientists have received two of the biggest funding grants ever awarded by Cancer Research UK, with the charity set to invest £40 million over the next five years in two ground-breaking research projects in the city.

This is an enormous challenge. I liken it to the idea of putting a man on Mars – there’s so much technology that you have to develop to do it. All sorts of things are happening in tumours that we can’t study using the technology we have

Greg Hannon

The funding will come from the first Cancer Research UK Grand Challenge awards – set up to help scientists solve some of the hardest unanswered questions in cancer research, and to revolutionise the prevention, diagnosis and treatment of cancer.

Teams based at the Cancer Research UK Cambridge Institute and the Wellcome Trust Sanger Institute – both part of the Cancer Research UK Cambridge Centre – have been awarded two of the four grants.

Professor Greg Hannon will lead a team at the Cancer Research UK Cambridge Institute, part of the University of Cambridge. Working with researchers in America, Switzerland, Canada and Ireland, they aim to build 3D versions of breast tumours, which can be studied using virtual reality, allowing scientists and doctors to study every cell and aspect of a tumour in unprecedented detail.

This new way of studying breast cancer could change how the disease is diagnosed, treated and managed. The virtual reality experience will include a ‘superman mode’ which will allow users to fly inside the tumour, point at every cell and find out exactly what kind of cell it is and what it’s doing.

Professor Hannon said: “This is an enormous challenge. I liken it to the idea of putting a man on Mars – there’s so much technology that you have to develop to do it. All sorts of things are happening in tumours that we can’t study using the technology we have. But with our project, we hope to change that.

“We want to create an interactive, faithful, 3D map of tumours that can be studied in virtual reality that scientists can ‘walk into’ and look at it in great detail. By doing this, we could learn more about tumours and begin to answer questions that have eluded cancer scientists for many years.”

At the Wellcome Trust Sanger Institute, Professor Sir Mike Stratton will lead a team aiming to build a deeper understanding of what causes cancer.

It’s already known that things in our environment, and behaviours like smoking and drinking alcohol, cause cancer by damaging the DNA in our cells. This damage occurs in distinctive patterns known as mutational fingerprints that are unique to their cause. For example, cancers caused by UV exposure have a different mutational fingerprint to cancers caused by tobacco.

There are at least 50 cancer-associated mutational fingerprints but researchers only know what causes around half of them. Professor Stratton’s team hope to fill in the missing gaps and determine the as yet unknown causes of cancer.

They’ll do this by studying 5,000 pancreatic, kidney, oesophageal and bowel cancer samples, which come from five continents. This will generate as much cancer DNA sequence data as the whole world has produced so far. This work could help prevent more cancers and reduce the global burden of the disease.

Professor Stratton said: “The main aim of our Grand Challenge is to understand the causes of cancer. Every cancer retains an archaeological trace, a record in its DNA, of what caused it. It’s that record that we want to explore to find out what caused the cancer.

“We’re going to sequence the DNA of thousands of cancer samples that have been collected from many different countries around the world, and study them to see what archaeological trace they contain. By doing this, we hope to figure out what caused those cancers.

“The thing that’s really exciting me is the challenge of making it all happen. And I’m looking forward to seeing the answers this work brings.”

The Cambridge projects were selected by an international panel of experts from a shortlist of nine exceptional, multi-disciplinary collaborations from universities, institutes and industry across the globe.

Sir Harpal Kumar, Cancer Research UK’s chief executive, said: “Cancer Research UK set up the Grand Challenge awards to bring a renewed focus and energy to the fight against cancer. We want to shine a light on the toughest questions that stand in the way of progress. We’re incredibly excited to be able to support these exceptional teams as they help us achieve our ambition.

“Cancer is a global problem, and these projects are part of the global solution. Together, we will redefine cancer – turning it from a disease that so many people die from, to one that many people can live with. We will reduce the number of people worldwide affected by cancer and achieve our goal of beating cancer sooner.”

Adapted from a press release by Cancer Research UK


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Viral Charity Campaigns Have A Psychological ‘Recipe’ And All-Too-Brief Lifespan

Viral charity campaigns have a psychological ‘recipe’ and all-too-brief lifespan

source: www.cam.ac.uk

New work focusing on the ALS Ice Bucket Challenge reveals very brief shelf life of such viral campaigns, and suggests the nature of ‘virality’ and social tipping points themselves may be a stumbling block to deeper engagement with social issues that campaigns aim to promote.

Increasing meaningful engagement through viral altruism might actually require deliberately hindering the hyper-viral nature at some point with a stabilising force

Sander van der Linden

A University of Cambridge researcher has identified a recipe for the new breed of wildly successful online charity campaigns such as the ALS Ice Bucket Challenge – a phenomenon he has labelled “viral altruism” – and what might make them stick in people’s minds.

However, he says the optimistic use of global digital networks to propel positive social change is balanced by the shallow, short-lived nature of engagement with anything viral.

Writing in the journal Nature Human Behaviour, social psychologist Dr Sander van der Linden has outlined the key psychological levers he says underpin the new wave of viral altruism that is increasingly taking over our Facebook feeds.

These include the power of social norms, particularly the appeal of joining a social consensus and the desire to conform to prosocial behaviour (such as appearing charitable), having a clear moral incentive to act, and the appetite for a ‘warm glow’: the positive emotional benefit derived from feeling compassionate.

One of the most important ingredients – and the hardest to achieve – is ‘translational impact’: the conversion of online token support, or ‘clicktivism’, into sustained real world contributions, whether financial donations or a long-term commitment to an issue.

This, he says, involves a shift in motivation from the ‘extrinsic’ – incentives conditional on outside social pressures – to the ‘intrinsic’: an incentive that has been internalised to become a “new personal normal” for an individual.

Part of van der Linden’s initial research has been to pull together data such as Google and Wikipedia searches as well as donations to indicate the longevity and engagement levels of the ALS Ice Bucket Challenge campaign.

The Challenge reached unprecedented ‘virality’ during August 2014. The formula of videoing ice-cold water being poured over your head and posting it to social media while publicly nominating others to do the same in support of a motor neurone disease charity reached approximately 440 million people worldwide, with over 28 million joining in.

‘Brightly but briefly’

Yet van der Linden found that the Challenge burned brightly but briefly: with online interest and donations reverting to pre-viral levels in mere weeks. The engagement was also superficial: estimates suggest that 1 in 4 participants did not mention the ALS charity in their videos and only 1 in 5 mentioned a donation.

And, while the 2014 campaign caused a significant spike in donations – some $115m – when the ALS charity attempted to reboot the Ice Bucket Challenge the following year it raised less than 1% of the previous summer.

Other examples of viral altruism considered to be successful also appear to have an equally brief “half-life”. The Facebook organ donor initiative elicited more than 60% of its total online registrations in the first two days before numbers rapidly dropped off. Save Darfur was one of the largest campaigns on Facebook; after joining, most members never donated money or recruited anyone else.

Van der Linden believes converting the brief social pressures of viral altruism into self-sustaining personal motivations is the key to leveraging new digital networks for long-term engagement with the big issues of our time, such as climate change.

However, he argues that it may be the very viral nature of ‘viral altruism’ that acts as a barrier to this.

“Society now has the ability to connect and mobilise over a billion Facebook users to action on specific social issues in a fast and low-cost manner, but it is becoming clear this entails viral phenomena which by their very nature are ephemeral and superficial,” says van der Linden, from Cambridge’s Department of Psychology.

Hyper-viral paradox

“Just as a flame that burns twice as bright burns half as long, so a rapid social consensus spike reaches an equally rapid saturation point.

“Once the social tipping point of a campaign has passed, momentum can decay quickly and the purpose can get diluted. Once the ALS campaign had reached peak virality, many people were just pouring cold water over their heads without necessarily referencing the charity.

“Paradoxically, increasing meaningful engagement through viral altruism might actually require deliberately hindering the hyper-viral nature at some point with a stabilising force. Perhaps introducing aspects to a campaign that increasingly require more commitment – slowing growth and encouraging deeper engagement. If we want people to internalise a new normal, we need to give them a window big enough to do that.

“Deeper engagement seems especially vital. Something as simple as a single phrase connecting a campaign to its cause can make a difference. For example, those who mentioned the ALS charity in their Ice Bucket Challenge video were five times more likely to donate money than those who did not.”

SMART recipe

Van der Linden has set out his recipe for viral altruism using the acronym SMART: Social influences; Moral imperatives; Affective Reactions; Translational impact.

The ALS campaign managed to exploit a two-pronged approach to ‘social influences’. People were influenced by the example of those in their network, and wanted to join the burgeoning consensus. The nature of the campaign also meant that many were publicly challenged to participate by their social network, and risked the ‘social sanction’ of being seen to lack compassion if they then didn’t.

Helping people with a debilitating disease was seen as a ‘moral imperative’. Van der Linden says that having ‘identifiable victims’ such as scientist Prof Stephen Hawking allowed people to relate to the cause.

Campaigns that allow for the creation of a shared identity between the individual and the cause over time appear to be more successful in achieving translational impact.

Sander van der Linden

‘Affective Reactions’ is the response to strong emotional content. “Empathy is an emotional contagion,” says van der Linden. “We are evolutionarily hard-wired to ‘catch’ other people’s feelings. Responding with an altruistic act give us a ‘warm glow’ of positivity. Similarly, people often respond to social injustice, such as genocide, with strong moral outrage.”

However, where almost all campaigns stumble is ‘Translational impact’, he says. “Extrinsic incentives, such as competitions or network pressure, can actually undermine people’s intrinsic motivation to do good by eroding moral sentiment. Motivation to participate can get sourced from a desire to ‘win’ a challenge or appear virtuous rather than caring about the cause itself.”

Climate change is an example of a major global issue that currently scores pretty much zero for the SMART recipe, says van der Linden.

“Climate change often fails to elicit strong emotional engagement, there is little to no societal pressure to act on climate change in our daily lives, most people do not view it as a fundamental moral issue, and the long-term nature of the problem requires more than a one-off donation.”

He suggests that using the SMART recipe could be a way to reverse engineer more effective climate change campaigns that harness viral altruism, but the problem of translating impact remains.

One of the more impactful campaigns van der Linden highlights is ‘No-Shave November‘: the month-long growing of a moustache to raise awareness of men’s health. Starting with just 30 people in 2003, the campaign didn’t experience viral hypergrowth, but developed over years to reach about 5 million members by 2014 – by which time the charity reported 75% of participants were more aware of health issues facing men.

“Campaigns that allow for the creation of a shared identity between the individual and the cause over time appear to be more successful in achieving translational impact.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

A Bridge of Stars Connects Two Dwarf Galaxies

A bridge of stars connects two dwarf galaxies

source: www.cam.ac.uk

The Magellanic Clouds, the two largest satellite galaxies of the Milky Way, appear to be connected by a bridge stretching across 43,000 light years, according to an international team of astronomers led by researchers from the University of Cambridge. The discovery is reported in the journal Monthly Notices of the Royal Astronomical Society (MNRAS) and is based on the Galactic stellar census being conducted by the European Space Observatory, Gaia.

We believe that at least in part this ‘bridge’ is composed of stars stripped from the Small Cloud by the Large

Vasily Belokurov

For the past 15 years, scientists have been eagerly anticipating the data from Gaia. The first portion of information from the satellite was released three months ago and is freely accessible to everyone. This dataset of unprecedented quality is a catalogue of the positions and brightness of a billion stars in our Milky Way galaxy and its environs.

What Gaia has sent to Earth is unique. The satellite’s angular resolution is similar to that of the Hubble Space Telescope, but given its greater field of view, it can cover the entire sky rather than a small portion of it. In fact, Gaia uses the largest number of pixels to take digital images of the sky for any space-borne instrument. Better still, the Observatory has not just one telescope but two, sharing the one metre wide focal plane.

Unlike typical telescopes, Gaia does not just point and stare: it constantly spins around its axis, sweeping the entire sky in less than a month. Therefore, it not only measures the instantaneous properties of the stars, but also tracks their changes over time. This provides a perfect opportunity for finding a variety of objects, for example stars that pulsate or explode – even if this is not what the satellite was primarily designed for.

The Cambridge team concentrated on the area around the Magellanic Clouds and used the Gaia data to pick out pulsating stars of a particular type: the so-called RR Lyrae, very old and chemically un-evolved. As these stars have been around since the earliest days of the Clouds’ existence, they offer an insight into the pair’s history. Studying the Large and Small Magellanic Clouds (LMC and SMC respectively) has always been difficult as they sprawl out over a large area. But with Gaia’s all-sky view, this has become a much easier task.

Around the Milky Way, the clouds are the brightest, and largest, examples of dwarf satellite galaxies. Known to humanity since the dawn of history (and to Europeans since their first voyages to the Southern hemisphere) the Magellanic Clouds have remained an enigma to date. Even though the clouds have been a constant fixture of the heavens, astronomers have only recently had the chance to study them in any detail.

The Magellanic Clouds can be seen just above the horizon and below the arc of the Milky Way – D Erkal

Whether the clouds fit the conventional theory of galaxy formation or not depends critically on their mass and the time of their first approach to the Milky Way. The researchers at Cambridge’s Institute of Astronomy found clues that could help answer both of these questions.

Firstly, the RR Lyrae stars detected by Gaia were used to trace the extent of the Large Magellanic Cloud. The LMC was found to possess a fuzzy low-luminosity ‘halo’ stretching as far as 20 degrees from its centre. The LMC would only be able to hold on to the stars at such large distances if it was substantially bigger than previously thought, totalling perhaps as much as a tenth of the mass of the entire Milky Way.

An accurate timing of the clouds’ arrival to the galaxy is impossible without knowledge of their orbits. Unfortunately, satellite orbits are difficult to measure: at large distances, the object’s motion in the sky is so minute that it is simply unobservable over a human lifespan. In the absence of an orbit, Dr Vasily Belokurov and colleagues found the next best thing: a stellar stream.

Streams of stars form when a satellite – a dwarf galaxy or a star cluster – starts to feel the tidal force of the body around which it orbits. The tides stretch the satellite in two directions: towards and away from the host. As a result, on the periphery of the satellite, two openings form: small regions where the gravitational pull of the satellite is balanced by the pull of the host. Satellite stars that enter these regions find it easy to leave the satellite altogether and start orbiting the host. Slowly, star after star abandons the satellite, leaving a luminous trace on the sky, and thus revealing the satellite’s orbit.

“Stellar streams around the Clouds were predicted but never observed,” explains Dr Belokurov. “Having marked the locations of the Gaia RR Lyrae on the sky, we were surprised to see a narrow bridge-like structure connecting the two clouds. We believe that at least in part this ‘bridge’ is composed of stars stripped from the Small Cloud by the Large. The rest may actually be the LMC stars pulled from it by the Milky Way.”

The researchers believe the RR Lyrae bridge will help to clarify the history of the interaction between the clouds and our galaxy.

“We have compared the shape and the exact position of the Gaia stellar bridge to the computer simulations of the Magellanic Clouds as they approach the Milky Way”, explains Dr Denis Erkal, a co-author of the study. “Many of the stars in the bridge appear to have been removed from the SMC in the most recent interaction, some 200 million years ago, when the dwarf galaxies passed relatively close by each other. “We believe that as a result of that fly-by, not only the stars but also hydrogen gas was removed from the SMC. By measuring the offset between the RR Lyrae and hydrogen bridges, we can put constraints on the density of the gaseous Galactic corona.”

Composed of ionised gas at very low density, the hot Galactic corona is notoriously difficult to study. Nevertheless, it has been the subject of intense scrutiny because scientists believe it may contain most of the missing baryonic – or ordinary – matter. Astronomers are trying to estimate where this missing matter (the atoms and ions that make up stars, planets, dust and gas) is. It’s thought that most, or even all, of these missing baryons are in the corona. By measuring the coronal density at large distances they hope to solve this conundrum.

During the previous encounter between the Small and Large Magellanic Cloud, both stars and gas were ripped out of the Small Cloud, forming a tidal stream. Initially, the gas and stars were moving at the same speed. However, as the Clouds approached our Galaxy, the Milky Way’s corona exerted a drag force on both of them. The stars, being relatively small and dense, punched through the corona with no change in their speed. However, the more tenuous neutral hydrogen gas slowed down substantially in the corona. By comparing the current location of the stars and the gas, taking into account the density of the gas and how long the Clouds have spent in the corona, the team estimated the density of the corona. Dr. Erkal concludes, “Our estimate showed that the corona could make up a significant fraction of the missing baryons, in agreement with previous independent techniques. With the missing baryon problem seemingly alleviated, the current model of galaxy formation is holding up well to the increased scrutiny possible with Gaia.”

Reference
Vasily Belokurov et al. “Clouds, Streams and Bridges. Redrawing the blueprint of the Magellanic System with Gaia DR1”. Monthly Notices of the Royal Astronomical Society; 8th Feb. 2017; DOI:10.1093/mnras/stw3357


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Concerns Over Wasting Doctor’s Time May Affect Decision To See GP

Concerns over wasting doctor’s time may affect decision to see GP

source: www.cam.ac.uk

Worries over wasting their doctor’s time, particularly at a time when NHS resources are stretched, may influence when and whether patients choose to see their GP, according to a study carried out by the University of Cambridge.

Recognising this worry about timewasting among patients is important because it could influence whether a patient chooses to see the doctor or not. If a patient decided to hold off seeing the doctor for fear of wasting resources, this could have serious implications for their health

Nadia Llanwarne

In the study, published today in the journal Social Science and Medicine, researchers from the Cambridge Centre for Health Services Research report how the theme of ‘wasting doctors’ time’ arose so often during interviews conducted with patients about their experiences of primary care that they chose to study this topic in its own right.

“‘Am I wasting the doctor’s time?’ is a question that many patients ask themselves when deciding whether or not to visit the doctor,” explains Dr Nadia Llanwarne, who led the study. “We already knew that this worry existed among some patients, but this is the first study entirely dedicated to the subject that reports the existence of this worry among a variety of patients, young and old, healthy and sick, visiting their GP for a wide range of complaints.”

As part of the study, Dr Llanwarne and colleagues filmed patients’ consultations with their GPs and then interviewed 52 patients across GP surgeries in London, the east of England and south west England about their experience. It was in these interviews that the issue of timewasting arose.

The researchers identified three threads common to the issue of timewasting present across patients’ narratives in general practice: the experience of a conveyor belt approach to care, the intimation that ‘other patients’ waste time, and uncertainty among patients over what is worthy of their doctor’s time.

The authors consider the reasons why people appear concerned about timewasting. Patients spoke of the pressured context in which their consultations take place: the demand on services, the NHS’s limited resources, the lack of time, and busy doctors. Understanding the time pressures that doctors face, patients described how these challenges influenced their decision to see their GP.

In an overstretched NHS, time becomes all the more precious, and this has meant that public campaigns often refer to appropriate and inappropriate users. For decades, doctors have expressed frustration that too many patients visit unnecessarily. As a result of these judgments cast upon them, patients voice the pressure to consult only when necessary and speak openly of ‘timewasters’.

“Patients are keen to avoid this label, but neither the patients, nor the doctors, are able to clearly define what precise problems might attract such a label,” says Dr Llanwarne. “This is because some patients will present with what seems on the surface a minor problem, but once through the door of the doctor’s consulting room, they may open up about more serious complaints. With some symptoms it may be very difficult for the patient to know whether these are serious enough or not to need review by the doctor.

“Recognising this worry about timewasting among patients is important because it could influence whether a patient chooses to see the doctor or not. If a patient decided to hold off seeing the doctor for fear of wasting resources, this could have serious implications for their health.”

Dr Llanwarne adds: “It’s important for patients to not delay contacting their doctor simply because of worry about wasting doctors’ time. And it’s important for doctors to be attentive to the fact that many patients will be worried about this. Doctors can then ensure they allay patients’ concerns when they do seek help.”

The study was funded by the National Institute for Health Research.

Reference
Llanwarne, N et al. Wasting the doctor’s time? A video-elicitation interview study with patients in primary care. Social Science & Medicine; e-pub 18 January 2017; DOI: 10.1016/j.socscimed.2017.01.025


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Baltic Hunter-Gatherers Adopted Farming Without Influence of Mass Migration, Ancient DNA Suggests

Baltic hunter-gatherers adopted farming without influence of mass migration, ancient DNA suggests

source: www.cam.ac.uk

Ancient DNA analyses show that – unlike elsewhere in Europe – farmers from the Near East did not overtake hunter-gatherer populations in the Baltic. The findings also suggest that the Balto-Slavic branch of the Indo-European language family originated in the Steppe grasslands of the East.

The Baltic hunter-gatherer genome remains remarkably untouched until the great migrations of the Bronze Age sweep in from the East

Andrea Manica

New research indicates that Baltic hunter-gatherers were not swamped by migrations of early agriculturalists from the Middle East, as was the case for the rest of central and western Europe. Instead, these people probably acquired knowledge of farming and ceramics by sharing cultures and ideas rather than genes with outside communities.

Scientists extracted ancient DNA from a number of archaeological remains discovered in Latvia and the Ukraine, which were between 5,000 and 8,000 years old. These samples spanned the Neolithic period, which was the dawn of agriculture in Europe, when people moved from a mobile hunter-gatherer lifestyle to a settled way of life based on food production.

We know through previous research that large numbers of early farmers from the Levant (the Near East) – driven by the success of their technological innovations such as crops and pottery – had expanded to the peripheral parts of Europe by the end of the Neolithic and largely replaced hunter-gatherer populations.

However, the new study, published today in the journal Current Biology, shows that the Levantine farmers did not contribute to hunter-gatherers in the Baltic as they did in Central and Western Europe.

The research team, which includes scientists from the University of Cambridge and Trinity College Dublin, say their findings instead suggest that the Baltic hunter-gatherers learned these skills through communication and cultural exchange with outsiders.

The findings feed into debates around the ‘Neolithic package’ – the cluster of technologies such as domesticated livestock, cultivated cereals and ceramics, which revolutionised human existence across Europe during the late Stone Age.

Advances in ancient DNA work have revealed that this ‘package’ was spread through Central and Western Europe by migration and interbreeding: the Levant and later Anatolian farmers mixing with and essentially replacing the hunter-gatherers.

But the new work suggests migration was not a ‘universal driver’ across Europe for this way of life. In the Baltic region, archaeology shows that the technologies of the ‘package’ did develop – albeit less rapidly – even though the analyses show that the genetics of these populations remained the same as those of the hunter-gatherers throughout the Neolithic.

Andrea Manica, one of the study’s senior authors from the University of Cambridge, said: “Almost all ancient DNA research up to now has suggested that technologies such as agriculture spread through people migrating and settling in new areas.”

“However, in the Baltic, we find a very different picture, as there are no genetic traces of the farmers from the Levant and Anatolia who transmitted agriculture across the rest of Europe.”

“The findings suggest that indigenous hunter-gatherers adopted Neolithic ways of life through trade and contact, rather than being settled by external communities. Migrations are not the only model for technology acquisition in European prehistory.”

The researchers analysed eight ancient genomes – six from Latvia and two from Ukraine – that spanned a timeframe of three and a half thousand years (between 8,300 and 4,800 years ago). This enabled them to start plotting the genetic history of Baltic inhabitants during the Neolithic.

DNA was extracted from the petrous area of skulls that had been recovered by archaeologists from some of the region’s richest Stone Age cemeteries. The petrous, at the base of the skull, is one of the densest bones in the body, and a prime location for DNA that has suffered the least contamination over millennia.

While the sequenced genomes showed no trace of the Levant farmer influence, one of the Latvian samples did reveal genetic influence from a different external source – one that the scientists say could be a migration from the Pontic Steppe in the east. The timing (5-7,000 years ago) fits with previous research estimating the earliest Slavic languages.

Researcher Eppie Jones, from Trinity College Dublin and the University of Cambridge, was the lead author of the study. She said: “There are two major theories on the spread of Indo-European languages, the most widely spoken language family in the world. One is that they came from the Anatolia with the agriculturalists; another that they developed in the Steppes and spread at the start of the Bronze Age.

“That we see no farmer-related genetic input, yet we do find this Steppe-related component, suggests that at least the Balto-Slavic branch of the Indo-European language family originated in the Steppe grasslands of the East, which would bring later migrations of Bronze Age horse riders.”

The researchers point out that the time scales seen in Baltic archaeology are also very distinct to the rest of Europe, with a much more drawn-out and piecemeal uptake of Neolithic technologies, rather than the complete ‘package’ that arrives with migrations to take most of Europe by storm.

Andrea Manica added: “Our evidence of genetic continuity in the Baltic, coupled with the archaeological record showing a prolonged adoption of Neolithic technologies, would suggest the existence of trade networks with farming communities largely independent of interbreeding.

“It seems the hunter-gatherers of the Baltic likely acquired bits of the Neolithic package slowly over time through a ‘cultural diffusion’ of communication and trade, as there is no sign of the migratory wave that brought farming to the rest of Europe during this time.

“The Baltic hunter-gatherer genome remains remarkably untouched until the great migrations of the Bronze Age sweep in from the East.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Ancient DNA Reveals Genetic ‘Continuity’ Between Stone Age and Modern Populations In East Asia

Ancient DNA reveals genetic ‘continuity’ between Stone Age and modern populations in East Asia

source: www.cam.ac.uk

In contrast to Western Europeans, new research finds contemporary East Asians are genetically much closer to the ancient hunter-gatherers that lived in the same region eight thousand years previously.

The Ulchi and the ancient hunter-gatherers appeared to be almost the same population from a genetic point of view, even though there are thousands of years between them

Andrea Manica

Researchers working on ancient DNA extracted from human remains interred almost 8,000 years ago in a cave in the Russian Far East have found that the genetic makeup of certain modern East Asian populations closely resemble that of their hunter-gatherer ancestors.

The study, published today in the journal Science Advances, is the first to obtain nuclear genome data from ancient mainland East Asia and compare the results to modern populations.

The findings indicate that there was no major migratory interruption, or “population turnover”, for well over seven millennia. Consequently, some contemporary ethnic groups share a remarkable genetic similarity to Stone Age hunters that once roamed the same region.

The high “genetic continuity” in East Asia is in stark contrast to most of Western Europe, where sustained migrations of early farmers from the Levant overwhelmed hunter-gatherer populations. This was followed by a wave of horse riders from Central Asia during the Bronze Age.  These events were likely driven by the success of emerging technologies such as agriculture and metallurgy

The new research shows that, at least for part of East Asia, the story differs – with little genetic disruption in populations since the early Neolithic period.

Despite being separated by a vast expanse of history, this has allowed an exceptional genetic proximity between the Ulchi people of the Amur Basin, near where Russia borders China and North Korea, and the ancient hunter-gatherers laid to rest in a cave close to the Ulchi’s native land.

The researchers suggest that the sheer scale of East Asia and dramatic variations in its climate may have prevented the sweeping influence of Neolithic agriculture and the accompanying migrations that replaced hunter-gatherers across much of Europe. They note that the Ulchi retained their hunter-fisher-gatherer lifestyle until recent times.

“Genetically speaking, the populations across northern East Asia have changed very little for around eight millennia,” said senior author Andrea Manica from the University of Cambridge, who conducted the work with an international team, including colleagues from Ulsan National Institute of Science and Technology in Korea, and Trinity College Dublin and University College Dublin in Ireland.

“Once we accounted for some local intermingling, the Ulchi and the ancient hunter-gatherers appeared to be almost the same population from a genetic point of view, even though there are thousands of years between them.”

The new study also provides further support for the ‘dual origin’ theory of modern Japanese populations: that they descend from a combination of hunter-gatherers and agriculturalists that eventually brought wet rice farming from southern China. A similar pattern is also found in neighbouring Koreans, who are genetically very close to Japanese.

However, Manica says that much more DNA data from Neolithic China is required to pinpoint the origin of the agriculturalists involved in this mixture.

The team from Trinity College Dublin were responsible for extracting DNA from the remains, which were found in a cave known as Devil’s Gate. Situated in a mountainous area close to the far eastern coast of Russia that faces northern Japan, the cave was first excavated by a soviet team in 1973.

Along with hundreds of stone and bone tools, the carbonised wood of a former dwelling, and woven wild grass that is one of the earliest examples of a textile, were the incomplete bodies of five humans.

If ancient DNA can be found in sufficiently preserved remains, sequencing it involves sifting through the contamination of millennia. The best samples for analysis from Devil’s Gate were obtained from the skulls of two females: one in her early twenties, the other close to fifty. The site itself dates back over 9,000 years, but the two women are estimated to have died around 7,700 years ago.

Researchers were able to glean the most from the middle-aged woman. Her DNA revealed she likely had brown eyes and thick, straight hair. She almost certainly lacked the ability to tolerate lactose, but was unlikely to have suffered from ‘alcohol flush’: the skin reaction to alcohol now common across East Asia.

While the Devil’s Gate samples show high genetic affinity to the Ulchi, fishermen from the same area who speak the Tungusic language, they are also close to other Tungusic-speaking populations in present day China, such as the Oroqen and Hezhen.

“These are ethnic groups with traditional societies and deep roots across eastern Russia and China, whose culture, language and populations are rapidly dwindling,” added lead author Veronika Siska, also from Cambridge.

“Our work suggests that these groups form a strong genetic lineage descending directly from the early Neolithic hunter-gatherers who inhabited the same region thousands of years previously.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Bag-Like Sea Creature Was Humans’ Oldest Known Ancestor

Bag-like sea creature was humans’ oldest known ancestor

source: www.cam.ac.uk

A tiny sea creature identified from fossils found in China may be the earliest known step on an evolutionary path that eventually led to the emergence of humans

We think that as an early deuterostome this may represent the primitive beginnings of a very diverse range of species, including ourselves

Simon Conway Morris

Researchers have identified traces of what they believe is the earliest known prehistoric ancestor of humans – a microscopic, bag-like sea creature, which lived about 540 million years ago.

Named Saccorhytus, after the sack-like features created by its elliptical body and large mouth, the species is new to science and was identified from microfossils found in China. It is thought to be the most primitive example of a so-called “deuterostome” – a broad biological category that encompasses a number of sub-groups, including the vertebrates.

If the conclusions of the study, published in the journal Nature, are correct, then Saccorhytus was the common ancestor of a huge range of species, and the earliest step yet discovered on the evolutionary path that eventually led to humans, hundreds of millions of years later.

Modern humans are, however, unlikely to perceive much by way of a family resemblance. Saccorhytus was about a millimetre in size, and probably lived between grains of sand on the seabed. Its features were spectacularly preserved in the fossil record – and intriguingly, the researchers were unable to find any evidence that the animal had an anus.

The study was carried out by an international team of academics, including researchers from the University of Cambridge in the UK and Northwest University in Xi’an China, with support from other colleagues at institutions in China and Germany.

Simon Conway Morris, Professor of Evolutionary Palaeobiology and a Fellow of St John’s College, University of Cambridge, said: “We think that as an early deuterostome this may represent the primitive beginnings of a very diverse range of species, including ourselves. To the naked eye, the fossils we studied look like tiny black grains, but under the microscope the level of detail is jaw-dropping. All deuterostomes had a common ancestor, and we think that is what we are looking at here.”

Degan Shu, from Northwest University, added: “Our team has notched up some important discoveries in the past, including the earliest fish and a remarkable variety of other early deuterostomes. Saccorhytus now gives us remarkable insights into the very first stages of the evolution of a group that led to the fish, and ultimately, to us.”

 

 

Most other early deuterostome groups are from about 510 to 520 million years ago, when they had already begun to diversify into not just the vertebrates, but the sea squirts, echinoderms (animals such as starfish and sea urchins) and hemichordates (a group including things like acorn worms). This level of diversity has made it extremely difficult to work out what an earlier, common ancestor might have looked like.

The Saccorhytus microfossils were found in Shaanxi Province, in central China, and pre-date all other known deuterostomes. By isolating the fossils from the surrounding rock, and then studying them both under an electron microscope and using a CT scan, the team were able to build up a picture of how Saccorhytus might have looked and lived. This revealed features and characteristics consistent with current assumptions about primitive deuterostomes.

Dr Jian Han, of Northwest University, said: “We had to process enormous volumes of limestone – about three tonnes – to get to the fossils, but a steady stream of new finds allowed us to tackle some key questions: was this a very early echinoderm, or something even more primitive? The latter now seems to be the correct answer.”

In the early Cambrian period, the region would have been a shallow sea. Saccorhytus was so small that it probably lived in between individual grains of sediment on the sea bed.

The study suggests that its body was bilaterally symmetrical – a characteristic inherited by many of its descendants, including humans – and was covered with a thin, relatively flexible skin. This in turn suggests that it had some sort of musculature, leading the researchers to conclude that it could have made contractile movements, and got around by wriggling.

Perhaps its most striking feature, however, was its rather primitive means of eating food and then dispensing with the resulting waste. Saccorhytus had a large mouth, relative to the rest of its body, and probably ate by engulfing food particles, or even other creatures.

A crucial observation are small conical structures on its body. These may have allowed the water that it swallowed to escape and so were perhaps the evolutionary precursor of the gills we now see in fish. But the researchers were unable to find any evidence that the creature had an anus. “If that was the case, then any waste material would simply have been taken out back through the mouth, which from our perspective sounds rather unappealing,” Conway Morris said.

The findings also provide evidence in support of a theory explaining the long-standing mismatch between fossil evidence of prehistoric life, and the record provided by biomolecular data, known as the “molecular clock”.

Technically, it is possible to estimate roughly when species diverged by looking at differences in their genetic information. In principle, the longer two groups have evolved separately, the greater the biomolecular difference between them should be, and there are reasons to think this process is more or less clock-like.

Unfortunately, before a point corresponding roughly to the time at which Saccorhytus was wriggling in the mud, there are scarcely any fossils available to match the molecular clock’s predictions. Some researchers have theorised that this is because before a certain point, many of the creatures they are searching for were simply too small to leave much of a fossil record. The microscopic scale of Saccorhytus, combined with the fact that it is probably the most primitive deuterostome yet discovered, appears to back this up.

The findings are published in Nature. Reference: Jian Han, Simon Conway Morris, Qiang Ou, Degan Shu and Hai Huang. Meiofaunal deuterostomes from the basal Cambrian of Shaanxi (China). DOI: 10.1038/nature21072.

Inset image: Photographs of the fossils show the spectacularly detailed levels of preservation which allowed researchers to identify and study the creature. Credit: Jian Han.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Pets Are A Child’s Best Friend, Not Their Siblings

Pets are a child’s best friend, not their siblings

source: www.cam.ac.uk

Children get more satisfaction from relationships with their pets than with their brothers or sisters, according to new research from the University of Cambridge. Children also appear to get on even better with their animal companions than with siblings.

The fact that pets cannot understand or talk back may even be a benefit as it means they are completely non-judgmental

Matt Cassells

The research adds to increasing evidence that household pets may have a major influence on child development, and could have a positive impact on children’s social skills and emotional well-being.

Pets are almost as common as siblings in western households, although there are relatively few studies on the importance of child-pet relationships.

‘‘Anyone who has loved a childhood pet knows that we turn to them for companionship and disclosure, just like relationships between people,” says Matt Cassells, a Gates Cambridge Scholar at the Department of Psychiatry, who led the study. “We wanted to know how strong these relationships are with pets relative to other close family ties. Ultimately this may enable us to understand how animals contribute to healthy child development”

This study, published in the Journal of Applied Developmental Psychology, was conducted in collaboration with the WALTHAM Centre for Pet Nutrition, part of Mars Petcare and co-funded by the Economic and Social Research Council as part of a larger study, led by Prof Claire Hughes at the University of Cambridge Centre for Family Research. Researchers surveyed 12 year old children from 77 families with one or more pets of any type and more than one child at home. Children reported strong relationships with their pets relative to their siblings, with lower levels of conflict and greater satisfaction in owners of dogs than other kinds of pets.

‘‘Even though pets may not fully understand or respond verbally, the level of disclosure to pets was no less than to siblings,” says Cassels. “The fact that pets cannot understand or talk back may even be a benefit as it means they are completely non-judgmental.

“While previous research has often found that boys report stronger relationships with their pets than girls do, we actually found the opposite. While boys and girls were equally satisfied with their pets, girls reported more disclosure, companionship, and conflict with their pet than did boys, perhaps indicating that girls may interact with their pets in more nuanced ways.’’

“Evidence continues to grow showing that pets have positive benefits on human health and community cohesion,” says Dr Nancy Gee, Human-Animal Interaction Research Manager at WALTHAM and a co-author of the study. “The social support that adolescents receive from pets may well support psychological well-being later in life but there is still more to learn about the long term impact of pets on children’s development.”

Reference
​Cassells, M et al. One of the family? Measuring early adolescents’ relationships with pets and siblings. Journal of Applied Developmental Psychology; 24 Jan 2017; DOI: 10.1016/j.appdev.2017.01.003

Adapted from a press release by WALTHAM Centre for Pet Nutrition.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Team Receives £5 Million To Help GPs Spot ‘Difficult-To-Diagnose’ Cancers

Cambridge team receives £5 million to help GPs spot ‘difficult-to-diagnose’ cancers

source: www.cam.ac.uk

Researchers in Cambridge are set to receive a £5m Cancer Research UK’s Catalyst Award to improve the early detection of cancers in GP surgeries. The CanTest team, led by Dr Fiona Walter from the University of Cambridge, will work with researchers in three UK sites and across the globe on a five year project that will help GPs to detect cancers in a primary care setting, enabling patients to benefit from innovative approaches and new technologies, and reducing the burden of referrals.

We know that GPs sometimes have to wait weeks for results before they can make any decisions for their patients. We’re trying to reduce this time by assessing ways that GPs could carry out the tests by themselves, as long as it’s safe and sensible to do so

Fiona Walter

The research will prioritise ‘difficult-to-diagnose’ cancers, which are also associated with poorer survival outcomes, and will look at both existing and novel technologies.

Dr Walter, from the Department of Public Health and Primary Care, says: “We know that GPs sometimes have to wait weeks for results before they can make any decisions for their patients. We’re trying to reduce this time by assessing ways that GPs could carry out the tests by themselves, as long as it’s safe and sensible to do so. We are open to assessing many different tests, and we’re excited to hear from potential collaborators.”

The Award aims to boost progress aligned to Cancer Research UK’s strategic priorities by building new collaborations within and between institutions, also involving researchers based at the University of Exeter, UCL (University College London), the University of Leeds and a number of international institutions.

“This is a fantastic opportunity to transform cancer diagnosis and we are delighted that Cancer Research UK is investing so substantially in primary care cancer research,” adds Dr Walter. “This award will enable us to nurture a new generation of researchers from a variety of backgrounds to work in primary care cancer diagnostics, creating an educational ‘melting pot’ to rapidly expand the field internationally.”

The Catalyst Award supports capacity building and collaboration in population health with up to £5 million awarded to enable teams to deliver impact over and above what they could do alone.

Sir Harpal Kumar, Cancer Research UK’s chief executive, said: “This collaboration will help us discover new and more effective ways to diagnose cancer by applying different methods to GP surgeries, and finding out what really works for them on the job.

“By investing in future experts in this field, it will allow us to continue searching for the best way to diagnose cancer patients for many years to come. This has potential not only to save GPs’ and patients’ time, but also to reduce the anxiety patients feel when waiting for their results.”

Adapted from a press release by Cancer Research UK.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Personality Traits Linked To Differences In Brain Structure

Personality traits linked to differences in brain structure

source: www.cam.ac.uk

Our personality may be shaped by how our brain works, but in fact the shape of our brain can itself provide surprising clues about how we behave – and our risk of developing mental health disorders – suggests a study published today.

Linking how brain structure is related to basic personality traits is a crucial step to improving our understanding of the link between the brain morphology and particular mood, cognitive, or behavioural disorders

Luca Passamonti

According to psychologists, the extraordinary variety of human personality can be broken down into the so-called ‘Big Five’ personality traits, namely neuroticism (how moody a person is), extraversion (how enthusiastic a person is), openness (how open-minded a person is), agreeableness (a measure of altruism), and conscientiousness (a measure of self-control).

In a study published today in the journal Social Cognitive and Affective Neuroscience, an international team of researchers from the UK, US, and Italy have analysed a brain imaging dataset from over 500 individuals that has been made publicly available by the Human Connectome Project, a major US initiative funded by the National Institutes of Health. In particular, the researchers looked at differences in the brain cortical anatomy (the structure of the outer layer of the brain) as indexed by three measures – the thickness, area, and amount of folding in the cortex – and how these measures related to the Big Five personality traits.

“Evolution has shaped our brain anatomy in a way that maximizes its area and folding at the expense of reduced thickness of the cortex,” explains Dr Luca Passamonti from the Department of Clinical Neurosciences at the University of Cambridge. “It’s like stretching and folding a rubber sheet – this increases the surface area, but at the same time the sheet itself becomes thinner. We refer to this as the ‘cortical stretching hypothesis’.”

“Cortical stretching is a key evolutionary mechanism that enabled human brains to expand rapidly while still fitting into our skulls, which grew at a slower rate than the brain,” adds Professor Antonio Terracciano from the Department of Geriatrics at the Florida State University. “Interestingly, this same process occurs as we develop and grow in the womb and throughout childhood, adolescence, and into adulthood: the thickness of the cortex tends to decrease while the area and folding increase.”

In addition, as we get older, neuroticism goes down – we become better at handling emotions. At the same time, conscientiousness and agreeableness go up – we become progressively more responsible and less antagonistic.

The researchers found that high levels of neuroticism, which may predispose people to develop neuropsychiatric disorders, were associated with increased thickness as well as reduced area and folding in some regions of the cortex such as the prefrontal-temporal cortices at the front of the brain.

In contrast, openness, which is a personality trait linked with curiosity, creativity and a preference for variety and novelty, was associated with the opposite pattern, reduced thickness and an increase in area and folding in some prefrontal cortices.

“Our work supports the notion that personality is, to some degree, associated with brain maturation, a developmental process that is strongly influenced by genetic factors,” says Dr Roberta Riccelli from Italy.

“Of course, we are continually shaped by our experiences and environment, but the fact that we see clear differences in brain structure which are linked with differences in personality traits suggests that there will almost certainly be an element of genetics involved,” says Professor Nicola Toschi from the University ‘Tor Vergata’ in Rome. “This is also in keeping with the notion that differences in personality traits can be detected early on during development, for example in toddlers or infants.”

The volunteers whose brains were imaged as part of the Human Connectome Project were all healthy individuals aged between 22 and 36 years with no history of neuro-psychiatric or other major medical problems. However, the relationship between differences in brain structure and personality traits in these people suggests that the differences may be even more pronounced in people who are more likely to experience neuro-psychiatric illnesses.

“Linking how brain structure is related to basic personality traits is a crucial step to improving our understanding of the link between the brain morphology and particular mood, cognitive, or behavioural disorders,” adds Dr Passamonti. “We also need to have a better understanding of the relation between brain structure and function in healthy people to figure out what is different in people with neuropsychiatric disorders.”

This is not the first time the researchers have found links between our brain structure and behaviour. A study published by the group last year found that the brains of teenagers with serious antisocial behaviour problems differ significantly in structure to those of their peers.

Reference
Riccelli, R et al. Surface-based morphometry reveals the neuroanatomical basis of the five-factor model. Social Cognitive and Affective Neuroscience; 25 Jan 2016; DOI: 10.1093/scan/nsw175


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Psychological ‘Vaccine’ Could Help Immunise Public Against ‘Fake News’ On Climate Change – Study

Psychological ‘vaccine’ could help immunise public against ‘fake news’ on climate change – study

 

source: www.cam.ac.uk

New research finds that misinformation on climate change can psychologically cancel out the influence of accurate statements. However, if legitimate facts are delivered with an “inoculation” – a warning dose of misinformation – some of the positive influence is preserved.

There will always be people completely resistant to change, but we tend to find there is room for most people to change their minds, even just a little

Sander van der Linden

In medicine, vaccinating against a virus involves exposing a body to a weakened version of the threat, enough to build a tolerance.

Social psychologists believe that a similar logic can be applied to help “inoculate” the public against misinformation, including the damaging influence of ‘fake news’ websites propagating myths about climate change.

A new study compared reactions to a well-known climate change fact with those to a popular misinformation campaign. When presented consecutively, the false material completely cancelled out the accurate statement in people’s minds – opinions ended up back where they started.

Researchers then added a small dose of misinformation to delivery of the climate change fact, by briefly introducing people to distortion tactics used by certain groups. This “inoculation” helped shift and hold opinions closer to the truth, despite the follow-up exposure to ‘fake news’.

The study on US attitudes found the inoculation technique shifted the climate change opinions of Republicans, Independents and Democrats alike.

Published in the journal Global Challenges, the study was conducted by researchers from the universities of Cambridge, UK, Yale and George Mason, US. It is one of the first on ‘inoculation theory’ to try and replicate a ‘real world’ scenario of conflicting information on a highly politicised subject.

“Misinformation can be sticky, spreading and replicating like a virus,” says lead author Dr Sander van der Linden, a social psychologist from the University of Cambridge and Director of the Cambridge Social Decision-Making Lab.

“We wanted to see if we could find a ‘vaccine’ by pre-emptively exposing people to a small amount of the type of misinformation they might experience. A warning that helps preserve the facts.

“The idea is to provide a cognitive repertoire that helps build up resistance to misinformation, so the next time people come across it they are less susceptible.”

Fact vs. Falsehood

To find the most compelling climate change falsehood currently influencing public opinion, van der Linden and colleagues tested popular statements from corners of the internet on a nationally representative sample of US citizens, with each one rated for familiarity and persuasiveness.

The winner: the assertion that there is no consensus among scientists, apparently supported by the Oregon Global Warming Petition Project. This website claims to hold a petition signed by “over 31,000 American scientists” stating there is no evidence that human CO2 release will cause climate change.

The study also used the accurate statement that “97% of scientists agree on manmade climate change”. Prior work by van der Linden has shown this fact about scientific consensus is an effective ‘gateway’ for public acceptance of climate change.

In a disguised experiment, researchers tested the opposing statements on over 2,000 participants across the US spectrum of age, education, gender and politics using the online platform Amazon Mechanical Turk.

In order to gauge shifts in opinion, each participant was asked to estimate current levels of scientific agreement on climate change throughout the study.

Those shown only the fact about climate change consensus (in pie chart form) reported a large increase in perceived scientific agreement – an average of 20 percentage points. Those shown only misinformation (a screenshot of the Oregon petition website) dropped their belief in a scientific consensus by 9 percentage points.

Some participants were shown the accurate pie chart followed by the erroneous Oregon petition. The researchers were surprised to find the two neutralised each other (a tiny difference of 0.5 percentage points).

“It’s uncomfortable to think that misinformation is so potent in our society,” says van der Linden. “A lot of people’s attitudes toward climate change aren’t very firm. They are aware there is a debate going on, but aren’t necessarily sure what to believe. Conflicting messages can leave them feeling back at square one.”

Psychological ‘inoculation’

Alongside the consensus fact, two groups in the study were randomly given ‘vaccines’:

  • A general inoculation, consisting of a warning that “some politically-motivated groups use misleading tactics to try and convince the public that there is a lot of disagreement among scientists”.
  • A detailed inoculation that picks apart the Oregon petition specifically. For example, by highlighting some of the signatories are fraudulent, such as Charles Darwin and members of the Spice Girls, and less than 1% of signatories have backgrounds in climate science.

For those ‘inoculated’ with this extra data, the misinformation that followed did not cancel out the accurate message.

The general inoculation saw an average opinion shift of 6.5 percentage points towards acceptance of the climate science consensus, despite exposure to fake news.

When the detailed inoculation was added to the general, it was almost 13 percentage points – two-thirds of the effect seen when participants were just given the consensus fact.

The research team point out that tobacco and fossil fuel companies have used psychological inoculation in the past to sow seeds of doubt, and to undermine scientific consensus in the public consciousness.

They say the latest study demonstrates that such techniques can be partially “reversed” to promote scientific consensus, and work in favour of the public good.

The researchers also analysed the results in terms of political parties. Before inoculation, the fake negated the factual for both Democrats and Independents. For Republicans, the fake actually overrode the facts by 9 percentage points.

However, following inoculation, the positive effects of the accurate information were preserved across all parties to match the average findings (around a third with just general inoculation; two-thirds with detailed).

“We found that inoculation messages were equally effective in shifting the opinions of Republicans, Independents and Democrats in a direction consistent with the conclusions of climate science,” says van der Linden.

“What’s striking is that, on average, we found no backfire effect to inoculation messages among groups predisposed to reject climate science, they didn’t seem to retreat into conspiracy theories.

“There will always be people completely resistant to change, but we tend to find there is room for most people to change their minds, even just a little.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Solar Storms Could Cost USA Tens of Billions of Dollars

Solar storms could cost USA tens of billions of dollars

source: www.cam.ac.uk

The daily economic cost to the USA from solar storm-induced electricity blackouts could be in the tens of billions of dollars, with more than half the loss from indirect costs outside the blackout zone, according to a new study led by University of Cambridge researchers.

Previous studies have focused on direct economic costs within the blackout zone, failing to take account of indirect domestic and international supply chain loss from extreme space weather.

According to the study, published in the journal Space Weather, on average the direct economic cost incurred from disruption to electricity represents just under a half of the total potential macroeconomic cost.

The paper was co-authored by researchers from the Cambridge Centre for Risk Studies at University of Cambridge Judge Business School, British Antarctic Survey, the British Geological Survey and the University of Cape Town.

Under the study’s most extreme blackout scenario, affecting two-thirds of the US population, the daily domestic economic loss could total $41.5 billion plus an additional $7 billion loss through the international supply chain.

Electrical engineering experts are divided on the possible severity of blackouts caused by “Coronal Mass Ejections,” or magnetic solar fields ejected during solar flares and other eruptions. Some believe that outages would last only hours or a few days because electrical collapse of the transmission system would protect electricity generating facilities, while others fear blackouts could last weeks or months because those transmission networks could in fact be knocked out and need replacement.

Extreme space weather events occur often, but only sometimes affecting Earth. The best-known geomagnetic storm affected Quebec in 1989, sparking the electrical collapse of the Hydro-Quebec power grid and causing a widespread blackout for about nine hours.

There was a very severe solar storm in 1859 known as the “Carrington event” (after the name of a British astronomer). A widely cited 2012 study by Pete Riley of Predictive Sciences Inc. said that the probability of another Carrington event occurring within the next decade is around 12 per cent; a 2013 report by insurer Lloyd’s, produced in collaboration with Atmospheric and Environmental Research, said that while the probability of an extreme solar storm is “relatively low at any given time, it is almost inevitable that one will occur eventually.”

“We felt it was important to look at how extreme space weather may affect domestic US production in various economic sectors, including manufacturing, government and finance, as well as the potential economic loss in other nations owing to supply chain linkages,” says study co-author Dr Edward Oughton of the Cambridge Centre for Risk Studies.

“It was surprising that there had been a lack of transparent research into these direct and indirect costs, given the uncertainty surrounding the vulnerability of electrical infrastructure to solar incidents.”

The study looks at three geographical scenarios for blackouts caused by extreme space weather, depending on the latitudes affected by different types of incidents.

If only extreme northern states are affected, with 8 per cent of the US population, the economic loss per day could reach $6.2 billion supplemented by an international supply chain loss of $0.8 billion. A scenario affecting 23 per cent of the population could have a daily cost of $16.5 billion plus $2.2 billion internationally, while a scenario affecting 44 per cent of the population could have a daily cost of $37.7 billion in the US plus $4.8 billion globally.

Manufacturing is the US economic sector most affected by those solar-induced blackouts, followed by government, finance and insurance, and property. Outside of the US, China would be most affected by the indirect cost of such US blackouts, followed by Canada and Mexico as these countries provide a greater proportion of raw materials, and intermediate goods and services, used in production by US firms.

Reference
Oughton, EJ et al. Quantifying the daily economic impact of extreme space weather due to failure in electricity transmission infrastructure. Space Weather; 18 Jan 2017; DOI: 10.1002/2016SW001491

Adapted from a press release by the Cambridge Judge Business School.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Graphene’s Sleeping Superconductivity Awakens

Graphene’s sleeping superconductivity awakens

source: www.cam.ac.uk

Since its discovery in 2004, scientists have believed that graphene may have the innate ability to superconduct. Now Cambridge researchers have found a way to activate that previously dormant potential.

It has long been postulated that graphene should undergo a superconducting transition, but can’t. The idea of this experiment was, if we couple graphene to a superconductor, can we switch that intrinsic superconductivity on?

Jason Robinson

Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor – meaning that it can be made to carry an electrical current with zero resistance.

The finding, reported in Nature Communications, further enhances the potential of graphene, which is already widely seen as a material that could revolutionise industries such as healthcare and electronics. Graphene is a two-dimensional sheet of carbon atoms and combines several remarkable properties; for example, it is very strong, but also light and flexible, and highly conductive.

Since its discovery in 2004, scientists have speculated that graphene may also have the capacity to be a superconductor. Until now, superconductivity in graphene has only been achieved by doping it with, or by placing it on, a superconducting material – a process which can compromise some of its other properties.

But in the new study, researchers at the University of Cambridge managed to activate the dormant potential for graphene to superconduct in its own right. This was achieved by coupling it with a material called praseodymium cerium copper oxide (PCCO).

Superconductors are already used in numerous applications. Because they generate large magnetic fields they are an essential component in MRI scanners and levitating trains. They could also be used to make energy-efficient power lines and devices capable of storing energy for millions of years.

Superconducting graphene opens up yet more possibilities. The researchers suggest, for example, that graphene could now be used to create new types of superconducting quantum devices for high-speed computing. Intriguingly, it might also be used to prove the existence of a mysterious form of superconductivity known as “p-wave” superconductivity, which academics have been struggling to verify for more than 20 years.

The research was led by Dr Angelo Di Bernardo and Dr Jason Robinson, Fellows at St John’s College, University of Cambridge, alongside collaborators Professor Andrea Ferrari, from the Cambridge Graphene Centre; Professor Oded Millo, from the Hebrew University of Jerusalem, and Professor Jacob Linder, at the Norwegian University of Science and Technology in Trondheim.

“It has long been postulated that, under the right conditions, graphene should undergo a superconducting transition, but can’t,” Robinson said. “The idea of this experiment was, if we couple graphene to a superconductor, can we switch that intrinsic superconductivity on? The question then becomes how do you know that the superconductivity you are seeing is coming from within the graphene itself, and not the underlying superconductor?”

Similar approaches have been taken in previous studies using metallic-based superconductors, but with limited success. “Placing graphene on a metal can dramatically alter the properties so it is technically no longer behaving as we would expect,” Di Bernardo said. “What you see is not graphene’s intrinsic superconductivity, but simply that of the underlying superconductor being passed on.”

PCCO is an oxide from a wider class of superconducting materials called “cuprates”. It also has well-understood electronic properties, and using a technique called scanning and tunnelling microscopy, the researchers were able to distinguish the superconductivity in PCCO from the superconductivity observed in graphene.

Superconductivity is characterised by the way the electrons interact: within a superconductor electrons form pairs, and the spin alignment between the electrons of a pair may be different depending on the type – or “symmetry” – of superconductivity involved. In PCCO, for example, the pairs’ spin state is misaligned (antiparallel), in what is known as a “d-wave state”.

By contrast, when graphene was coupled to superconducting PCCO in the Cambridge-led  experiment, the results suggested that the electron pairs within graphene were in a p-wave state. “What we saw in the graphene was, in other words, a very different type of superconductivity than in PCCO,” Robinson said. “This was a really important step because it meant that we knew the superconductivity was not coming from outside it and that the PCCO was therefore only required to unleash the intrinsic superconductivity of graphene.”

It remains unclear what type of superconductivity the team activated, but their results strongly indicate that it is the elusive “p-wave” form. If so, the study could transform the ongoing debate about whether this mysterious type of superconductivity exists, and – if so – what exactly it is.

In 1994, researchers in Japan fabricated a triplet superconductor that may have a p-wave symmetry using a material called strontium ruthenate (SRO). The p-wave symmetry of SRO has never been fully verified, partly hindered by the fact that SRO is a bulky crystal, which makes it challenging to fabricate into the type of devices necessary to test theoretical predictions.

“If p-wave superconductivity is indeed being created in graphene, graphene could be used as a scaffold for the creation and exploration of a whole new spectrum of superconducting devices for fundamental and applied research areas,” Robinson said. “Such experiments would necessarily lead to new science through a better understanding of p-wave superconductivity, and how it behaves in different devices and settings.”

The study also has further implications. For example, it suggests that graphene could be used to make a transistor-like device in a superconducting circuit, and that its superconductivity could be incorporated into molecular electronics. “In principle, given the variety of chemical molecules that can bind to graphene’s surface, this research can result in the development of molecular electronics devices with novel functionalities based on superconducting graphene,” Di Bernardo added.

The study, p-wave triggered superconductivity in single layer graphene on an electron-doped oxide superconductor, is published in Nature Communications. (DOI: 101038/NCOMMS14024).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Raspberry Pi Makers Release Own-Brand OS

Raspberry Pi makers release own-brand OS

source: www.bbc.co.uk

Raspberry PiImage copyrightAFP
Image captionThe first Raspberry Pi was released in 2012 and more than 10 million have now been sold

The makers of the Raspberry Pi computer have created a version of its graphical front end that can run on ordinary desktop computers.

The Pixel desktop has been re-worked so it runs on PCs and Apple Mac machines, said the Foundation.

People who use it on a Raspberry Pi and other machines will now get the same familiar software across both.

The Pi Foundation said the release also aided its plan to produce the “best” desktop computing experience.

Most popular

Raspberry Pi co-creator Eben Upton said the software should help schoolchildren who use the credit-card sized Pi in class or for their own projects but have to continue their work on PCs or Macs.

The Pi edition of Pixel and the version translated for bigger machines uses “exactly the same productivity software and programming tools, in exactly the same desktop environment”, he wrote.

“There is no learning curve, and no need to tweak… schoolwork to run on two subtly different operating systems,” he said.

Tim PeakeImage copyrightESA
Image captionUK astronaut Tim Peake took two Raspberry Pis to the International Space Station

In addition, he said, producing such a version of Pixel kept the Raspberry Pi foundation “honest” as it would help the organisation’s coders work out what bits of the user interface needed work.

Mr Upton said that because the core software underlying Pixel was based on a relatively old computer architecture, it should run on “vintage” machines.

He warned that the software was still “experimental” so might have bugs or other “minor issues” that might mean it does not run well on some machines.

Pixel was first released in September this year and overhauled the main graphical interface owners see and use when working with their Pi. It is based on a version of the open source Linux software known as Debian.

The desktop version lacks two programs – Minecraft and Mathematica – because the Pi organisation has not licensed those applications on any machines other than its own.

In April last year, the Raspberry Pi officially became the most popular British computer ever made. More than 10 million have now been sold.

The computer was first released in 2012 and is widely used as an educational tool for programming.

IMF Lending Conditions Curb Healthcare Investment In West Africa, Study Finds

IMF lending conditions curb healthcare investment in West Africa, study finds

source: www.cam.ac.uk

Research shows budget reduction targets and public sector caps, insisted on by the IMF as loan conditions, result in reduced health spending and medical ‘brain drain’ in developing West African nations.

We show that the IMF has undermined health systems – a legacy of neglect that affects West Africa’s progress towards achieving universal health coverage

Thomas Stubbs

A new study suggests that lending conditions imposed by the International Monetary Fund in West Africa squeeze “fiscal space” in nations such as Sierra Leone – preventing government investment in health systems and, in some cases, contributing to an exodus of medical talent from countries that need it most.

Researchers from the Universities of Cambridge, Oxford and the London School of Hygiene & Tropical Medicine analysed the IMF’s own primary documents to evaluate the relationship between IMF-mandated policy reforms – the conditions of loans – and government health spending in West African countries.

The team collected archival material, including IMF staff reports and government policy memoranda, to identify policy reforms in loan agreements between 1995 and 2014, extracting 8,344 reforms across 16 countries.

They found that for every additional IMF condition that is ‘binding’ – i.e. failure to implement means automatic loan suspension – government health expenditure per capita in the region is reduced by around 0.25%.

A typical IMF programme contains 25 such reforms per year, amounting to a 6.2% reduction in health spending for the average West African country annually.

The researchers say that this is often the result of a policy focus on budget deficit reduction over healthcare, or the funnelling of finance back into international reserves – all macroeconomic targets set by IMF conditions.

The authors of the new study, published in the journal Social Science and Medicine, say their findings show that the IMF “impedes progress toward the attainment of universal health coverage”, and that – under direct IMF tutelage – West African countries underfunded their health systems.

“The IMF proclaims it strengthens health systems as part of its lending programs,” said lead author Thomas Stubbs, from Cambridge’s Department of Sociology, who conducted the study with Prof Lawrence King. “Yet, inappropriate policy design in IMF programmes has impeded the development of public health systems in the region over the past two decades.”

A growing number of IMF loans to West Africa now include social spending targets to ensure that spending on health, education and other priorities are protected. These are not binding, however, and the study found that fewer than half are actually met.

“Stringent IMF-mandated austerity measures explain part of this trend,” said Stubbs. “As countries engage in fiscal belt-tightening to meet the IMF’s macroeconomic targets, few funds are left for maintaining health spending at adequate levels.”

The study also shows that the 16 West African countries experienced a combined total of 211 years with IMF conditions between 1995 and 2014. Some 45% of these included conditions stipulating layoffs or caps on public-sector recruitment and limits to the wage bill.

The researchers uncovered correspondence from national governments to the IMF arguing that imposed conditions are hindering recruitment of healthcare staff, something they found was often borne out by World Health Organisation (WHO) data. For example:

  • In 2004, Cabo Verde told the IMF that meeting their fiscal targets would interrupt recruitment of new doctors. The country later reported to the WHO a 48% reduction in physician numbers between 2004 and 2006.
  • In 2005, a series of IMF conditions aimed to reduce Ghana’s public sector wage bill. The Ghanaian Minister of Finance wrote to the IMF that “at the current level of remuneration, the civil service is losing highly productive employees, particularly in the health sector”. Wage ceilings remained until late-2006, and the number of physicians in Ghana halved.

“IMF-supported reforms have stopped many African countries hiring, retaining or paying healthcare staff properly,” said co-author Alexander Kentikelenis, based at the University of Oxford.

“Macroeconomic targets set by the IMF – for example, on budget deficit reduction – crowd out health concerns, so governments do not adequately invest in health.”

The IMF’s extended presence in West Africa – on average 13 out of 20 years per country – has caused considerable controversy among public health practitioners, say the researchers.

“While critics stress inappropriate or dogmatic policy design that undermines health system development, the IMF has argued its reforms bolster health policy,” said Stubbs.

“We show that the IMF has undermined health systems – a legacy of neglect that affects West Africa’s progress towards achieving universal health coverage, a key objective of the United Nation’s Sustainable Development Goals.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

New Report On Macro-Economic Impact of Brexit Qestions Treasury Forecasts

New report on macro-economic impact of Brexit questions Treasury forecasts

source: www.cam.ac.uk

Economists at the Centre for Business Research (CBR) have challenged the assumptions of the Treasury in their new forecast for the UK economy and the impact of Brexit in 2017.

There is a public demand for independent, non-partisan research, conducted outside government and the political arena. There is an important role for University-based research; this should feed into the process of deliberation as Brexit unfolds

Simon Deakin

The economists have also been working with lawyers at the CBR to explore the possible impact of Brexit. They warn that the UK is in danger of remaining a low wage, low skill country unless it can create the conditions for a reorientation of its economic model post-Brexit.

In a new podcast for the CBR, based at the Cambridge Judge Business School, Graham Gudgin, one of the authors of the new report: The Macro-Economic Impact of Brexit: Using the CBR Macro-Economic Model of the UK Economy (UKMOD), and Simon Deakin, Director of the CBR and Professor of Law at the University of Cambridge, discuss how the UK economy is likely to perform in 2017 and what would be the best model for leaving the EU.

Gudgin says that, according to The Treasury, we should have been in recession by now, but we are not. “Things are maybe a bit delayed but the whole succession of investment announcements we have had from Nissan, Microsoft and others suggests that companies are taking a much more sanguine view of this than the Treasury and others have suggested,” he says.

Possible options: from the EEA to bespoke trade deals

Deakin explains the four possible options the UK government could pursue for leaving the EU: re-joining (or remaining in) the EEA (European Economic Area); becoming a member of the EU’s customs union; undertaking a series of bespoke trade deals, such as Switzerland has; or, if none of the above apply, defaulting to the rules of the World Trade Organization. The first two of these would mean accepting the free movement of persons, which the current policy of the British government appears to rule out.

Gudgin suggests that, since the policy of migration control is likely to be maintained, the UK will try to negotiate a new trade deal with the EU, perhaps along the lines of the recent EU-Canada agreement. Some argue that this could take a decade or more to achieve, but Gudgin takes the view that since we are starting from an existing free-trade situation, the task is much easier.

No quick agreement is however likely and whether the EU and the UK can negotiate transitional arrangements to bridge that gap remains to be seen. Any new trade deal will also have to be conducted within the framework of WTO rules, says Gudgin, which will add to the complexity of the negotiations.

Deakin explains: “Nearly every European country is either in the single market or in the customs union. For example, Norway, via the EEA, is in the single market but not the customs union; Turkey is in the customs union but not the single market.  There are a number of other options. Switzerland is not part of the European Economic Area, but has a number of bilateral trade deals with the EU.

“These are conditional on Switzerland allowing free movement of labour (which a recent Swiss referendum vetoed) and capital. Countries in the single market, including those in the EEA, must conform to EU rules and regulations regarding product standards, labour laws and environmental protection, among other things.

“Customs union membership implies internal free trade and a single external tariff, but countries outside the EU which are in that position, such as Turkey, cannot make their own trade deals with third countries. If we went for that option post-Brexit, we would not be bound by all the rules of the single market but we couldn’t do our own trade deals with third countries.

The WTO also have rules on how migrant workers may be treated in host states which are not that dissimilar to those operating in the EU’s single market

Simon Deakin

“If we were in the EEA we couldn’t avoid rules on free movement of labour or capital. You either accept the four freedoms, the movement of goods, services, people and capital over borders, or you don’t; you can’t cherry pick. The UK could try for a Swiss style option where you try to have free movement but then modify it somewhat, but the Swiss have had to sign up to most aspects of free movement in order to get access to the single market. To get around the rules of free movement of labour and capital it is highly likely the UK would have to be outside the EEA.

“We could still sign up to the customs union, Turkey isn’t subject to the rules on free movement of labour, for example, nor is the EU required to accept free movement of persons from Turkey into the EU, but then we wouldn’t have the freedom to do trade deals with third countries, which the UK has said it wants to have; that is why the International Trade Department was brought back.

“WTO rules do not require member states to accept free movement of labour, they do, however, contain some rules on issues like state aids, to prevent distortions of international trade. The WTO also have rules on how migrant workers may be treated in host states which are not that dissimilar to those operating in the EU’s single market, and are highly contentious for the same reasons. WTO rules on these issues are generally not as strict as EU laws and do not form part of UK domestic law. International law obligations cannot be enforced in the same way as EU laws can be. However, the WTO option is not a blank slate for the UK.”

Trade deal within two years – unlikely

Deakin goes on to say that as things stand there is uncertainty over what Brexit might mean, even if it is possible to identify some of the main features of each of the principal options: “Lawyers can say what the general framework is for each of these four options, EEA, customs union, Swiss option, WTO, but until we know more about how the government will wish to conduct its negotiations with the EU and about the EU’s position going forward it is hard to make predictions. There are many issues we don’t have a clear answer to.

“We can sketch out broadly what happens for each of these main options but I think there is a case for more research to be done. It is most unlikely that there will be a trade deal negotiated within two years of triggering Article 50, and as the process of negotiation and deliberation unfolds new issues will arise. These may crop up at sector level, particular industries may have issues that need to be worked through, and individual companies may raise points about their position and if they receive guarantees from government there will be issues of state aids to consider under both EU and WTO law. At the moment we just don’t have a good set of answers to these questions.”

Deakin says a transitional agreement with the EU would need to be a one-off bespoke arrangement as there is no provision for such an agreement within EU treaties: “We are bound by EU law until we leave, we are bound by international law to maintain the treaties that we have signed up to until we withdraw from them. Until the European Communities Act is repealed we must apply EU law domestically and even after the so called Great Repeal Act, which the government has promised to bring in, is implemented, many of the same provisions will be replicated within UK law.

Transitional agreement?

“There is talk of a so-called transitional agreement and that could involve staying in the EEA, while things are worked out, but there is no obligation on the side of the EU to offer us a transitional deal.  This would have to be a bespoke arrangement as it is not provided for at the moment under the EU treaties. It remains to be seen if that sort of soft landing is possible, let’s see what is put on the table after negotiations between the UK and the EU begin. Whatever happens, we need to understand the institutional impact of Brexit in order to get a better understanding of what its economic effects will be.

“We do need independent research to be carried out on this question because so far most of the research that has been done on this has been by one or other side of the Brexit argument. The government has its own researchers in the civil service and of course this is objective, high quality research; the OBR is doing independent economic forecasting. However, there is a public demand for independent, non-partisan research, conducted outside government and the political arena. Thus there is an important role for University-based research; this should feed into the process of deliberation as Brexit unfolds”.

We have looked very carefully at what the Treasury has said about this and we find its work very flawed and very partisan

Graham Gudgin

Gudgin agrees with Deakin that better research and economic forecasting models are needed. He thinks that in reality the only option for the UK to leave the EU, other than the WTO fall-back, is under the terms of the so called Canadian model.

“There are probably only two practical options. One is a free trade agreement along the lines of the one Canada has just signed, or else no agreement on trade in which case you fall back on WTO rules. The impact of both of those is pretty uncertain. We have looked very carefully at what the Treasury has said about this and we find its work very flawed and very partisan. It is not objective. I agree with Deakin that we need some more objective economic work on this, the whole debate has been coloured by a lot of hyperbolic discussion.

The Treasury: four quarters of recession?

“The Treasury said there would be four quarters of recession, we have had six months since the Brexit vote, we should have been in recession by now, but we are not. Things are maybe a bit delayed but the whole succession of investment announcements we have had from Nissan, Microsoft and others suggests that companies are taking a much more sanguine view of this than the Treasury and others have suggested.

“We have looked at the Nissan deal in terms of what degree of currency depreciation you would need to offset the 10 per cent tariff that motor manufacturers could face under WTO rules, and the answer to us is that it looks like a 15 per cent depreciation of sterling would offset a 10 per cent tariff. We have already had a 12 per cent depreciation so we are pretty well there. This may have been what the government was relying upon: it is the currency depreciation that bridges that gap.”

Gudgin says that the EU has not been very good at agreeing free trade deals with third countries: “Theresa May has said very clearly that there will be control over migration and she rightly recognises that was the key point in the referendum. The EEA and Swiss bilateral treaties all depend on free movement of labour. It shows just how difficult even the Swiss approach is.

“The Canadian model is a free trade agreement which any country can have with the EU, but historically the EU has not been good at having free trade agreements with others. It doesn’t have a free trade agreement with China or the US, and some people such as the Economists For Brexit see the EU as being a highly protectionist organisation. If the EU has a free trade agreement with Canada, good heavens, they surely can have one with the UK.”

Gudgin explains these predictions in the podcast:

  • “2017 won’t be a great year but growth of GDP will be between 1.0 and 1.5 per cent rather than the 2 per cent it would have been without Brexit. It could even be 2 per cent but we don’t yet really know much about company investment intentions. GDP growth is slowing but will not be too bad.
  • “The sterling depreciation of 10 to 12 per cent will mean inflation will rise to about 3 per cent by the end of 2017. It will be higher than it has been for some years. The big question is will inflation get out of hand and we don’t think it will. Remember most countries have been trying to increase their inflation up to 2 per cent to get their exchange rates down. The UK has done it in one bound.
  • “The UKMOD equations tell us wages will start to rise as prices rise. We are pretty close to full employment, so workers have bargaining power. The Bank of England published its forecast for wages recently and we agree wages will rise to something like 3 per cent by the end of 2017.”

The above text was originally posted as a blog on the Judge Business School website. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Study Named As People’s Choice For Science Magazine’s ‘Breakthrough of the Year 2016’

Cambridge study named as People’s Choice for Science magazine’s ‘Breakthrough of the Year 2016’

source: www.cam.ac.uk

Cambridge research that will enable scientists to grow and study embryos in the lab for almost two weeks has been named as the People’s Choice for Science magazine’s ‘Breakthrough of the Year 2016’

It’s a natural human instinct to be curious about where we come from, but until now, technical hurdles have meant there’s been a huge gap in our understanding of how embryos develop

Magdalena Zernicka-Goetz

The work, led by Professor Magdalena Zernicka-Goetz from the Department of Physiology, Development and Neuroscience at the University of Cambridge, was the focus of parallel publications earlier this year in the journals Nature Cell Biologyand Nature.

Professor Zernicka-Goetz and colleagues developed a new technique that allows embryos to develop in vitro, in the absence of maternal tissue, beyond the implantation stage (when the embryo would normally implant into the womb). This will allow researchers to analyse for the first time key stages of human embryo development up to 13 days after fertilisation. The technique could open up new avenues of research aimed at helping improve the chances of success of IVF.

“It’s a wonderful honour to have been given such public recognition for our work,” says Professor Zernicka-Goetz, whose work was funded by Wellcome. “It’s a natural human instinct to be curious about where we come from, but until now, technical hurdles have meant there’s been a huge gap in our understanding of how embryos develop. We hope that our technique will crack open this ‘black box’ and allow us to learn more about our development.”

Dr Marta Shahbazi, one of the co-first authors of the Nature Cell Biology paper, also from Cambridge, adds: “In the same year where scientists have found evidence of gravitational waves, it’s amazing that the public has chosen our work as the most important scientific breakthrough. While our study will help satisfy our scientific curiosity, it is likely to help us better understand what happens in miscarriage and why the success rates for IVF are so low.”

The work builds on research pioneered by Professor Sir Robert Edwards, for which he was awarded the Nobel Prize in physiology or medicine in 2010. Professor Edwards developed the technique known as in vitro fertilisation (IVF), demonstrating that it was possible to fertilise an egg and culture it in the laboratory for the first six days of development. His work led to the first ever ‘test tube baby’, Louise Brown.

The award has been welcomed by Dr Jim Smith, Director of Science at Wellcome: “I’m really pleased to see Magda’s fantastic work recognised by Science, and we send our warmest congratulations to her and her team. In almost doubling the time we can culture human embryos in the lab, she has created completely new opportunities for developmental biologists to understand how we develop. It’s a great achievement, and Wellcome is proud to have supported her ground-breaking work.”

Science – Breakthrough of the Year 2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Glue’ That Makes Plant Cell Wall Strong Could Hold The Key To Wooden Skyscrapers

‘Glue’ that makes plant cell walls strong could hold the key to wooden skyscrapers

source: www.cam.ac.uk

Molecules 10,000 times narrower than the width of a human hair could hold the key to making possible wooden skyscrapers and more energy-efficient paper production, according to research published today in the journal Nature Communications. The study, led by a father and son team at the Universities of Warwick and Cambridge, solves a long-standing mystery of how key sugars in cells bind to form strong, indigestible materials.

We knew the answer must be elegant and simple. And in fact, it was

Paul Dupree

The two most common large molecules – or ‘polymers’ – found on Earth are cellulose and xylan, both of which are found in the cell walls of materials such as wood and straw. They play a key role in determining the strength of materials and how easily they can be digested.

For some time, scientists have known that these two polymers must somehow stick together to allow the formation of strong plant walls, but how this occurs has, until now, remained a mystery: xylan is a long, winding polymer with so-called ‘decorations’ of other sugars and molecules attached, so how could this adhere to the thick, rod-like cellulose molecules?

“We knew the answer must be elegant and simple,” explains Professor Paul Dupree from the Department of Biochemistry at the University of Cambridge, who led the research. “And in fact, it was. What we found was that cellulose induces xylan to untwist itself and straighten out, allowing it to attach itself to the cellulose molecule. It then acts as a kind of ‘glue’ that can protect cellulose or bind the molecules together, making very strong structures.”

The finding was made possible due to an unexpected discovery several years ago in Arabidopsis, a small flowering plant related to cabbage and mustard. Professor Dupree and colleagues showed that the decorations on xylan can only occur on alternate sugar molecules within the polymer – in effect meaning that the decorations only appear on one side of xylan. This led the team of researchers to survey other plants in the Cambridge University Botanic Garden and discover that the phenomenon appears to occur in all plants, meaning it must have evolved in ancient times, and must be important.

To explore this in more detail, they turned to an imaging technique known as solid state nuclear magnetic resonance (ssNMR), which is based on the same physics as hospital MRI scanners, but can reveal structure at the nanoscale. However, while ssNMR can image carbon, it requires a particular heavy isotope of carbon, carbon-13. This meant that the team had to grow their plants in an atmosphere enriched with a special form of carbon dioxide – carbon-13 dioxide.

Professor Ray Dupree – Paul Dupree’s father, and a co-author on the paper – supervised the work at the University of Warwick’s ssNMR laboratory. “By studying these molecules, which are over 10,000 times narrower than the width of a human hair, we could see for the first time how cellulose and xylan slot together and why this makes for such strong cell walls.”

Understanding how cellulose and xylan fit together could have a dramatic effect on industries as diverse as biofuels, paper production and agriculture, according to Paul Dupree.

“One of the biggest barriers to ‘digesting’ plants – whether that’s for use as biofuels or as animal feed, for example – has been breaking down the tough cellular walls,” he says. “Take paper production – enormous amounts of energy are required for this process. A better understanding of the relationship between cellulose and xylan could help us vastly reduce the amount of energy required for such processes.”

But just as this could improve how easily materials can be broken down, the discovery may also help them create stronger materials, he says. There are already plans to build houses in the UK more sustainably using wood, and Paul Dupree is involved in the Centre for Natural Material Innovation at the University of Cambridge, which is looking at whether buildings as tall as skyscrapers could be built using modified wood.

The research was funded by the Biotechnology and Biological Sciences Research Council (BBSRC).

Reference
Simmons, TJ et al. Folding of xylan onto cellulose fibrils in plant cell walls revealed by solid-state NMR. Nature Communications; Date; DOI: 10.1038/ncomms13902


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Patients Show Considerable Improvements After Treatment For Newly-Defined Movement Disorder

Patients show considerable improvements after treatment for newly-defined movement disorder

source: www.cam.ac.uk

DNA sequencing has defined a new genetic disorder that affects movement, enabling patients with dystonia — a disabling condition that affects voluntary movement — to be targeted for treatment that brings remarkable improvements, including restoring independent walking.

We can [now] aim for a more ‘precision medicine approach’ to target treatment with deep brain stimulation to those likely to benefit: most importantly, we would anticipate improvement in many of those treated

Lucy Raymond

A team of researchers from UCL Great Ormond Street Institute of Child Health, University of Cambridge and the NIHR Rare Disease Bioresource have identified mutations in a gene, called KMT2B, in 28 patients with dystonia. In most cases, the patients — many of whom were young children who were thought to have a diagnosis of cerebral palsy — were unable to walk.

Remarkably, for some patients, treatment with deep brain stimulation, in which electrical impulses are delivered to a specific brain region involved in movement, either restored or significantly improved independent walking and improved hand and arm movement. In one patient, improvements have been sustained over six years.

Given these observations, the team now suggest that testing for mutations in the gene should form part of standard testing for patients with dystonia, as this is emerging to be one of the commonest genetic causes of childhood-onset dystonia.

The research is published in Nature Genetics.

Dystonia is one of the commonest movement disorders and is thought to affect 70,000 people in the UK alone. It can cause a wide range of disabling symptoms, including painful muscle spasms and abnormal postures, and can affect walking and speech.

Through research testing of patients, the team discovered a region of chromosome 19 that was deleted from the genome of some patients with childhood-onset dystonia. Together with the NIHR Rare Disease Bioresource and international collaborators, the team then identified abnormal genetic changes in genomes from a further 18 patients in one gene, called KMT2B, where affected patients carried a mutated in their DNA.

“Through DNA sequencing, we have identified a new genetic movement disorder that can be treated with deep brain stimulation. This can dramatically improve the lives of children with the condition and enable them to have a wider range of movement with long-lasting effects,” says Dr Manju Kurian, paediatric neurologist at Great Ormond Street Hospital and Wellcome Trust-funded researcher at UCL Great Ormond Street Institute of Child Health.

“Our results, though in a relatively small group of patients, show the power of genomic research not only to identify new diseases, but also to reveal possible approaches that will allow other patients to benefit.”

The KMT2B protein is thought to alter the activity of other genes. The team believes that the mutations impair the ability of the KMT2B protein to carry out its normal, crucial role in controlling the expression of genes involved in voluntary movement.

A number of patients were previously thought to have cerebral palsy prior to confirmation of their genetic diagnosis. Such uncertainty could be addressed by looking for KMT2B mutations as part of a diagnostic approach.

Although affected patients have been found to have a mutation in their DNA, this severe condition is rarely inherited from either parent but usually occurs for the first time in the affected child.

“Most patients show a progressive disease course with worsening dystonia over time,” continues Dr Kurian. “Many patients did not show any response to the usual medications that we use for dystonia so we knew we would have to consider other strategies. We know, from our experience with other patients with dystonia, that deep brain stimulation might improve our patient’s symptoms, so were keen to see what response patients would have to this type of treatment.”

“Remarkably nearly all patients who had deep brain stimulation showed considerable improvements. One patient was able to walk independently within two weeks; in five patients, the improvement has lasted for more than three years. It is an astounding result.”

Given the dramatic effects seen in their patients with this newly defined genetic condition, the team propose that referral for assessment of deep brain stimulation should be considered for all patients with a mutation in KMT2B. In the future, the team hopes that, by diagnosing additional patients, the full spectrum of this new condition will be more apparent and patients and their families might see real benefit.

“It is only through the amazing generosity and efforts of patients and their families that we can begin to search for better answers and treatments: we admire their contribution,” says Professor Lucy Raymond, Professor of Medical Genetics and Neurodevelopment at the University of Cambridge. “Through participating in our research, they have helped us to identify patients with KMT2B-related dystonia, meaning we can aim for a more ‘precision medicine approach’ to target treatment with deep brain stimulation to those likely to benefit: most importantly, we would anticipate improvement in many of those treated.

“The lesson from our study is simple and clear: because confirming this diagnosis has implications for therapy, we should test all patients with suspected genetic dystonia for mutations in KMT2B.”

Reference
Meyer E, Carss KJ, Rankin J et al. Mutations in the Histone Methyltransferase Gene, KMT2B Cause Early Onset Dystonia. Nature Genetics; 19 Dec 2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

System is Failing to Prevent Deaths Following Police Custody and Prison, Study Suggests

System is failing to prevent deaths following police custody and prison, study suggests

source: www.cam.ac.uk

Poor access to health care and confusion over post-detention care may have contributed to more than 400 deaths following police custody and prison detention since 2009, a new report has claimed. Here, in an article first published on The Conversation, report authors Loraine Gelsthorpe and Nicola Padfield of Cambridge’s Faculty of Law, along with their colleague Jake Phillips from Sheffield Hallam University, discuss their findings.

Deaths post-detention should also be subject to similar levels of investigation as those that occur in police custody and prison

Getting released from prison or police custody can be a huge shock to those who have been incarcerated. Our new researchgives an indication of just how vulnerable these people can be. We found that over a seven-year period, 400 people died of a suspected suicide within 48 hours of leaving police detention.

The number of people dying in prisonsand in police custody has been increasing for several years. There is, rightly, a statutory obligation for every death that occurs within a state institution to be investigated by an independent body. So each death in a prison is investigated by the Prisons and Probation Ombudsman (PPO), while the equivalent in police stations are investigated by the Independent Police Complaints Commission (IPCC).

But for people who die shortly after release from police or prison custody, their deaths are not subject to statutory investigation and are too often invisible.

A dangerous transition

Our research, published by the Equality and Human Rights Commission, looked into non-natural deaths of people who have been released from police detention or prison custody. We found that the data on these deaths is contingent upon the relevant institutions (prisons, police or probation) finding out about the death in the first place – and this can be difficult.

We examined two sets of data: IPCC data on suspected suicides that occurred within 48 hours of release from police detention and data from the National Offender Management Service on deaths of people under probation supervision, which includes those released from prison. We also conducted interviews with 15 custody sergeants – police officers who are responsible for the welfare of a detainee while in a police station – prison officers and others such as representatives of police and crime commissioners (PCCs) and Public Health England.

The IPCC data suggest that 400 people died between 2009 and 2016 of a suspected suicide within 48 hours of release, although this number declined between the years 2014-15 and 2015-16, as the graph below shows. People who had been detained on suspicion of sex offences accounted for 32% of the 400 total suspected suicides.

We also examined a selection of 41 investigations and summaries of investigations into apparent post-release suicides that were provided to us by the IPCC. Half of these people had pre-existing mental health conditions. These referrals also pointed to inadequate risk assessment, record keeping and onward referral to relevant community-based care providers such as mental health or drug treatment providers.

We then looked at deaths that had occurred within 28 days of release from prison. Despite some issues with the accuracy and completeness of the data, we identified 66 people between 2010 and 2015 who had died from non-natural causes within 28 days of leaving prison. The numbers are small and so it is difficult to draw wider conclusions, but we found that 44 of those 66 died from a drug-related death. Of the 66, 35 had served a sentence for an acquisitive offence such as theft, shoplifting or robbery, offences which are commonly associated with drug use.

We also analysed investigations conducted between 2010 and 2015 by the PPO into deaths that occurred in approved premises, also known as bail hostels, within 28 days of release from custody. These investigations seek to understand what, if anything, could have been done to prevent the death. This highlighted problems with supporting drug-using offenders, a lack of confidence among staff and a failure to create a smooth transition from prison into the community.

Staff under strain

These analyses only tell part of the story. Our discussions with custody officers painted a complex picture. They argued that they were getting better at identifying people in custody with mental health conditions but that their ability to deal with them effectively was restricted by factors beyond their control such as a lack of appropriate treatment for people after leaving their care and an inadequate number of beds in mental health hospitals. They told us that the risk assessment tool they use for identifying such people was not fit for purpose because it did not go into enough detail and that they would benefit from additional mental health training. They were also strongly in favour of the responsibility for healthcare commissioning in police stations being handed to the NHS, rather than PCCs, a proposal which was dropped in December 2015.

The story from prison staff was similar, but they also talked about the use of new psychoactive substances and the negative effects these substances are having on mental health and safety in the prison.

Problems also exist when it comes to the provision of community-based care after people are released. These include cuts to community mental health services and drug services, as well as recent changes to the probation service, which have seen 70% of the service outsourced to the private sector. Such reforms have made communication between prisons and probation providers more difficult. These budget cuts and public sector reforms are having a serious impact on the ability of criminal justice agencies to deal with these issues and prevent any future deaths.

There needs to be an improvement in the way in which data on non-natural deaths is collected. Deaths post-detention should also be subject to similar levels of investigation as those that occur in police custody and prison. It would be naive to suggest that all deaths of people leaving state detention can be investigated, but there is scope for more oversight from both the IPCC and PPO, at least while they are adjusting to life back in the community. At the same time, the government must maintain investment in mental health and drug services to help prevent those most vulnerable when they are released from detention from taking their own life.

This article was originally published on The Conversation. Read the original article.

Professor Loraine Gelsthorpe is Deputy Director of the Institute of Criminology, University of Cambridge.

Nicola Padfield is Master, Fitzwilliam College, Cambridge, and a Reader in Criminal and Penal Justice, University of Cambridge.

Jake Phillips is a Senior Lecturer in Criminology, Sheffield Hallam University.

 

 

 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Larger Brain Size Linked To Longer Life In Deer

Larger brain size linked to longer life in deer

source: www.cam.ac.uk

The size of a female animals’ brain may determine whether they live longer and have more healthy offspring, according to new research led by the University of Cambridge.

We found that some of the cross-species predictions about brain size held for female red deer, and that none of the predictions were supported in male red deer. This indicates that each sex likely experiences its own set of trade-offs with regard to brain size.

Corina Logan

The study, published in the Royal Society Open Science journal, shows that female red deer with larger brains live longer and have more surviving offspring than those with smaller brains. Brain size is heritable and is passed down through the generations. This is the first extensive study of individual differences in brain size in wild mammals and draws on data comparing seven generations of deer.

Across species of mammals, brain size varies widely. This is thought to be a consequence of specific differences in the benefits and costs of a larger brain. Mammals with larger brains may, for example, have greater cognitive abilities that enable them to adapt better to environmental changes or they may have longer lifespans. But there may also be disadvantages: for instance, larger brains require more energy, so individuals that possess them may show reduced fertility.

The researchers, based at the University of Cambridge’s Zoology Department and Edinburgh University’s Institute of Evolutionary Biology, wanted to test if they could find more direct genetic or non-genetic evidence of the costs and benefits of large brain size by comparing the longevity and survival of individuals of the same species with different sized brains. Using the skulls of 1,314 wild red deer whose life histories and breeding success had been monitored in the course of a long-term study on the Isle of Rum, they found that females with larger endocranial volumes lived longer and produced more surviving offspring in the course of their lives.

Lead author Dr Corina Logan, a Gates Cambridge Scholar and Leverhulme Early Career Research Fellow in Cambridge’s Department of Zoology, says: “The reasons for the association between brain size and longevity are not known, but other studies have suggested that larger brains are a consequence of the longer-lived species having longer developmental periods in which the brain can grow. These hypotheses were generated from cross-species correlations; however, testing such hypotheses requires investigations at the within-species level, which is what we did.”

Dr Logan adds: “We found that some of the cross-species predictions about brain size held for female red deer, and that none of the predictions were supported in male red deer. This indicates that each sex likely experiences its own set of trade-offs with regard to brain size.”

The study also showed that females’ relative endocranial volume is smaller than that of males, despite evidence of selection for larger brains in females.

“We think this is likely due to sex differences in the costs and benefits related to larger brains,” adds Dr Logan. “We don’t know what kinds of trade-offs each sex might encounter, but we assume there must be variables that constrain brain size that are sex specific, which is why we see selection in females, but not males.”

Professor Tim Clutton-Brock, who set up the Rum Red Deer study with Fiona Guinness in 1972 and initiated the work on brain size, points out that the reason that this kind of study has not been conducted before is that it requires long term records of a large number of individuals across multiple generations and data of this kind are still rare in wild animals.

Reference
C.J. Logan, R. Stanley, A.M. Thompson, T.H. Clutton-Brock. Endocranial volume is heritable and is associated with longevity and fitness in a wild mammal. Royal Society Open Science; 14 Dec 2016; 10.1098/rsos.160622


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge To Play Major Role In €400m EU Food Innovation Project

Cambridge to play major role in €400m EU food innovation project

source: www.cam.ac.uk

The University of Cambridge is one of a number of British universities and companies that have won access to a £340 million EU Innovation programme to change the way we eat, grow and distribute food.

Our joint goal is in making the entire food system more resilient in the context of a changing climate, and improving health and nutrition for people across the world

Howard Griffiths

The project, called EIT Food, has ambitious aims to cut by half the amount of food waste in Europe within a decade, and reduce ill health caused by diet by 2030. It has received €400 million (£340m) of EU research funding, matched by 1.2 billion euros (£1 billion) of funding from industry and other sources over seven years.

The project is funded by the European Institute of Innovation and Technology (EIT), and will have a regional headquarters at the University of Reading to co-ordinate innovation, cutting edge education programmes and support start-ups in the ‘north west’ sector of Europe, covering the UK, Ireland and Iceland.

The Europe-wide scheme was put together by a partnership of 50 food business and research organisations from within Europe’s food sector, which provides jobs for 44 million people. Cambridge is part of one of five regional hubs across Europe. Already confirmed as core partners in the UK-based ‘Co-Location Centre’ (CLC) alongside Cambridge are academic centres Matís, Queen’s University Belfast and the University of Reading, as well as businesses ABP Food Group, PepsiCo and The Nielsen Company. Further partners are expected to be announced in the next year.

Professor Howard Griffiths, co-chair of the Global Food Security Strategic Research Initiative at the University of Cambridge, who will lead Cambridge’s involvement in the EIT, said: “Sustainability is a top-level agenda which is engaging both global multinational food producers and academics. Our joint goal is in making the entire food system more resilient in the context of a changing climate, and improving health and nutrition for people across the world.”

EIT Food will set up four programmes to target broad societal challenges, including:

  • personalised healthy food
  • the digitalization of the food system
  • consumer-driven supply chain development, customised products and new technology in farming, processing and retail
  • resource-efficient processes, making food more sustainable by eliminating waste and recycling by-products throughout the food chain.

EIT Food will also organize international entrepreneurship programmes for students, and develop a unique interdisciplinary EIT labelled Food System MSc for graduates. Thousands of students and food professionals will be trained via workshops, summer schools and online educational programmes like MOOCs (Massive Open Online Courses) and SPOCs (Specialized Private Online Courses).

Peter van Bladeren, Vice President Nestec, Global head Regulatory and Scientific Affairs for Nestlé and Chair of the Interim Supervisory Board of EIT Food, said: “EIT Food is committed to create the future curriculum for students and food professionals as a driving force for innovation and business creation; it will give the food manufacturing sector, which accounts for 44 million jobs in Europe, a unique competitive edge.”

Adapted from a press release by the University of Reading


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Antarctic Ice Sheet Study Reveals 8,000-Year Record of Climate Change

Antarctic Ice Sheet study reveals 8,000-year record of climate change

source: www.cam.ac.uk

An international team of researchers has found that the Antarctic Ice Sheet plays a major role in regional and global climate variability – a discovery that may also help explain why sea ice in the Southern Hemisphere has been increasing despite the warming of the rest of the Earth.

The Antarctic Ice Sheet has experienced much greater natural variability in the past than previously anticipated.

Michael Weber

Results of the study, co-authored by Michael Weber, a paleoclimatologist and visiting scientist at the University of Cambridge, along with colleagues from the USA, New Zealand and Germany, are published this week in the journal Nature.

Global climate models that look at the last several thousand years have failed to account for the amount of climate variability captured in the paleoclimate record, according to lead author Pepijn Bakker, a climate modeller from the MARUM Center for Marine Environmental Studies at the University of Bremen in Germany.

The researchers first turned their attention to the Scotia Sea. “Most icebergs calving off the Antarctic Ice Sheet travel through this region because of the atmospheric and oceanic circulation,” explained Weber. “The icebergs contain gravel that drop into the sediment on the ocean floor – and analysis and dating of such deposits shows that for the last 8,000 years, there were centuries with more gravel and those with less.”

The research team’s hypothesis is that climate modellers have historically overlooked one crucial element in the overall climate system. They discovered that the centuries-long phases of enhanced and reduced Antarctic ice mass loss documented over the past 8,000 years have had a cascading effect on the entire climate system.

Using sophisticated computer modelling, the researchers traced the variability in iceberg calving (ice that breaks away from glaciers) to small changes in ocean temperatures.

“There is a natural variability in the deeper part of the ocean adjacent to the Antarctic Ice Sheet that causes small but significant changes in temperatures,” said co-author Andreas Schmittner, a climate modeller from Oregon State University. “When the ocean temperatures warm, it causes more direct melting of the ice sheet below the surface, and it increases the number of icebergs that calve off the ice sheet.”

Those two factors combine to provide an influx of fresh water into the Southern Ocean during these warm regimes, according to Peter Clark, a paleoclimatologist from Oregon State University, and co-author on the study.

“The introduction of that cold, fresh water lessens the salinity and cools the surface temperatures, at the same time, stratifying the layers of water,” he said. “The cold, fresh water freezes more easily, creating additional sea ice despite warmer temperatures that are down hundreds of meters below the surface.”

The discovery may help explain why sea ice is currently expanding in the Southern Ocean despite global warming, the researchers say.

“This response is well-known, but what is less-known is that the input of fresh water also leads to changes far away in the northern hemisphere, because it disrupts part of the global ocean circulation,” explained Nick Golledge from the University of Wellington, New Zealand, an ice-sheet modeller and study co-author. “Meltwater from the Antarctic won’t just raise global sea level, but might also amplify climate changes around the world. Some parts of the North Atlantic may end up with warmer temperatures as a consequence of part of Antarctica melting.”

Golledge used a computer model to simulate how the Antarctic Ice Sheet changed as it came out of the last ice age and into the present, warm period.

“The integration of data and models provides further evidence that the Antarctic Ice Sheet has experienced much greater natural variability in the past than previously anticipated,” added Weber. “We should therefore be concerned that it will possibly act very dynamically in the future, too, specifically when it comes to projecting future sea-level rise.”

Two years ago Weber led another study, also published in Nature, which found that the Antarctic Ice Sheet collapsed repeatedly and abruptly at the end of the Last Ice Age to 19,000 to 9,000 years ago.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.