All posts by Admin

New Cases of Dementia in the UK Fall By 20% Over Two Decades

New cases of dementia in the UK fall by 20% over two decades

source: www.cam.ac.uk

The UK has seen a 20% fall in the incidence of dementia over the past two decades, according to new research from England, led by the University of Cambridge, leading to an estimated 40,000 fewer cases of dementia than previously predicted. However, the study, published today in Nature Communications, suggests that the dramatic change has been observed mainly in men.

Our evidence shows that the so-called dementia ‘tsunami’ is not an inevitability: we can help turn the tide if we take action now

Carol Brayne

Reports in both the media and from governments have suggested that the world is facing a dementia ‘tsunami’ of ever-increasing numbers, particularly as populations age. However, several recent studies have begun to suggest that the picture is far more complex. Although changing diagnostic methods and criteria are identifying more people as having dementia, societal measures which improve health such as education, early- and mid-life health promotion including smoking reduction and attention to diet and exercise may be driving a reduction in risk in some countries. Prevalence (the proportion of people with dementia) has been reported to have dropped in some European countries but it is incidence (the proportion of people developing dementia in a given time period) that provides by far the most robust evidence of fundamental change in populations.

As part of the Medical Research Council Cognitive Function and Ageing Study (CFAS), researchers at the University of Cambridge, Newcastle University, Nottingham University and the University of East Anglia interviewed a baseline of 7,500 people in three regions of the UK (Cambridgeshire, Newcastle and Nottingham) between 1991 and 1994 with repeat interviews at two years to estimate incidence. Then 20 years later a new sample of over 7,500 people from the same localities aged 65 and over was interviewed with a two year repeat interview again. This is the first time that a direct comparison of incidence across time in multiple areas, using identical methodological approaches, has been conducted in the world.

The researchers found that dementia incidence across the two decades has dropped by 20% and that this fall is driven by a reduction in incidence among men at all ages. These findings suggest that in the UK there are just under 210,000 new cases per year: 74,000 men and 135,000 women – this is compared to an anticipated 250,000 new cases based on previous levels. Incidence rates are higher in more deprived areas.

Even in the presence of an ageing population, this means that the number of people estimated to develop dementia in any year has remained relatively stable, providing evidence that dementia in whole populations can change.   It is not clear why rates among men have declined faster than those among women, though it is possible that it is related to the drop in smoking and vascular health improving in men.

Professor Carol Brayne, Director of the Cambridge Institute of Public Health, University of Cambridge, says: “Our findings suggest that brain health is improving significantly in the UK across generations, particularly among men, but that deprivation is still putting people at a disadvantage. The UK in earlier eras has seen major societal investments into improving population health and this appears to be helping protect older people from dementia. It is vital that policies take potential long term benefits into account.”

Professor Fiona Matthews from the Institute of Health and Society, Newcastle University and the MRC Biostatistics Unit, Cambridge adds: “Public health measures aimed at reducing people’s risk of developing dementia are vital and potentially more cost effective in the long run than relying on early detection and treating dementia once it is present. Our findings support a public health approach for long term dementia prevention, although clearly this does not reduce the need for alternative approaches for at-risk groups and for those who develop dementia.”

The researchers argue that while influential reports continue to promote future scenarios of huge increases of people with dementia across the globe, their study shows that global attention and investment in reducing the risk of dementia can help prevent such increases.

“While we’ve seen investment in Europe and many other countries, the lack of progress in access to education, malnutrition in childhood and persistent inequalities within and across other countries means that dementia will continue to have a major impact globally,” says Professor Brayne. “Our evidence shows that the so-called dementia ‘tsunami’ is not an inevitability: we can help turn the tide if we take action now.”

Dr Rob Buckle, director of science programmes at the Medical Research Council, which funded the study, added: “It is promising news that dementia rates, especially amongst men, have dropped by such a significant amount over the last twenty years, and testament to the benefits of an increased awareness of a brain-healthy lifestyle. However, the burden of dementia will continue to have significant societal impact given the growing proportion of elderly people within the UK population and it is therefore as important as ever that we continue to search for new ways of preventing and treating the disease. This study does, however, reinforce the importance of long-term, quality studies that create a wealth of data of invaluable resource for researchers.”

Reference
Matthews, FE et al. A two decade comparison of incidence of dementia in individuals aged 65 years and older from three geographical areas of England: results of the Cognitive Function Ageing Study I and II. Nature Communications; 19 April 2016; DOI 10.1038/ncomms11398


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/new-cases-of-dementia-in-the-uk-fall-by-20-over-two-decades#sthash.Xu3D2IED.dpuf

Dr Belinda Quinn Appointed As Chief Executive Officer of the Precision Medicine Catapult

Dr Belinda Quinn appointed as Chief Executive Officer of the Precision Medicine Catapult

Dr Belinda Quinn

source: https://pm.catapult.org.uk

19 April 2016, Cambridge, UK – The Precision Medicine Catapult, the UK’s innovation centre for precision medicine, announces today it has appointed Dr Belinda Quinn as its new Chief Executive Officer.

Belinda has been working as Chief Clinical Officer of the Precision Medicine Catapult over the past six months, formulating and landscaping Precision Medicine opportunities across the UK. She has played a pivotal leadership role in mobilising the seven Centres of Excellence integral to the success of the Precision Medicine Catapult.

Belinda trained as a doctor specialising in neurology before diversifying into IT and clinical leadership roles. She has held executive and transformational change roles across the public and private sector including big pharma, global management consulting, the NHS, data and regulatory in the UK, Australia and the Middle East.

Richard Barker, Chairman of the Precision Medicine Catapult said: “Belinda brings deep experience and great energy to the task of building the Precision Medicine Catapult and delivering its strategy. With our ambition to create value for the NHS, private sector companies and most importantly for patients through the development of precision medicine, her ability to bring together all these interests will be invaluable.”

Ruth McKernan, Chief Executive of Innovate UK, the Precision Medicine Catapult primary funder, said: “The Precision Medicine Catapult will transform the UK’s capability for innovation in this important sector, helping to drive economic growth and deliver significant improvements in health. Belinda has already built a strong reputation with colleagues across the UK and I am confident that the Catapult will be a powerful force in making the UK a leader in precision medicine.”

Belinda Quinn said: “I am delighted to accept the role as CEO of the Precision Medicine Catapult and will continue to build on the momentum and progress we have made in the last year. I am excited by the opportunity to work with the many world-leading experts in research and clinical practice that we have in this country, coupled with industry partners and Innovate UK to help bring forward innovative, sustainable and more cost-effective solutions that will build the UK’s precision medicine industry, benefit patients and improve the effectiveness and efficiency of the healthcare system in the UK.”

Read the full press release

UK Steel Can Survive If It Transforms Itself, Say Researchers

UK steel can survive if it transforms itself, say researchers

source: www.cam.ac.uk

A new report from the University of Cambridge claims that British steel could be saved, if the industry is willing to transform itself.

We will never need more capacity for making steel from iron ore than we have today.

Julian Allwood

The report, by Professor Julian Allwood, argues that in order to survive, the UK steel industry needs to refocus itself on steel recycling and on producing products for end users. He argues that instead of viewing Tata Steel’s UK exit as a catastrophe, it can instead be viewed as an opportunity.

Allwood’s report, A bright future for UK steel: A strategy for innovation and leadership through up-cycling and integration, uses evidence gathered from over six years of applied research by 15 researchers, funded by the UK’s Engineering and Physical Sciences Research Council (EPSRC) and industrial partners spanning the global steel supply chain. It is published online today (15 April).

“Tata Steel is pulling out of the UK, for good reason, and there are few if any willing buyers,” said Allwood, from Cambridge’s Department of Engineering. “Despite the sale of the Scunthorpe plant announced earlier this week, the UK steel industry is in grave jeopardy, and it appears that UK taxpayers must either subsidise a purchase, or accept closure and job losses.

“However, we believe that there is a third option, which would allow a transformation of the UK’s steel industry.”

Instead of producing new steel, one option for the UK steel industry is to refocus itself toward recycling steel rather than producing it from scratch. The global market for steel recycling is projected to grow at least three-fold in the next 30 years, but despite the fact that more than 90% of steel is recycled, the processes by which recycling happens are out of date. The quality of recycled steel is generally low, due to poor control of its composition.

Because of this, old steel is generally ‘down-cycled’ to the lowest value steel application – reinforcing bar. According to Allwood, the UK’s strengths in materials innovation could be applied to instead ‘up-cycle’ old steel to today’s high-tech compositions.

According to Allwood, today’s global steel industry has more capacity for making steel from iron ore than it will ever need again. On average, products made with steel last 35-40 years, and around 90% of all old steel is collected. It is likely that, despite the current downturn, global demand for steel will continue to grow, but all future growth can be met by recycling our existing stock of steel. “We will never need more capacity for making steel from iron ore than we have today,” said Allwood.

Apart from the issue of recycling, today’s UK steel industry focuses on products such as plates, bars and coils of strip, all of which have low profit margins. “The steel industry fails to capture the value and innovation potential from making final components,” said Allwood. “As a result, more than a quarter of all steel is cut off during fabrication and never enters a product, and most products use at least a third more steel than actually required. The makers of liquid steel could instead connect directly to final customer requirements.”

These two opportunities create the scope for a transformation of the steel industry in the UK, says the report. In response to Tata Steel’s decision, UK taxpayers will have to bear costs. If the existing operations are to be sold, taxpayers must subsidise the purchase without the guarantee of a long term national gain. If the plants are closed, the loss of tax income and payment of benefits will cost taxpayers £300m-£800m per year, depending on knock-on job losses.

Allwood’s strategy requires taxpayers to invest in a transformation, for example through the provision of a long term loan. This would allow UK to innovate more than any other large player, with the potential of leadership in a global market that is certain to triple in size.

He singles out the example of the Danish government’s Wind Power Programme, initiated in 1976, which provided a range of subsidies and support for Denmark’s nascent wind industry, allowing it to establish a world-leading position in a growing market. Allwood believes a similar initiative by the UK government could mirror this success and transform the steel industry. “Rapid action now to initiate working groups on the materials technologies, business model innovations, financing and management of the proposed transformation could convert this vision to a plan for action before the decision for plant closure or subsidised sale is finalised,” he said. “This is worth taking a real shot on.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/uk-steel-can-survive-if-it-transforms-itself-say-researchers#sthash.RzAW4ZHL.dpuf

Cambridge to Research Future Computing Tech That Could “Ignite A Technology Field”

Cambridge to research future computing tech that could “ignite a technology field”

Source: www.cam.ac.uk

A Cambridge-led project aiming to develop a new architecture for future computing based on superconducting spintronics – technology designed to increase the energy-efficiency of high-performance computers and data storage – has been announced.

Superconducting spintronics offer extraordinary potential because they combine the properties of two traditionally incompatible fields to enable ultra-low power digital electronics

Jason Robinson

A project which aims to establish the UK as an international leader in the development of “superconducting spintronics” – technology that could significantly increase the energy-efficiency of data centres and high-performance computing – has been announced.

Led by researchers at the University of Cambridge, the “Superspin” project aims to develop prototype devices that will pave the way for a new generation of ultra-low power supercomputers, capable of processing vast amounts of data, but at a fraction of the huge energy consumption of comparable facilities at the moment.

As more economic and cultural activity moves online, the data centres which house the servers needed to handle internet traffic are consuming increasing amounts of energy. An estimated three per cent of power generated in Europe is, for example, already used by data centres, which act as repositories for billions of gigabytes of information.

Superconducting spintronics is a new field of scientific investigation that has only emerged in the last few years. Researchers now believe that it could offer a pathway to solving the energy demands posed by high performance computing.

As the name suggests, it combines superconducting materials – which can carry a current without losing energy as heat – with spintronic devices. These are devices which manipulate a feature of electrons known as their “spin”, and are capable of processing large amounts of information very quickly.

Given the energy-efficiency of superconductors, combining the two sounds like a natural marriage, but until recently it was also thought to be completely impossible. Most spintronic devices have magnetic elements, and this magnetism prevents superconductivity, and hence reduces any energy-efficiency benefits.

Stemming from the discovery of spin polarized supercurrents in 2010 at the University of Cambridge, recent research, along with that of other institutions, has however shown that it is possible to power spintronic devices with a superconductor. The aim of the new £2.7 million project, which is being funded by the Engineering and Physical Sciences Research Council, is to use this as the basis for a new style of computing architecture.

Although work is already underway in several other countries to exploit superconducting spintronics, the Superspin project is unprecedented in terms of its magnitude and scope.

Researchers will explore how the technology could be applied in future computing as a whole, examining fundamental problems such as spin generation and flow, and data storage, while also developing sample devices. According to the project proposal, the work has the potential to establish Britain as a leading centre for this type of research and “ignite a technology field.”

The project will be led by Professor Mark Blamire, Head of the Department of Materials Sciences at the University of Cambridge, and Dr Jason Robinson, University Lecturer in Materials Sciences, Fellow of St John’s College, University of Cambridge, and University Research Fellow of the Royal Society. They will work with partners in the University’s Cavendish Laboratory (Dr Andrew Ferguson) and at Royal Holloway, London (Professor Matthias Eschrig).

Blamire and Robinson’s core vision of the programme is “to generate a paradigm shift in spin electronics, using recent discoveries about how superconductors can be combined with magnetism.” The programme will provide a pathway to making dramatic improvements in computing energy efficiency.

Robinson added: “Many research groups have recognised that superconducting spintronics offer extraordinary potential because they combine the properties of two traditionally incompatible fields to enable ultra-low power digital electronics.”

“However, at the moment, research programmes around the world are individually studying fascinating basic phenomena, rather than looking at developing an overall understanding of what could actually be delivered if all of this was joined up. Our project will aim to establish a closer collaboration between the people doing the basic science, while also developing demonstrator devices that can turn superconducting spintronics into a reality.”

The initial stages of the five-year project will be exploratory, examining different ways in which spin can be transported and magnetism controlled in a superconducting state. By 2021, however, the team hope that they will have manufactured sample logic and memory devices – the basic components that would be needed to develop a new generation of low-energy computing technologies.

The project will also report to an advisory board, comprising representatives from several leading technology firms, to ensure an ongoing exchange between the researchers and industry partners capable of taking its results further.

“The programme provides us with an opportunity to take international leadership of this as a technology, as well as in the basic science of studying and improving the interaction between superconductivity and magnetism,” Blamire said. “Once you have grasped the physics behind the operation of a sample device, scaling up from the sort of models that we are aiming to develop is not, in principle, too taxing.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/cambridge-to-research-future-computing-tech-that-could-ignite-a-technology-field#sthash.5y9mGCfu.dpuf

Graduate Earnings: What You Study and Where Matters – But So Does Parents’ Income

Graduate earnings: what you study and where matters – but so does parents’ income

source: www.cam.ac.uk

First ‘big data’ research approach to graduate earnings reveals significant variations depending on student background, degree subject and university attended.

The research illustrates strongly that, for most graduates, higher education leads to much better earnings than those earned by non-graduates, although students need to realise that their subject choice is important in determining how much of an earnings advantage they will have

Anna Vignoles

Latest research has shown that graduates from richer family backgrounds earn significantly more after graduation than their poorer counterparts, even after completing the same degrees from the same universities.

The finding is one of many from a new study, published today, which looks at the link between earnings and students’ background, degree subject and university.

The research also found that those studying medicine and economics earn far more than those studying other degree subjects, and that there is considerable variation in graduates’ earnings depending on the university attended.

The study was carried out by the Institute of Fiscal Studies and the universities of Cambridge and Harvard, including Professor Anna Vignoles from Cambridge’s Faculty of Education. It is the first time a ‘big data’ approach has been used to look at how graduate earnings vary by institution of study, degree subject and parental income.

The researchers say that many other factors beyond graduate earnings, such as intrinsic interest, will and should drive student choice. However, they write that the research shows the potential value of providing some useful information that might inform students’ choice of degree – particularly to assist those from more disadvantaged backgrounds who might find it harder to navigate the higher education system.

“It would seem important to ensure there is adequate advice and guidance given that graduates’ future earnings are likely to vary depending on the institution and subject they choose, with implications for social mobility,” write the researchers in the study’s executive summary.

The research used anonymised tax data and student loan records for 260,000 students up to ten years after graduation. The dataset includes cohorts of graduates who started university in the period 1998-2011 and whose earnings (or lack of earnings) are then observed over a number of tax years. The paper focuses on the tax year 2012/13.

The study found that those from richer backgrounds (defined as being approximately from the top 20% of households of those applying to higher education in terms of family income) did better in the labour market than the other 80% of students.

The average gap in earnings between students from higher and lower income backgrounds is £8,000 a year for men and £5,300 a year for women, ten years after graduation.

Even after taking account of subject studied and the characteristics of the institution of study, the average student from a higher income background earned about 10% more than other students.

The gap is bigger at the top of the distribution – the 10% highest earning male graduates from richer backgrounds earned about 20% more than the 10% highest earners from relatively poorer backgrounds. The equivalent premium for the 10% highest earning female graduates from richer backgrounds was 14%.

The study also showed that graduates are much more likely to be in work, and earn much more than non-graduates. Non-graduates are twice as likely to have no earnings as are graduates ten years on (30% against 15% for the cohort who enrolled in higher education in 1999).

Partly as a result of this, half of non-graduate women had earnings below £8,000 a year at around age 30, say the researchers. Only a quarter of female graduates were earning less than this. Half were earning more than £21,000 a year.

Among those with significant earnings (which the researchers define as above £8,000 a year), median earnings for male graduates ten years after graduation were £30,000. For non-graduates of the same age median earnings were £21,000. The equivalent figures for women with significant earnings were £27,000 and £18,000.

“The research illustrates strongly that, for most graduates, higher education leads to much better earnings than those earned by non-graduates, although students need to realise that their subject choice is important in determining how much of an earnings advantage they will have,” said Professor Vignoles.

The researchers also found substantial differences in earnings according to which university was attended, as well as which subject was studied. They say however that this is in large part driven by differences in entry requirements.

For instance, more than 10% of male graduates from LSE, Oxford and Cambridge were earning in excess of £100,000 a year ten years after graduation, with LSE graduates earning the most. LSE was the only institution with more than 10% of its female graduates earning in excess of £100,000 a year ten years on.

Even without focusing on the very top, the researchers say they found a large number of institutions (36 for men and 10 for women) had 10% of their graduates earning more than £60,000 a year ten years on. At the other end of the spectrum, there were some institutions (23 for men and 9 for women) where the median graduate earnings were less than those of the median non-graduate ten years on.

However, the researchers say that it is important to put this in context. “Given regional differences in average wages, some very locally focused institutions may struggle to produce graduates whose wages outpace English wide earnings, which includes those living in London where full time earnings for males are around 50% higher than in some other regions, such as Northern Ireland,” they write.

In terms of earnings according to subject, medical students were easily the highest earners at the median ten years out, followed by those who studied economics. For men, median earnings for medical graduates were about £50,000 after ten years, and for economics graduates £40,000.

Those studying the creative arts had the lowest earnings, and earned no more on average than non-graduates. However, the researchers say that some of these earnings differences are, of course, attributable to differences in student intake – since students with different levels of prior achievement at A-level take different subject options.

“When we account for different student intakes across subjects, only economics and medicine remain outliers with much higher earnings at the median as compared to their peers in other subjects,” write the researchers.

After allowing for differences in the characteristics of those who take different subjects, male medical graduates earn around £13,000 more at the median than similar engineering and technology graduates, the gap for women is approximately £16,000. Both male and female medical graduates earn around £14,000 more at the median than similar law graduates.

“Earnings vary substantially with university, subject, gender and cohort,” said study co-author Neil Shepherd of Harvard University. “This impacts on which parts of the HE sector the UK Government funds through the subsidy inherent within income contingent student loans. The next step in the research is to quantifying that variation in funding, building on today’s paper.”

Reference:
Institute for Fiscal Studies working paper: ‘How English domiciled graduate earnings vary with gender, institution attended, subject and socio-economic background‘, Jack Britton , Lorraine Dearden , Neil Shephard and Anna Vignoles.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/graduate-earnings-what-you-study-and-where-matters-but-so-does-parents-income#sthash.osd4Zuq7.dpuf

Predicting Gentrification Through Social Networking Data

Predicting gentrification through social networking data

source: www.cam.ac.uk

Data from location-based social networks may be able to predict when a neighbourhood will go through the process of gentrification, by identifying areas with high social diversity and high deprivation.

We understand that people who diversify their contacts socially and geographically have high social capital, but what about places?

Desislava Hristova

The first network to look at the interconnected nature of people and places in large cities is not only able to quantify the social diversity of a particular place, but can also be used to predict when a neighbourhood will go through the process of gentrification, which is associated with the displacement of residents of a deprived area by an influx of a more affluent population.

The researchers behind the study, led by the University of Cambridge, will present their results today (13 April) at the 25thInternational World Wide Web Conference in Montréal.

The Cambridge researchers, working with colleagues from the University of Birmingham, Queen Mary University of London, and University College London, used data from approximately 37,000 users and 42,000 venues in London to build a network of Foursquare places and the parallel Twitter social network of visitors, adding up to more than half a million check-ins over a ten-month period. From this data, they were able to quantify the ‘social diversity’ of various neighbourhoods and venues by distinguishing between places that bring together strangers versus those that tend to bring together friends, as well as places that attract diverse individuals as opposed to those which attract regulars.

When these social diversity metrics were correlated with wellbeing indicators for various London neighbourhoods, the researchers discovered that signs of gentrification, such as rising housing prices and lower crime rates, were the strongest in deprived areas with high social diversity. These areas had an influx of more affluent and diverse visitors, represented by social media users, and pointed to an overall improvement of their rank, according to the UK Index of Multiple Deprivation.

The UK Index of Multiple Deprivation (IMD) is a statistical exercise conducted by the Department of Communities and Local Government, which measures the relative prosperity of neighbourhoods across England. The researchers compared IMD data for 2010, the year their social and place network data was gathered, with the IMD data for 2015, the most recent report.

“We’re looking at the social roles and properties of places,” said Desislava Hristova from the University’s Computer Laboratory, and the study’s lead author. “We found that the most socially cohesive and homogenous areas tend to be either very wealthy or very poor, but neighbourhoods with both high social diversity and high deprivation are the ones which are currently undergoing processes of gentrification.”

This aligns with previous research, which has found that tightly-knit communities are more resistant to changes and resources remain within the community. This suggests that affluent communities remain affluent and poor communities remain poor because they are relatively isolated.

Hristova and her co-authors found that of the 32 London boroughs, the borough of Hackney had the highest social diversity, and in 2010, had the second-highest deprivation. By 2015, it had also seen the most improvement on the IMD index, and is now an area undergoing intense gentrification, with house prices rising far above the London average, fast-decreasing crime rate and a highly diverse population.

In addition to Hackney, Tower Hamlets, Greenwich, Hammersmith and Lambeth are also boroughs with high social diversity and high deprivation in 2010, and are now undergoing the process of gentrification, with all of the positive and negative effects that come along with it.

The ability to predict the gentrification of neighbourhoods could help local governments and policy-makers improve urban development plans and alleviate the negative effects of gentrification while benefitting from economic growth.

In order to measure the social diversity of a given place or neighbourhood, the researchers defined four distinct measures: brokerage, serendipity, entropy and homogeneity. Brokerage is the ability of a place to connect people who are otherwise disconnected; serendipity is the extent to which a place can induce chance encounters between its visitors; entropy is the extent to which a place is diverse with respect to visits; and homogeneity is the extent to which the visitors to a place are homogenous in their characteristics.

Within categories of places, the researchers found that some places were more likely places for friends to meet, and some were for more fleeting encounters. For example, in the food category, strangers were more likely to meet at a dumpling restaurant while friends were more likely to meet at a fried chicken restaurant. Similarly, friends were more likely to meet at a B&B, football match or strip club, while strangers were more likely to meet at a motel, art museum or gay bar.

“We understand that people who diversify their contacts socially and geographically have high social capital, but what about places?” said Hristova. “We all have a general notion of the social diversity of places and the people that visit them, but we’ve attempted to formalise this – it could even be used as a specialised local search engine.”

For instance, while there are a number of ways a tourist can find a highly-recommended restaurant in a new city, the social role that a place plays in a city is normally only known by locals through experience. “Whether a place is touristy or quiet, artsy or mainstream could be integrated into mobile system design to help newcomers or tourists feel like locals,” said Hristova.

Reference:
Desislava Hristova et al. ‘Measuring Urban Social Diversity Using Interconnected Geo-Social Networks.’ Paper presented to the International World Wide Web Conference, Montréal, 11-15 April 2016. http://www2016.ca/program-at-a-glance.html


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/predicting-gentrification-through-social-networking-data#sthash.VxhRau3e.dpuf

Neanderthals May Have Been Infected By Diseases Carried Out of Africa By Humans, Say Researchers

Neanderthals may have been infected by diseases carried out of Africa by humans, say researchers

Source: www.cam.ac.uk

Review of latest genetic evidence suggests infectious diseases are tens of thousands of years older than previously thought, and that they could jump between species of ‘hominin’. Researchers says that humans migrating out of Africa would have been ‘reservoirs of tropical disease’ – disease that may have sped up Neanderthal extinction.

Humans migrating out of Africa would have been a significant reservoir of tropical diseases

Charlotte Houldcroft

A new study suggests that Neanderthals across Europe may well have been infected with diseases carried out of Africa by waves of anatomically modern humans, or Homo sapiens. As both were species of hominin, it would have been easier for pathogens to jump populations, say researchers. This might have contributed to the demise of Neanderthals.

Researchers from the universities of Cambridge and Oxford Brookes have reviewed the latest evidence gleaned from pathogen genomes and DNA from ancient bones, and concluded that some infectious diseases are likely to be many thousands of years older than previously believed.

There is evidence that our ancestors interbred with Neanderthals and exchanged genes associated with disease. There is also evidence that viruses moved into humans from other hominins while still in Africa. So, the researchers argue, it makes sense to assume that humans could, in turn, pass disease to Neanderthals, and that – if we were mating with them – we probably did.

Dr Charlotte Houldcroft, from Cambridge’s Division of Biological Anthropology, says that many of the infections likely to have passed from humans to Neanderthals – such as tapeworm, tuberculosis, stomach ulcers and types of herpes – are chronic diseases that would have weakened the hunter-gathering Neanderthals, making them less fit and able to find food, which could have catalysed extinction of the species.

“Humans migrating out of Africa would have been a significant reservoir of tropical diseases,” says Houldcroft. “For the Neanderthal population of Eurasia, adapted to that geographical infectious disease environment, exposure to new pathogens carried out of Africa may have been catastrophic.”

“However, it is unlikely to have been similar to Columbus bringing disease into America and decimating native populations. It’s more likely that small bands of Neanderthals each had their own infection disasters, weakening the group and tipping the balance against survival,” says Houldcroft.

New techniques developed in the last few years mean researchers can now peer into the distant past of modern disease by unravelling its genetic code, as well as extracting DNA from fossils of some of our earliest ancestors to detect traces of disease.

In a paper published today in the American Journal of Physical Anthropology, Houldcroft, who also studies modern infections at Great Ormond Street Hospital, and Dr Simon Underdown, a researcher in human evolution from Oxford Brookes University, write that genetic data shows many infectious diseases have been “co-evolving with humans and our ancestors for tens of thousands to millions of years”.

The longstanding view of infectious disease is that it exploded with the dawning of agriculture some 8,000 years ago, as increasingly dense and sedentary human populations coexisted with livestock, creating a perfect storm for disease to spread. The researchers say the latest evidence suggests disease had a much longer “burn in period” that pre-dates agriculture.

In fact, they say that many diseases traditionally thought to be ‘zoonoses’, transferred from herd animals into humans, such as tuberculosis, were actually transmitted into the livestock by humans in the first place.

“We are beginning to see evidence that environmental bacteria were the likely ancestors of many pathogens that caused disease during the advent of agriculture, and that they initially passed from humans into their animals,” says Houldcroft.

“Hunter-gatherers lived in small foraging groups. Neanderthals lived in groups of between 15-30 members, for example. So disease would have broken out sporadically, but have been unable to spread very far. Once agriculture came along, these diseases had the perfect conditions to explode, but they were already around.”

There is as yet no hard evidence of infectious disease transmission between humans and Neanderthals; however, considering the overlap in time and geography, and not least the evidence of interbreeding, Houldcroft and Underdown say that it must have occurred.

Neanderthals would have adapted to the diseases of their European environment. There is evidence that humans benefited from receiving genetic components through interbreeding that protected them from some of these: types of bacterial sepsis – blood poisoning occurring from infected wounds – and encephalitis caught from ticks that inhabit Siberian forests.

In turn, the humans, unlike Neanderthals, would have been adapted to African diseases, which they would have brought with them during waves of expansion into Europe and Asia.

The researchers describe Helicobacter pylori, a bacterium that causes stomach ulcers, as a prime candidate for a disease that humans may have passed to Neanderthals. It is estimated to have first infected humans in Africa 88 to 116 thousand years ago, and arrived in Europe after 52,000 years ago. The most recent evidence suggests Neanderthals died out around 40,000 years ago.

Another candidate is herpes simplex 2, the virus which causes genital herpes. There is evidence preserved in the genome of this disease that suggests it was transmitted to humans in Africa 1.6 million years ago from another, currently unknown hominin species that in turn acquired it from chimpanzees.

“The ‘intermediate’ hominin that bridged the virus between chimps and humans shows that diseases could leap between hominin species. The herpesvirus is transmitted sexually and through saliva. As we now know that humans bred with Neanderthals, and we all carry 2-5% of Neanderthal DNA as a result, it makes sense to assume that, along with bodily fluids, humans and Neanderthals transferred diseases,” says Houldcroft.

Recent theories for the cause of Neanderthal extinction range from climate change to an early human alliance with wolves resulting in domination of the food chain. “It is probable that a combination of factors caused the demise of Neanderthals,” says Houldcroft, “and the evidence is building that spread of disease was an important one.”

Inset image: Dr Charlotte Houldcroft


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/neanderthals-may-have-been-infected-by-diseases-carried-out-of-africa-by-humans-say-researchers#sthash.jCT6tExI.dpuf

Timber Skyscrapers Could Transform London’s Skyline

Timber skyscrapers could transform London’s skyline

source: www.cam.ac.uk

London’s first timber skyscraper could be a step closer to reality this week after researchers presented Mayor of London Boris Johnson with conceptual plans for an 80-storey, 300m high wooden building integrated within the Barbican.

If London is going to survive it needs to increasingly densify. One way is taller buildings. We believe people have a greater affinity for taller buildings in natural materials rather than steel and concrete towers.

Michael Ramage

Researchers from Cambridge University’s Department of Architecture are working with PLP Architecture and engineers Smith and Wallwork on the future development of tall timber buildings in central London.

The use of timber as a structural material in tall buildings is an area of emerging interest for its variety of potential benefits; the most obvious being that it is a renewable resource, unlike prevailing construction methods which use concrete and steel.  The research is also investigating other potential benefits, such as reduced costs and improved construction timescales, increased fire resistance, and significant reduction in the overall weight of buildings.

The conceptual proposals currently being developed would create over 1,000 new residential units in a 1 million sq ft mixed-use tower and mid-rise terraces in central London, integrated within the Barbican.

Dr Michael Ramage, Director of Cambridge’s Centre for Natural Material Innovation, said: “The Barbican was designed in the middle of the last century to bring residential living into the city of London – and it was successful. We’ve put our proposals on the Barbican as a way to imagine what the future of construction could look like in the 21st century.

“If London is going to survive it needs to increasingly densify. One way is taller buildings. We believe people have a greater affinity for taller buildings in natural materials rather than steel and concrete towers. The fundamental premise is that timber and other natural materials are vastly underused and we don’t give them nearly enough credit. Nearly every historic building, from King’s College Chapel to Westminster Hall, has made extensive use of timber.”

Kevin Flanagan, Partner at PLP Architecture said “We now live predominantly in cities and so the proposals have been designed to improve our wellbeing in an urban context. Timber buildings have the potential architecturally to create a more pleasing, relaxed, sociable and creative urban experience.

“Our firm is currently designing many of London’s tall buildings, and the use of timber could transform the way we build in this city. We are excited to be working with the University and with Smith and Wallwork on this ground breaking design- and engineering-based research.”

The tall timber buildings research also looks towards creating new design potentials with timber buildings, rather than simply copying the forms of steel and concrete construction. The transition to timber construction may have a wider positive impact on urban environments and built form, and offers opportunities not only to rethink the aesthetics of buildings, but also the structural methodologies informing their design as well.

Just as major innovations in steel, glass and concrete revolutionised buildings in the 19th and 20th centuries, creating Joseph Paxton’s Crystal Palace and the Parisian arcades described by Walter Benjamin, innovations in timber construction could lead to entirely new experiences of the city in the 21st century.

The type of wood these new buildings would use is regarded as a ‘crop’. The amount of crop forest in the world is currently expanding. Canada alone could produce more than 15billion m³ of crop forest in the next 70 years, enough to house around a billion people.

At present, the world’s tallest timber building is a 14-storey apartment block in Bergen, Norway. The proposals presented to Johnson included concepts for a timber tower nearly 300m high, which would make it the second tallest building in London after The Shard.

Dr Ramage added: “We’ve designed the architecture and engineering and demonstrated it will stand, but this is at a scale no one has attempted to build before. We are developing a new understanding of primary challenges in structure and construction. There is a lot of work ahead, but we are confident of meeting all the challenges before us.”

Perhaps the most obvious concern for potential residents of homes built primarily from timber is fire risk. However, the team involved in the project said the proposed building would eventually meet or exceed every existing fire regulation currently in place for steel and concrete buildings.

Recent research has also shown that timber buildings can have positive effects on their user and occupant’s health. Some recent studies have also shown that children taught in schools with timber structures may perform better than in those made of concrete.

The designs for the Barbican is the first in a series of timber skyscrapers developed by Cambridge University in association with globally renowned architects and structural engineers with funding from the UK’s Engineering and Physical Sciences Research Council.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/timber-skyscrapers-could-transform-londons-skyline#sthash.MFmF72LW.dpuf

Pioneering Centre For Physical Sciences-Industry Collaborations Opens At University of Cambridge

Pioneering centre for physical sciences-industry collaborations opens at University of Cambridge

source: www.cam.ac.uk

The £26 million Maxwell Centre will focus on “blue skies” research in areas such as efficient energy generation, storage and use, including work on photovoltaics, energy storage, refrigeration, lighting and ICT.

The Maxwell Centre significantly strengthens our drive to deliver new knowledge and applications for industry, underpinning growth and fostering our innovative partnerships between research and business.

Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge

A centrepiece for industrial partnership with the physical sciences and engineering officially opens today. The building will be opened by David Harding, whose generous sponsorship of the Physics of Sustainability programme was central to the funding of the Maxwell Centre project by HEFCE.

In addition, the centre will foster advanced scientific computing, materials science research, nanoscience and biophysics.

Hosted by the Cavendish Laboratory, the Centre provides facilities for the University of Cambridge’s Science and Technology campus as well as collaborators from industry, with offices, laboratory and meeting spaces for more than 230 people.

It will house researchers from the University’s Physics, Chemistry, Chemical Engineering and Biotechnology, Engineering, and Material Sciences and Metallurgy departments. It is also a home to two EPSRC (Engineering and Physical Sciences Research Council) Centres for Doctoral Training, the SKF University Technology Centre, the Energy@Cambridge Initiative and connects to several other Cambridge Strategic Research Initiatives and Networks.

The Maxwell Centre is due to become the Cambridge satellite centre for the Sir Henry Royce Institute for Advanced Materials Research, and will also host the partnership between ARM and University of Cambridge. The latter collaboration will research new technologies that ensure data-intensive computing can be delivered within the constrained energy budgets governing many compute applications.

Director of the Centre, Professor Sir Richard Friend, said: “The Centre will translate ‘blue skies’ research into products vital for industry.

“The co-location of academics and industry supports a two-way flow of ideas.  New research opportunities are often revealed by industrial activity, their solutions require transfer of ideas and techniques often from fields well away from the industry.

“It demonstrates our commitment to collaborating with industry, large and small, through intellectual innovation.”

Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge, said: “The Maxwell Centre significantly strengthens our drive to deliver new knowledge and applications for industry, underpinning growth and fostering our innovative partnerships between research and business.

“This builds on established efforts to embed industrial engagement still further into the University, driving forward real excellence in translational research.”

The Centre will take forward research activity currently supported by the Winton Programme for the Physics of Sustainability at the Cavendish Laboratory, where the focus has been on original, risk-taking science since its inception in March 2011.

David Harding, Founder and CEO of Winton Capital, and an alumnus of the Physics Department, gave £20 million to the Cavendish Laboratory in 2011 to establish the Winton Programme, providing the freedom to explore basic science that could generate the much needed breakthroughs for the resource-strained world.

The Centre is named after physicist James Clerk Maxwell, who was appointed the first Professor of Experimental Physics at Cambridge in 1871 and who discovered electromagnetism and founded statistical mechanics.

It is located between the Physics of Medicine building and the William Gates building on the West Cambridge Physical Science and Technology campus.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/pioneering-centre-for-physical-sciences-industry-collaborations-opens-at-university-of-cambridge#sthash.Xxvs2DhU.dpuf

Map of Rocky Exoplanet Reveals a Lava World

Map of rocky exoplanet reveals a lava world

Source: www.cam.ac.uk

The most detailed map of a small, rocky ‘super Earth’ to date reveals a planet almost completely covered by lava, with a molten ‘hot’ side and solid ‘cool’ side.

We still don’t know exactly what this planet is made of – it’s still a riddle. These results are like adding another brick to the wall, but the exact nature of this planet is still not completely understood.

Brice-Olivier Demory

An international team of astronomers, led by the University of Cambridge, has obtained the most detailed ‘fingerprint’ of a rocky planet outside our solar system to date, and found a planet of two halves: one that is almost completely molten, and the other which is almost completely solid.

According to the researchers, conditions on the hot side of the planet are so extreme that it may have caused the atmosphere to evaporate, with the result that conditions on the two sides of the planet vary widely: temperatures on the hot side can reach 2500 degrees Celsius, while temperatures on the cool side are around 1100 degrees. The results are reported in the journalNature.

Using data from NASA’s Spitzer Space Telescope, the researchers examined a planet known as 55 Cancri e, which orbits a sun-like star located 40 light years away in the Cancer constellation, and have mapped how conditions on the planet change throughout a complete orbit, the first time this has been accomplished for such a small planet.

55 Cancri e is a ‘super Earth’: a rocky exoplanet about twice the size and eight times the mass of Earth, and orbits its parent star so closely that a year lasts just 18 hours. The planet is also tidally locked, meaning that it always shows the same face to its parent star, similar to the Moon, so there is a permanent ‘day’ side and a ‘night’ side. Since it is among the nearest super Earths whose composition can be studied, 55 Cancri e is among the best candidates for detailed observations of surface and atmospheric conditions on rocky exoplanets.

Uncovering the characteristics of super Earths is difficult, since they are so small compared to the parent star and their contrast relative to the star is extremely small compared to larger, hotter gas giant planets, the so-called ‘hot Jupiters’.

“We haven’t yet found any other planet that is this small and orbits so close to its parent star, and is relatively close to us, so 55 Cancri e offers lots of possibilities,” said Dr Brice-Olivier Demory of the University’s Cavendish Laboratory, the paper’s lead author. “We still don’t know exactly what this planet is made of – it’s still a riddle. These results are like adding another brick to the wall, but the exact nature of this planet is still not completely understood.”

55 Cancri e has been extensively studied since it was discovered in 2011. Based on readings taken at different points in time, it was thought to be a water world, or even made of diamond, but researchers now believe that it is almost completely covered by lava.

“We have entered a new era of atmospheric remote sensing of rocky exoplanets,” said study co-author Dr Nikku Madhusudhan, from the Institute of Astronomy at Cambridge. “It is incredible that we are now able to measure the large scale temperature distribution on the surface of a rocky exoplanet.”

Based on these new infrared measurements, the ‘day’ side of the planet appears to be almost completely molten, while the ‘night’ side is almost completely solid. The heat from the day side is not efficiently circulated to the night side, however. On Earth, the atmosphere aids in the recirculation of heat, keeping the temperature across the whole planet within a relatively narrow range. But on 55 Cancri e, the hot side stays hot, and the cold side stays cold.

According to Demory, one possibility for this variation could be either a complete lack of atmosphere, or one which has been partially destroyed due to the strong irradiation from the nearby host star. “On the day side, the temperature is around 2500 degrees Celsius, while on the night side it’s about 1100 degrees – that’s a huge difference,” he said. “We think that there could still be an atmosphere on the night side, but temperatures on the day side are so extreme that the atmosphere may have evaporated completely, meaning that heat is not being efficiently transferred, or transferred at all from the day side to the night side.”

Another possibility for the huge discrepancy between the day side and the night side may be that the molten lava on the day side moves heat along the surface, but since lava is mostly solid on the night side, heat is not moved around as efficiently.

What is unclear however, is where exactly the ‘extra’ heat on 55 Cancri e comes from in the first place, since the observations reveal an unknown source of heat that makes the planet hotter than expected solely from the irradiation from the star – but the researchers may have to wait until the next generation of space telescopes are launched to find out.

For Demory, these new readings also show just how difficult it will be to detect a planet that is similar to Earth. The smaller a planet is, the more difficult it is to detect. And once a rocky planet has been found, there is the question of whether it lies in the so-called habitable zone, where life can be supported. “The problem is, people don’t agree on what the habitable zone is,” said Demory. “For example, some studies consider Mars and Venus to be in the habitable zone, but life as we know it is not possible on either of those planets. Understanding the surface and climate properties of these other worlds will eventually allow us to put the Earth’s climate and habitability into context.”

One possibility might be to look at stars which are much cooler and smaller than our sun, such as the M-dwarfs, which would mean that planets could be much closer to their star and still be in the habitable zone. The sizes of such planets relative to their star would be larger, which make them more detectable from Earth.

But for the time being, Demory and his colleagues plan to keep studying 55 Cancri e, in order to see what other secrets it might hold, including the possibility that it might be surrounded by a torus of gas and dust, which could account for some of the variations in the data. And in 2018, the successor to Hubble and Spitzer, the James Webb Space Telescope, will launch, allowing astronomers to look at planets outside our solar system with entirely new levels of precision.

Reference:
Brice-Olivier Demory et al. ‘A map of the extreme day-night temperature gradient of a super-Earth exoplanet.’ Nature (2016). DOI: 10.1038/nature17169


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/map-of-rocky-exoplanet-reveals-a-lava-world#sthash.rBOVX7RP.dpuf

From Robot Intelligence To Sex By Numbers: Cambridge Heads For Hay

From robot intelligence to sex by numbers: Cambridge heads for Hay

Source: www.cam.ac.uk

For the eighth year running, the Cambridge Series at the prestigious Hay Festival will showcase a broad range of the University’s research excellence.

Cambridge University nurtures and challenges the world’s greatest minds, and offers the deepest understanding of the most intractable problems and the most thrilling opportunities. And for one week a year they bring that thinking to a field in Wales and share it with everyone. That’s a wonderful gift.

Peter Florence

A record number of Cambridge academics will take part in this year’s Hay Festival, one of the most prestigious literary festivals in the world.

This is the eighth year running that the Series has formed part of the festival. This year it features a range of speakers, from experts on climate change, robotics, maternal health and risk to Classics, European politics, nuclear power, playfulness in education and digital media.

The Series is part of the University of Cambridge’s commitment to public engagement. The Festival runs from 26th May to 5th June and is now open for bookings. Twenty-seven academics from the University of Cambridge and several alumni will be speaking.

This year’s line-up includes Professor Peter Mandler on education and social mobility; Professor Ashley Moffett on immunity in pregnancy; Dame Carol Black, Principal of Newnham College, on addiction, obesity and employment; Professor Susan Gathercole on working memory; Fumiya Iida on robot intelligence; Professor Paul Cartledge on ancient Greek democracy; Professor Eric Wolff on climate change, past, present and future; Professor Jim Huntington on breakthrough research into blood clotting and how the insights are being used to prevent heart attacks and stroke; Topun Austin on the development of the human brain; Kathelijne Koops on what chimpanzees and bonobos can tell us about human culture; Suman-Lata Sahonta on LEDs; and Giles Yeo on genetic predisposition to obesity. Dr Yeo will be presenting a BBC Horizon programme on his research in June. Neuroscientist Hannah Critchlow also returns after being singled out as one of the highlights of Hay 2015.

In addition, there will be a series of discussions: Sharath Srinivasan, Director of the Centre of Governance and Human Rights, will be joined by blogger, technologist and social entrepreneur Marieme Jamme and Rob Burnet, CEO and Founder of Well Told Story, to talk about Africa’s digital revolution. David Whitebread, Jenny Gibson and Sara Baker from the PEDAL Research Centre will ask if the consequences of curtailing play, in schools, at home and in the outdoors, could be catastrophic for healthy child development. Madeline Abbas, Chris Bickerton and Katharina Karcher will debate the future of Europe. And theatre director and academic Zoe Svendsen and journalist and economist Paul Mason will explore the theatricality of capitalism through examining what an economic analysis of Shakespeare’s plays might tell us about character and how the human is represented. They are collaborating on a research and development project at the Young Vic Theatre.

Several of the speakers have new books out – Dame Fiona Reynolds, Master of Emmanuel College, will discuss the fight for beauty; Professor David Spiegelhalter will address the statistics of sexual behaviour and whether we can believe them; Professor Paul Murdin will speak about his book on the landscapes of other worlds as imaged close-up by space probes; Simon Taylor will discuss the strange rebirth of nuclear power in Britain; and Matt Wilkinson will explain how the need to move has driven the evolution of life on Earth.  Jennifer Wallace, author of the novel Digging up Milton, will be joined by Professor Adrian Poole to discuss literary celebrity in the 18th and 19th centuries. Chris Bickerton’s book The European Union: a citizen’s guide is out in June.

Also taking part in the Festival from the University of Cambridge are  Professor Richard Evans, Professor Tim Whitmarsh and Dr Christine Corton.

Peter Florence, director of the Hay Festival, said: “Cambridge University nurtures and challenges the world’s greatest minds, and offers the deepest understanding of the most intractable problems and the most thrilling opportunities. And for one week a year they bring that thinking to a field in Wales and share it with everyone. That’s a wonderful gift.”

Dane Comerford, head of public engagement at the University of Cambridge, said: “The Cambridge series is a fantastic way to share fascinating research from the University with the public. The Hay Festival draws an international cross-section of people, from policy makers to prospective university students. We have found that Hay audiences are highly interested in the diversity of Cambridge speakers, and ask some great questions. We look forward to another wonderful series of speakers, with talks and debates covering so many areas of research and key ideas emerging from Cambridge, relevant to key issues faced globally today.”

To book tickets go to www.hayfestival.org. For the full line-up of the Cambridge Series and times, click here.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/from-robot-intelligence-to-sex-by-numbers-cambridge-heads-for-hay#sthash.tIWtChwS.dpuf

Early-Stage Embryos With Abnormalities May Still Develop Into Healthy Babies

Early-stage embryos with abnormalities may still develop into healthy babies

source: www.cam.ac.uk

Abnormal cells in the early embryo are not necessarily a sign that a baby will be born with a birth defect such as Down’s syndrome, suggests new research carried out in mice at the University of Cambridge. In a study published today in the journal Nature Communications, scientists show that abnormal cells are eliminated and replaced by healthy cells, repairing – and in some cases completely fixing – the embryo.

What does it mean if a quarter of the cells from the placenta carry a genetic abnormality – how likely is it that the child will have cells with this abnormality, too? This is the question we wanted to answer

Magdalena Zernicka-Goetz

Researchers at the Department of Physiology, Development and Neuroscience at Cambridge report a mouse model of aneuploidy, where some cells in the embryo contain an abnormal number of chromosomes. Normally, each cell in the human embryo should contain 23 pairs of chromosomes (22 pairs of chromosomes and one pair of sex chromosomes), but some can carry multiple copies of chromosomes, which can lead of developmental disorders. For example, children born with three copies of chromosome 21 will develop Down’s syndrome.

Pregnant mothers – particular older mothers, whose offspring are at greatest risk of developing such disorders – are offered tests to predict the likelihood of genetic abnormalities. Between the 11th and 14th weeks of pregnancy, mothers may be offered chorionic villus sampling (CVS), a test that involves removing and analysing cells from the placenta. A later test, known as amniocentesis, involves analysing cells shed by the foetus into the surrounding amniotic fluid – this test is more accurate, but is usually carried out during weeks 15-20 of the pregnancy, when the foetus is further developed.

Professor Magdalena Zernicka-Goetz, the study’s senior author, was inspired to carry out the research following her own experience when pregnant with her second child. “I am one of the growing number of women having children over the age of 40 – I was pregnant with my second child when I was 44,” says Professor Zernicka-Goetz.

At the time, a CVS test found that as many as a quarter of the cells in the placenta that joined her and her developing baby were abnormal: could the developing baby also have abnormal cells? When Professor Zernicka-Goetz spoke to geneticists about the potential implications, she found that very little was understood about the fate of embryos containing abnormal cells and about the fate of these abnormal cells within the developing embryos.

Fortunately for Professor Zernicka-Goetz, her son, Simon, was born healthy. “I know how lucky I was and how happy I felt when Simon was born healthy,” she says.

“Many expectant mothers have to make a difficult choice about their pregnancy based on a test whose results we don’t fully understand,” says Professor Zernicka-Goetz. “What does it mean if a quarter of the cells from the placenta carry a genetic abnormality –  how likely is it that the child will have cells with this abnormality, too? This is the question we wanted to answer. Given that the average age at which women have their children is rising, this is a question that will become increasingly important.”

“In fact, abnormal cells with numerical and/or structural anomalies of chromosomes have been observed in as many as 80-90% of human early stage embryos following in vitro fertilization,” says Professor Thierry Voet from the Wellcome Trust Sanger Institute, UK, and the University of Leuven, Belgium, another senior author of this paper, “and CSV tests may expose some degree of these abnormalities.”

In research funded by the Wellcome Trust, Professor Zernicka-Goetz and colleagues developed a mouse model of aneuploidy by mixing 8-cell stage mouse embryos in which the cells were normal with embryos in which the cells were abnormal. Abnormal mouse embryos are relatively unusual, so the team used a molecule known as reversine to induce aneuploidy.

In embryos where the mix of normal and abnormal cells was half and half, the researchers observed that the abnormal cells within the embryo were killed off by ‘apoptosis’, or programmed-cell death, even when placental cells retained abnormalities. This allowed the normal cells to take over, resulting in an embryo where all the cells were healthy. When the mix of cells was three abnormal cells to one normal cell, some of abnormal cells continued to survive, but the ratio of normal cells increased.

“The embryo has an amazing ability to correct itself,” explains Professor Zernicka-Goetz. “We found that even when half of the cells in the early stage embryo are abnormal, the embryo can fully repair itself. If this is the case in humans, too, it will mean that even when early indications suggest a child might have a birth defect because there are some, but importantly not all abnormal cells in its embryonic body, this isn’t necessarily the case.”

The researchers will now try to determine the exact proportion of healthy cells needed to completely repair an embryo and the mechanism by which the abnormal cells are eliminated.

Reference
Bolton, H et al. Mouse model of chromosome mosaicism reveals lineage-specific depletion of aneuploid cells and normal developmental potential. Nature Comms; 26 March 2016; DOI: 10.1038/ncomms11165


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/early-stage-embryos-with-abnormalities-may-still-develop-into-healthy-babies#sthash.ZXWcNRA5.dpuf

Solar Cell Material Can Recycle Light to Boost Efficiency

Solar cell material can recycle light to boost efficiency

source: www.cam.ac.uk

Perovskite materials can recycle light particles – a finding which could lead to a new generation of affordable, high-performance solar cells.

It’s a massive demonstration of the quality of this material and opens the door to maximising the efficiency of solar cells

Felix Deschler

Scientists have discovered that a highly promising group of materials known as hybrid lead halide perovskites can recycle light – a finding that they believe could lead to large gains in the efficiency of solar cells.

Hybrid lead halide perovskites are a particular group of synthetic materials which have been the subject of intensive scientific research, as they appear to promise a revolution in the field of solar energy. As well as being cheap and easy to produce, perovskite solar cells have, in the space of a few years, become almost as energy-efficient as silicon – the material currently used in most household solar panels.

By showing that they can also be optimised to recycle light, the new study suggests that this could just be the beginning. Solar cells work by absorbing photons from the sun to create electrical charges, but the process also works in reverse, because when the electrical charges recombine, they can create a photon. The research shows that perovskite cells have the extra ability to re-absorb these regenerated photons – a process known as “photon recycling”. This creates a concentration effect inside the cell, as if a lens has been used to focus lots of light in a single spot.

According to the researchers, this ability to recycle photons could be exploited with relative ease to create cells capable of pushing the limits of energy efficiency in solar panels.

The study builds on an established collaboration, focusing on the use of these materials not only in solar cells but also in light-emitting diodes, and was carried out in the group of Richard Friend, Cavendish Professor of Physics and Fellow of St John’s College at the University of Cambridge. The research was undertaken in partnership with the team of Henry Snaith at the University of Oxford and Bruno Ehrler at the FOM Institute, AMOLF, Amsterdam.

Felix Deschler, who is one of the corresponding authors of the study and works with a team studying perovskites at the Cavendish Laboratory, said: “It’s a massive demonstration of the quality of this material and opens the door to maximising the efficiency of solar cells. The fabrication methods that would be required to exploit this phenomenon are not complicated, and that should boost the efficiency of this technology significantly beyond what we have been able to achieve until now.”

Perovskite-based solar cells were first tested in 2012, and were so successful that in 2013, Science Magazine rated them one of the breakthroughs of the year.

Since then, researchers have made rapid progress in improving the efficiency with which these cells convert light into electrical energy. Recent experiments have produced power conversion efficiencies of around 20% – a figure already comparable with silicon cells.

By showing that perovskite-based cells can also recycle photons, the new research suggests that they could reach efficiencies well beyond this.

The study, which is reported in Science, involved shining a laser on to one part of a 500 nanometre-thick sample of lead-iodide perovskite. Perovskites emit light when they come into contact with it, so the team was able to measure photon activity inside the sample based on the light it emitted.

Close to where the laser light had shone on to the film, the researchers detected a near-infrared light emission. Crucially, however, this emission was also detected further away from the point where the laser hit the sample, together with a second emission composed of lower-energy photons.

“The low-energy component enables charges to be transported over a long distance, but the high-energy component could not exist unless photons were being recycled,” Luis Miguel Pazos Outón, lead author on the study, said. “Recycling is a quality that materials like silicon simply don’t have. This effect concentrates a lot of charges within a very small volume. These are produced by a combination of incoming photons and those being made within the material itself, and that’s what enhances its energy efficiency.”

As part of the study, Pazos Outón also manufactured the first demonstration of a perovskite-based back-contact solar cell. This single cell proved capable of transporting an electrical current more than 50 micrometres away from the contact point with the laser; a distance far greater than the researchers had predicted, and a direct result of multiple photon recycling events taking place within the sample.

The researchers now believe that perovskite solar cells, may be able to reach considerably higher efficiencies than they have to date. “The fact that we were able to show photon recycling happening in our own cell, which had not been optimised to produce energy, is extremely promising,” Richard Friend, a corresponding author, said. “If we can harness this it would lead to huge gains in terms of energy efficiency.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Quantum Effects At Work In The World’s Smelliest Superconductor

Quantum effects at work in the world’s smelliest superconductor

source: www.cam.ac.uk

Researchers have found that quantum effects are the reason that hydrogen sulphide – which has the distinct smell of rotten eggs –behaves as a superconductor at record-breaking temperatures, which may aid in the search for room temperature superconductors.

That we are able to make quantitative predictions with such a good agreement with the experiments is exciting and means that computation can be confidently used to accelerate the discovery of high temperature superconductors.

Chris Pickard

The quantum behaviour of hydrogen affects the structural properties of hydrogen-rich compounds, which are possible candidates for the elusive room temperature superconductor, according to new research co-authored at the University of Cambridge.

New theoretical results, published online in the journal Nature, suggest that the quantum nature of hydrogen – meaning that it can behave like a particle or a wave – strongly affects the recently discovered hydrogen sulphur superconductor, a compound that when subjected to extremely high pressure, is the highest-temperature superconductor yet identified. This new step towards understanding the underlying physics of high temperature superconductivity may aid in the search for a room temperature superconductor, which could be used for applications such as levitating trains, lossless electrical grids and next-generation supercomputers.

Superconductors are materials that carry electrical current with zero electrical resistance. Low-temperature, or conventional, superconductors were first identified in the early 20th century, but they need to be cooled close to absolute zero (zero degrees on the Kelvin scale, or -273 degrees Celsius) before they start to display superconductivity. For the past century, researchers have been searching for materials that behave as superconductors at higher temperatures, which would make them more suitable for practical applications. The ultimate goal is to identify a material which behaves as a superconductor at room temperature.

Last year, German researchers identified the highest temperature superconductor yet – hydrogen sulphide, the same compound that gives rotten eggs their distinctive odour. When subjected to extreme pressure – about one million times higher than the Earth’s atmospheric pressure – this stinky compound displays superconducting behaviour at temperatures as high as 203 Kelvin (-70 degrees Celsius), which is far higher than any other high temperature superconductor yet discovered.

Since this discovery, researchers have attempted to understand what it is about hydrogen sulphide that makes it capable of superconducting at such high temperatures. Now, new theoretical results suggest that the quantum behaviour of hydrogen may be the reason, as it changes the structure of the chemical bonds between atoms. The results were obtained by an international collaboration of researchers led by the University of the Basque Country and the Donostia International Physics Center, and including researchers from the University of Cambridge.

The behaviour of objects in our daily life is governed by classical, or Newtonian, physics. If an object is moving, we can measure both its position and momentum, to determine where an object is going and how long it will take to get there. The two properties are inherently linked.

However, in the strange world of quantum physics, things are different. According to a rule known as Heisenberg’s uncertainty principle, in any situation in which a particle has two linked properties, only one can be measured and the other must be uncertain.

Hydrogen, being the lightest element of the periodic table, is the atom most strongly subjected to quantum behaviour. Its quantum nature affects structural and physical properties of many hydrogen compounds. An example is high-pressure ice, where quantum fluctuations of the proton lead to a change in the way that the molecules are held together, so that the chemical bonds between atoms become symmetrical.

The researchers behind the current study believe that a similar quantum hydrogen-bond symmetrisation occurs in the hydrogen sulphide superconductor.

Theoretical models that treat hydrogen atoms as classical particles predict that at extremely high pressures – even higher than those used by the German researchers for their record-breaking superconductor – the atoms sit exactly halfway between two sulphur atoms, making a fully symmetrical structure. However, at lower pressures, hydrogen atoms move to an off-centre position, forming one shorter and one longer bond.

The researchers have found that when considering the hydrogen atoms as quantum particles behaving like waves, they form symmetrical bonds at much lower pressures – around the same as those used for the German-led experiment, meaning that quantum physics, and symmetrical hydrogen bonds, were behind the record-breaking superconductivity.

“That we are able to make quantitative predictions with such a good agreement with the experiments is exciting and means that computation can be confidently used to accelerate the discovery of high temperature superconductors,” said study co-author Professor Chris Pickard of Cambridge’s Department of Materials Science & Metallurgy.

According to the researcher’s calculations, the quantum symmetrisation of the hydrogen bond has a tremendous impact on the vibrational and superconducting properties of hydrogen sulphide. “In order to theoretically reproduce the observed pressure dependence of the superconducting critical temperature the quantum symmetrisation needs to be taken into account,” said the study’s first author, Ion Errea, from the University of the Basque Country and Donostia International Physics Center.

The discovery of such a high temperature superconductor suggests that room temperature superconductivity might be possible in other hydrogen-rich compounds. The current theoretical study shows that in all these compounds, the quantum motion of hydrogen can strongly affect the structural properties, even modifying the chemical bonding, and the electron-phonon interaction that drives the superconducting transition.

“Theory and computation have played an important role in the hunt for superconducting hydrides under extreme compression,” said Pickard. “The challenges for the future are twofold – increasing the temperature towards room temperature, but, more importantly, dramatically reducing the pressures required.”

Reference:
Ion Errea et. al. ‘Quantum hydrogen-bond symmetrization in the superconducting hydrogen sulfide system.’ Nature (2016).DOI: 10.1038/nature17175.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Spectral Edge Announces Successful £1.5m Funding Round

Spectral Edge announces successful £1.5m funding round

source: www.realwire.com

Cambridge-based image fusion pioneer attracts major backing to commercialise product portfolio

Cambridge, 22nd March 2016, Spectral Edge, (http://www.spectraledge.co.uk/) today announced the successful completion of an oversubscribed £1.5 million second funding round. New lead investors IQ Capital and Parkwalk Advisors, along with angel investors from Cambridge Angels, Wren Capital, Cambridge Capital Group and Martlet, the Marshall of Cambridge Corporate Angel investment fund, join the Rainbow Seed Fund/Midven and Iceni in backing the company.

Spectral Edge Phusion
Spectral Edge Phusion

Spun out of the University of East Anglia (UEA) Colour Lab, Spectral Edge has developed innovative image fusion technology. This combines different types of image, ranging from the visible to invisible (such as infrared and thermal), to enhance detail, aid visual accessibility, and create ever more beautiful pictures.

Spectral Edge’s Phusion technology platform has already been proven in the visual accessibility market, where independent studies have shown that it can transform the TV viewing experience for the estimated 4% of the world’s population that suffers from colour-blindness. It enhances live TV and video, allowing colour-blind viewers to differentiate between colour combinations such as red-green and pink-grey so that otherwise inaccessible content such as sport can be enjoyed.

The new funding will be used to expand Spectral Edge’s team, increase investment in sales and marketing, and underpin development of its product portfolio into IP-licensable products and reference designs. Spectral Edge is mainly targetingcomputational photography, where blending near-infrared and visible images gives higher quality, more beautiful results with greater depth. Other applications include security, where the combination of visible and thermal imaging enhances details to provide easier identification of people filmed on surveillance cameras, as well as visual accessibility through itsEyeteq brand.

“Spectral Edge is a true pioneer in the field of photography. They are set to disrupt and transform the imaging sector, not just within consumer and professional photography, but also across a broad range of business sectors,” said Max Bautin, Managing Partner at IQ Capital. “Backed by a robust catalogue of IP, Spectral Edge’s technology enables individuals and companies to take pictures and record videos with unparalleled detail by taking advantage of non-visible information like near-infra red and heat. We are proud to add Spectral Edge to our portfolio of companies. We back cutting-edge IP-rich technology which pushes the boundaries but also has a proven track record of experiencing stable growth, and Spectral Edge fits that mould perfectly.”

“We are delighted to support Professor Graham Finlayson and his team at Spectral Edge,” said Alastair Kilgour CIO Parkwalk Advisors. “We believe Phusion could prove to be a substantial enhancement to the quality of digital imaging and as such have significant commercial prospects.”

Spectral Edge is led by an experienced team that combines deep technical and business experience. It includes Professor Graham Finlayson, Head of Vision Group and Professor of Computing Science, UEA, Christopher Cytera (managing director) and serial entrepreneur Dr Robert Swann (chairman).

“After having proved the potential of our innovative Phusion technology, this new funding provides Spectral Edge with a springboard for growth,” said Christopher Cytera, Managing Director, Spectral Edge. “The significant investment from IQ Capital, Parkwalk Advisors, Midven, Iceni and Cambridge angel groups demonstrates their faith in our technology, approach and overall strategy. We can now accelerate commercialisation of our intellectual property portfolio and grow by licensing our technology to consumer electronics manufacturers and service providers in our key markets of computational photography, visual accessibility and security.”

-ends-

About Spectral Edge
Formed in February 2011, Spectral Edge is a spin-out company of the Colour Group of the School of Computing Sciences at the University of East Anglia (United Kingdom). It operates from offices in Cambridge.

Spectral Edge Phusion technology enhances images and video by using information outside the normal visible spectrum or applying transformations to that within it. Applications range from computational photography, security and consumer applications such as enhancing TV pictures to improve content accessibility.

Website: http://www.spectraledge.co.uk/
Pictures & Media Pack: http://www.spectraledge.co.uk/about/media-pack
LinkedIn: https://www.linkedin.com/company/spectral-edge-ltd/
Facebook: https://www.facebook.com/SpectralEdge
Twitter: @SpectralEdgeLtd
Email: pr@spectraledge.co.uk

About IQ Capital
IQ Capital is a UK focused venture capital investor, based in Cambridge and London. We invest in B2B software including machine learning & AI, data analytics, cyber security, AdTech, FinTech and e-health, as well as embedded systems and robotics. Recent exits include trade sales to Google, Apple, Becton Dickinson and Huawei.

IQ Capital always invests alongside an experienced, sector-expert entrepreneur who has recently made a significant exit in the same sector and who now has the right skills, mind-set and motivation to support an early stage business. We are currently investing from our 2015 IQ Capital Fund II which is actively looking for new investment opportunities.

About Parkwalk Advisors
Parkwalk is an independent investment firm dedicated to providing clients with access to some of the most exciting deal-flow emanating from British R&D intensive institutions and Universities. Parkwalk invests in, and raises capital for, innovative UK technology companies. The Funds are investment-driven venture capital funds, seeking capital appreciation. Parkwalk portfolio companies all have deeply-embedded IP and commercial potential, and range from early stage seed capital, through development and commercial capital to AIM-listed investments. In the last 12 months Parkwalk has invested over £20m into this investment strategy.

More information can be found at www.parkwalkadvisors.com

About Wren Capital
Wren Capital, whose managing partner is Rajat Malhotra (UK Business Angels Associations Angel Investor of the Year for 2013) specialises in early stage investing across science, engineering and software. Our involvement is tailored to the needs of each business and we aim to be a supportive value-adding investor. We generally follow a co-investment model and have links with a number of universities, business schools, angel networks and a trusted network of high quality investors who share our investment philosophy.

For more information please see: www.wrencapital.co.uk

For more information:
Chris Measures (PR for Spectral Edge)
+44 7976 535147
chris@measuresconsulting.com

XAAR Launches New Family Of Printheads

XAAR LAUNCHES NEW FAMILY OF PRINTHEADS

22(nd) March 2016 – Xaar, the world leader in industrial inkjet technology, is pleased to announce the launch of the Xaar 1003 family of printheads.

The introduction of the Xaar 1003 is in line with the Company’s 2020 vision, recently outlined at the Full Year results, and reiterates Xaar’s commitment to investing significantly in Research & Development.

The Xaar 1003 sets a new benchmark for industrial inkjet printing, with its new upgrades allowing higher productivity, versatility and an all-round superior performance than previous Xaar 1001 and 1002 models. These upgrades include:

— The XaarGuard(TM), the ultimate in nozzle plate protection, providing the longest maintenance-free production runs in the industry*.

— A step forward in consistent print quality across the wide print widths used in many single-pass applications, due to Xaar’s new X-ACT(TM) Micro Electric Mechanical Systems (MEMS) manufacturing process.

Like its predecessors, the new Xaar 1003 family of printheads combines Xaar’s unique TF Technology(R) with Xaar’s Hybrid Side Shooter(R) architecture so that ink is recirculated directly past the back of the nozzle during drop ejection at the highest flow rates in the industry. This ensures that the printhead operates reliably even in the harshest industrial environments and also in horizontal and vertical (skyscraper) jetting modes. Ink is in constant circulation, preventing sedimentation and subsequent blocking of the nozzles when jetting.

In response to market demand, the Xaar 1003 will be available in three variants. The Xaar 1003 GS12 (rich colours or higher speeds) for ceramics applications is first to be launched, closely followed by the Xaar 1003 GS6 (for fine detail) and the Xaar 1003 GS40 (for special effects). The other variants for UV applications will also be available later in the first half of this year.

Gillian Ewers, Xaar’s Director of Marketing, said:

“We are delighted to introduce the new and exciting Xaar 1003 printhead family to the market. This launch is further evidence of our commitment to our customers, and to ensuring Xaar remains at the leading edge of single pass industrial inkjet printing.”

FDA 510(k) Clearance Granted For PneumaCare’s Ground-Breaking Thora-3DI™ System For Non-Contact Respiratory Measurement

FDA 510(k) Clearance Granted For PneumaCare’s Ground-Breaking Thora-3DI™ System For Non-Contact Respiratory Measurement

Displaying pneumacare logo.jpgDisplaying pneumacare logo.jpgpneuma

We are pleased to write to you with some very exciting news about PneumaCare Ltd ( www.pneumacare.com). Please see the attached Press Release issued by the Company.

Thora-3DI™ is a non-invasive, non-contact device that uses a patented technology known as Structured Light Plethysmography (SLP) to measure breathing through detection of movement of the chest and abdomen. The technology can be used to accurately measure respiratory status in patients with a wide range of respiratory conditions, including asthma, chronic obstructive pulmonary disease (COPD), pneumonia and lung failure, and to assess patients before and after surgery. The SLP technology uses safe white light to project a grid pattern onto the chest, and record accurate 3D images of chest wall movements over time. The measurements are converted into visual and numerical outputs, which can help clinicians to make faster diagnoses and treatment decisions, and continually monitor patients in real time, without direct patient contact or intervention. The Thora-3DI™ is mobile, and can easily be moved between wards, or dismantled for transport and use in the community or in clinics.

We would be delighted to discuss any aspect of our business and products with you in light of this great development for the Company.

Read more about PneumaCare and Thora-3DI™ here:

 

No Evidence That Genetic Tests Change People’s Behaviour

No evidence that genetic tests change people’s behaviour

source: www.cam.ac.uk

Genetic tests that provide an estimate of an individual’s risk of developing diseases such as lung cancer and heart disease do not appear to motivate a change in behaviour to reduce the risk, according to a study led by the University of Cambridge and published in The BMJ today.

Expectations have been high that giving people information about their genetic risk will empower them to change their behaviour, but we have found no evidence that this is the case

Theresa Marteau

Researchers at the Behaviour and Health Research Unit analysed a number of studies that looked at whether testing an individual’s DNA for genetic variants that increased their risk of developing so-called ‘common complex diseases’ influenced their health-related behaviour. Complex diseases are those such as heart disease, most cancers and diabetes, where no single gene causes the disease, but rather it is the interaction of dozens – possibly hundreds – of genes together with an individual’s environment and behaviour that leads to the disease.

Genome sequencing – reading an individual’s entire DNA – has opened up the potential to provide individuals with information on whether or not they carry genes known to increase their risk of disease. Such tests are controversial – knowing that an individual carries these variants does not mean that individual will develop the disease; however, proponents argue that if an individual knows that he or she is at a greater risk of a particular disease, they can make an informed decision about whether or not to change their behaviour.

In the early 2000s, several companies launched direct-to-consumer tests for a range of common complex disorders, and these tests continue to be sold in Canada, the United Kingdom, and other European countries. In 2013 in the United States, the Food and Drug Administration ordered the company 23andMe to stop selling its health-related testing kits because of concerns about their accuracy and usefulness, but in October 2015 the company resumed selling some health-related services.

The Cambridge researchers examined over 10,000 abstracts from relevant studies and identified from these 18 studies that matched their criteria for inclusion in their analysis. By compiling the data, they found that informing individuals of their genetic risk had little or no effect on their health-related behaviour, particularly for smoking cessation and physical activity.

Professor Theresa Marteau, who led the study, says: “Expectations have been high that giving people information about their genetic risk will empower them to change their behaviour – to eat more healthily or to stop smoking, for example – but we have found no evidence that this is the case. But nor does the evidence support concerns that such information might demotivate people and discourage them from changing their behaviour.”

However, the researchers recognise that DNA testing may still play a role in improving people’s health. “DNA testing, alone or in combination with other assessments of disease risk, may help clinicians identify individuals at greatest risk and allow them to target interventions such as screening tests, surgery, and drug treatments,” explains co-author Dr Gareth Hollands.

The team argue that these results are consistent with other evidence that risk communication typically has at best only a small effect on health behaviour.

The study was funded by the Medical Research Council and the National Institute for Health Research.

Reference
Hollands, GJ et al. The impact of communicating genetic risks of disease on risk-reducing health behaviour: systematic review with meta-analysis. BMJ; 15 March 2016; DOI: 10.1136/bmj.i1102


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/no-evidence-that-genetic-tests-change-peoples-behaviour#sthash.1ZLtpdlz.dpuf

Researchers Identify When Parkinson’s Proteins Become Toxic To Brain Cells

Researchers identify when Parkinson’s proteins become toxic to brain cells

Source: www.cam.ac.uk

Observation of the point at which proteins associated with Parkinson’s disease become toxic to brain cells could help identify how and why people develop the disease, and aid in the search for potential treatments.

The damage appears to be done before visible fibrils are even formed.

Dorothea Pinotsi

Researchers have used a non-invasive method of observing how the process leading to Parkinson’s disease takes place at the nanoscale, and identified the point in the process at which proteins in the brain become toxic, eventually leading to the death of brain cells.

The results suggest that the same protein can either cause, or protect against, the toxic effects that lead to the death of brain cells, depending on the specific structural form it takes, and that toxic effects take hold when there is an imbalance of the level of protein in its natural form in a cell. The work could help unravel how and why people develop Parkinson’s, and aid in the search for potential treatments. The study is published in the journal Proceedings of the National Academy of Sciences.

Using super-resolution microscopy, researchers from the University of Cambridge were able to observe the behaviour of different types of alpha-synuclein, a protein closely associated with Parkinson’s disease, in order to find how it affects neurons, and at what point it becomes toxic.

Parkinson’s disease is one of a number of neurodegenerative diseases caused when naturally occurring proteins fold into the wrong shape and stick together with other proteins, eventually forming thin filament-like structures called amyloid fibrils. These amyloid deposits of aggregated alpha-synuclein, also known as Lewy bodies, are the hallmark of Parkinson’s disease.

Parkinson’s disease is the second-most common neurodegenerative disease worldwide (after Alzheimer’s disease). Close to 130,000 people in the UK, and more than seven million worldwide, have the disease. Symptoms include muscle tremors, stiffness and difficulty walking. Dementia is common in later stages of the disease.

“What hasn’t been clear is whether once alpha-synuclein fibrils have formed they are still toxic to the cell,” said Dr Dorothea Pinotsi of Cambridge’s Department of Chemical Engineering and Biotechnology, the paper’s first author.

Pinotsi and her colleagues from Cambridge’s Department of Chemical Engineering & Biotechnology and Department of Chemistry, and led by Dr Gabriele Kaminski Schierle, have used optical ‘super-resolution’ techniques to look into live neurons without damaging the tissue. “Now we can look at how proteins associated with neurodegenerative conditions grow over time, and how these proteins come together and are passed on to neighbouring cells,” said Pinotsi.

The researchers used different forms of alpha-synuclein and observed their behaviour in neurons from rats. They were then able to correlate what they saw with the amount of toxicity that was present.

They found that when they added alpha-synuclein fibrils to the neurons, they interacted with alpha-synuclein protein that was already in the cell, and no toxic effects were present.

“It was believed that amyloid fibrils that attack the healthy protein in the cell would be toxic to the cell,” said Pinotsi. “But when we added a different, soluble form of alpha-synuclein, it didn’t interact with the protein that was already present in the neuron and interestingly this was where we saw toxic effects and cells began to die. So somehow, when the soluble protein was added, it created this toxic effect. The damage appears to be done before visible fibrils are even formed.”

The researchers then observed that by adding the soluble form of alpha-synuclein together with amyloid fibrils, the toxic effect of the former could be overcome. It appeared that the amyloid fibrils acted like magnets for the soluble protein and mopped up the soluble protein pool, shielding against the associated toxic effects.

“These findings change the way we look at the disease, because the damage to the neuron can happen when there is simply extra soluble protein present in the cell – it’s the excess amount of this protein that appears to cause the toxic effects that lead to the death of brain cells,” said Pinotsi. Extra soluble protein can be caused by genetic factors or ageing, although there is some evidence that it could also be caused by trauma to the head.

The research shows how important it is to fully understand the processes at work behind neurodegenerative diseases, so that the right step in the process can be targeted.

“With these optical super-resolution techniques, we can really see details we couldn’t see before, so we may be able to counteract this toxic effect at an early stage,” said Pinotsi.

The research was funded by the Medical Research Council, the Engineering and Physical Sciences Research Council, and the Wellcome Trust.

Reference:
Dorothea Pinotsi et. al. ‘Nanoscopic insights into seeding mechanisms and toxicity of α-synuclein species in neurons.’ Proceedings of the National Academy of Sciences (2016). DOI: 10.1073/pnas.1516546113


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/researchers-identify-when-parkinsons-proteins-become-toxic-to-brain-cells#sthash.6ou2MzQe.dpuf

‘Good’ Cholesterol Doesn’t Always Lower Heart Attack Risk

‘Good’ cholesterol doesn’t always lower heart attack risk

source: www.cam.ac.uk

Some people with high levels of ‘good’ high density lipoprotein cholesterol (HDL-C) are at increased risk of coronary heart disease, contrary to earlier evidence that people with more HDL-C are usually at lower heart disease risk. This finding comes from an international study involving researchers at the University of Cambridge, funded by the British Heart Foundation (BHF).

Large-scale collaborative research like this paves the way for further studies of rare mutations that might be significantly increasing people’s risk of a deadly heart attack

Adam Butterworth

The discovery, published today in Science, could move researchers away from potentially ineffective HDL-raising drugs to treat coronary heart disease, and lead to the development of new treatments, helping to reduce their risk of heart attack.

The researchers studied people with a rare genetic mutation in the SCARB1 gene, called the P376L variant, which causes the body to have high levels of ‘good’ HDL-C. High levels of ‘good’ cholesterol are commonly associated with reduced risk for coronary heart disease. Challenging this view, the researchers unexpectedly found that people with the rare mutation, who had increased levels of HDL-C, had an 80 per cent increased relative risk of the disease – a figure almost equivalent to the increased risk caused by smoking.

Coronary heart disease is responsible for nearly 70,000 deaths every year, almost entirely through heart attacks, making it the UK’s single biggest killer. The disease involves the build-up of fatty material, or plaque, in the coronary artery walls. If large quantities accumulate in the vessel walls, blood flow to the heart can become restricted or blocked, increasing risk of a heart attack.

The international team of scientists included BHF-funded researchers Professor Sir Nilesh Samani at the University of Leicester and Professor John Danesh at the University of Cambridge. They initially looked at the DNA of 328 individuals with very high levels of HDL-C in the blood and compared them to 398 people with relatively low HDL-C. As the P376L variant they found was so rare, they then looked at its effects on HDL-C and heart disease in more than half a million additional people.

Dr Adam Butterworth, from the Cardiovascular Epidemiology Unit,  University of Cambridge,  and co-investigator of this study, said: “We found that people carrying a rare genetic mutation causing higher levels of the so-called ‘good’ HDL-cholesterol are, unexpectedly, at greater risk of heart disease. This discovery could lead to new drugs that improve the processing of HDL-C to prevent devastating heart attacks.

“Large-scale collaborative research like this paves the way for further studies of rare mutations that might be significantly increasing people’s risk of a deadly heart attack. These discoveries also give researchers the knowledge we need to develop better treatments.”

Professor Peter Weissberg, Medical Director at the BHF, added said: “This is an important study that sheds light on one of the major puzzles relating to cholesterol and heart disease, which is that despite strong evidence showing HDL-C reduces heart disease risk, clinical trials on the effects of HDL-C-raising drugs have been disappointing.

“These new findings suggest that the way in which HDL-C is handled by the body is more important in determining risk of a heart attack than the levels of HDL-C in the blood. Only by understanding the underlying biology that links HDL-C with heart attacks can we develop new treatments to prevent them. These unexpected findings pave the way for further research into the SCARB1 pathway to identify new treatments to reduce heart attacks in the future.”

Additional funding for the study in the USA came from the National Center for Research Resources and the National Center for Advancing Translational Sciences of the National Institute of Health.

Reference
Zanoni, P et al. Rare Variant in Scavenger Receptor BI raises HDL Cholesterol and Increases Risk of Coronary Heart Disease. Science; 10 Mar 2016; DOI: 10.1126/science.aad3517

Adapted from a press release from the British Heart Foundation


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Lines of Thought: Discoveries That Changed the World

Lines of Thought: Discoveries that Changed the World

source: www.cam.ac.uk

Some of the world’s most valuable books and manuscripts – texts which have altered the very fabric of our understanding – will go on display in Cambridge this week as Cambridge University Library celebrates its 600th birthday with a once-in-a-lifetime exhibition of its greatest treasures.

What started in 1416 as a small collection of manuscripts locked in wooden chests, has now grown into a global institution housing eight million books and manuscripts, billions of words, and millions of images, all communicating thousands of years of human thought.

Anne Jarvis

Lines of Thought: Discoveries that Changed the World, opens free to the public this Friday (March 11) and celebrates 4,000 years of recorded thought through the Library’s unique and irreplaceable collections. More than 70 per cent of the exhibits are displayed to the public for the first time in this exhibition.

Tracing the connections between Darwin and DNA, Newton and Hawking, and 3,000-year-old Chinese oracle bones and Twitter,the exhibition investigates, through six distinct themes, how Cambridge University Library’s millions of books and manuscripts have transformed our understanding of life here on earth and our place among the stars.

The iconic Giles Gilbert Scott building, opened in the 1930s, now holds more than eight million books, journals, maps and magazines – as well as some of the world’s most iconic scientific, literary and cultural treasures.

The new exhibition puts on display Newton’s own annotated copy of Principia Mathematica, Darwin’s papers on evolution, 3,000-year-old Chinese oracle bones, a cuneiform tablet from 2,000BC, and the earliest reliable text for 20 of Shakespeare’s plays.

Other items going on display include:

  • Edmund Halley’s handwritten notebook/sketches of Halley’s Comet (1682)
  • Stephen Hawking’s draft typescript of A Brief History of Time
  • Darwin’s first pencil sketch of Species Theory and his Primate Tree
  • A 2nd century AD fragment of Homer’s Odyssey.
  • The Nash Papyrus – a 2,000-year-old copy of the Ten Commandments
  • Codex Bezae – 5th century New Testament, crucial to our understanding of The Bible.
  • A hand-coloured copy of Vesalius’ 1543 Epitome – one of the most influential works in western medicine
  •  The earliest known record of a human dissection in England (1564)
  • A Babylonian tablet dated 2039 BCE (the oldest object in the library)
  • The Gutenberg Bible – the earliest substantive printed book in Western Europe (1455)
  • The Book of Deer, 10th century gospel book: thought to be the oldest Scottish book and the first example of written Gaelic
  • The first catalogue listing the contents of the Library in 1424, barely a decade after it was first identified in the wills of William Loring and William Hunden

The six Lines of Thought featured in the exhibition are: From clay tablets to Twitter feed (Revolutions in human communication); The evolution of genetics (From Darwin to DNA); Beginning with the word (Communicating faith); On the shoulders of giants (Understanding gravity); Eternal lines (Telling the story of history) and Illustrating anatomy (Understanding the body).

University Librarian Anne Jarvis said: “It’s extraordinary to think that the University Library, which started in 1416 as a small collection of manuscripts locked in wooden chests, has now grown into a global institution housing eight million books and manuscripts, billions of words, and millions of images, all communicating thousands of years of human thought.

“Our spectacular exhibition showcases six key concepts in human history that have been critical in shaping the world and culture we know today, illustrating the myriad lines of thought that take us back into the past, and forward to tomorrow’s research, innovation and literature.”

The University Library, which is older than both the British Library and the Vatican Library, has more than 125 miles of shelving and more than two million books immediately available to readers – making it the largest open-access library in Europe.

The first Line of Thought featured in the exhibition: From clay tablet to Twitter begins with a tiny 4,000-year-old tablet used as a receipt for wool, evidence of an advanced civilisation using a cuneiform script and Sumerian language, probably written in Girsu (Southern Iraq) and precisely dated to 2039BCE. The tablet is on public display for the first time in this exhibition.

From there, it charts the many and varied revolutions in communications throughout history, taking in Chinese oracle bones, the Gutenberg Bible, a palm leaf manuscript written in 1015AD, newspapers, chapbooks and 20th century Penguin paperbacks, before ending with a book containing Shakespeare’s Hamlet written in tweets.

Objects going on display for the first time during Lines of Thought include: the Book of Deer, Vesalius’s 3D manikin of the human body, William Morris’s extensively annotated proofs of his edition of Beowulf, a wonderful caricature of Darwin, and works by Copernicus, Galileo and Jocelyn Bell Burnell, the discoverer of pulsars.

“For six centuries, the collections of Cambridge University Library have challenged and changed the world around us,” added Jarvis.  “Across science, literature and the arts, the millions of books, manuscripts and digital archives we hold have altered the very fabric of our understanding.

“Only in Cambridge, can you find Newton’s greatest works sitting alongside Darwin’s most important papers on evolution, or Sassoon’s wartime poetry books taking their place next to the Gutenberg Bible and the archive of Margaret Drabble.”

To celebrate the Library’s 600th anniversary, the Library has selected one iconic item from each theme within the exhibition to be digitised and made available within a free iPad app, Words that Changed the World. Readers can turn the pages of these masterworks of culture and science, from cover to cover, accompanied by University experts explaining their importance and giving contextual information.

Lines of Thought: Discoveries that Changed the World opens to the public on Friday, March 11, 2016 and runs until Friday, September 30, 2016. Entry is free.

The exhibition is also available to view online, and items from the exhibition have also been digitised and made available on the Cambridge Digital Library.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/lines-of-thought-discoveries-that-changed-the-world#sthash.TIYiftDw.dpuf

AI Crossword-Solving Application Could Make Machines Better at Understanding Language

AI crossword-solving application could make machines better at understanding language

source: www.cam.ac.uk

A web-based machine language system solves crossword puzzles far better than commercially-available products, and may help machines better understand language.

Despite recent progress in AI, problems involving language understanding are particularly difficult.

Felix Hill

Researchers have designed a web-based platform which uses artificial neural networks to answer standard crossword clues better than existing commercial products specifically designed for the task. The system, which is freely availableonline, could help machines understand language more effectively.

In tests against commercial crossword-solving software, the system, designed by researchers from the UK, US and Canada, was more accurate at answering clues that were single words (e.g. ‘culpability’ – guilt), a short combination of words (e.g. ‘devil devotee’ – Satanist), or a longer sentence or phrase (e.g. ‘French poet and key figure in the development of Symbolism’ – Baudelaire). The system can also be used a ‘reverse dictionary’ in which the user describes a concept and the system returns possible words to describe that concept.

The researchers used the definitions contained in six dictionaries, plus Wikipedia, to ‘train’ the system so that it could understand words, phrases and sentences – using the definitions as a bridge between words and sentences. Their results, published in the journal Transactions of the Association for Computational Linguistics, suggest that a similar approach may lead to improved output from more general language understanding and dialogue systems and information retrieval engines in general. All of the code and data behind the application has been made freely available for future research.

“Over the past few years, there’s been a mini-revolution in machine learning,” said Felix Hill of the University of Cambridge’s Computer Laboratory, one of the paper’s authors. “We’re seeing a lot more usage of deep learning, which is especially useful for language perception and speech recognition.”

Deep learning refers to an approach in which artificial neural networks with little or no prior ‘knowledge’ are trained to recreate human abilities using massive amounts of data. For this particular application, the researchers used dictionaries – training the model on hundreds of thousands of definitions of English words, plus Wikipedia.

“Dictionaries contain just about enough examples to make deep learning viable, but we noticed that the models get better and better the more examples you give them,” said Hill. “Our experiments show that definitions contain a valuable signal for helping models to interpret and represent the meaning of phrases and sentences.”

Working with Anna Korhonen from the Cambridge’s Department of Theoretical and Applied Linguistics, and researchers from the Université de Montréal and New York University, Hill used the model as a way of bridging the gap between machines that understand the meanings of individual words and machines that can understand the meanings of phrases and sentences.

“Despite recent progress in AI, problems involving language understanding are particularly difficult, and our work suggests many possible applications of deep neural networks to language technology,” said Hill. “One of the biggest challenges in training computers to understand language is recreating the many rich and diverse information sources available to humans when they learn to speak and read.”

However, there is still a long way to go. For instance, when Hill’s system receives a query, the machine has no idea about the user’s intention or the wider context of why the question is being asked. Humans, on the other hand, can use their background knowledge and signals like body language to figure out the intent behind the query.

Hill describes recent progress in learning-based AI systems in terms of behaviourism and cognitivism: two movements in psychology that effect how one views learning and education. Behaviourism, as the name implies, looks at behaviour without looking at what the brain and neurons are doing, while cognitivism looks at the mental processes that underlie behaviour. Deep learning systems like the one built by Hill and his colleagues reflect a cognitivist approach, but for a system to have something approaching human intelligence, it would have to have a little of both.

“Our system can’t go too far beyond the dictionary data on which it was trained, but the ways in which it can are interesting, and make it a surprisingly robust question and answer system – and quite good at solving crossword puzzles,” said Hill. While it was not built with the purpose of solving crossword puzzles, the researchers found that it actually performed better than commercially-available products that are specifically engineered for the task.

Existing commercial crossword-answering applications function in a similar way to a Google search, with one system able to reference over 1100 dictionaries. While this approach has advantages if you want to look up a definition verbatim, it works less well when you input a question or query that the model has never seen in training. It also makes it incredibly ‘heavy’ in terms of the amount of memory it requires. “Traditional approaches are like lugging many heavy dictionaries around with you, whereas our neural system is incredibly light,” said Hill.

According to the researchers, the results show the effectiveness of definition-based training for developing models that understand phrases and sentences. They are currently looking at ways of enhancing their system, specifically by combining it with more behaviourist-style models of language learning and linguistic interaction.

Reference:
Hill, Felix et al. Learning to Understand Phrases by Embedding the Dictionary. Transactions of the Association for Computational Linguistics, [S.l.], v. 4, p. 17-30, feb. 2016. ISSN 2307-387X. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/ai-crossword-solving-application-could-make-machines-better-at-understanding-language#sthash.vN5dFTwn.dpuf

Overcrowded Internet Domain Space is Stifling Demand, Suggesting a Future ‘not-com’ Boom

Overcrowded Internet domain space is stifling demand, suggesting a future ‘not-com’ boom

source: www.cam.ac.uk

New research suggests that a lack of remaining domain names with easy to remember – and consequently valuable – word combinations is restricting Internet growth, with an untapped demand of as much as 25% of all current domains being held back. The study’s author contends that the findings show ICANN’s release of new top level domains could prove a wise policy.

 

What I find fascinating is that the observed transaction prices of domain names reveal a free market valuation of linguistic characteristics and language itself

Thies Lindenthal

As the digital age dawned, pioneers successfully snapped up broad swathes of the most popular and memorable domain names, such as nouns, places and combinations thereof – claiming valuable ‘virtual real estate’ under the top level domains such as dot-com, dot-co-dot-uk and so on.

Now, the first research to try and define current demand for Internet domain names suggests that the drying up of intuitive and familiar word combinations has seen domain registration drop far below the expected appetite, given the extent to which we now live online, with new entrepreneurs struggling to find “their slice” of virtual space.

In fact, the study estimates that the lack of available high quality domains featuring popular names, locations and things could be stifling as much as a further 25% of the total current registered domains.

With the total standing at around 294 million as of summer 2015, this could mean over 73m potential domains stymied due to an inability to register relevant word combinations likely to drive traffic for personal or professional purposes.

However, as the Internet Corporation for Assigned Names and Numbers (ICANN) has begun to roll out the option to issue brand new top-level domains for almost any word, whether it’s dot-hotel, dot-books or dot-sex – dubbed the ‘not-coms’ – the research suggests there is substantial untapped demand that could fuel additional growth in the domain registrations.

Dr Thies Lindenthal from the University of Cambridge, who conducted the study, says that – while the domain name market may be new – the economics is not. The market fits nicely onto classic models of urban economics, he says, and – as with property – a lot rides on location.

“Cyberspace is no different from traditional cities, at least in economic terms. In a basic city model, you have a business district to which all residents commute, and property value is determined by proximity to that hub,” said Lindenthal, from Cambridge’s Department of Land Economy.

“It’s similar in cyberspace. The commute to, and consequent value of, virtual locations depend on linguistic attributes: familiarity, memorability and importantly length. A virtual commute is about the ease with which a domain name is remembered and the time it takes to type.

“The snappier and more recognisable a domain, the more it is going to be worth. What I find fascinating is that the observed transaction prices of domain names reveal a free market valuation of linguistic characteristics and language itself,” he said.

From 2007 onwards, annual additions to the domain stock began to lag, while between 2006 and 2012 re-sale prices of domain names already registered rose 63% – indicating a demand for virtual ‘locations’ outpacing the supply of available attractive names, with competition driving up prices.

Recently, ICANN began the release of 1,400 new top-level domains, the ‘not-coms’, to expand current extensions such as dot-com, dot-org etc, with the aim of expanding the domain supply.

Google were one of the first to use a ‘not-com’ to get around the domain name shortage. Finding all obvious domains taken for its new parent company ‘Alphabet’, the company acquired space on the new dot-xyz domain to create the canny web address:www.abc.xyz. 

Serious money is currently being invested in ‘not-coms’. With initial application fees around the $185k mark, Lindenthal says it could be as much as $2m before you have the necessary infrastructure to secure and manage your new top level domain, but, once owned, you are able to set prices for anyone who wants to acquire virtual real estate under that domain.

“By 2013, as much as $350m had already been put down in application fees alone, and further billions must have been invested. Clearly, corporations and entrepreneurs have trust in the new domains being able to serve a previously unmet demand, and from this research it appears some of them may be right,” said Lindenthal.

The set of catchy keywords that appeal to humans is still bound by the way we process language

For the study, published today in the Journal of Real Estate Finance & Economics, Lindenthal set out to get a rough idea of the demand for name registration not served by current top-level domains.

Looking at just dot-coms, he compared existing registrations with census data for popular family names in the US. “You have to assume someone called Miller is as likely to register a domain with their name in it as someone called Smith, for example. So, roughly speaking, if there are twice as many Millers, you would expect to see twice as many domains with that name in it.”

Lindenthal found that the more prevalent the family name, the lower the number of domains featuring that name per head of population. Moreover, a one per cent increase in a surname pushes up domains featuring that name by only 0.74% – suggesting a substantial gap between likely demand and current domain registration.

Lindenthal also explored domain registration featuring city names compared to size of the population, and found a similar gap between expected demand and current domains.

Using statistical modelling analysis, he concludes that – based on the available data – an estimate for domain name demand not met by available word combinations is as much as 25% of all currently registered Internet domains.

A shorter ‘cyber-commute’ was found to be more desirable. Increasing the length of a surname by just one character, from six to seven, reduces the number of registrations by a remarkable 24%.

The shorter the better holds true for emerging ‘not-coms’, says Lindenthal. “Shorter names are more valuable and lead to greater registrations. With new top level domains named for cities, for example, it was the concise city names – dot-london; dot-miami; dot-berlin – that went first, and now anyone who wants to buy virtual space under those domains has to buy it from the new owner.

“More cumbersome city names are not seen as a good investment. For example, despite being a global centre for Internet technology, dot-sanfranciso is still up for grabs. Do you want to be the digital mayor of a new San Francisco domain? $2 million and it’s yours!”

However, while the new ‘not-com’ boom will open up huge new areas of the Internet, Lindenthal says that the overarching constraints will kick in again further down the line – which may be precisely what makes the new top level domains a worthy investment.

“The set of catchy keywords that appeal to humans is still bound by the way we process language, even if we had unlimited choice in top level domains,” he said.

“Legend has it that Mark Twain advised to buy land, since ‘they have stopped making it’. Similarly, one can argue that investing into top level domains is a promising business venture, since we have stopped inventing language, at least at a large scale.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/overcrowded-internet-domain-space-is-stifling-demand-suggesting-a-future-not-com-boom#sthash.atyUNHMp.dpuf

520 Million-Year-Old Fossilised Nervous System Is Most Detailed Example Yet Found

520 million-year-old fossilised nervous system is most detailed example yet found

source: www.cam.ac.uk

A 520 million-year-old fossilised nervous system – so well-preserved that individually fossilised nerves are visible – is the most complete and best example yet found, and could help unravel how the nervous system evolved in early animals.

The more of these fossils we find, the more we will be able to understand how the nervous system – and how early animals – evolved.

Javier Ortega-Hernández

Researchers have found one of the oldest and most detailed fossils of the central nervous system yet identified, from a crustacean-like animal that lived more than 500 million years ago. The fossil, from southern China, has been so well preserved that individual nerves are visible, the first time this level of detail has been observed in a fossil of this age.

The findings, published in theProceedings of the National Academy of Sciences, are helping researchers understand how the nervous system of arthropods – creepy crawlies with jointed legs – evolved. Finding any fossilised soft tissue is rare, but this particular find, by researchers in the UK, China and Germany, represents the most detailed example of a preserved nervous system yet discovered.

The animal, called Chengjiangocaris kunmingensis, lived during the Cambrian ‘explosion’, a period of rapid evolutionary development about half a billion years ago when most major animal groups first appear in the fossil record. C. kunmingensis belongs to a group of animals called fuxianhuiids, and was an early ancestor of modern arthropods – the diverse group that includes insects, spiders and crustaceans.

“This is a unique glimpse into what the ancestral nervous system looked like,” said study co-author Dr Javier Ortega-Hernández, of the University of Cambridge’s Department of Zoology. “It’s the most complete example of a central nervous system from the Cambrian period.”

Over the past five years, researchers have identified partially-fossilised nervous systems in several different species from the period, but these have mostly been fossilised brains. And in most of those specimens, the fossils only preserved details of the profile of the brain, meaning the amount of information available has been limited.

C. kunmingensis looked like a sort of crustacean, with a broad, almost heart-shaped head shield, and a long body with pairs of legs of varying sizes. Through careful preparation of the fossils, which involved chipping away the surrounding rock with a fine needle, the researchers were able to view not only the hard parts of the body, but fossilised soft tissue as well.

The vast majority of fossils we have are mostly bone and other hard body parts such as teeth or exoskeletons. Since the nervous system and soft tissues are essentially made of fatty-like substances, finding them preserved as fossils is extremely rare. The researchers behind this study first identified a fossilised central nervous system in 2013, but the new material has allowed them to investigate the significance of these finding in much greater depth.

Click to enlarge

The central nervous system coordinates all neural and motor functions. In vertebrates, it consists of the brain and spinal cord, but in arthropods it consists of a condensed brain and a chain-like series of interconnected masses of nervous tissue called ganglia that resemble a string of beads.

Like modern arthropods, C. kunmingensis had a nerve cord – which is analogous to a spinal cord in vertebrates – running throughout its body, with each one of the bead-like ganglia controlling a single pair of walking legs.

Closer examination of the exceptionally preserved ganglia revealed dozens of spindly fibres, each measuring about five thousandths of a millimetre in length. “These delicate fibres displayed a highly regular distribution pattern, and so we wanted to figure out if they were made of the same material as the ganglia that form the nerve cord,” said Ortega-Hernández. “Using fluorescence microscopy, we confirmed that the fibres were in fact individual nerves, fossilised as carbon films, offering an unprecedented level of detail. These fossils greatly improve our understanding of how the nervous system evolved.”

For Ortega-Hernández and his colleagues, a key question is what this discovery tells us about the evolution of early animals, since the nervous system contains so much information. Further analysis revealed that some aspects of the nervous system in C. kunmingensis appear to be structured similar to that of modern priapulids (penis worms) and onychophorans (velvet worms), with regularly-spaced nerves coming out from the ventral nerve cord.

In contrast, these dozens of nerves have been lost independently in the tardigrades (water bears) and modern arthropods, suggesting that simplification played an important role in the evolution of the nervous system.

Possibly one of the most striking implications of the study is that the exceptionally preserved nerve cord of C. kunmingensis represents a unique structure that is otherwise unknown in living organisms. The specimen demonstrates the unique contribution of the fossil record towards understanding the early evolution of animals during the Cambrian period. “The more of these fossils we find, the more we will be able to understand how the nervous system – and how early animals – evolved,” said Ortega-Hernández.

The research was supported in part by Emmanuel College, Cambridge.

Reference:
Jie Yang et. al. ‘The fuxianhuiid ventral nerve cord and early nervous system evolution in Panarthropoda.’ PNAS (2016). DOI: 10.1073/pnas.1522434113


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/520-million-year-old-fossilised-nervous-system-is-most-detailed-example-yet-found#sthash.kY97m7Ne.dpuf

Pollinator Species Vital To Our Food Supply Are Under Threat, Warn Experts

Pollinator species vital to our food supply are under threat, warn experts

source: www.cam.ac.uk

A new report from experts and Government around the world addresses threats to animal pollinators such as bees, birds and bats that are vital to more than three-quarters of the world’s food crops, and intimately linked to human nutrition, culture and millions of livelihoods. Scientists say simple strategies could harness pollinator power to boost agricultural yield.

People’s livelihoods and culture are intimately linked with pollinators around the world. All the major world religions have sacred passages that mention bees

Lynn Dicks

Delegates from almost 100 national Governments have gathered in Kuala Lumpur to discuss how to address the threats facing animal pollinators: the bees, flies, birds, butterflies, moths, wasps, beetles and bats that transport the pollen essential to the reproduction of much of the world’s crops and plant life.

It is the first time the global community has gathered on this scale to focus on the preservation of the small species that help fertilise more than three quarters of the leading kinds of global food crops and nearly 90% of flowering wild plant species.

A report on pollinator species produced over two years by an international team of 77 scientists, including Cambridge’s Dr Lynn Dicks, has been adopted by theIntergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services(IPBES) today. IPBES has 124 member Governments.

The report is the first assessment ever issued by IPBES, and the first time that such an assessment has brought together multiple knowledge systems comprehensively, including scientific, and indigenous and local knowledge. It will highlight the threats to animal pollinators, and the major implications of these species’ declines for the world’s food supply and economy.

But the report also details the ways that pollinator power can be used for the benefit of biodiversity, food security and people: by harnessing natural relationships between plants and animals to improve agricultural yields and strengthen local communities.

“It is incredible to see international Governments coming together to discuss the problem of pollinators in this way,” says Lynn Dicks, from Cambridge University’s Department of Zoology.

“Without pollinators, many of us would not be able to enjoy chocolate, coffee and vanilla ice cream, or healthy foods like blueberries and brazil nuts. The value of pollinators goes way beyond this. People’s livelihoods and culture are intimately linked with pollinators around the world. All the major world religions have sacred passages that mention bees.”

The volume of pollinator-dependent food produced has increased by 300% over the past 50 years, including most fruits from apple to avocado, as well as coffee, cocoa, and nuts such as cashews. This shows an increasing dependence of agriculture on pollinators.

Such crops now occupy around 35% of all agricultural land. While these crops rely on animal pollination to varying degrees – along with, for example, wind-blown pollination – the scientists estimate that between 5 and 8% of all global crop production is directly attributable to animal pollinators, with an annual market value that may be as much as 577 billion US dollars.

However, the experts warn that a variety of agricultural practices are contributing to steep declines in key pollinating species across Europe and North America. In Europe, populations are declining for at least 37% of bee and 31% of butterfly species.

A lack of data for Africa, Latin America and Asia means we are currently in the dark about the status of pollinators in many parts of the world, say the scientists. Where national ‘red lists’ are available, they show that up to 50% of global bee species, for example, may be threatened with extinction.

For some crops, including cocoa, wild pollinators contribute more to global crop production than managed honey bees. Wild bee populations are of particular concern, as bees are “dominant” pollinators, say scientists, and visit over 90% of the leading global crop types.

Changes in land-use and habitat destruction are key drivers of pollinator decline. Increasing crop monocultures – where the same plant is homogenously grown across vast swathes of land – mean that the plant diversity required by many pollinators is dwindling.

Increased use of pesticides are a big problem for many species – insecticides such as neonicotinoids have been shown to harm the survival of wild bees, for example – and climate change is shifting seasonal activities of key pollinators, the full effects of which may not be apparent for several decades.

The decline of practices based on indigenous and local knowledge also threatens pollinators. These practices include traditional farming systems, maintenance of diverse landscapes and gardens, kinship relationships that protect specific pollinators, and cultures and languages that are connected to pollinators.

Everyone should think carefully about whether they need to use insecticides and herbicides in their own gardens

Many livelihoods across the world depend on pollinating animals, say scientists. Pollinator-dependent crops include leading export products in developing countries (such as coffee and cocoa) and developed countries (such as almonds), providing employment and income for millions of people.

If the worst-case scenario – a complete loss of animal pollinators – occurred, not only would between 5 and 8% of the world’s food production be wiped out, it would lower the availability of crops and wild plants that provide essential micronutrients to human diets, risking vastly increased numbers of people suffering from Vitamin A, iron and folate deficiency.

However, the assessment says that by deploying strategies for supporting pollinators, we could not only preserve the volume of food they help us produce, but we could boost populations and in doing so could even improve production in sustainable farming systems, so-called “ecological intensification”.

Many pollinator-friendly strategies are relatively straightforward. Maintaining patches of semi-natural habitats throughout productive agricultural land would provide nesting and ‘floral resources’ for many pollinators. This could be as simple as strips of wild flowers breaking up crop monocultures, for example, and identifying and tending to nest trees in farming settings.

Certain traditional crop rotation practices using seasonal indicators such as flowering to trigger planting also help to maintain diversity – and it is diversity that is at the heart of flourishing pollinator populations.

There are actions that Governments around the world could take, says Dr Dicks, such as raising the standards of pesticide and GMO risk assessment, or supporting training for farmers in how to manage pollination and reduce pesticide use. National-level monitoring of wild pollinators, especially bees, would help to address the lack of long term data on pollinator numbers.

“There are many things individual people can do to help pollinators, and safeguard them for the future,” says Dr Dicks.

“Planting flowers that pollinators use for food, or looking after their habitats in urban and rural areas, will help. Everyone should also think carefully about whether they need to use insecticides and herbicides in their own gardens.”

More information about how to help wild pollinators can be found at the Bees Needs website, which is part of the National Pollinator Strategy for England.

Inset image: Lynn Dicks at the IPBES meeting in Kuala Lumpur. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/pollinator-species-vital-to-our-food-supply-are-under-threat-warn-experts#sthash.8XS69JDq.dpuf