All posts by Admin

Cambridge-Oxford Link To Create ‘Britain’s Silicon Valley’

Cambridge-Oxford link to create ‘Britain’s Silicon Valley’

source: BBC Local Live Orla Moore

Britain could boast its own Silicon Valley if plans to re-establish the rail link between Cambridge and Oxford are fast-tracked.

The National Infrastructure Commission is urging the government to deliver the western section of the East West Railline by 2024 and bring forward £100m in funding.

It says speeding up the project will enable the creation of a corridor of innovation and learning, linking the two university cities by rail for the first time since 1967.

eastwest rail image

Network Rail

The NIC study into how to maximise the potential of the Cambridge – Milton Keynes – Oxford corridor found that a lack of sufficient and suitable housing presents a risk to the success of the area.

The report called for a “joined-up plan” for homes, jobs and infrastructure.

Rice Farming In India Much Older Than Thought, Used As ‘Summer Crop’ By Indus Civilisation

Rice farming in India much older than thought, used as ‘summer crop’ by Indus civilisation

Thought to have arrived from China in 2000 BC, latest research shows domesticated rice agriculture in India and Pakistan existed centuries earlier, and suggests systems of seasonal crop variation that would have provided a rich and diverse diet for the Bronze Age residents of the Indus valley.

Our findings appear to show there was already a long-held and sustainable culture of rice production in India as a widespread summer addition to the winter cropping during the Indus civilisation

Jennifer Bates

Latest research on archaeological sites of the ancient Indus Civilisation, which stretched across what is now Pakistan and northwest India during the Bronze Age, has revealed that domesticated rice farming in South Asia began far earlier than previously believed, and may have developed in tandem with – rather than as a result of – rice domestication in China.

The research also confirms that Indus populations were the earliest people to use complex multi-cropping strategies across both seasons, growing foods during summer (rice, millets and beans) and winter (wheat, barley and pulses), which required different watering regimes. The findings suggest a network of regional farmers supplied assorted produce to the markets of the civilisation’s ancient cities.

Evidence for very early rice use has been known from the site of Lahuradewa in the central Ganges basin, but it has long been thought that domesticated rice agriculture didn’t reach South Asia until towards the end of the Indus era, when the wetland rice arrived from China around 2000 BC. Researchers found evidence of domesticated rice in South Asia as much as 430 years earlier.

The new research is published today in the journals Antiquity and Journal of Archaeological Science by researchers from the University of Cambridge’s Division of Archaeology, in collaboration with colleagues at Banaras Hindu University and the University of Oxford.

“We found evidence for an entirely separate domestication process in ancient South Asia, likely based around the wild species Oryza nivara. This led to the local development of a mix of ‘wetland’ and ‘dryland’ agriculture of local Oryza sativa indica rice agriculture before the truly ‘wetland’ Chinese rice, Oryza sativa japonica, arrived around 2000 BC,” says study co-author Dr Jennifer Bates

“While wetland rice is more productive, and took over to a large extent when introduced from China, our findings appear to show there was already a long-held and sustainable culture of rice production in India as a widespread summer addition to the winter cropping during the Indus civilisation.”

Co-author Dr Cameron Petrie says that the location of the Indus in a part of the world that received both summer and winter rains may have encouraged the development of seasonal crop rotation before other major civilisations of the time, such as Ancient Egypt and China’s Shang Dynasty.

“Most contemporary civilisations initially utilised either winter crops, such as the Mesopotamian reliance on wheat and barley, or the summer crops of rice and millet in China – producing surplus with the aim of stockpiling,” says Petrie.

“However, the area inhabited by the Indus is at a meteorological crossroads, and we found evidence of year-long farming that predates its appearance in the other ancient river valley civilisations.”

The archaeologists sifted for traces of ancient grains in the remains of several Indus villages within a few kilometers of the site called Rakhigari: the most recently excavated of the Indus cities that may have maintained a population of some 40,000.

As well as the winter staples of wheat and barley and winter pulses like peas and vetches, they found evidence of summer crops: including domesticated rice, but also millet and the tropical beans urad and horsegram, and used radiocarbon dating to provide the first absolute dates for Indus multi-cropping: 2890-2630 BC for millets and winter pulses, 2580-2460 BC for horsegram, and 2430-2140 BC for rice.

Millets are a group of small grain, now most commonly used in birdseed, which Petrie describes as “often being used as something to eat when there isn’t much else”. Urad beans, however, are a relative of the mung bean, often used in popular types of Indian dhal today.

In contrast with evidence from elsewhere in the region, the village sites around Rakhigari reveal that summer crops appear to have been much more popular than the wheats of winter.

The researchers say this may have been down to the environmental variation in this part of the former civilisation: on the seasonally flooded Ghaggar-Hakra plains where different rainfall patterns and vegetation would have lent themselves to crop diversification – potentially creating local food cultures within individual areas.

This variety of crops may have been transported to the cities. Urban hubs may have served as melting pots for produce from regional growers, as well as meats and spices, and evidence for spices have been found elsewhere in the region.

While they don’t yet know what crops were being consumed at Rakhigarhi, Jennifer Bates points out that: “It is certainly possible that a sustainable food economy across the Indus zone was achieved through growing a diverse range of crops, with choice being influenced by local conditions.

“It is also possible that there was trade and exchange in staple crops between populations living in different regions, though this is an idea that remains to be tested.”

“Such a diverse system was probably well suited to mitigating risk from shifts in climate,” adds Cameron Petrie. “It may be that some of today’s farming monocultures could learn from the local crop diversity of the Indus people 4,000 years ago.”

The findings are the latest from the Land, Water and Settlement Project, which has been conducting research on the ancient Indus Civilisation in northwest India since 2008.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Talk With Your Hands: a Cambridge Shorts Film

Talk with Your Hands: a Cambridge Shorts film

 

source: www.cam.ac.uk

The capacity for language is what sets us apart from other animals. Talk with Your Hands, the third of four Cambridge Shorts films, explores the richness of sensory perception in interviews with blind and deaf people together with insights from neuroscientists.

Talk with Your Hands: Communicating across the Sensory Spectrum opens with Hayden Dahmm speaking to camera. He is studying engineering and he’s blind. One of the benefits of being blind, he suggests, is that he is not distracted by physical appearance. The words people use, and how they use them, gives him “a genuine impression of the speaker”.

Louise Stern is a writer and artist. She is deaf and explains that her native tongue is American Sign Language. Speaking with her hands, she says: “The body is eloquent and conveys layers of emotion and meaning.” When she describes how eye contact is, for a deaf person, an especially beautiful thing, she hesitates – and then says “it makes me feel like they see me”.

In just ten minutes, Talk with Your Hands conveys the richness of verbal and non-verbal languages and explores how our senses overlap and merge. Through interviews with blind and deaf people, interwoven with insights from neuroscientists, the film demonstrates how we communicate with sounds and gestures – and how each mode of communication has its own characteristics.

Sign language is not a translation of, or substitute for, verbal language. While spoken language is linear (produced through the channels of our mouths one word at a time), sign language is flowing and simultaneous. Similarly, the spoken word is not just the written word spoken out loud. It’s much more than that, explains Hayden, rather as “poetry is the things that cannot be translated”.

The capacity for language is what sets mankind apart from other animals. Years ago, scientists looking at brain damage identified the parts of the brain responsible for speaking and comprehension, for hearing and seeing. Now we know that this understanding of how the brain works is far too simplistic: language, and the different ways we use it, colonises most of the brain.

Talk with Your Hands is one of four films made by Cambridge researchers for the 2016 Cambridge Shorts series, funded by Wellcome Trust ISSF. The scheme supports early career researchers to make professional quality short films with local artists and filmmakers. Researchers Craig Pearson (Wellcome Trust-MRC Cambridge Stem Cell Institute) and Julio Chenchen Song (Department of Linguistics) collaborated with filmmaker Toby Smith.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

A BLUEPRINT For Blood Cells: Cambridge Researchers Play Leading Role In Major Release of Epigenetic Studies

A BLUEPRINT for blood cells: Cambridge researchers play leading role in major release of epigenetic studies

source: www.cam.ac.uk

Cambridge researchers have played a leading role in several studies released today looking at how variation in and potentially heritable changes to our DNA, known as epigenetic modifications, affect blood and immune cells, and how this can lead to disease.

BLUEPRINT shows the power of collaboration among scientists across Europe in making a difference to our knowledge of how epigenetic changes impact on our health

Willem Ouwehand

The studies are part of BLUEPRINT, a large-scale research project bringing together 42 leading European universities, research institutes and industry entrepreneurs, with close to €30 million of funding from the EU. BLUEPRINT scientists have this week released a collection of 26 publications, part of a package of 41 publications being released by the International Human Epigenome Consortium.

One of the great mysteries in biology is how the many different cell types that make up our bodies are derived from a single stem cell and how information encoded in different parts of our genome are made available to be used by different cell types. Scientists have learned a lot from studying the human genome, but have only partially unveiled the processes underlying cell determination. The identity of each cell type is largely defined by an instructive layer of molecular annotations on top of the genome – the epigenome – which acts as a blueprint unique to each cell type and developmental stage.

Unlike the genome, the epigenome changes as cells develop and in response to changes in the environment. Defects in the proteins that read, write and erase the epigenetic information are involved in many diseases. The comprehensive analysis of the epigenomes of healthy and abnormal cells will facilitate new ways to diagnose and treat various diseases, and ultimately lead to improved health outcomes.

“This huge release of research papers will help transform our understanding of blood-related and autoimmune diseases,” says Professor Willem H Ouwehand from the Department of Haematology at the University of Cambridge, one of the Principal Investigators of BLUEPRINT. “BLUEPRINT shows the power of collaboration among scientists across Europe in making a difference to our knowledge of how epigenetic changes impact on our health.”

Among the papers led by Cambridge researchers, Professor Nicole Soranzo and Dr Adam Butterworth have co-led a study analysing the effect of genetic variants in our DNA sequence on our blood cells. Using a genome-wide association analysis, the team identified more than 2,700 variants that affect blood cells, including hundreds of rare genetic variants that have far larger effects on the formation of blood cells than the common ones. Interestingly, they found genetic links between the effects of these variants and autoimmune diseases, schizophrenia and coronary heart disease, thereby providing new insights into the causes of these diseases.

A second study led by Professor Soranzo looked at the contribution of genetic and epigenetic factors to different immune cell characteristics in the largest cohort of this kind created with blood donors from the NHS Blood and Transplant centre in Cambridge.

Dr Mattia Frontini and Dr Chris Wallace, together with scientists at the Babraham Institute, have jointly led a third study mapping the regions of the genome that interact with genes in 17 different blood cell types. By creating an atlas of links between genes and the remote regions that regulate them in each cell type, they have been able to uncover thousands of genes affected by DNA modifications, pointing to their roles in diseases such as rheumatoid arthritis and other types of autoimmune disease.

Dr Frontini has also co-led a study with BLUEPRINT colleagues from the University of Vienna that has developed a reference map of how epigenetic changes to DNA can program haematopoietic stem cells – a particular type of ‘master cell’ – to develop into the different types of blood and immune cells.

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, which helped fund the research, said: “Our genes are critical to our health and there’s still a wealth of information hidden in our genetic code. By taking advantage of a large scale international collaboration, involving the combined expertise of dozens of research groups, these unprecedented studies have uncovered potentially crucial knowledge for the development of new life saving treatments for heart disease and many other deadly conditions.

“Collaborations like this, which rely on funding from the public through charities and governments across the globe, are vital for analysing and understanding the secrets of our genetics. Research of this kind is helping us to beat disease and improve millions of lives.”

Departmental Affiliations

  • Professor Nicole Soranzo – Department of Haematology
  • Dr Adam Butterworth – Medical Research Council (MRC)/British Heart Foundation (BHF) Cardiovascular Epidemiology Unit
  • Dr Mattia Frontini – Department of Haematology, and Senior Research Fellow for the BHF Cambridge Centre for Research Excellence
  • Dr Chris Wallace – Department of Medicine and MRC Biostatistics Unit

References

  • Astle, WJ et al. The allelic landscape of human blood cell trait variation. Cell; 17 Nov 2016; DOI: 10.1016/j.cell.2016.10.042
  • Chen, L et al. Genetic drivers of epigenetic and transcriptional variation in human immune cells. Cell; 17 Nov 2016; DOI: 10.1371/journal.pbio.0000051
  • Javierre et al. Lineage-specific genome architecture links enhancers and non-coding disease variants to target gene promoters. Cell; 17 Nov 2016; DOI: 10.1016/j.cell.2016.09.037
  • Farlik et al. Cell Stem Cell; 17 Nov 2016; DOI: 10.1016/j.stem.2016.10.019

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Inability To Safely Store Fat Increases Risk of Diabetes and Heart Disease

Inability to safely store fat increases risk of diabetes and heart disease

source: www.cam.ac.uk

A large-scale genetic study has provided strong evidence that the development of insulin resistance – a risk factor for type 2 diabetes and heart attacks and one of the key adverse consequences of obesity – results from the failure to safely store excess fat in the body.

We’ve long suspected that problems with fat storage might lead to its accumulation in other organs, where it causes insulin resistance and eventually diabetes, but the evidence for this has mostly come from rare forms of human lipodystrophy

Steve O’Rahilly

Overeating and lack of physical activity worldwide has led to rising levels of obesity and a global epidemic of diseases such as heart disease, stroke and type 2 diabetes. A key process in the development of these diseases is the progressive resistance of the body to the actions of insulin, a hormone that controls the levels of blood sugar. When the body becomes resistant to insulin, levels of blood sugars and lipids rise, increasing the risk of diabetes and heart disease. However, it is not clear in most cases how insulin resistance arises and why some people become resistant, particularly when overweight, while others do not.

An international team led by researchers at the University of Cambridge studied over two million genetic variants in almost 200,000 people to look for links to insulin resistance. In an article published today in Nature Genetics, they report 53 regions of the genome associated with insulin resistance and higher risk of diabetes and heart disease; only 10 of these regions have previously been linked to insulin resistance.

The researchers then carried out a follow-up study with over 12,000 participants in the Fenland and EPIC-Norfolk studies, each of whom underwent a body scan that shows fat deposits in different regions of the body. They found that having a greater number of the 53 genetic variants for insulin resistance was associated with having lower amounts of fat under the skin, particularly in the lower half of the body.

The team also found a link between having a higher number of the 53 genetic risk variants and a severe form of insulin resistance characterized by loss of fat tissue in the arms and legs, known as familial partial lipodystrophy type 1. Patients with lipodystrophy are unable to adequately develop fat tissue when eating too much, and often develop diabetes and heart disease as a result.

In follow-up experiments in mouse cells, the researchers were also able to show that suppression of several of the identified genes (including CCDC92, DNAH10 and L3MBTL3) results in an impaired ability to develop mature fat cells.

“Our study provides compelling evidence that a genetically-determined inability to store fat under the skin in the lower half of the body is linked to a higher risk of conditions such as diabetes and heart disease,” says Dr Luca Lotta from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge. “Our results highlight the important biological role of peripheral fat tissue as a deposit of the surplus of energy due to overeating and lack of physical exercise.”

“We’ve long suspected that problems with fat storage might lead to its accumulation in other organs such as the liver, pancreas and muscles, where it causes insulin resistance and eventually diabetes, but the evidence for this has mostly come from rare forms of human lipodystrophy,” adds Professor Sir Stephen O’Rahilly from the MRC Metabolic Diseases Unit and Metabolic Research Laboratories at the University of Cambridge. “Our study suggests that these processes also take place in the general population.”

Overeating and being physically inactive leads to excess energy, which is stored as fat tissue. This new study suggests that among individuals who have similar levels of eating and physical exercise, those who are less able store the surplus energy as fat in the peripheral body, such as the legs, are at a higher risk of developing insulin resistance, diabetes and cardiovascular disease than those who are able to do so.

“People who carry the genetic risk variants that we’ve identified store less fat in peripheral areas,” says Professor Nick Wareham, also from the MRC Epidemiology Unit. “But this does not mean that they are free from risk of disease, because when their energy intake exceeds expenditure, excess fat is more likely to be stored in unhealthy deposits. The key to avoiding the adverse effects is the maintenance of energy balance by limiting energy intake and maximising expenditure through physical activity.”

These new findings may lead to future improvements in the way we prevent and treat insulin resistance and its complications. The researchers are now collaborating with other academic as well as industry partners with the aim of finding drugs that may reduce the risk of diabetes and heart attack by targeting the identified pathways.

The research was mainly funded by the Medical Research Council, with additional support from the Wellcome Trust.

Reference
Lotta, LA et al. Integrative genomic analysis implicates limited peripheral adipose storage capacity in the pathogenesis of human insulin resistance. Nature Genetics; 14 Nov 2016; DOI: 10.1038/ng.3714


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Multi-Drug Resistant Infection Spreading Globally Among Cystic Fibrosis Patients

Multi-drug resistant infection spreading globally among cystic fibrosis patients

source: www.cam.ac.uk

A multi-drug resistant infection that can cause life-threatening illness in people with cystic fibrosis (CF) and can spread from patient to patient has spread globally and is becoming increasingly virulent, according to new research published today in the journal Science.

Our research should provide a degree of hope: now that we know the extent of the problem and are beginning to understand how the infection spreads, we can start to respond

Julian Parkhill

The study, led by the University of Cambridge and the Wellcome Trust Sanger Institute, also suggests that conventional cleaning will not be sufficient to eliminate the pathogen, which can be transmitted through contaminated surfaces or in the air.

Mycobacterium abscessus, a species of multidrug resistant mycobacteria, has recently emerged as a significant global threat to individuals with cystic fibrosis and other lung diseases. It can cause a severe pneumonia leading to accelerated inflammatory damage to the lungs, and may prevent safe lung transplantation. It is also extremely difficult to treat – fewer than one in three cases is treated successfully.

It was previously thought that patients acquired the infection from the environment and that transmission between patients never occurred. The research team had previously studied one specialist CF centre in the UK and identified genetic and epidemiological evidence suggesting person-to-person transmission of M. abscessus but it was unclear whether this was a one off incident.

Now, by sequencing the whole genomes of over 1,000 isolates of mycobacteria from 517 individuals attending CF specialist centres in Europe, the US and Australia, researchers have demonstrated that the majority of CF patients have acquired transmissible forms of M. abscessus that have spread globally. Further analysis suggests that the infection may be transmitted within hospitals via contaminated surfaces and through airborne transmission. This presents a potentially serious challenge to infection control practices in hospitals.

Using a combination of cell-based and mouse models, the researchers showed that the recently-evolved mycobacteria were more virulent, likely to cause more serious disease in patients.

“This mycobacterium can cause very serious infections that are extremely challenging to treat, requiring combination treatment with multiple antibiotics for 18 months or longer,” says Professor Andres Floto from the Department of Medicine, University of Cambridge, and the Cambridge Centre for Lung Infection at Papworth Hospital NHS Foundation Trust. “The bug initially seems to have entered the patient population from the environment, but we think it has recently evolved to become capable of jumping from patient to patient, getting more virulent as it does so.”

Professor Julian Parkhill from the Wellcome Trust Sanger Institute at Hinxton, Cambridgeshire, adds: “Our research should provide a degree of hope: now that we know the extent of the problem and are beginning to understand how the infection spreads, we can start to respond. Our work has already helped inform infection control policies and provides the means to monitor the effectiveness of these.”

The Adult Cystic Fibrosis Centre at Papworth Hospital, Cambridgeshire, has led the development and implementation of new infection control policies to reduce the risk of transmission, now adopted across the UK and elsewhere. This study has also influenced the design of a new CF unit, due to open within the New Papworth Hospital on the Cambridge Biomedical Campus in 2018, which will incorporate a state-of-the-art air handling system.

One question that the researchers will now aim to answer is how the pathogen manages to spread globally. Their current study has shown that not only can it spread between individuals within specialist centres, but it has also been able to spread from continent to continent. The mechanism for this is unclear, but the researchers speculate that healthy individuals may be unwittingly carrying the mycobacteria between countries.

The sequencing data has also revealed potential new drug targets, and the team is now focused on working with other groups at the University of Cambridge and Colorado State University to develop these further.

Dr Janet Allen, Director of Strategic Innovation at the CF Trust, said: “This paper highlights the risks posed through transmission of multi-drug resistant organisms between people with cystic fibrosis. The team in Cambridge are a world authority in this area. This work demonstrates the global threat of this infection, the risks of cross-infection within and between CF centres, and the need for improved surveillance.  This study exemplifies the enormous impact of CF Trust-funded Strategic Research Centres, which were designed to generate world-class research with the very highest impact. Without the support of the CF community, this landmark study would not have been possible.”

Around one in 2,500 children in the UK is born with cystic fibrosis, a hereditary condition that causes the lungs to become clogged up with thick, sticky mucus. The condition tends to decrease life expectancy among patients.

The research was funded by the Wellcome Trust and the UK Cystic Fibrosis Trust.

Reference
Bryant, JM et al. Emergence and spread of a human transmissible multidrug-resistant nontuberculous mycobacterium. Science; 11 Nov 2016; DOI: 10.1126/science.aaf8156

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

World’s ‘Smallest Magnifying Glass’ Makes It Possible To See Individual Chemical Bonds Between Atoms

World’s ‘smallest magnifying glass’ makes it possible to see individual chemical bonds between atoms

source: www.cam.ac.uk

Using the strange properties of tiny particles of gold, researchers have concentrated light down smaller than a single atom, letting them look at individual chemical bonds inside molecules, and opening up new ways to study light and matter.

Single gold atoms behave just like tiny metallic ball bearings in our experiments, with conducting electrons roaming around, which is very different from their quantum life.

Jeremy Baumberg

For centuries, scientists believed that light, like all waves, couldn’t be focused down smaller than its wavelength, just under a millionth of a metre. Now, researchers led by the University of Cambridge have created the world’s smallest magnifying glass, which focuses light a billion times more tightly, down to the scale of single atoms.

In collaboration with European colleagues, the team used highly conductive gold nanoparticles to make the world’s tiniest optical cavity, so small that only a single molecule can fit within it. The cavity – called a ‘pico-cavity’ by the researchers – consists of a bump in a gold nanostructure the size of a single atom, and confines light to less than a billionth of a metre. The results, reported in the journal Science, open up new ways to study the interaction of light and matter, including the possibility of making the molecules in the cavity undergo new sorts of chemical reactions, which could enable the development of entirely new types of sensors.

According to the researchers, building nanostructures with single atom control was extremely challenging. “We had to cool our samples to -260°C in order to freeze the scurrying gold atoms,” said Felix Benz, lead author of the study. The researchers shone laser light on the sample to build the pico-cavities, allowing them to watch single atom movement in real time.

“Our models suggested that individual atoms sticking out might act as tiny lightning rods, but focusing light instead of electricity,” said Professor Javier Aizpurua from the Center for Materials Physics in San Sebastian in Spain, who led the theoretical section of this work.

“Even single gold atoms behave just like tiny metallic ball bearings in our experiments, with conducting electrons roaming around, which is very different from their quantum life where electrons are bound to their nucleus,” said Professor Jeremy Baumberg of the NanoPhotonics Centre at Cambridge’s Cavendish Laboratory, who led the research.

The findings have the potential to open a whole new field of light-catalysed chemical reactions, allowing complex molecules to be built from smaller components. Additionally, there is the possibility of new opto-mechanical data storage devices, allowing information to be written and read by light and stored in the form of molecular vibrations.

The research is funded as part of a UK Engineering and Physical Sciences Research Council (EPSRC) investment in the Cambridge NanoPhotonics Centre, as well as the European Research Council (ERC) and the Winton Programme for the Physics of Sustainability, and supported by the Spanish Council for Research (CSIC) and the University of the Basque Country (UPV/EHU).

Reference:
Felix Benz et al. ‘Single-molecule optomechanics in ‘pico-cavities’.’ Science (2016). DOI: 10.1126/science.aah5243

Inset image: The presence of the sharp metal tip on a plasma sphere concentrates the electric field into its vicinity, initiating a spark. Credit: NanoPhotonics Cambridge


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Weight Loss Condition Provides Insight Into Failure of Cancer Immunotherapies

Weight loss condition provides insight into failure of cancer immunotherapies

source: www.cam.ac.uk

A weight loss condition that affects patients with cancer has provided clues as to why cancer immunotherapy – a new approach to treating cancer by boosting a patient’s immune system – may fail in a substantial number of patients.

Cancer immunotherapy might completely transform how we treat cancer in the future – if we can make it work for more patients

Tobias Janowitz

Cancer immunotherapies involve activating a patient’s immune cells to recognise and destroy cancer cells. They have shown great promise in some cancers, but so far have only been effective in a minority of patients with cancer. The reasons behind these limitations are not clear.

Now, researchers at the Cancer Research UK Cambridge Institute at the University of Cambridge have found evidence that the mechanism behind a weight loss condition that affects patients with cancer could also be making immunotherapies ineffective. The condition, known as cancer cachexia, causes loss of appetite, weight loss and wasting in most patients with cancer towards the end of their lives. However, cachexia often starts to affect patients with certain cancers, such as pancreatic cancer, much earlier in the course of their disease.

In research published today in the journal Cell Metabolism, the scientists have shown in mice that even at the early stages of cancer development, before cachexia is apparent, a protein released by the cancer changes the way the body, in particular the liver, processes its own nutrient stores.

“The consequences of this alteration are revealed at times of reduced food intake, where this messaging protein renders the liver incapable of generating sources of energy that the rest of the body can use,” explains Thomas Flint, an MB/PhD student from the University of Cambridge School of Clinical Medicine and co-first author of the study. “This inability to generate energy sources triggers a second messaging process in the body – a hormonal response – that suppresses the immune cell reaction to cancers, and causes failure of anti-cancer immunotherapies.”

“Cancer immunotherapy might completely transform how we treat cancer in the future – if we can make it work for more patients,” says Dr Tobias Janowitz, Medical Oncologist and Academic Lecturer at the Department of Oncology at the University of Cambridge and co-first author. “Our work suggests that a combination therapy that either involves correction of the metabolic abnormalities, or that targets the resulting hormonal response, may protect the patient’s immune system and help make effective immunotherapy a reality for more patients.”

The next step for the team is to see how this discovery might be translated for the benefit of patients with cancer.

“If the phenomenon that we’ve described helps us to divide patients into likely responders and non-responders to immunotherapy, then we can use those findings in early stage clinical trials to get better information on the use of new immunotherapies,” says Professor Duncan Jodrell, director of the Early Phase Trials Team at the Cambridge Cancer Centre and co-author of the study.

“We need to do much more work in order to transform these results into safe, effective therapies for patients, however,” adds Professor Douglas Fearon, Emeritus Sheila Joan Smith Professor of Immunology at the University of Cambridge and the senior author, who is now also working at Cold Spring Harbor Laboratory and Weill Cornell Medical College. “Even so, the results raise the distinct possibility of future cancer therapies that are designed to target how the patient’s own body responds to cancer, with simultaneous benefit for reducing weight loss and boosting immunotherapy.”

The research was largely funded by Cancer Research UK, the Lustgarten Foundation, the Wellcome Trust and the Rosetrees Trust.

Nell Barrie, senior science information manager at Cancer Research UK, said: “Understanding the complicated biological processes at the heart of cancer is crucial for tackling the disease – and this study sheds light on why many cancer patients suffer from both loss of weight and appetite, and how their immune systems are affected by this process. Although this research is in its early stages, it has the potential to help make a difference on both fronts – helping treat weight loss and also improving treatments that boost the power of the immune system to destroy cancer cells.”

Reference
Flint, TR et al. Tumor-Induced IL-6 Reprograms Host Metabolism to Suppress Anti-tumor Immunity. Cell Metabolism; 8 Nov 2016; DOI: 10.1016/j.cmet.2016.10.010


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


Uber Confirms Cambridge Launch

Uber confirms Cambridge launch

uber, ride hailing, cambridge
source: http://www.businessweekly.co.uk

California ride-hailing business Uber has officially added the fast-growing Cambridge UK science & technology cluster to its map.

A communication from the company confirms: “We are ready to press the GO button and add Cambridge to the Uber map! Thousands of riders in Cambridge have already downloaded the Uber app and are waiting for awesome partner-drivers to pick them up.”

Uber has been inviting potential partners to private information sessions this week to explain how Uber works, how to get started on the platform and what rewards are on offer.

The company is anchoring its Cambridge operation from Vision Park in Histon having received an operation licence from the city council.

The San Francisco company’s mobile app allows consumers with smartphones to submit a trip request, which a software programme automatically sends to the Uber driver nearest to the consumer, alerting the driver to the location of the customer.

Uber drivers use their own personal cars and by the height of summer the service was available in more than 66 countries and 507 cities worldwide.

Leading UK trades union are monitoring Uber’s activities following the landark employment tribunal ruling that Uber drivers are company employees not self employed contractors so must receive the minimum wage.

Uber’s arrival in Cambridge is set to provide a fresh spin to the city’s transport revolution following last year’s arrival of Gett, the global taxi app that works exclusively with licensed Hackney carriages and black cabs across the UK.

Following success in London and increased demand for services in outlying areas, Gett pinpointed Cambridge as a key location in the company’s rapid growth and recruitment drive across Britain. It claimed 1,000 new drivers were joining Gett each month.

Uber’s launch following Gett provides a major challenge to existing cab companies and could trigger a price war – to the benefit of a population growing after the arrival of a record number of global tech giants from the US, Europe and Asia.

Brexit: High Court Ruling On Article 50 Explained

Brexit: High Court ruling on Article 50 explained

source: www.cam.ac.uk

In a landmark constitutional judgment handed down today, the High Court has put a stumbling block in the way of the Prime Minister’s plan to trigger Article 50 by the end of March 2017. Professor Kenneth Armstrong from the Centre for European Legal Studies goes through the ruling.

For MPs and Lords, this is a chance to try and get the Government to reveal more of its Brexit negotiating position

Kenneth Armstrong

“For some, today’s ruling is a victory for parliamentary democracy. For others, unelected judges stand in the way of the UK’s withdrawal from the EU. If the Supreme Court gets the final say, voters may still wonder whether their voice matters at all,” says Kenneth Armstrong, Professor of European Law from Cambridge’s CELS.

For Armstrong, the key aspects of the judgment are:

  • It is impermissible for the Prime Minister to invoke the Royal Prerogative[i] as legal authority for a notification to be sent under Article 50 of the Treaty on European Union, to begin the process of withdrawing the UK from the EU.
  • The effect of withdrawal will be to remove or limit the rights created by EU law and which are given effect in UK law via the European Communities Act 1972.
  • Neither as an interpretation of the European Communities Act, nor of constitutional principle can the Executive by Royal Prerogative alone remove or limit rights protected in domestic law.
  • The Referendum Act 2015 – in formal legal terms – only made provision for an advisory referendum. It did not give statutory authority for the triggering of Article 50.

“The High Court was asked by the claimants to limit the power of Theresa May to begin the Article 50 withdrawal process. Requiring the Prime Minister to obtain legislative authorisation from Parliament was contested by the Government on the basis that the electorate had given the Government a clear instruction to leave the EU in the referendum held on 23 June 2016,” says Armstrong.

“However, while the outcome of the referendum has given the Government a political mandate to withdraw from the EU, the legal power to notify must be exercised within legal limits. The High Court has concluded that where an exercise of the Royal Prerogative would remove legal rights, derived from EU law but made available in domestic law by Parliament through the European Communities Act, only Parliament can legislate for such rights to be removed.”

The case was brought by Gina Miller, an investment manager (the ‘lead claimant’) together with Mr Deir Dos Santos, a hairdresser, both UK citizens resident in the UK. Their claim was supported by a crowd-funded claim – the so-called ‘People’s Challenge’ – in the name of Graeme Pigney and other UK citizens resident in different parts of the UK and in other EU states.

An appeal by the Government to the UK Supreme Court is likely, says Armstrong. Following the recent ruling of the High Court in Belfast dismissing claims that the Prime Minister’s power to trigger Article 50 was limited by the terms of the Northern Ireland Act 1998 and the Belfast ‘Good Friday’ Agreement, the solicitor for Raymond McCord – whose son was murdered by paramilitaries and who is one of the claimants – has also indicated that his case will be appealed to the UK Supreme Court.

“Before today’s ruling there had been some suggestion that if the judgment had gone against the Government it may not have continued with an appeal to the Supreme Court,” says Armstrong. “Today’s judgment and the decision to appeal the Belfast High Court judgment has made it more likely that the Supreme Court will have the final say.”

The claimants argued that the Article 50 process once triggered was irrevocable. In a recent interview with the BBC, Lord Kerr of Kinlochard – credited with authoring the text of what is now Article 50 – said that the process was revocable. As a question of EU law this could require an authoritative interpretation by the EU’s top court, the European Court of Justice.

“The High Court seems to have accepted the idea – widely contested by others – that the Article 50 process is irrevocable. As a question of the interpretation of EU law, this could give rise to a request for an interpretation of Article 50 from the Court of Justice,” says Armstrong.

“But the Supreme Court may try and avoid asking the European Court for a ruling, not just because it will delay matters, but also because some will object to the idea that a European court will have a say.”

The Prime Minister had already conceded that there would be a parliamentary debate without a vote. And Parliament will have to give its approval at the end of the negotiations before any formal agreement between the EU and the UK can be ratified. Following the High Court’s ruling, Parliament will have a vote on authorising the Prime Minister to trigger Article 50.

Adds Armstrong: “The problem for the Government is that it will want the legislative authorisation from Parliament to be quick and wholly procedural. For MPs and Lords, however, this is a chance to try and get the Government to reveal more of its Brexit negotiating position. It would be a constitutional crisis for Parliament to refuse to authorise notification and to ignore the result of the referendum. This limits how far Parliamentarians can push their demands.”

 

[i] The Royal Prerogative refers to one of the sources of legal authority accepted by the courts through which the Crown and Ministers of the Crown may take decisions. In modern times, these prerogative powers are typically exercised by government ministers but over time, they have been removed or limited by Acts of Parliament which instead provide the legal authority for ministers to act. The power to make and ratify treaties falls within the Royal Prerogative.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Ectoplasm, Spirit Trumpets and Paintings From Pompeii: 600 Years of Curious Objects

Ectoplasm, spirit trumpets and paintings from Pompeii: 600 years of Curious Objects

source: www.cam.ac.uk

Why does one of the world’s great research libraries have ‘ectoplasm’, a spirit trumpet and beard hair posted to Charles Darwin among its eight million books, manuscripts and digital collections?

We’ve opened cupboards and found wall paintings from Pompeii.

Jill Whitelock

The answers lie in the second major exhibition of Cambridge University Library’s 600th anniversary – Curious Objects – which puts on display a collection of curiosities that has been centuries in the making.

Opening to the public on November 3, and following on from the hugely successful Lines of Thought, the exhibits on display in Curious Objects cover all corners of the globe and every era of human history, from the Stone Age to the Space Age.

Research for the exhibition has turned up new and rediscovered finds – including the oldest objects in the Library, two black-topped redware pots from Predynastic Egypt, and the oldest written artefact, a Sumerian clay tablet from around 2200 BCE.

As one of only six Legal Deposit libraries in the UK and Ireland, Cambridge University Library has been entitled to a copy of every UK publication since 1710. But it also predates the era of most modern museums and collections, meaning that over the centuries, it has been a depository for all manner of objects, all of which have a part to play in telling the story of one of the world’s greatest libraries.

Among the curious objects going on display are:

  • Ectoplasm’ captured during a séance by Helen Duncan (circa 1950) – the last person to be imprisoned under the Witchcraft Act of 1735
  • Stone Age tools from Northern Nigeria
  • A Predynastic Egyptian drinking vessel
  • Fragments of wall paintings from Pompeii (circa 20 BCE-79 CE)
  • A pocket globe (1775) tracing Captain Cook’s first voyage
  • A spirit trumpet for use at séances
  • A Shakespeare tobacco stopper
  • Beard and scalp hair posted to Charles Darwin – as a counterargument to claims Darwin made in Descent of Man
  • A Soyuz space badge, cigarettes and food packaging from the Cold War-era Soviet Union

“Shabby and beautiful, quirky and controversial, all the objects on display in our new exhibition provoke our curiosity and prompt questions about the nature of libraries – past present and future,” said Professor Christopher Young, Acting University Librarian.

“Over 600 years, Cambridge University Library has revealed the story of the world around us and the universe beyond – not only through its printed and manuscript treasures, but through this unique and wonderful ‘cabinet of curiosities’ that opens a window onto the nature of collecting.”

As well as the objects listed above, because of its Legal Deposit status, the University Library also has a significant collection of children’s toys, board games and models – often supplied with children’s books and magazines – which continue to arrive at the Library every week.

Cambridge University Library is home to the archive of the Society for Psychical Research, on deposit since 1989 and including the ‘ectoplasm’ and spirit trumpet among a number of artefacts. The Library also holds the collections of the Royal Commonwealth Society, a treasure-trove of information on the Commonwealth and Britain’s former colonial territories, containing some forty objects in addition to more than 300,000 printed items, about 800 archival collections and over 120,000 photographs.

“Our curiosity has been rewarded with some exciting finds,” said Dr Jill Whitelock, Head of Special Collections and Lead Curator. “We’ve opened cupboards and found wall paintings from Pompeii, opened a box of medals and found an ancient clay tablet carefully wrapped in tissue paper. It’s wonderful to think that after 600 years there’s still so much to explore in the Library. We hope visitors to the exhibition will enjoy discovering our curious objects too – where else can you see ‘ectoplasm’ alongside Egyptian artefacts?”

Curious Objects runs from November 3, 2016-March 21, 2017. Admission is free.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Brexit: Listen To Experts From Cambridge and Beyond Discuss How, Why and What Next For Brexit Britain

Brexit: Listen to experts from Cambridge and beyond discuss how, why and what next for Brexit Britain

source: www.cam.ac.uk

Listen to some of the talks that were given as part of the University’s ‘Brexit Week’ series, which took place from 18 – 22 October.

The University of Cambridge recently held a week-long series of Brexit talks and discussions, featuring senior experts in law, politics, history, science and economics from Cambridge and beyond.

The aim was to engage both University students and the local community in debates on how Britain moves towards departure from the European Union in the wake of June’s referendum.

You can listen to some of the talks below, or download from iTunesU here.

 

How Did We Get Here?

Tuesday 18th October

Robert Tombs, Professor of Modern European History at Cambridge’s Faculty of History

Robert Tombs is the author of a recent epic history of England, and a renowned expert on nineteenth-century French political history and the relationship between the French and the British. During the EU Referendum campaign, he was a signatory on a letter produced by ‘Historians for Britain’, which supported a Leave vote, and has written about the future of the UK post-Brexit.

Dr Victoria Bateman, Fellow and College Lecturer in Economics at Gonville & Caius College, Cambridge

Victoria Bateman is an economic historian at Cambridge, and a Fellow at the Legatum Institute think tank. Her current research focuses on the European economy from early-modern times to the present. Victoria has called for a sexual revolution in economics due to a lack of women in the discipline, and wrote articles in favour of a Remain vote in the run-up to the EU Referendum. She tweets at @vnbateman.

Dr Chris Bickerton, University Lecturer in Politics at POLIS and Official Fellow at Queens’ College, Cambridge

Chris Bickerton’s research focuses on the dynamics of state transformation and the challenges facing representative democracy in Europe. He has written a recently published book called The European Union: A Citizen’s Guide. During the run-up to the EU Referendum, Chris wrote in favour of a Leave vote, making the left-wing case for Brexit. He tweets at @cjbickerton.

 

Key Issues for the UK and EU Post-Brexit

Wednesday 19th October

Coen Teulings, Professor of Labour Economics and Industrial Relations at Cambridge’s Faculty of Economics

As well as holding the Montague Burton Chair at Cambridge, Coen Teulings is a Professor of Economics at the University of Amsterdam. He has written extensively about wages and income inequality, and spent seven years as the Director of the Central Planning Bureau — the Netherlands’ official economic forecasting agency. He has talked publicly about the risks posed by Brexit to free trade.

Athene Donald, Professor of Experimental Physics at Cambridge’s Cavendish Laboratory and Master of Churchill College

Athene Donald has served on the University’s Council and as its gender equality champion. She was appointed a Dame Commander of the British Empire in 2010, and Master of Churchill College in 2013. Athene wrote and talked extensively on the dangers that a Leave vote posed for UK science during the run-up to the EU Referendum. She is a regular blogger, and tweets at @AtheneDonald.

Charles Clarke, former Home Secretary

Charles Clarke is a Visiting Professor at the Policy Institute of Kings College London. He was MP for Norwich South from 1997 to 2010, and served as Home Secretary between 2004 and 2006 in Tony Blair’s Labour Government. During the run-up to the EU Referendum, Charles co-authored a report warning that intelligence relationships would be damaged by a Leave vote.

 

Process and Politics of the UK Leaving the EU

Thursday 20th October

David Runciman, Professor of Politics and Head of Department at POLIS and Fellow at Trinity Hall, Cambridge

David Runciman’s current research projects include the Leverhulme-funded Conspiracy and Democracy project and Future of Intelligence centre. In 2013, he published the book The Confidence Trap, a history of democratic crises since WWI. David hosts the weekly podcast Talking Politics from his Cambridge office, and has written that the Referendum vote shone a light on the education divide in democracy.

Mark Elliott, Professor of Public Law at the Faculty of Law, and Fellow at St Catharine’s College, Cambridge

Mark Elliott has written a number of books on public law, and is Legal Adviser to the House of Lords Constitution Committee. Mark writes a highly regarded blog called Public Law for Everyone, on which he analyses many of the legal issues surrounding the triggering of Article 50 and Theresa May’s Great Repeal Bill. Mark tweets at @ProfMarkElliott, and the slides from this talk are available at his blog.

 

Global Britain? The Future of British Trade after Brexit

Thursday 20th October

The Rt. Hon. Greg Hands MP, Minister of State in the Department for International Trade, delivered this year’s Alcuin Lecture at Cambridge’s Department of Politics and International Studies (POLIS). Greg was appointed to his current position by Theresa May in July 2016, where he serves as number two to Secretary of State Liam Fox. He tweets at @GregHands.

 

The UK and Brexit: How, Why and Where Now?

Friday 21st October

Matthew Elliott, Head of Vote Leave

Matthew Elliott is the former Chief Executive of the Vote Leave campaign. He is now Editor-at-Large of BrexitCentral, recently launched with the aim of “promoting a positive vision of Britain after Brexit”. He was a founder and former Chief Executive of the political think tank The TaxPayers’ Alliance. Matthew tweets at @matthew_elliott.

Catherine Barnard, Professor of European Union Law and Employment Law at the Faculty of Law, and Fellow of Trinity College, Cambridge

Catherine Barnard is a leading expert on EU internal markets and employment law, publishing extensively in these fields. She is a Senior Fellow of the ESRC’s UK in a Changing Europe initiative, and is jointly leading the EU Migrant Worker research project. Catherine regularly commented in the media during and after the EU Referendum. She has recently written that there could be free movement of workers in any Brexit deal. Catherine tweets at @CSBarnard24.

Jonathan Portes, Principal Research Fellow at the National Institute of Economic and Social Research

In addition to his role at the NIESR, Jonathan Portes is also a Senior Fellow of the UK in a Changing Europe initiative. Previously, he served as Chief Economist at the Cabinet Office. Jonathan’s new book, 50 Capitalism Ideas You Really Need to Know, has just been published. During the run-up to the EU Referendum, he wrote on the misrepresentation of migration by sections of the media. Jonathan tweets at @jdportes.

Anand Menon, Professor of European Politics and Foreign Affairs at King’s College London

Anand Menon is the Director of the UK in a Changing Europe initiative, and has written widely on many aspects of EU politics and policy and on UK-EU relations. As part of the initiative, he recently led on a report suggesting that “Brexit has the potential to test the UK’s constitutional settlement, legal framework, political process and bureaucratic capacities to their limits”. Anand tweets at @anandMenon1.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Elephant Poaching Costs African Economies US $25 Million Per Year In Lost Tourism Revenue

Elephant poaching costs African economies US $25 million per year in lost tourism revenue

source: www.cam.ac.uk

New research shows investing in elephant conservation is smart economic policy for many African countries.

We know that within parks, tourism suffers when elephant poaching ramps up. This work provides a first estimate of the scale of that loss

Andrew Balmford

The current elephant poaching crisis costs African countries around USD $25 million annually in lost tourism revenue, according to a new study published in the journal Nature Communications. Comparing this lost revenue with the cost of halting declines in elephant populations due to poaching, the study determines that investment in elephant conservation is economically favorable across the majority of African elephants’ range.

The research, undertaken by scientists from World Wildlife Fund (WWF), the University of Vermont, and the University of Cambridge, represents the first continent-wide assessment of the economic losses that the current elephant poaching surge is inflicting on nature-based tourism economies in Africa.

“While there have always been strong moral and ethical reasons for conserving elephants, not everyone shares this viewpoint.  Our research now shows that investing in elephant conservation is actually smart economic policy for many African countries,” said Dr. Robin Naidoo, lead wildlife scientist at WWF and lead author on the study.

Poachers kill between 20,000-30,000 African elephants each year for the illegal ivory trade, funded by global organized crime syndicates and fueled largely by demand in China and elsewhere in Asia. In just the past ten years, Africa’s elephants have declined by more than 20 percent.

“We know that within parks, tourism suffers when elephant poaching ramps up. This work provides a first estimate of the scale of that loss, and shows pretty convincingly that stronger conservation efforts usually make sound economic sense even when looking at just this one benefit stream,” said study co-author Professor Andrew Balmford, from the University of Cambridge’s Department of Zoology.

The research shows that tourism revenue lost to the current poaching crisis exceeds the anti-poaching costs necessary to stop the decline of elephants in east, southern, and west Africa. Rates of return on elephant conservation in these regions are positive, signaling strong economic incentive for countries to protect elephant populations.

“The average rate of return on elephant conservation in east, west, and south Africa compares favorably with rates of return on investments in areas like education, food security and electricity,” said Dr. Brendan Fisher, an economist at University of Vermont’s Gund Institute for Ecological Economics. “For example, for every dollar invested in protecting elephants in East Africa, you get about $1.78 back. That’s a great deal.”

However, for countries in central Africa, the study finds that elephant-based tourism cannot currently be expected to contribute substantially to elephant conservation. In these remote, forested areas where tourism levels are lower and elephants are typically more difficult to see, different mechanisms will be necessary to halt elephant declines.

Taken from a WWF press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Fossilised Dinosaur Brain Tissue Identified For The First Time

Fossilised dinosaur brain tissue identified for the first time

source: www.cam.ac.uk

Researchers have identified the first known example of fossilised brain tissue in a dinosaur from Sussex. The tissues resemble those seen in modern crocodiles and birds.

The chances of preserving brain tissue are incredibly small, so the discovery of this specimen is astonishing.

Alex Liu

An unassuming brown pebble, found more than a decade ago by a fossil hunter in Sussex, has been confirmed as the first example of fossilised brain tissue from a dinosaur.

The fossil, most likely from a species closely related to Iguanodon, displays distinct similarities to the brains of modern-day crocodiles and birds. Meninges – the tough tissues surrounding the actual brain – as well as tiny capillaries and portions of adjacent cortical tissues have been preserved as mineralised ‘ghosts’.

The results are reported in a Special Publication of the Geological Society of London, published in tribute to Professor Martin Brasier of the University of Oxford, who died in 2014. Brasier and Dr David Norman from the University of Cambridge co-ordinated the research into this particular fossil during the years prior to Brasier’s untimely death in a road traffic accident.

The fossilised brain, found by fossil hunter Jamie Hiscocks near Bexhill in Sussex in 2004, is most likely from a species similar to Iguanodon: a large herbivorous dinosaur that lived during the Early Cretaceous Period, about 133 million years ago.

Finding fossilised soft tissue, especially brain tissue, is very rare, which makes understanding the evolutionary history of such tissue difficult. “The chances of preserving brain tissue are incredibly small, so the discovery of this specimen is astonishing,” said co-author Dr Alex Liu of Cambridge’s Department of Earth Sciences, who was one of Brasier’s PhD students in Oxford at the time that studies of the fossil began.

According to the researchers, the reason this particular piece of brain tissue has been so well-preserved is that the dinosaur’s brain was essentially ‘pickled’ in a highly acidic and low-oxygen body of water – similar to a bog or swamp – shortly after its death. This allowed the soft tissues to become mineralised before they decayed away completely, so that they could be preserved.

“What we think happened is that this particular dinosaur died in or near a body of water, and its head ended up partially buried in the sediment at the bottom,” said Norman. “Since the water had little oxygen and was very acidic, the soft tissues of the brain were likely preserved and cast before the rest of its body was buried in the sediment.”

Working with colleagues from the University of Western Australia, the researchers used scanning electron microscope (SEM) techniques in order to identify the tough membranes, or meninges, that surrounded the brain itself, as well as strands of collagen and blood vessels. Structures that could represent tissues from the brain cortex (its outer layer of neural tissue), interwoven with delicate capillaries, also appear to be present. The structure of the fossilised brain, and in particular that of the meninges, shows similarities with the brains of modern-day descendants of dinosaurs, namely birds and crocodiles.

In typical reptiles, the brain has the shape of a sausage, surrounded by a dense region of blood vessels and thin-walled vascular chambers (sinuses) that serve as a blood drainage system. The brain itself only takes up about half of the space within the cranial cavity.

In contrast, the tissue in the fossilised brain appears to have been pressed directly against the skull, raising the possibility that some dinosaurs had large brains which filled much more of the cranial cavity. However, the researchers caution against drawing any conclusions about the intelligence of dinosaurs from this particular fossil, and say that it is most likely that during death and burial the head of this dinosaur became overturned, so that as the brain decayed, gravity caused it to collapse and become pressed against the bony roof of the cavity.

“As we can’t see the lobes of the brain itself, we can’t say for sure how big this dinosaur’s brain was,” said Norman. “Of course, it’s entirely possible that dinosaurs had bigger brains than we give them credit for, but we can’t tell from this specimen alone. What’s truly remarkable is that conditions were just right in order to allow preservation of the brain tissue – hopefully this is the first of many such discoveries.”

“I have always believed I had something special. I noticed there was something odd about the preservation, and soft tissue preservation did go through my mind. Martin realised its potential significance right at the beginning, but it wasn’t until years later that its true significance came to be realised,” said paper co-author Jamie Hiscocks, the man who discovered the specimen. “In his initial email to me, Martin asked if I’d ever heard of dinosaur brain cells being preserved in the fossil record. I knew exactly what he was getting at. I was amazed to hear this coming from a world renowned expert like him.”

The research was funded in part by the Natural Environment Research Council (NERC) and Christ’s College, Cambridge.

Reference:
Martin D. Brasier et al.’ Remarkable preservation of brain tissues in an Early Cretaceous iguanodontian dinosaur.’ Earth System Evolution and Early Life: a Celebration of the Work of Martin Brasier. Geological Society, London, Special Publications, 448. (2016). DOI: 10.1144/SP448.3


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Facebook Updates Could Provide a Window To Understanding – and Treating – Mental Health Disorders

Facebook updates could provide a window to understanding – and treating – mental health disorders

source: www.cam.ac.uk

Our Facebook status updates, ‘likes’ and even photos could help researchers better understand mental health disorders with the right ethical safeguards, argue researchers from the University of Cambridge, who suggest that social networks may even be used in future to provide support and interventions, particularly among young people.

Facebook is hugely popular and could provide us with a wealth of data to improve our knowledge of mental health disorders such as depression and schizophrenia

Becky Inkster

Over a billion people worldwide use Facebook daily – one in seven of the global population – and social media use is increasing at three times the rate of other internet use. Evidence suggests that 92% of adolescents use the site daily and disclose considerably more about themselves online than offline.

Writing in today’s edition of Lancet Psychiatry, researchers from the University of Cambridge discuss how social networking sites might be harnessed to provide data to help further our understanding of the onset and early years of mental illness.

“Facebook is hugely popular and could provide us with a wealth of data to improve our knowledge of mental health disorders such as depression and schizophrenia,” says Dr Becky Inkster, the study’s lead-author, from the Department of Psychiatry. “Its reach is particularly broad, too, stretching across the digital divide to traditionally hard-to-reach groups including homeless youth, immigrants, people with mental health problems, and seniors.”

Dr Inkster and her colleagues argue that Facebook might be used to help improve the detection of mental health factors. Dr Michal Kosinski, co-author from Stanford Graduate Business School, adds that Facebook data tends to be more reliable than offline self-reported information, while still reflecting an individual’s offline behaviours. It also enables researchers to measure content that is difficult to assess offline, such as conversation intensity, and to reach sample sizes previously unobtainable.

Status updates, shares and likes can provide a wealth of information about users, they say. A previous study of 200 US college students over the age of 18 years found that one in four posted status updates showing depressive-like symptoms. By analysing the language, emotions and topics used in status updates, the researchers say that it may be possible to look for symptoms or early signs of mental illness. Even photographs might provide new insights; Facebook is the world’s largest photo sharing website, with some 350 million photos uploaded daily, and automated picture analysis of emotional facial expressions might offer unique representations of offline behaviours.

Studies have shown that social networks can have both positive and negative effects on user’s emotions. Being ‘unfriended’ can elicit negative emotions, but even an individuals’ News Feed, which reports what their friends are up to, can affect their mood: one study found that a reduction of the amount of positive content displayed by friends led to an increase in negative status updates by users, and vice-versa. Other research has shown that some people with mental health disorders report positive experiences of social media, suggesting that Facebook might be harnessed to offer people support. People with schizophrenia and psychosis, for example, have reported that social networking sites helped them socialise and did not worsen their symptoms.

The researchers suggest that the use of therapies based on users’ Facebook pictures and timelines could be trialled as possible ways to use online social networks to support individuals. This might assist with accessing autobiographical memories, which can be impaired in conditions such as depression, and for improving cognition and mood with older patients, similar to offline therapies for early dementia.

“Facebook relationships may help those with reduced self-esteem and provide companionship for individuals who are socially isolated,” says Dr Becky Inkster. “We know that socially isolated adolescents are more likely to suffer from depression and suicidal thoughts, so these online stepping stones could encourage patients to reform offline social connections.”

These online – potentially leading to offline – social connections can provide support for vulnerable individuals such as homeless youth, a population at increased risk of mental health problems. Research has shown that this support is associated with a reduction in their alcohol intake and a decrease in depression-like symptoms. Unlike virtual patient communities, an advantage of using social networking sites, especially Facebook, is that people naturally use them in their daily lives, which addresses concerns about the limited duration of participation in virtual communities.

Early detection of digital warning signs could enhance mental health service contact and improve service provision, the researchers say. Facebook already allows users who are worried about a friend’s risk of suicide to report the post, for example. However, the use of social networking sites in the context of mental health and young people raises potential ethical issues. Vulnerable individuals will need to fully understand what participation in psychiatry research and mental health-care practice involves and that consent is monitored throughout the various stages of their illness.

“People are uneasy at the idea of having their social media monitored and their privacy infringed upon, so this is something that will need to be handled carefully,” says co-author Dr David Stillwell from the Cambridge Judge Business School. “To see this, we only have to look at the recent furore that led to the abrupt suspension of the Samaritans’ Radar Twitter app, which with the best of intentions enabled users to monitor their friends’ Twitter activity for suicidal messages.”

Much of this research is still in its infancy and evidence is often anecdotal or insufficient, argue the team. Several issues need addressing, such as whether using social media might interfere with certain illnesses or symptoms more than others – such as digital surveillance-based paranoid themes – and to ensure confidentiality and data protection rights for vulnerable people. But they are optimistic about its potential uses.

“Although it isn’t clear yet how social networking sites might best be used to improve mental health care, they hold considerable promise for having profound implications that could revolutionise mental healthcare,” says Dr Inkster.

Reference
Becky Inkster, David Stillwell, Michal Kosinski, Peter Jones. A decade into Facebook: where is psychiatry in the digital age? Lancet Psychiatry; 27 Oct 2016; DOI: 10.1016/S2215-0366(16)30041-4


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Potential New Treatment For Haemophilia Developed By Cambridge Researchers

Potential new treatment for haemophilia developed by Cambridge researchers

source: www.cam.ac.uk

A new treatment that might one day help all patients with haemophilia, including those that become resistant to existing therapies, has been developed by researchers at the University of Cambridge.

Within three years, we hope to be conducting our first-in-man trials

Trevor Baglin

Around 400,000 individuals around the world are affected by haemophilia, a genetic disorder that causes uncontrolled bleeding. Haemophilia is the result of a deficiency in proteins required for normal blood clotting – factor VIII for haemophilia A and factor IX for haemophilia B. Currently, the standard treatment is administration of the missing clotting factor. However, this requires regular intravenous injections, is not fully effective, and in about a third of patients results in the development of inhibitory antibodies. Nearly three-quarters of haemophilia sufferers have no access to treatment and have a life-expectancy of only 10 years.

In a study published online today in Blood, the Journal of the American Society of Hematology, researchers report on a novel approach that gives the clotting process more time to produce thrombin, the enzyme that forms blood clots.  They suggest this treatment could one day help all patients with haemophilia, including those who develop antibodies against standard therapy. The therapy is based on observations relating to a disorder associated with excessive clotting, known as factor V Leiden.

“We know that patients who have haemophilia and also have mutations that increase clotting, such as factor V Leiden, experience less-severe bleeding,” says study co-author Dr Trevor Baglin, Consultant Haematologist at Addenbrooke’s Hospital, Cambridge University Hospitals.

Dr Baglin and colleagues therefore pursued a strategy of reducing the activity of an anticoagulant enzyme, known as activated protein C (APC). The principal function of APC is to breakdown the complex that makes thrombin, and the factor V Leiden mutation slows this process. The team, led by Professor Jim Huntington, exploited this insight by developing a specific inhibitor of APC based on a particular type of molecule known as a serpin.

“We hypothesized that if we targeted the protein C pathway we could prolong thrombin production and thereby induce clotting in people with clotting defects, such as haemophilia sufferers,” says Professor Huntington, from the Cambridge Institute for Medical Research at the University of Cambridge. “So, we engineered a serpin that could selectively prevent APC from shutting down thrombin production before the formation of a stable clot.”

To test their theory, the team administered the serpin to mice with haemophilia B and clipped their tails. The researchers found that the amount of blood loss decreased as the dose increased, with the highest dose reducing bleeding to the level found in normal mice. Further studies confirmed that the serpin helped haemophilia mice form stable clots, with higher doses resulting in faster clot formation. The serpin was also able to increase thrombin production and accelerate clot formation when added to blood samples from haemophilia A patients.

“It’s our understanding that because we are targeting a general anti-clotting process, our serpin could effectively treat patients with either haemophilia A or B, including those who develop resistance to more traditional therapy,” adds Professor Huntington. “Additionally, we have focused on engineering the serpin to be long-acting and to be delivered by injection under the skin instead of directly into veins. This will free patients from the inconvenience of having to receive infusions three times a week, as is the case with current treatments.”

The research team hopes that the discovery can be rapidly developed into an approved medicine to provide improved care to haemophilia sufferers around the world.

“Within three years, we hope to be conducting our first-in-man trials of a subcutaneously-administered form of our serpin,” says Dr Baglin. “It is important to remember that the majority of people in the world with haemophilia have no access to therapy. A stable, easily-administered, long-acting, effective drug could bring treatment to a great deal many more haemophilia sufferers.”

This study forms part of a patent application, filed in the name of Cambridge Enterprise, and the modified serpin is being developed by a start-up company, ApcinteX, with funding from Medicxi.

Adapted from a press release by American Society of Hematology.

Reference
Polderdijk, SGI et al. Design and characterization of an APC-specific serpin for the treatment of haemophilia. Blood; 27 Oct 2016; DOI: 10.1182/blood-2016-05-718635


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Top Ten Universities Conduct a Third of All UK Animal Research

Top ten universities conduct a third of all UK animal research

source: www.cam.ac.uk

The ten UK universities who do the most world-leading biomedical research have announced their animal research statistics, revealing that they collectively conducted a third of all UK animal research in 2015.

The top ten institutions conduct more than two thirds of all UK university animal research between them, completing a combined total of 1.37 million procedures. Over 99% of these procedures were carried out on rodents or fish, and in line with national data they were roughly evenly split between experiments and the breeding of genetically modified animals.

The ten universities are listed below alongside the total number of procedures that they carried out in 2015. Each institution’s name links to a breakdown of their individual animal research statistics.

University of Oxford:             226,214
University of Edinburgh:        212,695
UCL:                                     202,554
University of Cambridge:       181,080
King’s College London:         175,296
University of Manchester:      145,457
Imperial College London:       101,179
University of Glasgow:           49,082
University of Birmingham:      47,657
University of Nottingham:       31,689

The universities employ more than 90,000 staff between them, and as you would expect the larger institutions tend to conduct the most animal research. All universities are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible, minimising the number of animals used per experiment and minimising suffering to improve animal welfare. However, as universities expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

“The fact that we perform a significant proportion of the UK’s leading biomedical research is something to be proud of,” says Professor Michael Arthur, UCL President & Provost. “It’s no surprise that the universities who conduct the most world-leading research also use the most animals; despite advances in non-animal techniques, animals offer answers to many research questions that alternative methods cannot yet provide.”

All ten universities are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. 107 organisations have signed the concordat including UK universities, charities, research funders and commercial research organisations.

Animal research has played a key role in the development of virtually every medicine that we take for granted today. However, despite decades of dedicated research, many widespread and debilitating conditions are still untreatable. Medical research is a slow process with no easy answers, but animal research helps to take us incrementally closer to treatments for cancer, dementia, stroke and countless other conditions.

While many animal studies do not lead directly to treatments for diseases, ‘basic science’ research helps scientists to understand different processes in the body and how they can go wrong, underpinning future efforts to diagnose and treat various conditions. Additionally, many studies will show that a line of research is not worth pursuing. Although this can be disappointing, such research is incredibly valuable as scientists need to know which methods do not work and why so that they can develop new ones. Animal studies can also help to answer a wide range of research questions that are not directly related to diseases, such as exploring how genes determine traits or how brain functions develop.

About animal research at the University of Cambridge


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Self-Renewable Killer Cells Could Be Key To Making Cancer Immunotherapy Work

Self-renewable killer cells could be key to making cancer immunotherapy work

source: www.cam.ac.uk

A small molecule that can turn short-lived ‘killer T-cells’ into long-lived, renewable cells that can last in the body for a longer period of time, activating when necessary to destroy tumour cells, could help make cell-based immunotherapy a realistic prospect to treat cancer.

Rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells

Randall Johnson

In order to protect us from invading viruses and bacteria, and from internal threats such as malignant tumour cells, our immune system employs an army of specialist immune cells. Just as a conventional army will be made up of different types of soldiers, each with a particular role, so each of these immune cells has a particular function.

Among these cells are cytotoxic T-cells – ‘killer T-cells’, whose primary function is to patrol our bodies, programmed to identify and destroy infected or cancerous cells. Scientists are now trying to harness these cells as a way to fight cancer, by growing T-cells programmed to recognise cancer cells in the laboratory in large numbers and then reintroducing them into the body to destroy the tumour – an approach known as adoptive T-cell immunotherapy.

However, this approach has been hindered by the fact that killer T-cells are short-lived – most killer T cells are gone within three days of transfer – so the army may have died out before it has managed to rid the body of the tumour.

Now, an international team led by researchers at the University of Cambridge has identified a way of increasing the life-span of these T-cells, a discovery that could help scientists overcome one of the key hurdles preventing progress in immunotherapy.

In a paper published today in the journal Nature, the researchers have identified a new role for a molecule known as 2-hydroxyglutarate, or 2-HG, which is known to trigger abnormal growth in tumour cells. In fact, the team has shown that a slightly different form of the molecule also plays a normal, but critical, role in T-cell function: it can influence T-cells to reside in a ‘memory state’.  This is a state where the cells can renew themselves, persist for a very long period of time, and re-activate to combat infection or cancer.

The researchers found that by increasing the levels of 2-HG in the T-cells, the researchers could generate cells that could much more effectively destroy tumours. Rather than expiring shortly after reintroduction, the memory state T-cells were able to persist for much longer, destroying tumour cells more effectively.

“In a sense, this means that rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells,” says Professor Randall Johnson, Wellcome Trust Principal Research Fellow at the Department of Physiology, Development & Neuroscience, University of Cambridge.

“So, with a fairly trivial treatment of T-cells, we’re able to change a moderate response to tumour growth to a much stronger response, potentially giving people a more permanent immunity to the tumours they are carrying. This could make immunotherapy for cancer much more effective.”

The research was largely funded by the Wellcome Trust.

Reference
Tyrakis, PA et al. The immunometabolite S-2-hydroxyglutarate regulates CD8+ T-lymphocyte fate; Nature; 26 Oct 2016; DOI: 10.1038/nature2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Extends World Leading Role For Medical Imaging With Powerful New Brain and Body Scanners

Cambridge extends world leading role for medical imaging with powerful new brain and body scanners

source: www.cam.ac.uk

The next generation of imaging technology, newly installed at the University of Cambridge, will give researchers an unprecedented view of the human body – in particular of the myriad connections within our brains and of tumours as they grow and respond to treatment – and could pave the way for development of treatments personalised for individual patients.

By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them

Ed Bullmore

The equipment, funded by the Medical Research Council (MRC), Wellcome Trust and Cancer Research UK, sits within the newly-refurbished Wolfson Brain Imaging Centre (WBIC), which today celebrates two decades at the forefront of medical imaging.

At the heart of the refurbishment are three cutting-edge scanners, of which only a very small handful exist at institutions outside Cambridge – and no institution other than the University of Cambridge has all three. These are:

  • a Siemens 7T Terra Magnetic Resonance Imaging (MRI) scanner, which will allow researchers to see detail in the brain as tiny as a grain of sand
  • a GE Healthcare PET/MR scanner that will enable researchers to collect critical data to help understand how cancers grow, spread and respond to treatment, and how dementia progresses
  • a GE Healthcare hyperpolarizer that enables researchers to study real-time metabolism of cancers and other body tissues, including whether a cancer therapy is effective or not

These scanners, together with refurbished PRISMA and Skyra 3T MRI scanners at the WBIC and at the Medical Research Council Cognition and Brain Sciences Unit, will make the Cambridge Biomedical Campus the best-equipped medical imaging centre in Europe.

Professor Ed Bullmore, Co-Chair of Cambridge Neuroscience and Scientific Director of the WBIC, says: “This is an exciting day for us as these new scanners will hopefully provide answers to questions that we have been asking for some time, as well as opening up new areas for us to explore in neuroscience, mental health research and cancer medicine.

“By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them. This will be a powerful research tool and represents a big step in the direction of personalised treatments.”

Dr Rob Buckle, Director of Science Programmes at the MRC, adds: “The MRC is proud to sponsor this exciting suite of new technologies at the University of Cambridge. They will play an important role in advancing our strategy in stratified medicine, ultimately ensuring that the right patient gets the right treatment at the right time.”

 

Slide show: Click on images to expand

7T Medical Resonance Imaging (MRI) scanner

The Siemens 7T Terra scanner – which refers to the ultrahigh strength of its magnetic field at 7 Tesla – will allow researchers to study at unprecedented levels of detail the workings of the brain and how it encodes information such as individual memories. Current 3T MRI scanners can image structures 2-3mm in size, whereas the new scanner has a resolution of just 0.5mm, the size of a coarse grain of sand.

“Often, the early stages of diseases of the brain, such as Alzheimer’s and Parkinson’s, occur in very small structures – until now too small for us to see,” explains Professor James Rowe, who will be leading research using the new 7T scanner. “The early seeds of dementia for example, which are often sown in middle age, have until now been hidden to less powerful MRI scanners.”

The scanner will also be able to pick up unique signatures of neurotransmitters in the brain, the chemicals that allow its cells to communicate with each other. Changes in the amount of these neurotransmitters affect how the brain functions and can underpin mental health disorders such as depression and schizophrenia.

“How a patient responds to a particular drug may depend on how much of a particular neurotransmitter present is currently present,” says Professor Rowe. “We will be looking at whether this new scanner can help provide this information and so help us tailor treatments to individual patients.”

The scanner will begin operating at the start of December, with research projects lined up to look at dementias caused by changes to the brain almost undetectable by conventional scanners, and to look at how visual and sound information is converted to mental representations in the brain.

PET/MR scanner

The new GE Healthcare PET/MR scanner brings together two existing technologies: positron emission tomography (PET), which enables researchers to visualise cellular activity and metabolism, and magnetic resonance (MR), which is used to image soft tissue for structural and functional details.

Purchased as part of the Dementias Platform UK, a network of imaging centres across the UK, the scanner will enable researchers to simultaneously collect information on physiological and disease-related processes in the body, reducing the need for patients to return for multiple scans. This will be particularly important for dementia patients.

Professor Fiona Gilbert, who will lead research on the PET/MR scanner, explains: “Dementia patients are often frail, which can present challenges when they need separate PET and MR scanners. So, not only will this new scanner provide us with valuable information to help improve understanding and diagnosis of dementia, it will also be much more patient-friendly.”

PET/MR  will allow researchers to see early molecular changes in the brain, accurately map them onto structural brain images and follow their progression as disease develops or worsens. This could enable researchers to diagnose dementia before any symptoms have arisen and to understand which treatments may best halt or slow the disease.

As well as being used for dementia research, the scanner will also be applied to cancer research, says Professor Gilbert.

“At the moment, we have to make lots of assumptions about what’s going on in tumour cells. We can take biopsies and look at the different cell types, how aggressive they are, their genetic structure and so on, but we can only guess what’s happening to a tumour at a functional level. Functional information is important for helping us determine how best to treat the cancer – and hence how we can personalise treatment for a particular patient. Using PET/MR, we can get real-time information for that patient’s specific tumour and not have to assume it is behaving in the same way as the last hundred tumours we’ve seen.”

The PET/MR scanner will begin operation at the start of November, when it will initially be used to study oxygen levels and blood flow in the tumours of breast cancer patients and in studies of brain inflammation in patients with Alzheimer’s disease and depression.

Hyperpolarizer

The third new piece of imaging equipment to be installed is a GE Healthcare hyperpolarizer, which is already up and running at the facility.

MRI relies on the interaction of strong magnetic fields with a property of atomic nuclei known as ‘spin’. By looking at how these spins differ in the presence of magnetic field gradients applied across the body, scientists are able to build up three-dimensional images of tissues. The hyperpolarizer boosts the ‘spin’ signal from tracers injected into the tissue, making the MRI measurement much more sensitive and allowing imaging of the biochemistry of the tissue as well as its anatomy.

“Because of underlying genetic changes in a tumour, not all patients respond in the same way to the same treatment,” explains Professor Kevin Brindle, who leads research using the hyperpolarizer. “Using hyperpolarisation and MRI, we can potentially tell whether a drug is working, from changes in the tumour’s biochemistry, within a few hours of starting treatment. If it’s working you continue, if not you change the treatment.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Engineers Design Ultralow Power Transistors That Could Function For Years Without a Battery

Engineers design ultralow power transistors that could function for years without a battery

 

source: www.cam.ac.uk

A new design for transistors which operate on ‘scavenged’ energy from their environment could form the basis for devices which function for months or years without a battery, and could be used for wearable or implantable electronics.

If we were to draw energy from a typical AA battery based on this design, it would last for a billion years.

Sungsik Lee

A newly-developed form of transistor opens up a range of new electronic applications including wearable or implantable devices by drastically reducing the amount of power used. Devices based on this type of ultralow power transistor, developed by engineers at the University of Cambridge, could function for months or even years without a battery by ‘scavenging’ energy from their environment.

Using a similar principle to a computer in sleep mode, the new transistor harnesses a tiny ‘leakage’ of electrical current, known as a near-off-state current, for its operations. This leak, like water dripping from a faulty tap, is a characteristic of all transistors, but this is the first time that it has been effectively captured and used functionally. Theresults, reported in the journal Science, open up new avenues for system design for the Internet of Things, in which most of the things we interact with every day are connected to the Internet.

The transistors can be produced at low temperatures and can be printed on almost any material, from glass and plastic to polyester and paper. They are based on a unique geometry which uses a ‘non-desirable’ characteristic, namely the point of contact between the metal and semiconducting components of a transistor, a so-called ‘Schottky barrier.’

“We’re challenging conventional perception of how a transistor should be,” said Professor Arokia Nathan of Cambridge’s Department of Engineering, the paper’s co-author. “We’ve found that these Schottky barriers, which most engineers try to avoid, actually have the ideal characteristics for the type of ultralow power applications we’re looking at, such as wearable or implantable electronics for health monitoring.”

The new design gets around one of the main issues preventing the development of ultralow power transistors, namely the ability to produce them at very small sizes. As transistors get smaller, their two electrodes start to influence the behaviour of one another, and the voltages spread, meaning that below a certain size, transistors fail to function as desired. By changing the design of the transistors, the Cambridge researchers were able to use the Schottky barriers to keep the electrodes independent from one another, so that the transistors can be scaled down to very small geometries.

The design also achieves a very high level of gain, or signal amplification. The transistor’s operating voltage is less than a volt, with power consumption below a billionth of a watt. This ultralow power consumption makes them most suitable for applications where function is more important than speed, which is the essence of the Internet of Things.

“If we were to draw energy from a typical AA battery based on this design, it would last for a billion years,” said Dr Sungsik Lee, the paper’s first author, also from the Department of Engineering. “Using the Schottky barrier allows us to keep the electrodes from interfering with each other in order to amplify the amplitude of the signal even at the state where the transistor is almost switched off.”

“This will bring about a new design model for ultralow power sensor interfaces and analogue signal processing in wearable and implantable devices, all of which are critical for the Internet of Things,” said Nathan.

“This is an ingenious transistor concept,” said Professor Gehan Amaratunga, Head of the Electronics, Power and Energy Conversion Group at Cambridge’s Engineering Department. “This type of ultra-low power operation is a pre-requisite for many of the new ubiquitous electronics applications, where what matters is function – in essence ‘intelligence’ – without the demand for speed. In such applications the possibility of having totally autonomous electronics now becomes a possibility. The system can rely on harvesting background energy from the environment for very long term operation, which is akin to organisms such as bacteria in biology.”

Reference:
S. Lee and A. Nathan, ‘Subthreshold Schottky-barrier thin film transistors with ultralow power and high intrinsic gain’. Science (2016). DOI: 10.1126/science.aah5035


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

“The Best Or Worst Thing To Happen To Humanity” – Stephen Hawking Launches Centre For The Future of Intelligence

“The best or worst thing to happen to humanity” – Stephen Hawking launches Centre for the Future of Intelligence

source: www.cam.ac.uk

Artificial intelligence has the power to eradicate poverty and disease or hasten the end of human civilisation as we know it – according to a speech delivered by Professor Stephen Hawking this evening.

Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many.

Stephen Hawking

Speaking at the launch of the £10millionLeverhulme Centre for the Future of Intelligence (CFI) in Cambridge, Professor Hawking said the rise of AI would transform every aspect of our lives and was a global event on a par with the industrial revolution.

CFI brings together four of the world’s leading universities (Cambridge, Oxford, Berkeley and Imperial College, London) to explore the implications of AI for human civilisation. Together, an interdisciplinary community of researchers will work closely with policy-makers and industry investigating topics such as the regulation of autonomous weaponry, and the implications of AI for democracy.

“Success in creating AI could be the biggest event in the history of our civilisation,” said Professor Hawking. “But it could also be the last – unless we learn how to avoid the risks. Alongside the benefits, AI will also bring dangers like powerful autonomous weapons or new ways for the few to oppress the many.

“We cannot predict what we might achieve when our own minds are amplified by AI. Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one – industrialisation.”

The Centre for the Future of Intelligence will initially focus on seven distinct projects in the first three-year phase of its work, reaching out to brilliant researchers and connecting them and their ideas to the challenges of making the best of AI. Among the initial research topics are: ‘Science, value and the future of intelligence’; ‘Policy and responsible innovation’; ‘Autonomous weapons – prospects for regulation’ and ‘Trust and transparency’.

The Academic Director of the Centre, and Bertrand Russell Professor of Philosophy at Cambridge, Huw Price, said: “The creation of machine intelligence is likely to be a once-in-a-planet’s-lifetime event. It is a future we humans face together. Our aim is to build a broad community with the expertise and sense of common purpose to make this future the best it can be.”

Many researchers now take seriously the possibility that intelligence equal to our own will be created in computers within this century. Freed of biological constraints, such as limited memory and slow biochemical processing speeds, machines may eventually become more intelligent than we are – with profound implications for us all.

AI pioneer Professor Maggie Boden (University of Sussex) sits on the Centre’s advisory board and spoke at this evening’s launch. She said: “AI is hugely exciting. Its practical applications can help us to tackle important social problems, as well as easing many tasks in everyday life. And it has advanced the sciences of mind and life in fundamental ways. But it has limitations, which present grave dangers given uncritical use. CFI aims to pre-empt these dangers, by guiding AI development in human-friendly ways.”

“Recent landmarks such as self-driving cars or a computer game winning at the game of Go, are signs of what’s to come,” added Professor Hawking. “The rise of powerful AI will either be the best or the worst thing ever to happen to humanity. We do not yet know which. The research done by this centre is crucial to the future of our civilisation and of our species.”

Transcript of Professor Hawking’s speech at the launch of the Leverhulme Centre for the Future of Intelligence, October 19, 2016

“It is a great pleasure to be here today to open this new Centre.  We spend a great deal of time studying history, which, let’s face it, is mostly the history of stupidity.  So it is a welcome change that people are studying instead the future of intelligence.

Intelligence is central to what it means to be human.  Everything that our civilisation has achieved, is a product of human intelligence, from learning to master fire, to learning to grow food, to understanding the cosmos.

I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer.  It therefore follows that computers can, in theory, emulate human intelligence — and exceed it.

Artificial intelligence research is now progressing rapidly.  Recent landmarks such as self-driving cars, or a computer winning at the game of Go, are signs of what is to come.  Enormous levels of investment are pouring into this technology.  The achievements we have seen so far will surely pale against what the coming decades will bring.

The potential benefits of creating intelligence are huge.  We cannot predict what we might achieve, when our own minds are amplified by AI.  Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one — industrialisation.  And surely we will aim to finally eradicate disease and poverty.  Every aspect of our lives will be transformed.  In short, success in creating AI, could be the biggest event in the history of our civilisation.

But it could also be the last, unless we learn how to avoid the risks.  Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many.   It will bring great disruption to our economy.  And in the future, AI could develop a will of its own — a will that is in conflict with ours.

In short, the rise of powerful AI will be either the best, or the worst thing, ever to happen to humanity.  We do not yet know which.  That is why in 2014, I and a few others called for more research to be done in this area.  I am very glad that someone was listening to me!

The research done by this centre is crucial to the future of our civilisation and of our species.  I wish you the best of luck!”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Anti-Inflammatory Drugs Could Help Treat Symptoms of Depression, Study Suggests

Anti-inflammatory drugs could help treat symptoms of depression, study suggests

source: www.ox.ac.uk

Anti-inflammatory drugs similar to those used to treat conditions such as rheumatoid arthritis and psoriasis could in future be used to treat some cases of depression, concludes a review led by the University of Cambridge, which further implicates our immune system in mental health disorders.

It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs

Golam Khandaker

Researchers from the Department of Psychiatry at Cambridge led a team that analysed data from 20 clinical trials involving the use of anti-cytokine drugs to treat a range of autoimmune inflammatory diseases. By looking at additional beneficial side-effects of the treatments, the researchers were able to show that there was a significant antidepressant effect from the drugs compared to a placebo based on a meta-analysis of seven randomised controlled trials. Meta-analyses of the other types of clinical trials showed similar results.

When we are exposed to an infection, for example influenza or a stomach bug, our immune system fights back to control and remove the infection. During this process, immune cells flood the blood stream with proteins known as cytokines. This process is known as systemic inflammation.

Even when we are healthy, our bodies carry trace levels of these proteins – known as ‘inflammatory markers’ – which rise exponentially in response to infection. Previous work from the team found that children with high everyday levels of one of these markers are at greater risk of developing depression and psychosis in adulthood, suggesting a role for the immune system, particularly chronic low-grade systemic inflammation, in mental illness.

Inflammation can also occur as a result of the immune system mistaking healthy cells for infected cells and attacking the body, leading to autoimmune inflammatory diseases such as rheumatoid arthritis, psoriasis and Crohn’s disease. New types of anti-inflammatory drugs called anti-cytokine monoclonal antibodies and cytokine inhibitors have been developed recently, some of which are now routinely used for patients who respond poorly to conventional treatments. Many more are currently undergoing clinical trials to test their efficacy and safety.

The team of researchers carried out a meta-analysis of these clinical trials and found that the drugs led to an improvement in the severity of depressive symptoms independently of improvements in physical illness. In other words, regardless of whether a drug successfully treated rheumatoid arthritis, for example, it would still help improve a patient’s depressive symptoms. Their results are published today in the journal Molecular Psychiatry.

Dr Golam Khandaker, who led the study, says: “It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs. These are not your everyday anti-inflammatory drugs such as ibuprofen, however, but a particular new class of drugs.”

“It’s too early to say whether these anti-cytokine drugs can be used in clinical practice for depression, however,” adds Professor Peter Jones, co-author of the study. “We will need clinical trials to test how effective they are in patients who do not have the chronic conditions for which the drugs have been developed, such as rheumatoid arthritis or Crohn’s disease. On top of this, some existing drugs can have potentially serious side effects, which would need to be addressed.”

Dr Khandaker and colleagues believe that anti-inflammatory drugs may offer hope for patients for whom current antidepressants are ineffective. Although the trials reviewed by the team involve physical illnesses that trigger inflammation – and hence potentially contribute to depression – their previous work found a connection between depression and baseline levels of inflammation in healthy people (when someone does not have an acute infection), which can be caused by a number of factors such as genes and psychological stress.

“About a third of patients who are resistant to antidepressants show evidence of inflammation,” adds Dr Khandaker. “So, anti-inflammatory treatments could be relevant for a large number of people who suffer from depression.

“The current approach of a ‘one-size-fits-all’ medicine to treat depression is problematic. All currently available antidepressants target a particular type of neurotransmitter, but a third of patients do not respond to these drugs. We are now entering the era of ‘personalised medicine’ where we can tailor treatments to individual patients. This approach is starting to show success in treating cancers, and it’s possible that in future we would use anti-inflammatory drugs in psychiatry for certain patients with depression.”

The research was mainly funded by the Wellcome Trust, with further support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Reference
Kappelmann, N et al. Antidepressant activity of anti-cytokine treatment: a systematic review and meta-analysis of clinical trials of chronic inflammatory conditions. Molecular Psychiatry; 18 Oct 2016; DOI: 10.1038/mp.2016.167


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Researchers Road-Test Powerful Method For Studying Singlet Fission

Researchers road-test powerful method for studying singlet fission

source: www.cam.ac.uk

In a new study, researchers measure the spin properties of electronic states produced in singlet fission – a process which could have a central role in the future development of solar cells.

Future research will focus on making devices and examining how these states can be harnessed for use in solar cells

Leah Weiss

Physicists have successfully employed a powerful technique for studying electrons generated through singlet fission, a process which it is believed will be key to more efficient solar energy production in years to come.

Their approach, reported in the journal Nature Physics, employed lasers, microwave radiation and magnetic fields to analyse the spin of excitons, which are energetically excited particles formed in molecular systems.

These are generated as a result of singlet fission, a process that researchers around the world are trying to understand fully in order to use it to better harness energy from the sun. Using materials exhibiting singlet fission in solar cells could make energy production much more efficient in the future, but the process needs to be fully understood in order to optimize the relevant materials and design appropriate technologies to exploit it.

In most existing solar cells, light particles (or photons) are absorbed by a semiconducting material, such as silicon. Each photon stimulates an electron in the material’s atomic structure, giving a single electron enough energy to move. This can then potentially be extracted as electrical current.

In some materials, however, the absorption of a single photon initially creates one higher-energy, excited particle, called a spin singlet exciton. This singlet can also share its energy with another molecule, forming two lower-energy excitons, rather than just one. These lower-energy particles are called spin “triplet” excitons. Each triplet can move through the molecular structure of the material and be used to produce charge.

The splitting process – from one absorbed photon to two energetic triplet excitons – is singlet fission. For scientists studying how to generate more solar power, it represents a potential bargain – a two-for-one offer on the amount of electrical current generated, relative to the amount of light put in. If materials capable of singlet fission can be integrated into solar cells, it will become possible to generate energy more efficiently from sunlight.

But achieving this is far from straightforward. One challenge is that the pairs of triplet excitons only last for a tiny fraction of a second, and must be separated and used before they decay. Their lifespan is connected to their relative “spin”, which is a unique property of elementary particles and is an intrinsic angular momentum. Studying and measuring spin through time, from the initial formation of the pairs to their decay, is essential if they are to be harnessed.

In the new study, researchers from the University of Cambridge and the Freie Universität Berlin (FUB) utilised a method that allows the spin properties of materials to be measured through time. The approach, called electron spin resonance (ESR) spectroscopy, has been used and improved since its discovery over 50 years ago to better understand how spin impacts on many different natural phenomena.

It involves placing the material being studied within a large electromagnet, and then using laser light to excite molecules within the sample, and microwave radiation to measure how the spin changes over time. This is especially useful when studying triplet states formed by singlet fission as these are difficult to study using most other techniques.

Because the excitons’ spin interacts with microwave radiation and magnetic fields, these interactions can be used as an additional way to understand what happens to the triplet pairs after they are formed. In short, the approach allowed the researchers to effectively watch and manipulate the spin state of triplet pairs through time, following formation by singlet fission.

The study was led by Professor Jan Behrends at the Freie Universität Berlin (FUB), Dr Akshay Rao, a College Research Associate at St John’s College, University of Cambridge, and Professor Neil Greenham in the Department of Physics, University of Cambridge.

Leah Weiss, a Gates-Cambridge Scholar and PhD student in Physics based at Trinity College, Cambridge, was the paper’s first author. “This research has opened up many new questions,” she said. “What makes these excited states either separate and become independent, or stay together as a pair, are questions that we need to answer before we can make use of them.”

The researchers were able to look at the spin states of the triplet excitons in considerable detail. They observed pairs had formed which variously had both weakly and strongly-linked spin states, reflecting the co-existence of pairs that were spatially close and further apart. Intriguingly, the group found that some pairs which they would have expected to decay very quickly, due to their close proximity, actually survived for several microseconds.

“Finding those pairs in particular was completely unexpected,” Weiss added. We think that they could be protected by their overall spin state, making it harder for them to decay. Continued research will focus on making devices and examining how these states can be harnessed for use in solar cells.”

Professor Behrends added: “This interdisciplinary collaboration nicely demonstrates that bringing together expertise from different fields can provide novel and striking insights. Future studies will need to address how to efficiently split the strongly-coupled states that we observed here, to improve the yield from singlet fission cells.”

Beyond trying to improve photovoltaic technologies, the research also has implications for wider efforts to create fast and efficient electronics using spin, so-called “spintronic” devices, which similarly rely on being able to measure and control the spin properties of electrons.

The research was made possible with support from the UK Engineering and Physical Sciences Research Council (EPSRC) and from the Freie Universität Berlin (FUB). Weiss and colleague Sam Bayliss carried out the spectroscopy experiments within the laboratories of Professor Jan Behrends and Professor Robert Bittl at FUB. The work is also part of the Cambridge initiative to connect fundamental physics research with global energy and environmental challenges, backed by the Winton Programme for the Physics of Sustainability.

The study, Strongly exchange-coupled triplet pairs in an organic semiconductor, is published in Nature Physics. DOI: 10.1038/nphys3908.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Fruit Fly Model of Deadly Brain Diseases Could Lead To Blood Test For vCJD

Fruit fly model of deadly brain diseases could lead to blood test for vCJD

source: www.cam.ac.uk

A new model of fatal brain diseases is being developed in the fruit fly by a team led by Dr Raymond Bujdoso at the University of Cambridge, and could lead to a low cost, fast and efficient blood test to diagnose – and prevent possible transmission of – variant Creutzfeldt-Jakob disease (vCJD).

We have found the fruit flies respond so quickly to infected blood that it means we can develop a faster, more versatile and more sensitive test to detect infectious prions in blood than currently exists

Raymond Bujdoso

Currently, methods to detect vCJD-infected human blood samples that involve experimental animals, such as mice, are time consuming and expensive. This new test could potentially be used on blood samples collected during pre-clinical disease and would be able to give a result in a matter of days or weeks rather than months or years.

In the late eighties, the UK saw an outbreak of bovine spongiform encephalopathy (BSE), a fatal brain condition in cattle, often referred to as ‘mad cow disease.’ BSE is a type of neurodegenerative brain condition known as a prion disease, caused by the build-up of a rogue form of a normal protein found in neurons. This aggregated form of the normal protein is referred to as a prion and is infectious. Following the BSE outbreak, a number of people were diagnosed with vCJD, a fatal human prion disease, believed to have occurred through the consumption of BSE-contaminated beef. vCJD causes changes in mood and behaviour, followed by difficulty in walking, and eventually leads to loss of movement and speech before death.

Other cases of vCJD have occurred in patients who received blood products prepared from donors who themselves later developed the disease; hence, blood-borne transmission of vCJD is a major concern for blood transfusion banks, manufacturers of blood plasma-derived products and public health authorities.

Although the number of people known to have died from vCJD is small – less than 180 cases in the UK – recent research has suggested that, within a certain age group of people in the UK, the number of individuals infected with vCJD, but who have not developed clinical signs of the condition, could be as high as one person in 2,000. Whether these individuals will go on to develop the clinical form of the disease during their natural life span remains uncertain.

At the moment, the only reliable way to detect infectious prions in blood is through a test known as a bioassay. This involves injecting suspected infected samples into experimental animals and waiting to see if these recipients develop prion disease. This is usually carried out by injecting potentially prion-infected samples into the brains of mice. These assays are slow and cumbersome, since the incubation time for prion disease may be over a year. This means that very few blood samples are routinely screened for prion infectivity.

Now, in a study published today in the Biochemical Journal, scientists at the University of Cambridge, UK, and the Ecole Nationale Veterinaire de Toulouse, France, report the development of a genetically-modified fruit fly (Drosophila melanogaster) into which a gene has been inserted to make the fly capable of producing the rogue protein that aggregates in the brain of sheep with the prion disease scrapie.

When the researchers fed these transgenic flies plasma from sheep known to have prions in their blood, they found that this caused prion disease in the flies. This response to prion-infected blood was evident within only a few weeks after exposure to the material.

Dr Raymond Bujdoso from the Department of Veterinary Medicine at the University of Cambridge, who led the research, says: “We have found the fruit flies respond so quickly to infected blood that it means we can develop a faster, more versatile and more sensitive test to detect infectious prions in blood than currently exists.

“At the moment, screening blood products for vCJD prion infectivity is just not practical – it is expensive and time consuming, and would require the use of a large number of animals, which is ethically unacceptable. The development of a vCJD blood test that could easily and reliably screen for prion-infectivity would represent an ideal solution for identifying donors and blood donations that might present a risk of causing the disease.”

Fruit flies are relatively easy and economical to work with, and widely accepted to be an ethical alternative to higher organisms such as mice. Dr Bujdoso and colleagues say that their fruit fly model will help contribute to the so-called 3Rs – the replacement, refinement and reduction of the use of animals in research.

Professor David Carling, Chair of the Biochemical Journal, adds: “The paper from Dr Bujdoso and colleagues provides a proof-of-principle study demonstrating that the fruit fly can be used to detect the infectious agent responsible for a type of neurodegenerative disease. Although the work is at a preliminary stage, it offers the exciting possibility of developing a quick and reliable screen for early diagnosis of a devastating disease.”

The research was supported by the Isaac Newton Trust and the National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs).

Reference
Thackray, AM, Andréoletti, O and Bujdoso, R. Bioassay of prion-infected blood plasma in PrP transgenic Drosophila. Biochem Journal; 13 Oct 2016; 10.1042/BCJ20160417


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

New Approach To Treating Type 1 Diabetes Aims To Limit Damage Caused By Our Own Immune System

New approach to treating type 1 diabetes aims to limit damage caused by our own immune system

source: www.cam.ac.uk

Researchers at the University of Cambridge have taken the first step towards developing a new form of treatment for type 1 diabetes which, if successful, could mean an end to the regular insulin injections endured by people affected by the disease, many of whom are children.

Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system

Frank Waldron-Lynch

Type 1 diabetes is one of the most common chronic diseases in children and there is a rapid increase in the number affected each year. About 400,000 people in the UK are affected, 29,000 of them children. In type 1 diabetes, the body’s own immune system mistakes the insulin producing cells of the pancreas as harmful, attacks and then destroys them. The result is a lack of insulin, which is essential for transporting glucose from the blood into cells. Without insulin, glucose levels in the blood rise, causing short term and long term damage: hence patients have to inject themselves several times a day with insulin to compensate.

In a study published today in the open access journal PLOS Medicine, a team led by researchers from the JDRF/Wellcome Trust Diabetes Inflammation Laboratory at the Cambridge Institute of Medical Research used a drug to regulate the immune system with the aim of preventing a patient’s immune cells attacking their insulin-producing cells in the pancreas.

The drug, aldesleukin, recombinant interleukin -2 (IL-2), is currently used at high doses to treat certain types of kidney tumours and skin cancers. At much lower doses, aldesleukin enhances the ability of immune cells called regulatory T cells (Tregs) to stop the immune system losing control once stimulated and prevent it from damaging the body’s own organs (autoimmunity).

Critical to this approach was to first determine the effects of single doses of aldesleukin on Tregs in patients with type 1 diabetes. To achieve this the team employed a state-of-the-art trial design combined with extensive immune monitoring in 40 participants with type 1 diabetes, and found doses to increase Tregs by between 10-20%. These doses are potentially enough to prevent immune cells from attacking the body, but not so much that they would supress the body’s natural defences, which are essential for protecting us from infection by invading bacteria or viruses.

The researchers also found that the absence of response of some participants in previous trials may be explained by the daily dosing regimen of aldesleukin used. The current trial results suggest that daily dosing results in Tregs becoming less sensitive to the drug, and the recommendation from the study is that the drug should not be administered on a daily basis for optimal immune outcomes.

“Type 1 diabetes is fatal if left untreated, but the current treatment – multiple daily injections of insulin – are at best inconvenient, at worst painful, particularly for children,” says Dr Frank Waldron-Lynch, who led the trial. “Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system.

“Our work is at an early stage, but it uses a drug that occurs naturally within the body to restore the immune system to health in these patients. Whereas previous approaches have focused on suppressing the immune system, we are looking to fine-tune it. Our next step is to find the optimal, ‘Goldilocks’ treatment regimen – too little and it won’t stop the damage, too much and it could impair our natural defences, but just right and it would enhance the body’s own response.”

The researchers say that any treatment would initially focus on people who are newly-diagnosed with type 1 diabetes, many of whom are still able to produce sufficient insulin to prevent complications from the disease. The treatment could then help prevent further damage and help them to continue to produce a small amount of insulin for a longer period of time.

The research was largely funded by the type 1 diabetes charity JDRF, the Wellcome Trust and the Sir Jules Thorn Charitable Trust, with support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Angela Wipperman, Senior Research Communications Manager at JDRF, said: “Immunotherapy research offers the potential to change the lives of those affected by type 1 diabetes. We eagerly await the next steps from this talented research team.”

Reference
Todd JA, Evangelou M, Cutler AJ, Pekalski ML, Walker NM, Stevens HE, et al. PLOS Medicine; 11 Oct 2016; DOI: 10.1371/journal.pmed.1002139


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.