All posts by Admin

Potential New Treatment For Haemophilia Developed By Cambridge Researchers

Potential new treatment for haemophilia developed by Cambridge researchers

source: www.cam.ac.uk

A new treatment that might one day help all patients with haemophilia, including those that become resistant to existing therapies, has been developed by researchers at the University of Cambridge.

Within three years, we hope to be conducting our first-in-man trials

Trevor Baglin

Around 400,000 individuals around the world are affected by haemophilia, a genetic disorder that causes uncontrolled bleeding. Haemophilia is the result of a deficiency in proteins required for normal blood clotting – factor VIII for haemophilia A and factor IX for haemophilia B. Currently, the standard treatment is administration of the missing clotting factor. However, this requires regular intravenous injections, is not fully effective, and in about a third of patients results in the development of inhibitory antibodies. Nearly three-quarters of haemophilia sufferers have no access to treatment and have a life-expectancy of only 10 years.

In a study published online today in Blood, the Journal of the American Society of Hematology, researchers report on a novel approach that gives the clotting process more time to produce thrombin, the enzyme that forms blood clots.  They suggest this treatment could one day help all patients with haemophilia, including those who develop antibodies against standard therapy. The therapy is based on observations relating to a disorder associated with excessive clotting, known as factor V Leiden.

“We know that patients who have haemophilia and also have mutations that increase clotting, such as factor V Leiden, experience less-severe bleeding,” says study co-author Dr Trevor Baglin, Consultant Haematologist at Addenbrooke’s Hospital, Cambridge University Hospitals.

Dr Baglin and colleagues therefore pursued a strategy of reducing the activity of an anticoagulant enzyme, known as activated protein C (APC). The principal function of APC is to breakdown the complex that makes thrombin, and the factor V Leiden mutation slows this process. The team, led by Professor Jim Huntington, exploited this insight by developing a specific inhibitor of APC based on a particular type of molecule known as a serpin.

“We hypothesized that if we targeted the protein C pathway we could prolong thrombin production and thereby induce clotting in people with clotting defects, such as haemophilia sufferers,” says Professor Huntington, from the Cambridge Institute for Medical Research at the University of Cambridge. “So, we engineered a serpin that could selectively prevent APC from shutting down thrombin production before the formation of a stable clot.”

To test their theory, the team administered the serpin to mice with haemophilia B and clipped their tails. The researchers found that the amount of blood loss decreased as the dose increased, with the highest dose reducing bleeding to the level found in normal mice. Further studies confirmed that the serpin helped haemophilia mice form stable clots, with higher doses resulting in faster clot formation. The serpin was also able to increase thrombin production and accelerate clot formation when added to blood samples from haemophilia A patients.

“It’s our understanding that because we are targeting a general anti-clotting process, our serpin could effectively treat patients with either haemophilia A or B, including those who develop resistance to more traditional therapy,” adds Professor Huntington. “Additionally, we have focused on engineering the serpin to be long-acting and to be delivered by injection under the skin instead of directly into veins. This will free patients from the inconvenience of having to receive infusions three times a week, as is the case with current treatments.”

The research team hopes that the discovery can be rapidly developed into an approved medicine to provide improved care to haemophilia sufferers around the world.

“Within three years, we hope to be conducting our first-in-man trials of a subcutaneously-administered form of our serpin,” says Dr Baglin. “It is important to remember that the majority of people in the world with haemophilia have no access to therapy. A stable, easily-administered, long-acting, effective drug could bring treatment to a great deal many more haemophilia sufferers.”

This study forms part of a patent application, filed in the name of Cambridge Enterprise, and the modified serpin is being developed by a start-up company, ApcinteX, with funding from Medicxi.

Adapted from a press release by American Society of Hematology.

Reference
Polderdijk, SGI et al. Design and characterization of an APC-specific serpin for the treatment of haemophilia. Blood; 27 Oct 2016; DOI: 10.1182/blood-2016-05-718635


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Top Ten Universities Conduct a Third of All UK Animal Research

Top ten universities conduct a third of all UK animal research

source: www.cam.ac.uk

The ten UK universities who do the most world-leading biomedical research have announced their animal research statistics, revealing that they collectively conducted a third of all UK animal research in 2015.

The top ten institutions conduct more than two thirds of all UK university animal research between them, completing a combined total of 1.37 million procedures. Over 99% of these procedures were carried out on rodents or fish, and in line with national data they were roughly evenly split between experiments and the breeding of genetically modified animals.

The ten universities are listed below alongside the total number of procedures that they carried out in 2015. Each institution’s name links to a breakdown of their individual animal research statistics.

University of Oxford:             226,214
University of Edinburgh:        212,695
UCL:                                     202,554
University of Cambridge:       181,080
King’s College London:         175,296
University of Manchester:      145,457
Imperial College London:       101,179
University of Glasgow:           49,082
University of Birmingham:      47,657
University of Nottingham:       31,689

The universities employ more than 90,000 staff between them, and as you would expect the larger institutions tend to conduct the most animal research. All universities are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible, minimising the number of animals used per experiment and minimising suffering to improve animal welfare. However, as universities expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

“The fact that we perform a significant proportion of the UK’s leading biomedical research is something to be proud of,” says Professor Michael Arthur, UCL President & Provost. “It’s no surprise that the universities who conduct the most world-leading research also use the most animals; despite advances in non-animal techniques, animals offer answers to many research questions that alternative methods cannot yet provide.”

All ten universities are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. 107 organisations have signed the concordat including UK universities, charities, research funders and commercial research organisations.

Animal research has played a key role in the development of virtually every medicine that we take for granted today. However, despite decades of dedicated research, many widespread and debilitating conditions are still untreatable. Medical research is a slow process with no easy answers, but animal research helps to take us incrementally closer to treatments for cancer, dementia, stroke and countless other conditions.

While many animal studies do not lead directly to treatments for diseases, ‘basic science’ research helps scientists to understand different processes in the body and how they can go wrong, underpinning future efforts to diagnose and treat various conditions. Additionally, many studies will show that a line of research is not worth pursuing. Although this can be disappointing, such research is incredibly valuable as scientists need to know which methods do not work and why so that they can develop new ones. Animal studies can also help to answer a wide range of research questions that are not directly related to diseases, such as exploring how genes determine traits or how brain functions develop.

About animal research at the University of Cambridge


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Self-Renewable Killer Cells Could Be Key To Making Cancer Immunotherapy Work

Self-renewable killer cells could be key to making cancer immunotherapy work

source: www.cam.ac.uk

A small molecule that can turn short-lived ‘killer T-cells’ into long-lived, renewable cells that can last in the body for a longer period of time, activating when necessary to destroy tumour cells, could help make cell-based immunotherapy a realistic prospect to treat cancer.

Rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells

Randall Johnson

In order to protect us from invading viruses and bacteria, and from internal threats such as malignant tumour cells, our immune system employs an army of specialist immune cells. Just as a conventional army will be made up of different types of soldiers, each with a particular role, so each of these immune cells has a particular function.

Among these cells are cytotoxic T-cells – ‘killer T-cells’, whose primary function is to patrol our bodies, programmed to identify and destroy infected or cancerous cells. Scientists are now trying to harness these cells as a way to fight cancer, by growing T-cells programmed to recognise cancer cells in the laboratory in large numbers and then reintroducing them into the body to destroy the tumour – an approach known as adoptive T-cell immunotherapy.

However, this approach has been hindered by the fact that killer T-cells are short-lived – most killer T cells are gone within three days of transfer – so the army may have died out before it has managed to rid the body of the tumour.

Now, an international team led by researchers at the University of Cambridge has identified a way of increasing the life-span of these T-cells, a discovery that could help scientists overcome one of the key hurdles preventing progress in immunotherapy.

In a paper published today in the journal Nature, the researchers have identified a new role for a molecule known as 2-hydroxyglutarate, or 2-HG, which is known to trigger abnormal growth in tumour cells. In fact, the team has shown that a slightly different form of the molecule also plays a normal, but critical, role in T-cell function: it can influence T-cells to reside in a ‘memory state’.  This is a state where the cells can renew themselves, persist for a very long period of time, and re-activate to combat infection or cancer.

The researchers found that by increasing the levels of 2-HG in the T-cells, the researchers could generate cells that could much more effectively destroy tumours. Rather than expiring shortly after reintroduction, the memory state T-cells were able to persist for much longer, destroying tumour cells more effectively.

“In a sense, this means that rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells,” says Professor Randall Johnson, Wellcome Trust Principal Research Fellow at the Department of Physiology, Development & Neuroscience, University of Cambridge.

“So, with a fairly trivial treatment of T-cells, we’re able to change a moderate response to tumour growth to a much stronger response, potentially giving people a more permanent immunity to the tumours they are carrying. This could make immunotherapy for cancer much more effective.”

The research was largely funded by the Wellcome Trust.

Reference
Tyrakis, PA et al. The immunometabolite S-2-hydroxyglutarate regulates CD8+ T-lymphocyte fate; Nature; 26 Oct 2016; DOI: 10.1038/nature2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Extends World Leading Role For Medical Imaging With Powerful New Brain and Body Scanners

Cambridge extends world leading role for medical imaging with powerful new brain and body scanners

source: www.cam.ac.uk

The next generation of imaging technology, newly installed at the University of Cambridge, will give researchers an unprecedented view of the human body – in particular of the myriad connections within our brains and of tumours as they grow and respond to treatment – and could pave the way for development of treatments personalised for individual patients.

By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them

Ed Bullmore

The equipment, funded by the Medical Research Council (MRC), Wellcome Trust and Cancer Research UK, sits within the newly-refurbished Wolfson Brain Imaging Centre (WBIC), which today celebrates two decades at the forefront of medical imaging.

At the heart of the refurbishment are three cutting-edge scanners, of which only a very small handful exist at institutions outside Cambridge – and no institution other than the University of Cambridge has all three. These are:

  • a Siemens 7T Terra Magnetic Resonance Imaging (MRI) scanner, which will allow researchers to see detail in the brain as tiny as a grain of sand
  • a GE Healthcare PET/MR scanner that will enable researchers to collect critical data to help understand how cancers grow, spread and respond to treatment, and how dementia progresses
  • a GE Healthcare hyperpolarizer that enables researchers to study real-time metabolism of cancers and other body tissues, including whether a cancer therapy is effective or not

These scanners, together with refurbished PRISMA and Skyra 3T MRI scanners at the WBIC and at the Medical Research Council Cognition and Brain Sciences Unit, will make the Cambridge Biomedical Campus the best-equipped medical imaging centre in Europe.

Professor Ed Bullmore, Co-Chair of Cambridge Neuroscience and Scientific Director of the WBIC, says: “This is an exciting day for us as these new scanners will hopefully provide answers to questions that we have been asking for some time, as well as opening up new areas for us to explore in neuroscience, mental health research and cancer medicine.

“By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them. This will be a powerful research tool and represents a big step in the direction of personalised treatments.”

Dr Rob Buckle, Director of Science Programmes at the MRC, adds: “The MRC is proud to sponsor this exciting suite of new technologies at the University of Cambridge. They will play an important role in advancing our strategy in stratified medicine, ultimately ensuring that the right patient gets the right treatment at the right time.”

 

Slide show: Click on images to expand

7T Medical Resonance Imaging (MRI) scanner

The Siemens 7T Terra scanner – which refers to the ultrahigh strength of its magnetic field at 7 Tesla – will allow researchers to study at unprecedented levels of detail the workings of the brain and how it encodes information such as individual memories. Current 3T MRI scanners can image structures 2-3mm in size, whereas the new scanner has a resolution of just 0.5mm, the size of a coarse grain of sand.

“Often, the early stages of diseases of the brain, such as Alzheimer’s and Parkinson’s, occur in very small structures – until now too small for us to see,” explains Professor James Rowe, who will be leading research using the new 7T scanner. “The early seeds of dementia for example, which are often sown in middle age, have until now been hidden to less powerful MRI scanners.”

The scanner will also be able to pick up unique signatures of neurotransmitters in the brain, the chemicals that allow its cells to communicate with each other. Changes in the amount of these neurotransmitters affect how the brain functions and can underpin mental health disorders such as depression and schizophrenia.

“How a patient responds to a particular drug may depend on how much of a particular neurotransmitter present is currently present,” says Professor Rowe. “We will be looking at whether this new scanner can help provide this information and so help us tailor treatments to individual patients.”

The scanner will begin operating at the start of December, with research projects lined up to look at dementias caused by changes to the brain almost undetectable by conventional scanners, and to look at how visual and sound information is converted to mental representations in the brain.

PET/MR scanner

The new GE Healthcare PET/MR scanner brings together two existing technologies: positron emission tomography (PET), which enables researchers to visualise cellular activity and metabolism, and magnetic resonance (MR), which is used to image soft tissue for structural and functional details.

Purchased as part of the Dementias Platform UK, a network of imaging centres across the UK, the scanner will enable researchers to simultaneously collect information on physiological and disease-related processes in the body, reducing the need for patients to return for multiple scans. This will be particularly important for dementia patients.

Professor Fiona Gilbert, who will lead research on the PET/MR scanner, explains: “Dementia patients are often frail, which can present challenges when they need separate PET and MR scanners. So, not only will this new scanner provide us with valuable information to help improve understanding and diagnosis of dementia, it will also be much more patient-friendly.”

PET/MR  will allow researchers to see early molecular changes in the brain, accurately map them onto structural brain images and follow their progression as disease develops or worsens. This could enable researchers to diagnose dementia before any symptoms have arisen and to understand which treatments may best halt or slow the disease.

As well as being used for dementia research, the scanner will also be applied to cancer research, says Professor Gilbert.

“At the moment, we have to make lots of assumptions about what’s going on in tumour cells. We can take biopsies and look at the different cell types, how aggressive they are, their genetic structure and so on, but we can only guess what’s happening to a tumour at a functional level. Functional information is important for helping us determine how best to treat the cancer – and hence how we can personalise treatment for a particular patient. Using PET/MR, we can get real-time information for that patient’s specific tumour and not have to assume it is behaving in the same way as the last hundred tumours we’ve seen.”

The PET/MR scanner will begin operation at the start of November, when it will initially be used to study oxygen levels and blood flow in the tumours of breast cancer patients and in studies of brain inflammation in patients with Alzheimer’s disease and depression.

Hyperpolarizer

The third new piece of imaging equipment to be installed is a GE Healthcare hyperpolarizer, which is already up and running at the facility.

MRI relies on the interaction of strong magnetic fields with a property of atomic nuclei known as ‘spin’. By looking at how these spins differ in the presence of magnetic field gradients applied across the body, scientists are able to build up three-dimensional images of tissues. The hyperpolarizer boosts the ‘spin’ signal from tracers injected into the tissue, making the MRI measurement much more sensitive and allowing imaging of the biochemistry of the tissue as well as its anatomy.

“Because of underlying genetic changes in a tumour, not all patients respond in the same way to the same treatment,” explains Professor Kevin Brindle, who leads research using the hyperpolarizer. “Using hyperpolarisation and MRI, we can potentially tell whether a drug is working, from changes in the tumour’s biochemistry, within a few hours of starting treatment. If it’s working you continue, if not you change the treatment.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Engineers Design Ultralow Power Transistors That Could Function For Years Without a Battery

Engineers design ultralow power transistors that could function for years without a battery

 

source: www.cam.ac.uk

A new design for transistors which operate on ‘scavenged’ energy from their environment could form the basis for devices which function for months or years without a battery, and could be used for wearable or implantable electronics.

If we were to draw energy from a typical AA battery based on this design, it would last for a billion years.

Sungsik Lee

A newly-developed form of transistor opens up a range of new electronic applications including wearable or implantable devices by drastically reducing the amount of power used. Devices based on this type of ultralow power transistor, developed by engineers at the University of Cambridge, could function for months or even years without a battery by ‘scavenging’ energy from their environment.

Using a similar principle to a computer in sleep mode, the new transistor harnesses a tiny ‘leakage’ of electrical current, known as a near-off-state current, for its operations. This leak, like water dripping from a faulty tap, is a characteristic of all transistors, but this is the first time that it has been effectively captured and used functionally. Theresults, reported in the journal Science, open up new avenues for system design for the Internet of Things, in which most of the things we interact with every day are connected to the Internet.

The transistors can be produced at low temperatures and can be printed on almost any material, from glass and plastic to polyester and paper. They are based on a unique geometry which uses a ‘non-desirable’ characteristic, namely the point of contact between the metal and semiconducting components of a transistor, a so-called ‘Schottky barrier.’

“We’re challenging conventional perception of how a transistor should be,” said Professor Arokia Nathan of Cambridge’s Department of Engineering, the paper’s co-author. “We’ve found that these Schottky barriers, which most engineers try to avoid, actually have the ideal characteristics for the type of ultralow power applications we’re looking at, such as wearable or implantable electronics for health monitoring.”

The new design gets around one of the main issues preventing the development of ultralow power transistors, namely the ability to produce them at very small sizes. As transistors get smaller, their two electrodes start to influence the behaviour of one another, and the voltages spread, meaning that below a certain size, transistors fail to function as desired. By changing the design of the transistors, the Cambridge researchers were able to use the Schottky barriers to keep the electrodes independent from one another, so that the transistors can be scaled down to very small geometries.

The design also achieves a very high level of gain, or signal amplification. The transistor’s operating voltage is less than a volt, with power consumption below a billionth of a watt. This ultralow power consumption makes them most suitable for applications where function is more important than speed, which is the essence of the Internet of Things.

“If we were to draw energy from a typical AA battery based on this design, it would last for a billion years,” said Dr Sungsik Lee, the paper’s first author, also from the Department of Engineering. “Using the Schottky barrier allows us to keep the electrodes from interfering with each other in order to amplify the amplitude of the signal even at the state where the transistor is almost switched off.”

“This will bring about a new design model for ultralow power sensor interfaces and analogue signal processing in wearable and implantable devices, all of which are critical for the Internet of Things,” said Nathan.

“This is an ingenious transistor concept,” said Professor Gehan Amaratunga, Head of the Electronics, Power and Energy Conversion Group at Cambridge’s Engineering Department. “This type of ultra-low power operation is a pre-requisite for many of the new ubiquitous electronics applications, where what matters is function – in essence ‘intelligence’ – without the demand for speed. In such applications the possibility of having totally autonomous electronics now becomes a possibility. The system can rely on harvesting background energy from the environment for very long term operation, which is akin to organisms such as bacteria in biology.”

Reference:
S. Lee and A. Nathan, ‘Subthreshold Schottky-barrier thin film transistors with ultralow power and high intrinsic gain’. Science (2016). DOI: 10.1126/science.aah5035


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

“The Best Or Worst Thing To Happen To Humanity” – Stephen Hawking Launches Centre For The Future of Intelligence

“The best or worst thing to happen to humanity” – Stephen Hawking launches Centre for the Future of Intelligence

source: www.cam.ac.uk

Artificial intelligence has the power to eradicate poverty and disease or hasten the end of human civilisation as we know it – according to a speech delivered by Professor Stephen Hawking this evening.

Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many.

Stephen Hawking

Speaking at the launch of the £10millionLeverhulme Centre for the Future of Intelligence (CFI) in Cambridge, Professor Hawking said the rise of AI would transform every aspect of our lives and was a global event on a par with the industrial revolution.

CFI brings together four of the world’s leading universities (Cambridge, Oxford, Berkeley and Imperial College, London) to explore the implications of AI for human civilisation. Together, an interdisciplinary community of researchers will work closely with policy-makers and industry investigating topics such as the regulation of autonomous weaponry, and the implications of AI for democracy.

“Success in creating AI could be the biggest event in the history of our civilisation,” said Professor Hawking. “But it could also be the last – unless we learn how to avoid the risks. Alongside the benefits, AI will also bring dangers like powerful autonomous weapons or new ways for the few to oppress the many.

“We cannot predict what we might achieve when our own minds are amplified by AI. Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one – industrialisation.”

The Centre for the Future of Intelligence will initially focus on seven distinct projects in the first three-year phase of its work, reaching out to brilliant researchers and connecting them and their ideas to the challenges of making the best of AI. Among the initial research topics are: ‘Science, value and the future of intelligence’; ‘Policy and responsible innovation’; ‘Autonomous weapons – prospects for regulation’ and ‘Trust and transparency’.

The Academic Director of the Centre, and Bertrand Russell Professor of Philosophy at Cambridge, Huw Price, said: “The creation of machine intelligence is likely to be a once-in-a-planet’s-lifetime event. It is a future we humans face together. Our aim is to build a broad community with the expertise and sense of common purpose to make this future the best it can be.”

Many researchers now take seriously the possibility that intelligence equal to our own will be created in computers within this century. Freed of biological constraints, such as limited memory and slow biochemical processing speeds, machines may eventually become more intelligent than we are – with profound implications for us all.

AI pioneer Professor Maggie Boden (University of Sussex) sits on the Centre’s advisory board and spoke at this evening’s launch. She said: “AI is hugely exciting. Its practical applications can help us to tackle important social problems, as well as easing many tasks in everyday life. And it has advanced the sciences of mind and life in fundamental ways. But it has limitations, which present grave dangers given uncritical use. CFI aims to pre-empt these dangers, by guiding AI development in human-friendly ways.”

“Recent landmarks such as self-driving cars or a computer game winning at the game of Go, are signs of what’s to come,” added Professor Hawking. “The rise of powerful AI will either be the best or the worst thing ever to happen to humanity. We do not yet know which. The research done by this centre is crucial to the future of our civilisation and of our species.”

Transcript of Professor Hawking’s speech at the launch of the Leverhulme Centre for the Future of Intelligence, October 19, 2016

“It is a great pleasure to be here today to open this new Centre.  We spend a great deal of time studying history, which, let’s face it, is mostly the history of stupidity.  So it is a welcome change that people are studying instead the future of intelligence.

Intelligence is central to what it means to be human.  Everything that our civilisation has achieved, is a product of human intelligence, from learning to master fire, to learning to grow food, to understanding the cosmos.

I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer.  It therefore follows that computers can, in theory, emulate human intelligence — and exceed it.

Artificial intelligence research is now progressing rapidly.  Recent landmarks such as self-driving cars, or a computer winning at the game of Go, are signs of what is to come.  Enormous levels of investment are pouring into this technology.  The achievements we have seen so far will surely pale against what the coming decades will bring.

The potential benefits of creating intelligence are huge.  We cannot predict what we might achieve, when our own minds are amplified by AI.  Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one — industrialisation.  And surely we will aim to finally eradicate disease and poverty.  Every aspect of our lives will be transformed.  In short, success in creating AI, could be the biggest event in the history of our civilisation.

But it could also be the last, unless we learn how to avoid the risks.  Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many.   It will bring great disruption to our economy.  And in the future, AI could develop a will of its own — a will that is in conflict with ours.

In short, the rise of powerful AI will be either the best, or the worst thing, ever to happen to humanity.  We do not yet know which.  That is why in 2014, I and a few others called for more research to be done in this area.  I am very glad that someone was listening to me!

The research done by this centre is crucial to the future of our civilisation and of our species.  I wish you the best of luck!”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Anti-Inflammatory Drugs Could Help Treat Symptoms of Depression, Study Suggests

Anti-inflammatory drugs could help treat symptoms of depression, study suggests

source: www.ox.ac.uk

Anti-inflammatory drugs similar to those used to treat conditions such as rheumatoid arthritis and psoriasis could in future be used to treat some cases of depression, concludes a review led by the University of Cambridge, which further implicates our immune system in mental health disorders.

It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs

Golam Khandaker

Researchers from the Department of Psychiatry at Cambridge led a team that analysed data from 20 clinical trials involving the use of anti-cytokine drugs to treat a range of autoimmune inflammatory diseases. By looking at additional beneficial side-effects of the treatments, the researchers were able to show that there was a significant antidepressant effect from the drugs compared to a placebo based on a meta-analysis of seven randomised controlled trials. Meta-analyses of the other types of clinical trials showed similar results.

When we are exposed to an infection, for example influenza or a stomach bug, our immune system fights back to control and remove the infection. During this process, immune cells flood the blood stream with proteins known as cytokines. This process is known as systemic inflammation.

Even when we are healthy, our bodies carry trace levels of these proteins – known as ‘inflammatory markers’ – which rise exponentially in response to infection. Previous work from the team found that children with high everyday levels of one of these markers are at greater risk of developing depression and psychosis in adulthood, suggesting a role for the immune system, particularly chronic low-grade systemic inflammation, in mental illness.

Inflammation can also occur as a result of the immune system mistaking healthy cells for infected cells and attacking the body, leading to autoimmune inflammatory diseases such as rheumatoid arthritis, psoriasis and Crohn’s disease. New types of anti-inflammatory drugs called anti-cytokine monoclonal antibodies and cytokine inhibitors have been developed recently, some of which are now routinely used for patients who respond poorly to conventional treatments. Many more are currently undergoing clinical trials to test their efficacy and safety.

The team of researchers carried out a meta-analysis of these clinical trials and found that the drugs led to an improvement in the severity of depressive symptoms independently of improvements in physical illness. In other words, regardless of whether a drug successfully treated rheumatoid arthritis, for example, it would still help improve a patient’s depressive symptoms. Their results are published today in the journal Molecular Psychiatry.

Dr Golam Khandaker, who led the study, says: “It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs. These are not your everyday anti-inflammatory drugs such as ibuprofen, however, but a particular new class of drugs.”

“It’s too early to say whether these anti-cytokine drugs can be used in clinical practice for depression, however,” adds Professor Peter Jones, co-author of the study. “We will need clinical trials to test how effective they are in patients who do not have the chronic conditions for which the drugs have been developed, such as rheumatoid arthritis or Crohn’s disease. On top of this, some existing drugs can have potentially serious side effects, which would need to be addressed.”

Dr Khandaker and colleagues believe that anti-inflammatory drugs may offer hope for patients for whom current antidepressants are ineffective. Although the trials reviewed by the team involve physical illnesses that trigger inflammation – and hence potentially contribute to depression – their previous work found a connection between depression and baseline levels of inflammation in healthy people (when someone does not have an acute infection), which can be caused by a number of factors such as genes and psychological stress.

“About a third of patients who are resistant to antidepressants show evidence of inflammation,” adds Dr Khandaker. “So, anti-inflammatory treatments could be relevant for a large number of people who suffer from depression.

“The current approach of a ‘one-size-fits-all’ medicine to treat depression is problematic. All currently available antidepressants target a particular type of neurotransmitter, but a third of patients do not respond to these drugs. We are now entering the era of ‘personalised medicine’ where we can tailor treatments to individual patients. This approach is starting to show success in treating cancers, and it’s possible that in future we would use anti-inflammatory drugs in psychiatry for certain patients with depression.”

The research was mainly funded by the Wellcome Trust, with further support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Reference
Kappelmann, N et al. Antidepressant activity of anti-cytokine treatment: a systematic review and meta-analysis of clinical trials of chronic inflammatory conditions. Molecular Psychiatry; 18 Oct 2016; DOI: 10.1038/mp.2016.167


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Researchers Road-Test Powerful Method For Studying Singlet Fission

Researchers road-test powerful method for studying singlet fission

source: www.cam.ac.uk

In a new study, researchers measure the spin properties of electronic states produced in singlet fission – a process which could have a central role in the future development of solar cells.

Future research will focus on making devices and examining how these states can be harnessed for use in solar cells

Leah Weiss

Physicists have successfully employed a powerful technique for studying electrons generated through singlet fission, a process which it is believed will be key to more efficient solar energy production in years to come.

Their approach, reported in the journal Nature Physics, employed lasers, microwave radiation and magnetic fields to analyse the spin of excitons, which are energetically excited particles formed in molecular systems.

These are generated as a result of singlet fission, a process that researchers around the world are trying to understand fully in order to use it to better harness energy from the sun. Using materials exhibiting singlet fission in solar cells could make energy production much more efficient in the future, but the process needs to be fully understood in order to optimize the relevant materials and design appropriate technologies to exploit it.

In most existing solar cells, light particles (or photons) are absorbed by a semiconducting material, such as silicon. Each photon stimulates an electron in the material’s atomic structure, giving a single electron enough energy to move. This can then potentially be extracted as electrical current.

In some materials, however, the absorption of a single photon initially creates one higher-energy, excited particle, called a spin singlet exciton. This singlet can also share its energy with another molecule, forming two lower-energy excitons, rather than just one. These lower-energy particles are called spin “triplet” excitons. Each triplet can move through the molecular structure of the material and be used to produce charge.

The splitting process – from one absorbed photon to two energetic triplet excitons – is singlet fission. For scientists studying how to generate more solar power, it represents a potential bargain – a two-for-one offer on the amount of electrical current generated, relative to the amount of light put in. If materials capable of singlet fission can be integrated into solar cells, it will become possible to generate energy more efficiently from sunlight.

But achieving this is far from straightforward. One challenge is that the pairs of triplet excitons only last for a tiny fraction of a second, and must be separated and used before they decay. Their lifespan is connected to their relative “spin”, which is a unique property of elementary particles and is an intrinsic angular momentum. Studying and measuring spin through time, from the initial formation of the pairs to their decay, is essential if they are to be harnessed.

In the new study, researchers from the University of Cambridge and the Freie Universität Berlin (FUB) utilised a method that allows the spin properties of materials to be measured through time. The approach, called electron spin resonance (ESR) spectroscopy, has been used and improved since its discovery over 50 years ago to better understand how spin impacts on many different natural phenomena.

It involves placing the material being studied within a large electromagnet, and then using laser light to excite molecules within the sample, and microwave radiation to measure how the spin changes over time. This is especially useful when studying triplet states formed by singlet fission as these are difficult to study using most other techniques.

Because the excitons’ spin interacts with microwave radiation and magnetic fields, these interactions can be used as an additional way to understand what happens to the triplet pairs after they are formed. In short, the approach allowed the researchers to effectively watch and manipulate the spin state of triplet pairs through time, following formation by singlet fission.

The study was led by Professor Jan Behrends at the Freie Universität Berlin (FUB), Dr Akshay Rao, a College Research Associate at St John’s College, University of Cambridge, and Professor Neil Greenham in the Department of Physics, University of Cambridge.

Leah Weiss, a Gates-Cambridge Scholar and PhD student in Physics based at Trinity College, Cambridge, was the paper’s first author. “This research has opened up many new questions,” she said. “What makes these excited states either separate and become independent, or stay together as a pair, are questions that we need to answer before we can make use of them.”

The researchers were able to look at the spin states of the triplet excitons in considerable detail. They observed pairs had formed which variously had both weakly and strongly-linked spin states, reflecting the co-existence of pairs that were spatially close and further apart. Intriguingly, the group found that some pairs which they would have expected to decay very quickly, due to their close proximity, actually survived for several microseconds.

“Finding those pairs in particular was completely unexpected,” Weiss added. We think that they could be protected by their overall spin state, making it harder for them to decay. Continued research will focus on making devices and examining how these states can be harnessed for use in solar cells.”

Professor Behrends added: “This interdisciplinary collaboration nicely demonstrates that bringing together expertise from different fields can provide novel and striking insights. Future studies will need to address how to efficiently split the strongly-coupled states that we observed here, to improve the yield from singlet fission cells.”

Beyond trying to improve photovoltaic technologies, the research also has implications for wider efforts to create fast and efficient electronics using spin, so-called “spintronic” devices, which similarly rely on being able to measure and control the spin properties of electrons.

The research was made possible with support from the UK Engineering and Physical Sciences Research Council (EPSRC) and from the Freie Universität Berlin (FUB). Weiss and colleague Sam Bayliss carried out the spectroscopy experiments within the laboratories of Professor Jan Behrends and Professor Robert Bittl at FUB. The work is also part of the Cambridge initiative to connect fundamental physics research with global energy and environmental challenges, backed by the Winton Programme for the Physics of Sustainability.

The study, Strongly exchange-coupled triplet pairs in an organic semiconductor, is published in Nature Physics. DOI: 10.1038/nphys3908.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Fruit Fly Model of Deadly Brain Diseases Could Lead To Blood Test For vCJD

Fruit fly model of deadly brain diseases could lead to blood test for vCJD

source: www.cam.ac.uk

A new model of fatal brain diseases is being developed in the fruit fly by a team led by Dr Raymond Bujdoso at the University of Cambridge, and could lead to a low cost, fast and efficient blood test to diagnose – and prevent possible transmission of – variant Creutzfeldt-Jakob disease (vCJD).

We have found the fruit flies respond so quickly to infected blood that it means we can develop a faster, more versatile and more sensitive test to detect infectious prions in blood than currently exists

Raymond Bujdoso

Currently, methods to detect vCJD-infected human blood samples that involve experimental animals, such as mice, are time consuming and expensive. This new test could potentially be used on blood samples collected during pre-clinical disease and would be able to give a result in a matter of days or weeks rather than months or years.

In the late eighties, the UK saw an outbreak of bovine spongiform encephalopathy (BSE), a fatal brain condition in cattle, often referred to as ‘mad cow disease.’ BSE is a type of neurodegenerative brain condition known as a prion disease, caused by the build-up of a rogue form of a normal protein found in neurons. This aggregated form of the normal protein is referred to as a prion and is infectious. Following the BSE outbreak, a number of people were diagnosed with vCJD, a fatal human prion disease, believed to have occurred through the consumption of BSE-contaminated beef. vCJD causes changes in mood and behaviour, followed by difficulty in walking, and eventually leads to loss of movement and speech before death.

Other cases of vCJD have occurred in patients who received blood products prepared from donors who themselves later developed the disease; hence, blood-borne transmission of vCJD is a major concern for blood transfusion banks, manufacturers of blood plasma-derived products and public health authorities.

Although the number of people known to have died from vCJD is small – less than 180 cases in the UK – recent research has suggested that, within a certain age group of people in the UK, the number of individuals infected with vCJD, but who have not developed clinical signs of the condition, could be as high as one person in 2,000. Whether these individuals will go on to develop the clinical form of the disease during their natural life span remains uncertain.

At the moment, the only reliable way to detect infectious prions in blood is through a test known as a bioassay. This involves injecting suspected infected samples into experimental animals and waiting to see if these recipients develop prion disease. This is usually carried out by injecting potentially prion-infected samples into the brains of mice. These assays are slow and cumbersome, since the incubation time for prion disease may be over a year. This means that very few blood samples are routinely screened for prion infectivity.

Now, in a study published today in the Biochemical Journal, scientists at the University of Cambridge, UK, and the Ecole Nationale Veterinaire de Toulouse, France, report the development of a genetically-modified fruit fly (Drosophila melanogaster) into which a gene has been inserted to make the fly capable of producing the rogue protein that aggregates in the brain of sheep with the prion disease scrapie.

When the researchers fed these transgenic flies plasma from sheep known to have prions in their blood, they found that this caused prion disease in the flies. This response to prion-infected blood was evident within only a few weeks after exposure to the material.

Dr Raymond Bujdoso from the Department of Veterinary Medicine at the University of Cambridge, who led the research, says: “We have found the fruit flies respond so quickly to infected blood that it means we can develop a faster, more versatile and more sensitive test to detect infectious prions in blood than currently exists.

“At the moment, screening blood products for vCJD prion infectivity is just not practical – it is expensive and time consuming, and would require the use of a large number of animals, which is ethically unacceptable. The development of a vCJD blood test that could easily and reliably screen for prion-infectivity would represent an ideal solution for identifying donors and blood donations that might present a risk of causing the disease.”

Fruit flies are relatively easy and economical to work with, and widely accepted to be an ethical alternative to higher organisms such as mice. Dr Bujdoso and colleagues say that their fruit fly model will help contribute to the so-called 3Rs – the replacement, refinement and reduction of the use of animals in research.

Professor David Carling, Chair of the Biochemical Journal, adds: “The paper from Dr Bujdoso and colleagues provides a proof-of-principle study demonstrating that the fruit fly can be used to detect the infectious agent responsible for a type of neurodegenerative disease. Although the work is at a preliminary stage, it offers the exciting possibility of developing a quick and reliable screen for early diagnosis of a devastating disease.”

The research was supported by the Isaac Newton Trust and the National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs).

Reference
Thackray, AM, Andréoletti, O and Bujdoso, R. Bioassay of prion-infected blood plasma in PrP transgenic Drosophila. Biochem Journal; 13 Oct 2016; 10.1042/BCJ20160417


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

New Approach To Treating Type 1 Diabetes Aims To Limit Damage Caused By Our Own Immune System

New approach to treating type 1 diabetes aims to limit damage caused by our own immune system

source: www.cam.ac.uk

Researchers at the University of Cambridge have taken the first step towards developing a new form of treatment for type 1 diabetes which, if successful, could mean an end to the regular insulin injections endured by people affected by the disease, many of whom are children.

Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system

Frank Waldron-Lynch

Type 1 diabetes is one of the most common chronic diseases in children and there is a rapid increase in the number affected each year. About 400,000 people in the UK are affected, 29,000 of them children. In type 1 diabetes, the body’s own immune system mistakes the insulin producing cells of the pancreas as harmful, attacks and then destroys them. The result is a lack of insulin, which is essential for transporting glucose from the blood into cells. Without insulin, glucose levels in the blood rise, causing short term and long term damage: hence patients have to inject themselves several times a day with insulin to compensate.

In a study published today in the open access journal PLOS Medicine, a team led by researchers from the JDRF/Wellcome Trust Diabetes Inflammation Laboratory at the Cambridge Institute of Medical Research used a drug to regulate the immune system with the aim of preventing a patient’s immune cells attacking their insulin-producing cells in the pancreas.

The drug, aldesleukin, recombinant interleukin -2 (IL-2), is currently used at high doses to treat certain types of kidney tumours and skin cancers. At much lower doses, aldesleukin enhances the ability of immune cells called regulatory T cells (Tregs) to stop the immune system losing control once stimulated and prevent it from damaging the body’s own organs (autoimmunity).

Critical to this approach was to first determine the effects of single doses of aldesleukin on Tregs in patients with type 1 diabetes. To achieve this the team employed a state-of-the-art trial design combined with extensive immune monitoring in 40 participants with type 1 diabetes, and found doses to increase Tregs by between 10-20%. These doses are potentially enough to prevent immune cells from attacking the body, but not so much that they would supress the body’s natural defences, which are essential for protecting us from infection by invading bacteria or viruses.

The researchers also found that the absence of response of some participants in previous trials may be explained by the daily dosing regimen of aldesleukin used. The current trial results suggest that daily dosing results in Tregs becoming less sensitive to the drug, and the recommendation from the study is that the drug should not be administered on a daily basis for optimal immune outcomes.

“Type 1 diabetes is fatal if left untreated, but the current treatment – multiple daily injections of insulin – are at best inconvenient, at worst painful, particularly for children,” says Dr Frank Waldron-Lynch, who led the trial. “Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system.

“Our work is at an early stage, but it uses a drug that occurs naturally within the body to restore the immune system to health in these patients. Whereas previous approaches have focused on suppressing the immune system, we are looking to fine-tune it. Our next step is to find the optimal, ‘Goldilocks’ treatment regimen – too little and it won’t stop the damage, too much and it could impair our natural defences, but just right and it would enhance the body’s own response.”

The researchers say that any treatment would initially focus on people who are newly-diagnosed with type 1 diabetes, many of whom are still able to produce sufficient insulin to prevent complications from the disease. The treatment could then help prevent further damage and help them to continue to produce a small amount of insulin for a longer period of time.

The research was largely funded by the type 1 diabetes charity JDRF, the Wellcome Trust and the Sir Jules Thorn Charitable Trust, with support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Angela Wipperman, Senior Research Communications Manager at JDRF, said: “Immunotherapy research offers the potential to change the lives of those affected by type 1 diabetes. We eagerly await the next steps from this talented research team.”

Reference
Todd JA, Evangelou M, Cutler AJ, Pekalski ML, Walker NM, Stevens HE, et al. PLOS Medicine; 11 Oct 2016; DOI: 10.1371/journal.pmed.1002139


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Diagnosis Of Cancer As a Medical Emergency Leads To Poorer Prognosis For Many Patients

Diagnosis of cancer as a medical emergency leads to poorer prognosis for many patients

www.cam.ac.uk

Too many patients – particularly those from disadvantaged backgrounds – are being diagnosed with cancer as medical emergencies, say researchers. This means that their chances of successful treatment are greatly reduced.

The earlier an individual can get a diagnosis of cancer, the better the prognosis and the options for treatment. When the first time their cancer is identified is when it becomes an emergency, the prognosis is much worse

Yin Zhou

In an article in the journal Nature Reviews Clinical Oncology, a team of researchers jointly led by the University of Cambridge and University College London reviewed current evidence from 26 peer-reviewed studies and 6 online reports from 7 countries. The evidence indicates that emergency diagnosis of cancer is a universal problem, challenging previous assumptions regarding this issue being a particular problem only in the UK.

Looking at prognosis alone, the researchers reviewed evidence that showed that patients diagnosed with colorectal cancer at emergency presentation had a 50% one year survival rate, compared to 82% for patients diagnosed electively – in other words, following a GP referral to a specialist. Similarly, for lung cancer the respective survival rates were 12% versus 40%. This was in part because cancer diagnosed at emergency care was more likely to be at an advanced stage. However, this was not the full story: even when patients presented with tumours of the same disease stage they still had a worse prognosis if they were diagnosed in emergency care, possibly because of problems in the quality of their management out-of-hours, or because they have more aggressive disease on a stage-for-stage basis.

In the UK, about three in 10 emergency presenters are referred to hospital emergency services by their family doctors, but others self-present to accident and emergency departments. Patients at both ends of the age spectrum – the youngest and the oldest – were most likely to have their cancer diagnosed in emergency contexts. Differences between genders were unclear – and vary for different types of cancer.

However, the review found particular inequalities between socioeconomic groups. Although the evidence came only from studies looking at colorectal and lung cancer, it found that people from more deprived backgrounds were at a greater risk of being diagnosed at emergency care. The same was true for people of Asian ethnicity in the UK and African-Americans in the USA.

First author Dr Yin Zhou from the Primary Care Unit at the University of Cambridge, who led the study, says: “The earlier an individual can get a diagnosis of cancer, the better the prognosis and the options for treatment. When the first time their cancer is identified is when it becomes an emergency, the prognosis is much worse.”

Dr Georgios Lyratzopoulos, who instigated and coordinated the collaboration, based at the Department of Epidemiology and Public Health, University College London adds: “A substantial minority of cancer patients who are diagnosed as emergencies do not seem to have had prior contact with the formal healthcare system; we need to find out why they are not seeking medical help sooner. Is it because they are unaware of any symptoms until too late, or is it because they do not think the symptoms are a sign of a more serious problem?”

The researchers found that the evidence points towards developing new screening methods and improving participation in existing screening programme as one possible way to reduce emergency presentations in the medium to longer term. For example, based on indirect evidence in one geographical region in the UK, the introduction of faecal occult blood test in the UK is likely to have reduced the proportion of patients with colorectal cancer diagnosed as emergencies by half between 1999 and 2004.

“What was clear from our review,” adds Dr Zhou, “is how little data there is available about emergency diagnoses. What little data there is suggests that we’re only seeing the tip of the iceberg and that this could be a much bigger problem, particularly in low and middle income countries.”

A major part of the global evidence on emergency presentations (in terms of patient numbers) relates to English patients – thanks to the pioneering Routes to Diagnosis project and population-based data collected by the National Cancer Registration and Analysis Services of Public Health England.

Dr Anne Mackie, Director of Screening at Public Health England, said: “Screening has a vital role to play in identifying cancers at an early stage and getting people the right treatment as soon as possible, with better survival chances. Screening is always the person’s own choice and they should speak to their GP if they have any questions before deciding if screening is right for them.”
Reference
Zhou, Y et al. Diagnosis of cancer as an emergency: a critical review of current evidence. Nature Reviews Clinical Oncology; 11 Oct 2016; 10.1038/nrclinonc.2016.155


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Professor Oliver Hart Wins Economics Nobel Prize

Professor Oliver Hart wins economics Nobel Prize

source: www.cam.ac.uk

Professor Oliver Hart, a former undergraduate at King’s College (1966), and a former Fellow of Churchill College, has been jointly awarded the 2016 Sveriges Riksbank Prize in Economic Sciences, along with Bengt Holmström of MIT for their work in the field of contracts.

Professor Hart becomes the 96th Cambridge affiliate to be awarded a Nobel Prize.

The Nobel Assembly made their announcement this morning (October 10), stating: “Modern economies are held together by innumerable contracts. The new theoretical tools created by Hart and Holmström are valuable to the understanding of real-life contracts and institutions, as well as potential pitfalls in contract design.

“Society’s many contractual relationships include those between shareholders and top executive management, an insurance company and car owners, or a public authority and its suppliers. As such relationships typically entail conflicts of interest, contracts must be properly designed to ensure that the parties take mutually beneficial decisions.

“This year’s laureates have developed contract theory, a comprehensive framework for analysing many diverse issues in contractual design, like performance-based pay for top executives, deductibles and co-pays in insurance, and the privatisation of public-sector activities.”

Professor Hart is currently the Andrew E. Furer Professor of Economics at Harvard University. From 1975 to 1981, Hart was an Assistant Lecturer and then Lecturer at the Faculty of Economics, and a Fellow of Churchill College. He was born in London in 1948 and gained his PhD from Princeton University in 1974.

In the mid-1980s, Oliver Hart made fundamental contributions to a new branch of contract theory that deals with the important case of incomplete contracts. Because it is impossible for a contract to specify every eventuality, this branch of the theory spells out optimal allocations of control rights: which party to the contract should be entitled to make decisions in which circumstances?

Hart’s findings on incomplete contracts have shed new light on the ownership and control of businesses and have had a vast impact on several fields of economics, as well as political science and law. His research provides us with new theoretical tools for studying questions such as which kinds of companies should merge, the proper mix of debt and equity financing, and when institutions such as schools or prisons ought to be privately or publicly owned.

Professor Hart becomes the 96th Cambridge affiliate to be awarded a Nobel Prize afterlast week’s Nobel Prize in Physics went to Cambridge alumni David Thouless (Trinity Hall, 1952), Duncan Haldane (Christ’s, 1970) and Michael Kosterlitz (Gonville and Caius, 1962).

More details on previous Cambridge winners can be found here:https://www.cam.ac.uk/research/research-at-cambridge/nobel-prize


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Study Demonstrates How Academia And Business Can Ensure Sustainability Of Resources

Study demonstrates how academia and business can ensure sustainability of resources

source: www.cam.ac.uk

Collaboration between business and academia can identify the most urgent research priorities to ensure the sustainability of food, energy, water and the environment, according to a new study.

As pressures start to mount, placing enormous demands upon natural resources, we are increasingly asked for support by businesses who want practical approaches that they can apply to address their growing challenges.

Gemma Cranston

Companies both depend upon and impact the environment, and are subject to interdependent pressures over food, energy, water and the environment. Yet their perspectives are often overlooked by the research community, which lacks access to their business thinking. Equally, businesses find it challenging to engage with the academic community, and to define researchable questions that would benefit from more detailed analysis.

The study, published in the journalSustainability Science and organised by the Cambridge Institute for Sustainability Leadership, included over 250 people, including academics and companies such as Asda, EDF Energy, HSBC and Nestlé, to produce research priorities that are both scientifically feasible and include results that can be practically implemented by the business community.

“The process of co-design engages businesses at the outset to help define the challenges, limitations and ambitions of research agendas. These considerations ultimately have important consequences for the impact and practicality of research outputs,” said lead author Dr Jonathan Green, formerly of the University of Cambridge’s Department of Geography. “Greater investment in the complex but productive relations between the private sector and research community will create deeper and more meaningful collaboration and cooperation”.

The project is part of the work of the Nexus Network, an extensive network of researchers and stakeholders coordinated by the Cambridge Institute for Sustainability Leadership (CISL), the University of Sussex, the University of East Anglia, the University of Sheffield and the University of Exeter, and supported by the Economic and Social Research Council (ESRC).

The study was carried out over five months and involved researchers collecting over 700 questions from business practitioners, academics, policy-makers and members of the public. Over 50 per cent of these questions were submitted by businesses from a range of sectors, including retail, utilities, manufacturing and consumer goods. These questions were then reviewed by an expert group of businesses and researchers, who narrowed this list down to 40 questions that reflect key challenges for corporate sustainability.

Dr Bhaskar Vira, one of the project leads from the Department of Geography and the University of Cambridge Conservation Research Institute said: “We were able to bring together 40 experts with a huge diversity of backgrounds and knowledge. This unique group of senior business practitioners and interdisciplinary researchers, who represented 13 universities, 16 businesses and other important partners including ESRC, were able to inform the debate by their ability to answer both ‘Is this question answerable through an academic research project?’, but also ‘If answered, would this change the way we do business?”

Several themes emerged from the study, highlighting the issues that require more research and better engagement between the academic and business communities. These included research around development of pragmatic yet credible tools that allow businesses to incorporate the interactions between food, energy and water demands in a changing environment into their decision-making; the role of social considerations and livelihoods in business decision-making in relation to sustainable management; identification of the most effective levers for behaviour change; and understanding incentives or circumstances that allow individuals and businesses to take a leadership stance on these issues.

“As pressures start to mount, placing enormous demands upon natural resources, we are increasingly asked for support by businesses who want practical approaches that they can apply to address their growing challenges,” said Dr Gemma Cranston, project lead from CISL. “Co-designing new research is critical to provide business with robust and rigorous approaches that are academically sound but that are also directly applicable to a business context. We have identified priority areas that can guide new research development and look forward to seeing a greater integration of businesses into collaborative research agendas.”

It will be the role of multi-disciplinary groups of researchers and business practitioners to devise the projects that will deliver the solutions to these pressing issues around food, energy, water and the environment.

Reference
Jonathan Green et al. ‘
Research priorities for managing the impacts and dependencies of business upon food, energy, water and the environment.’ Sustainability Science (2016). DOI: 10.1007/s11625-016-0402-4


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Alumni Win 2016 Nobel Prize In Physics

Cambridge alumni win 2016 Nobel Prize in Physics

source: www.cam.ac.uk

Three alumni of the University of Cambridge were today awarded the 2016 Nobel Prize in Physics for their pioneering work in the field of condensed matter physics.

The trio become the 93rd, 94th and 95th Nobel Affiliates of Cambridge to be awarded a Nobel Prize.

David Thouless (Trinity Hall, 1952), Duncan Haldane (Christ’s, 1970) and Michael Kosterlitz (Gonville and Caius, 1962) discovered unexpected behaviours of solid materials – and devised a mathematical framework to explain their properties. Their discoveries have led to new materials with an array of unique properties.

The Prize was divided, one half awarded to Thouless, the other half jointly to Haldane and Kosterlitz. The trio become the 93rd, 94th and 95th Nobel Affiliates of Cambridge to be awarded a Nobel Prize.

“This prize is richly deserved,” said Professor Nigel Cooper of Cambridge’s Cavendish Laboratory. “Through the great breakthroughs they’ve made, Thouless, Haldane and Kosterlitz took a visionary approach to understanding how topology plays a role in novel materials.”

Topology is a mathematical concept that accounts for how certain physical properties are related by smooth deformations: a football can be smoothly deformed into a rugby ball (so these have the same topology), but neither of these can be smoothly deformed into a bicycle tube (which therefore has different topology). The Laureates recognized how novel states of matter could arise due to the differing topologies of how the underlying particles arrange themselves at the microscopic level.

The Nobel Assembly made their announcement this morning (October 4), saying: “This year’s Laureates opened the door on an unknown world where matter can assume strange states. They have used advanced mathematical methods to study unusual phases, or states, of matter, such as superconductors, superfluids or thin magnetic films. Thanks to their pioneering work, the hunt is now on for new and exotic phases of matter. Many people are hopeful of future applications in both materials science and electronics.

“The three Laureates’ use of topological concepts in physics was decisive for their discoveries. Topology is a branch of mathematics that describes properties that only change step-wise. Using topology as a tool, they were able to astound the experts. In the early 1970s, Michael Kosterlitz and David Thouless overturned the then current theory that superconductivity or suprafluidity could not occur in thin layers. They demonstrated that superconductivity could occur at low temperatures and also explained the mechanism, phase transition, that makes superconductivity disappear at higher temperatures.

“In the 1980s, Thouless was able to explain a previous experiment with very thin electrically conducting layers in which conductance was precisely measured as integer steps. He showed that these integers were topological in their nature. At around the same time, Duncan Haldane discovered how topological concepts can be used to understand the properties of chains of small magnets found in some materials.

“We now know of many topological phases, not only in thin layers and threads, but also in ordinary three-dimensional materials. Over the last decade, this area has boosted frontline research in condensed matter physics, not least because of the hope that topological materials could be used in new generations of electronics and superconductors, or in future quantum computers. Current research is revealing the secrets of matter in the exotic worlds discovered by this year’s Nobel Laureates.”

Professor Haldane is the current Eugene Higgins Professor of Physics at Princeton University. Born in London in 1951, he came to Christ’s as an undergraduate in 1970 to read Natural Sciences. His PhD was conferred in 1978.

Professor Kosterlitz is the Harrison E. Farnsworth Professor of Physics at Brown University, where he joined the faculty in 1982. He was born to German Jewish emigres in 1942 and his father was the pioneering biochemist Hans Walter Kosterlitz. Professor Kosterlitz, who came to Cambridge in 1965, is the 14th Nobel Laureate affiliated to Gonville and Caius.

Professor Thouless, born in 1934, is Emeritus Professor of Physics at the University of Washington. An undergraduate at Trinity Hall, he was also previously a Visiting Fellow at Clare Hall, where he was awarded a Doctorate of Science in 1985. He has been a Life Member of the college since 1986.

Professor Thouless was also a Fellow of Churchill College from 1961-65, and in 1961 became its first Director of Studies for Physics. He has also held the position of Visiting Fellow at Churchill. He is Churchill’s 31st Nobel Affiliate and Trinity Hall’s first.

The Master of Caius, Professor Sir Alan Fersht, today warmly congratulated Prof Kosterlitz, who was his exact contemporary at Caius, coming up to Cambridge to read Natural Sciences in 1962. “This is fantastic news,” Sir Alan said. “Mike was obviously an exceptionally clever guy. We went to physics lectures together in our first year, and he continued to specialise in Physics in the second year while I specialised in Chemistry. He was a very good physicist, and moved from the UK to America fairly rapidly.

“He was an absolutely mad climber – he disappeared every weekend to go mountain climbing in the Peak District. He lived on Tree Court, and he built a traverse around the room where he would climb using his fingers and hanging on to the picture rail.”

More details on previous Cambridge winners can be found here:https://www.cam.ac.uk/research/research-at-cambridge/nobel-prize.

The first Nobel Prize in Physics was awarded in 1901.

 

This is great, groundbreaking materials science. The work is “beautiful and deep” with big applications in future electronics

It all started at Caius… Co-winner of 2016 for Michael Kosterlitz in his Matriculation photo at Caius in 1962

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge Enterprise Joins Largest Early Stage Investment In A University Spin-Out

Cambridge Enterprise joins largest early stage investment in a university spin-out

source: www.cam.ac.uk

Cambridge spin-out Carrick Therapeutics raises $95 million in funding, representing the largest-ever early stage investment in a UK university spin-out company.

This investment and in particular the scale of the investment marks a real turning point for investments in Cambridge spin-out companies.

Bradley Hardiman

Carrick Therapeutics Ltd, a company which is developing new treatments for the most aggressive and resistant forms of cancer, launched today having secured $95 million in funding, representing the largest early-stage investment in a UK university spin-out.

The company, which has licensed technology developed at the Gurdon Institute at the University of Cambridge, brings together cancer researchers and drug development experts, backed by leading providers of early stage funding, with the aim of building Europe’s leading oncology company.

Carrick Therapeutics has research and development teams located in Dublin and Oxford. The $95 million funding round was led by ARCH Venture Partners and Woodford Investment Management with participation from Cambridge Enterprise Seed Funds, Cambridge Innovation Capital, Evotec AG, GV (Google Ventures) and Lightstone Ventures.

The first technology licensed by the company was developed at the Gurdon Institute and was licensed by Cambridge Enterprise, the University’s technology transfer arm. Carrick received seed funding in 2015 from ARCH Venture Partners and Cambridge Enterprise Seed Funds.

“This investment and in particular the scale of the investment marks a real turning point for investments in Cambridge spin-out companies,” said Bradley Hardiman, Investment Manager for Cambridge Enterprise Seed Funds. “In the past, particularly in Europe, investors have been guilty of drip-feeding money in to companies. This means that companies could be continuously fundraising, distracting them from the critical task of advancing treatments. This ‘war chest’ of funding will enable Carrick to get on with the important work of researching and developing new cancer treatments, making real differences to sufferers of this debilitating disease.”

The company’s vision is to target the molecular pathways that drive the most aggressive and resistant forms of cancer. While other companies are often reliant on a single compound or biological mechanism, Carrick Therapeutics is building a portfolio of treatments that are progressed through understanding the mechanisms that cause cancer and resistance, and are tailored to an individual patient’s tumour.

By linking a network of clinicians and scientists in internationally leading research institutes and hospitals, Carrick will move its portfolio of ground-breaking cancer therapies from laboratory to clinic.

“Our aim is to build Europe’s leading oncology company,” said Carrick Chief Executive Dr Elaine Sullivan, a former Vice President for research and development functions at both Eli Lilly and AstraZeneca. “There is a significant unmet need in cancer treatment, and targeting aggressive and resistant disease is an area where we can make a real difference to patients’ lives.”

Carrick Therapeutics is working on three innovative scientific programmes, and is looking to expand its portfolio through academic and pharmaceutical partnerships.

The company’s worldwide network of collaborating cancer experts includes the world’s leading cancer research charity, Cancer Research UK, and researchers from several of the world’s top universities, including Cambridge, Imperial College London and Oxford.

“The quality of the science and assets, combined with the calibre of the management team makes Carrick Therapeutics a powerful proposition,” said Steven Gillis, Managing Partner of ARCH Venture Partners and a member of the board of Carrick Therapeutics. “As an investor and a scientist I look forward to Carrick Therapeutics being a dominant force in the fight against cancer.”

Adapted from a Cambridge Enterprise press release


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Professor Stephen Toope Nominated As Vice-Chancellor Of The University of Cambridge

Professor Stephen Toope nominated as Vice-Chancellor of the University of Cambridge

source: www.cam.ac.uk

Today (26 September), international law scholar and university leader Professor Stephen Toope was nominated as Vice-Chancellor of the University of Cambridge.

I am thrilled to be returning to this great university. I look forward to working with staff and students in the pursuit of academic excellence and tremendous international engagement – the very mark of Cambridge.

Professor Stephen Toope

Subject to the approval of the Regent House, the University’s governing body, Professor Toope will take over from Professor Sir Leszek Borysiewicz on 1 October 2017.

Professor Toope is Director of the University of Toronto’s Munk School of Global Affairs and formerly served as president and vice-chancellor of the University of British Columbia.

He is a scholar specialising in human rights, international dispute resolution, international environmental law, the use of force, and international legal theory with degrees in common law (LLB) and civil law (BCL) with honours from McGill University (1983). Professor Toope is also an alumnus of Trinity College Cambridge, where he completed his PhD in 1987.

He graduated from Harvard with a degree (AB) in history and literature in 1979. He has published articles and books on change in international law, and the origins of international obligation in international society.

Professor Toope also represented Western Europe and North America on the UN Working Group on Enforced or Involuntary Disappearances from 2002-2007.

Cambridge has carried out an international search for the position of Vice-Chancellor and the Search Committee was headed up by the Master of Jesus College, Professor Ian White.

Professor White said: “This nomination builds on seven years of Sir Leszek’s visionary leadership. Professor Toope has impeccable academic credentials, a longstanding involvement with higher education, strong leadership experience and an excellent research background.”

Vice-Chancellor Professor Sir Leszek Borysiewicz says, “We are delighted to be welcoming a distinguished leader with such an outstanding record as a scholar and educator to lead Cambridge.”

Professor Toope says, “I am thrilled to be returning to this great university. I look forward to working with staff and students in the pursuit of academic excellence and tremendous international engagement – the very mark of Cambridge.”

Professor Sir Leszek Borysiewicz will continue to lead the University until Professor Toope takes up his post on 1 October 2017.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Solving Cambridge Congestion

Solving Cambridge congestion

source: http://www.cambridgeindependent.co.uk/

 

What’s the solution to Cambridge’s biggest challenge?

 “City Deal spokesman”

The City Deal is investing in a transport network that can support growth now but one that doesn’t rule out new or larger innovations in the future.

Transport through the historic centre is one of the tallest hurdles facing our rapidly developing city.

A number of experts are agreed that going underground is the most viable solution to tackle congestion in the long-term.

But debate has begun over whether a light rail service or underground buses are realistic options.

The push for an underground service has been spearheaded by Cambridge Connect, a campaign group which this month held a meeting with industry specialists.

Cambridge Connect has determined that an underground with a city-wide light rail network including three line extensions would cost around £1.3 billion.

The Greater Cambridge City Deal, which aims to use £1bn of investment to drive future prosperity, has been unable to commit to the idea so far due to inadequate funds and funding intervals, and is focusing efforts on extending the bus network. But those in charge of the City Deal told the Cambridge Independent this week that they have not ruled out bolder options in future.

Light Rail Network

Estimated to cost £1.3 billion

Trams have an average occupancy of 35%

The docklands light railway has an average occupancy of 110%

The Isaac Newton Line has three proposed extensions with 36 stops

A spokesman for the City Deal said: “The options currently being considered as part of the City Deal programme are realistic for an area such as this and meet the need to connect not just the different parts of the city, but also the market towns, villages and new and growing settlements all around it. The City Deal is investing in a transport network that can support growth now but one that doesn’t rule out new or larger innovations in the future.”

City Deal is proposing a second busway from Cambourne to Cambridge at the cost of £141 million. The scheme aims to provide the infrastructure needed to support the city’s growth up to 2031.

By 2031 there could be need of up to 300 buses per hour in the city centre, whereas now there are 125 at peak times.

Ian Sandison, Chairman, Cambridge BID said: “Congestion is clearly a problem which negatively impacts city users. Businesses recognise that radical action is needed to address congestion and the employer lunchtime briefings, which are available to all organisations from now until the 10th October, are an opportunity for them to help shape the parameters of the City Deal strategy. We strongly urge all local businesses to consider how this important issue impacts them, and add their voice to the conversation.

“Cambridge BID supports any measures which contribute to reducing congestion. However, for the City Deal proposals to work, a flexible public transport system – which is affordable, reliable and widespread – must also be put in place. The economic climate remains challenging and we must avoid any measures which could further discourage visitors to ensure that the city centre does not become a sleepy historic core.

“We hope that the eight point package will result in a less congested city and look forward to our continued engagement with the City Deal board.”

Cambridge Connect Isaac Newton Line conceptCambridge Connect Isaac Newton Line concept

Cambridge Connect believes an underground solution is the only way to provide effective inner city public transport.

If there’s a better way to spend money coming to the region through the City Deal, Cambridge Connect says it would be to fund a £100,000 consultation proving that there really is light at the end of the tunnel.

Dr Colin Harris is the man behind Cambridge Connect and its proposal for what is being called the ‘Isaac Newton line’. He has run his environmental consultancy, Environmental Research & Assessment, in Cambridge for almost 20 years. He completed his PhD at the University of Cambridge Scott Polar Research Institute, and now works on projects across the world, including Antarctica, Africa and Asia, using computer-based Geographic Information Systems (GIS) to solve complex environmental problems.

Last year he turned his expertise towards addressing the impending infrastructure problems facing Cambridge.

Dr Harris said: “I did look carefully at overground solutions in the inner city because I was aware that tunnelling was going to be seen as somewhat ‘out-there’ and many would say this is too expensive and just pie in the sky, it’s never going to happen. But the more I looked at it the more I became convinced that actually, to achieve the end goal and to protect the environment and the heritage of Cambridge, you cannot do it without a tunnel.”

Dr Harris used the same GIS tool he uses in his professional work for mapping the Cambridge Light Rail Network and Isaac Newton Line.

“Dr Colin Harris, Cambridge Connect”

By going underground, and by taking people to where they want to be with stops in the right places, you service people’s needs in the right way. It’s going to improve the inner city space, not detract from it.

He said: “Using the computer we can calculate things like distance and time between stops. Using that we can calculate how much of the city would be accessible to the light rail network. With this configuration, 91 per cent of the city would be within an 18-minute walk or a seven-and-a-half-minute cycle ride of any stop. Which, if you think about that, it changes the way people would move about the city. It does transform things.

“We wanted the backing of UK Tram behind us to show that this is a serious initiative that has credible backing from the industry.”

So does the project have that support?

James Hammett, general manager of UK Tram, met with Cambridge Connect this month. He said: “Yes, it has our support. It looks like a feasible project, so we’re prepared to give our support for lobbying and bringing in our knowledge and technical expertise.”

Mr Hammett also stated his confidence that the Isaac Newton Line would be able to find funding.

Several interested parties met with Cambridge Connect and UK Tram to discuss the viability of a light rail network for the city. One attendee was Alex Reid, who played an integral role in the guided busway’s introduction. He thinks taking it underground should be seriously considered.

The Isaac Newton Line Journey times examples with average speed 30 kph

Cambridge Central Rail Station to Girton Interchange – 14 min

Cambridge Central Rail Station to Market Square – 3.6 min

Cambridge Central Rail Station to Science Park (Milton Rd) – 12.2 min

Cambridge Central Rail Station to Capital Park (Fulbourn) – 11.9 min

Cambridge Central Rail Station to Addenbrookes – 5.7 min

Cambridge Central Rail Station to Cavendish Lab – 7.2 min

Addenbrookes to Cavendish Lab – 12.9 min

Mr Reid, the former director general at the Royal Institute of British Architects in London was a Liberal Democrat councillor for Cambridgeshire County Council, and their spokesman for transport when the busway from St Ives to Cambridge was being established.

While Mr Reid supports Cambridge Connect’s plans for the Isaac Newton Line, he sees a possibility for an underground tunnel for electric buses.

He said: “I have long felt that a tunnel of some kind under the centre of Cambridge is considerable.

“One’s got to think ahead. I know the City Deal money has to be spent quite quickly but I think to say we’re spending a bit of it to see what we should do in five years’ time is absolutely legitimate.”

Road infrastructure costs about one fifth of light rail. However, national statistics show that bus use outside of London has decreased by 1.8 per cent over the past year, while light rail has experienced a relatively sharp rise in passenger journeys in England outside London, with an 11 per cent increase over the same time period.

Last week a transport expert said that Cambridge’s bus networks can be revitalised, but reliability has to be the primary concern that’s addressed.

Looking around Cambridge before the Seminar at the Guildhall are left tor right Andy Campell, Professor David Begg and Councillor Lewis Herbert. Picture: Keith HeppellLooking around Cambridge before the Seminar at the Guildhall are left tor right Andy Campell, Professor David Begg and Councillor Lewis Herbert. Picture: Keith Heppell

Professor David Begg, former chairman of the Government’s commission for integrated transport, met with City Deal and bus service operators to discuss plans to significantly improve bus travel in and around Cambridge.

Prof Begg said: “Doing nothing is not an option.

“Traffic congestion is proven to be bad for the economy, the environment and people’s health. In bumper-to-bumper conditions like those seen in peak times in Cambridge, tailpipe emissions are four times higher than at other times of the day. This reduces life expectancy and makes Cambridge a less attractive place to live.

“Buses offer the most efficient use of road space and can move 20 times more people than cars, which have an average occupancy rate of just 1.2. It’s plain common sense to prioritise the use of buses.”

Prof Begg backed the use of Peak Congestion Control Points that would prevent cars from using key routes at rush hours. The City Deal is proposing to introduce them next year. He said that they had been effective in encouraging more people to cycle after being introduced in Nottingham.

He said: “Local authorities need to be making bold and brave decisions on traffic congestion, and be prepared to stand by them.”

Prof Begg also held a seminar to discuss the finding of his study, The Impact of Congestion on Bus Passengers, as part of the City Deal’s eight-point plan to improve transport in the city.

Prof Begg explained: “Bus services across the UK are in a downward spiral, directly due to congestion. Slower speeds have led to higher operator costs, higher fares, increased journey times and a decline in punctuality and reliability.”

Cllr Lewis Herbert, chair of the City Deal’s executive board, said: “David’s views on the challenges for bus operators caused by congestion make a lot of sense and chime very well with what we are trying to achieve here in Cambridge.

“By freeing road space for much more reliable bus journeys, people will be encouraged to use them, which is a much more effective use of limited road space in a small medieval city like Cambridge than clogging its narrow streets with gridlocked traffic.”

Andy Campbell, managing director of Stagecoach East, the biggest bus operator in the region, said: “We do everything possible to keep services running to time – we know this really matters to our passengers -–but our hands are tied when the road is gridlocked. We schedule extra buses during peak periods but they make little difference if the traffic isn’t moving.”

How should Cambridge solve it’s congestion problem? Let us know your thoughts.

Unprecedented Study Of Aboriginal Australians Points To One Shared Out of Africa Migration For Modern Humans

Unprecedented study of Aboriginal Australians points to one shared Out of Africa migration for modern humans

source: www.cam.ac.uk

The first significant investigation into the genomics of Aboriginal Australians has uncovered several major findings about early human populations. These include evidence of a single “Out of Africa” migration event, and of a previously unidentified, “ghost-like” population spread which provided a basis for the modern Aboriginal cultural landscape.

We found evidence that there was only really one wave of humans who gave rise to all present-day non-Africans, including Australians

Eske Willerslev

The first major genomic study of Aboriginal Australians ever undertaken has confirmed that all present-day non-African populations are descended from the same single wave of migrants, who left Africa around 72,000 years ago.

Researchers sequenced the complete genetic information of 83 Aboriginal Australians, as well as 25 Papuans from New Guinea, to produce a host of significant new findings about the origins of modern human populations. Their work is published alongside several other related papers in the journal Nature.

The study, by an international team of academics, was carried out in close collaboration with elders and leaders from various Aboriginal Australian communities – some of whom are co-authors on the paper – as well as with various other organisations representing the participating groups.

Alongside the prevailing conclusion, that the overwhelming majority of the genomes of non-Africans alive today stem from one ancestral group of migrants who left Africa together, there are several other standout findings. These include:

  • Compelling evidence that Aboriginal Australians are descended directly from the first people to inhabit Australia – which is still the subject of periodic political dispute.
  • Evidence of an uncharacterised – and perhaps unknown – early human species which interbred with anatomically modern humans as they migrated through Asia.
  • Evidence that a mysterious dispersal from the northeastern part of Australia roughly 4,000 years ago contributed to the cultural links between Aboriginal groups today. These internal migrants defined the way in which people spoke and thought, but then disappeared from most of the continent, in a manner which the researchers describe as “ghost-like”.

The study’s senior authors are from the University of Cambridge, the Wellcome Trust Sanger Institute, the Universities of Copenhagen, Bern and Griffith University Australia. Within Cambridge, members of the Leverhulme Centre for Evolutionary Studies also contributed to the research, in particular by helping to place the genetic data which the team gathered in the field within the context of wider evidence about early human population and migration patterns.

 

 

Professor Eske Willerslev, who holds posts at St John’s College, University of Cambridge, the Sanger Institute and the University of Copenhagen, initiated and led the research. He said: “The study addresses a number of fundamental questions about human evolution – how many times did we leave Africa, when was Australia populated, and what is the diversity of people in and outside Australia?”

“Technologically and politically, it has not really been possible to answer those questions until now. We found evidence that there was only really one wave of humans who gave rise to all present-day non-Africans, including Australians.”

Anatomically modern humans are known to have left Africa approximately 72,000 years ago, eventually spreading across Asia and Europe. Outside Africa, Australia has one of the longest histories of continuous human occupation, dating back about 50,000 years.

Some researchers believe that this deep history indicates that Papuans and Australians stemmed from an earlier migration than the ancestors of Eurasian peoples; others that they split from Eurasian progenitors within Africa itself, and left the continent in a separate wave.

Until the present study, however, the only genetic evidence for Aboriginal Australians, which is needed to investigate these theories, came from one tuft of hair (taken from a long-since deceased individual), and two unidentified cell lines.

The new research dramatically improves that picture. Working closely with community elders, representative organisations and the ethical board of Griffith University, Willerslev and colleagues obtained permission to sequence dozens of Aboriginal Australian genomes, using DNA extracted from saliva.

This was compared with existing genetic information about other populations. The researchers modelled the likely genetic impact of different human dispersals from Africa and towards Australia, looking for patterns that best matched the data they had acquired. Dr Marta Mirazon Lahr and Professor Robert Foley, both from the Leverhulme Centre, assisted in particular by analysing the likely correspondences between this newly-acquired genetic evidence and a wider framework of existing archaeological and anthropological evidence about early human population movements.

 

 

Dr Manjinder Sandhu, a senior author from the Sanger Institute and University of Cambridge, said: “Our results suggest that, rather than having left in a separate wave, most of the genomes of Papuans and Aboriginal Australians can be traced back to a single ‘Out of Africa’ event which led to modern worldwide populations. There may have been other migrations, but the evidence so far points to one exit event.”

The Papuan and Australian ancestors did, however, diverge early from the rest, around 58,000 years ago. By comparison, European and Asian ancestral groups only become distinct in the genetic record around 42,000 years ago.

The study then traces the Papuan and Australian groups’ progress. Around 50,000 years ago they reached “Sahul” – a prehistoric supercontinent that originally united New Guinea, Australia and Tasmania, until these regions were separated by rising sea levels approximately 10,000 years ago.

The researchers charted several further “divergences” in which various parts of the population broke off and became genetically isolated from others. Interestingly, Papuans and Aboriginal Australians appear to have diverged about 37,000 years ago – long before they became physically separated by water. The cause is unclear, but one reason may be the early flooding of the Carpentaria basin, which left Australia connected to New Guinea by a strip of land that may have been unfavourable for human habitation.

Once in Australia, the ancestors of today’s Aboriginal communities remained almost completely isolated from the rest of the world’s population until just a few thousand years ago, when they came into contact with some Asian populations, followed by European travellers in the 18th Century.

Indeed, by 31,000 years ago, most Aboriginal communities were genetically isolated from each other. This divergence was most likely caused by environmental barriers; in particular the evolution of an almost impassable central desert as the Australian continent dried out.

 

 

Assistant Professor Anna-Sapfo Malaspinas, from the Universities of Copenhagen and Bern, and a lead author, said: “The genetic diversity among Aboriginal Australians is amazing. Because the continent has been populated for such a long time, we find that groups from south-western Australia are genetically more different from north-eastern Australia, than, for example, Native Americans are from Siberians.”

Two other major findings also emerged. First, the researchers were able to reappraise traces of DNA which come from an ancient, extinct human species and are found in Aboriginal Australians. These have traditionally been attributed to encounters with Denisovans – a group known from DNA samples found in Siberia.

In fact, the new study suggests that they were from a different, as-yet uncharacterised, species. “We don’t know who these people were, but they were a distant relative of Denisovans, and the Papuan/Australian ancestors probably encountered them close to Sahul,” Willerslev said.

Finally, the research also offers an intriguing new perspective on how Aboriginal culture itself developed, raising the possibility of a mysterious, internal migration 4,000 years ago.

About 90% of Aboriginal communities today speak languages belonging to the “Pama-Nyungan” linguistic family. The study finds that all of these people are  descendants of the founding population which diverged from the Papuans 37,000 years ago, then diverged further into genetically isolated communities.

This, however, throws up a long-established paradox. Language experts are adamant that Pama-Nyungan languages are much younger, dating back 4,000 years, and coinciding with the appearance of new stone technologies in the archaeological record.

Scientists have long puzzled over how – if these communities were completely isolated from each other and the rest of the world – they ended up sharing a language family that is much younger? The traditional answer has been that there was a second migration into Australia 4,000 years ago, by people speaking this language.

But the new research finds no evidence of this. Instead, the team uncovered signs of a tiny gene flow, indicating a small population movement from north-east Australia across the continent, potentially at the time the Pama-Nyungan language and new stone tool technologies appeared.

These intrepid travellers, who must have braved forbidding environmental barriers, were small in number, but had a significant, sweeping impact on the continent’s culture. Mysteriously, however, the genetic evidence for them then disappears. In short, their influential language and culture survived – but they, as a distinctive group, did not.

“It’s a really weird scenario,” Willerslev said. “A few immigrants appear in different villages and communities around Australia. They change the way people speak and think; then they disappear, like ghosts. And people just carry on living in isolation the same way they always have. This may have happened for religious or cultural reasons that we can only speculate about. But in genetic terms, we have never seen anything like it before.”

The paper, A Genomic History of Aboriginal Australia, is published in Nature. doi:10.1038/nature18299.

Inset images: Professor Eske Willerslev talking to Aboriginal elders in the Kalgoorlie area in southwestern Australia in 2012. (Photo credit: Preben Hjort, Mayday Film). / Map showing main findings from the paper. Credit: St John’s College, Cambridge.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Study Identifies Different Ways To Help Social Businesses Grow

Study identifies different ways to help social businesses grow

source: www.cam.ac.uk

New study identifies four strategies and two key methods for scaling up social businesses in developing countries in order to meet the unmet needs of more than four billion people.

Social businesses have enormous potential to provide important services to billions of people around the world – but they need to scale up in order to meet these needs.

Jaideep Prabhu

Social businesses – those with a socially beneficial objective – can play an important role in developing countries in addressing needs such as healthcare, energy, education and sanitation, but such businesses have faced a difficult time scaling up to significant size and reach.

A new study by researchers from Cambridge University has identified four key strategies and two methods for social businesses to scale up, which could help them reach many more of the four billion people in developing countries who could benefit from the services of such businesses.

Meeting these needs through affordable and sustainable solutions offers businesses a vast opportunity for future growth. As developing markets emerge from low-income to middle-income status, their development offers businesses the potential to make profits while also delivering significant social impact.

The study, published in the Journal of Cleaner Production, identifies market penetration, market development, product development and diversification as the four key growth strategies at different stages of business maturity for social businesses. In parallel, the study found two ways of increasing income generated through these four strategies – increasing revenue per stream and diversifying revenue streams.

Previous studies in this area had focused on scaling up conventional for-profit businesses and NGOs, but there had been little research on scaling up profit-generating social businesses.

Social businesses have had difficulty scaling up in developing countries due to a lack of infrastructure such as roads and electricity, coupled with a lack of clear property rights and well-functioning courts. On a more encouraging note, new technology such as mobile phones and new business structures such as public-private partnerships now make it easier for such businesses to find ways to reach new consumers.

“Social businesses have enormous potential to provide important services to billions of people around the world – but they need to scale up in order to meet these needs,” said study co-author Professor Jaideep Pradhu from Cambridge Judge Business School. “This study is a first step to greater understanding in this area, but we need a lot more work to support the development and growth of such businesses.”

The study focused on three successful social businesses – development organization BRAC, eye care company Aravind and Amul Dairy.

BRAC, one of the world’s largest NGOs, serving 135 million people in 11 countries in areas ranging from nutrition and sanitation to microfinance, uses “replication and diffusion” as a key to its successful efforts to scale up. To reach people in dispersed regions, for example, BRAC used village societies and village intermediaries to train people, testing solutions in local communities and improving them before expanding to larger communities.

Aravind Eye Care runs a number of hospitals and eye care centres in India, treating more than three million people a year to fight blindness in the country. In order to scale up, Aravind chose to increase the number of hospitals and centres in a given region – and then diversified activities by adding a manufacturing unit, staff training and a research department.

Amul Dairy, an Indian co-operative of three million milk producers, began to scale up by increasing the number of milk producers in the state of Gujurat followed by establishment of a processing plant to produce dairy products from excess milk. A partnership with the government then helped Amul to expand across India and into other countries.

All three companies used almost all the different scaling-up methods at one point in their development, suggesting that all strategies are important to achieve scale and that companies need to mix and match different strategies.

Based on the case studies, companies with a clear single purpose (Aravind and Amul) might start with ‘simpler’ strategies such as market penetration, while companies like BRAC with broader goals will need a more diverse approach.

Reference:
Nancy M.P. Bocken, Alison Fil, and Jaideep Prabhu. ‘Scaling up social businesses in developing markets.’ Journal of Cleaner Production (2016). DOI:10.1016/j.jclepro.2016.08.045

Adapted from a Cambridge Judge Business School press release.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Algorithm For Predicting Protein Pairings Could Help Show How Living Systems Work

Algorithm for predicting protein pairings could help show how living systems work

source: www.cam.ac.uk

An algorithm which models how proteins inside cells interact with each other will enhance the study of biology, and sheds light on how proteins work together to complete tasks such as turning food into energy.

Being able to predict these interactions will help us understand how proteins fit and work together to complete required tasks.

Lucy Colwell

Researchers have developed an algorithm that aids our understanding of how living systems work, by identifying which proteins within cells will interact with each other, based on their genetic sequences alone.

The ability to generate huge amounts of data from genetic sequencing has developed rapidly in the past decade, but the trouble for researchers is in being able to apply that sequence data to better understand living systems. The new research, published in the journal Proceedings of the National Academy of Sciences, is a significant step forward because biological processes, such as how our bodies turn food into energy, are driven by specific protein-protein interactions.

“We were really surprised that our algorithm was powerful enough to make accurate predictions in the absence of experimentally-derived data,” said study co-author Dr Lucy Colwell, from the University of Cambridge’s Department of Chemistry, who led the study with Ned Wingreen of Princeton University. “Being able to predict these interactions will help us understand how proteins fit and work together to complete required tasks – and using an algorithm is much faster and much cheaper than relying on experiments.”

When proteins interact with each other, they stick together to form protein complexes. In her previous research, Colwell found that if the two interacting proteins were known, sequence data could be used to figure out the structure of these complexes. Once the structure of the complexes is known, researchers can then investigate what is happening chemically. However, the question of which proteins interact with each other still required expensive, time-consuming experiments. Each cell often contains multiple versions of the same protein, and it wasn’t possible to predict which version of each protein would interact specifically – instead, experiments involve trying all options to see which ones stick.

In the current paper, the researchers used a mathematical algorithm to sift through the possible interaction partners and identify pairs of proteins that interact with each other. The method correctly predicted 93% of protein-protein interactions present in a dataset of more than 40,000 protein sequences for which the pairing is known, without being first provided any examples of correct pairs.

When two proteins stick together, some amino acids on one chain stick to the amino acids on the other chain. The boundaries between interacting proteins tend to evolve together over time, causing their sequences to mirror each other.

The algorithm uses this effect to build a model of the interaction. It first randomly pairs protein versions within each organism – because interacting pairs tend to be more similar in sequence to one another than non-interacting pairs, the algorithm can quickly identify a small set of largely correct pairings from the random starting point.

Using this small set, the algorithm measures whether the amino acid at a particular location in the first protein influences which amino acid occurs at a particular location in the second protein. These dependencies, learned from the data, are incorporated into a model and used to calculate the interaction strengths for each possible protein pair. Low-scoring pairings are eliminated, and the remaining set used to build an updated model.

The researchers thought that the algorithm would only work accurately if it first ‘learned’ what makes a good protein-protein pair by studying pairs that have been discovered in experiments. This meant that the researchers had to give the algorithm some known protein pairs, or ‘gold standards,’ against which to compare new sequences. The team used two well-studied families of proteins, histidine kinases and response regulators, which interact as part of a signaling system in bacteria.

But known examples are often scarce, and there are tens of millions of undiscovered protein-protein interactions in cells. So the team decided to see if they could reduce the amount of training they gave the algorithm. They gradually lowered the number of known histidine kinase-response regulator pairs that they fed into the algorithm, and were surprised to find that the algorithm continued to work. Finally, they ran the algorithm without giving it any such training pairs, and it still predicted new pairs with 93 percent accuracy.

“The fact that we didn’t need a set of training data was really surprising,” said Colwell.

The algorithm was developed using proteins from bacteria, and the researchers are now extending the technique to other organisms. “Reactions in living organisms are driven by specific protein interactions,” said Colwell. “This approach allows us to identify and probe these interactions, an essential step towards building a picture of how living systems work.”

The research was supported in part by the National Institutes of Health, the National Science Foundation and the European Union.

Reference:
Anne-Florence Bitbol et al. ‘Inferring interaction partners from protein sequences.’ Proceedings of the National Academy of Sciences (2016). DOI:10.1073/pnas.1606762113


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Gut Feelings’ Help Make More Successful Financial Traders

‘Gut feelings’ help make more successful financial traders

source: www.cam.ac.uk

Financial traders are better at reading their ‘gut feelings’ than the general population – and the better they are at this ability, the more successful they are as traders, according to new research led by the University of Cambridge.

In economics and finance most models analyse conscious reasoning and are based on psychology. We should refocus on the body, or more exactly the interaction between body and brain. Medics find this obvious; economists don’t

John Coates

‘Gut feelings’ – known technically as interoceptive sensations – are sensations that carry information to the brain from many tissues of the body, including the heart and lungs, as well as the gut. They can report anything from body temperature to breathlessness, racing heart, fullness from the gut, bladder and bowel, and they underpin states such as hunger, thirst, pain, and anxiety.

We are often not conscious – or at least barely aware – of this information, but it provides valuable inputs in risky decision making. High-risk choices are accompanied by rapid and subtle physiological changes that feed back to the brain, affecting our decisions, and steering us away from gambles that are likely to lead to loss and towards those that are likely to lead to profit. This can enable people to make important decisions even before they are able to articulate the reasons for their choices.

Traders and investors in the financial markets frequently talk of the importance of gut feelings for selecting profitable trades. To find out the extent to which this belief is correct, researchers from the Universities of Cambridge and Sussex in the UK and Queensland University of Technology in Australia compared the interoceptive abilities of financial traders against those of non-trader control subjects. Their results are published today in the journal Scientific Reports.

The researchers recruited 18 male traders from a hedge fund engaged in high frequency trading, which involves buying and selling futures contracts for only a short period of time – seconds or minutes, a few hours at the most. This form of trading requires an ability to assimilate large amounts of information flowing through news feeds, to rapidly recognize price patterns, and to make large and risky decisions with split-second timing. This niche of the financial markets is particularly unforgiving: while successful traders may earn in excess of £10 million per year, unprofitable ones do not survive for long.

The study took place during a particularly volatile period – the Eurozone crisis – so the performance of each trader reflected his ability to make money during periods of extreme uncertainty. The researchers measured individual differences in each trader’s capacity to detect subtle changes in the physiological state of their bodies by means of two established heartbeat detection tasks. These tasks test how accurately a person, when at rest, can count their heartbeats. Each trader was given a score which, essentially, measured the percentage of right answers, and these scores were compared against data from 48 students at the University of Sussex.

The researchers found that traders performed significantly better at the heart rate detection tasks compared to the controls: the mean score for traders was 78.2, compared to 66.9 for the controls. Even within the group of traders, those who were better at the heart rate detection tasks also performed better at trading, generating greater profits.

Strikingly, an individual’s interoceptive ability could be used to predict whether they would survive in the financial markets. The researchers plotted heartbeat detection scores against years of experience in the financial markets and found that a trader’s heartbeat counting score predicted the number of years he had survived as a trader.

“Traders in the financial world often speak of the importance of gut feelings for choosing profitable trades – they select from a range of possible trades the one that just ‘feels right’,” says Dr John Coates, a former research fellow in neuroscience and finance at the University of Cambridge, who also used to run a trading desk on Wall Street. “Our findings suggest they’re right – they manage to read real and valuable physiological trading signals, even if they are unaware they are doing so.”

Although the results are consistent with recent studies showing that heartbeat detection skills predict more effective risk taking, the researchers caution that there may be other interpretations. For example, one study has found that heartbeat detection ability increases during stress, so it could be argued that heartbeat detection skills correlated with years of survival merely because experienced traders, taking larger risks, are subjected to greater stresses. The authors of the current study think this unlikely – in trading, as in many other professions, experienced and successful individuals, being more in control, are commonly less stressed than beginners.

The findings also appear to contradict the influential ‘Efficient Markets Hypothesis’ of economic theory, which argues that the market is random, meaning that no trait or skill of an investor or trader – not their IQ, education, nor training – can improve their performance, any more than these traits and skills could improve their performance at flipping coins.

“A large part of a trader’s success and survival seems to be linked to their physiology. Such a finding has profound implications for how we understand financial markets,” adds Dr Mark Gurnell from the Wellcome Trust-Medical Research Council Institute of Metabolic Science at the University of Cambridge.

“In economics and finance most models analyse conscious reasoning and are based on psychology,” Dr Coates continues. “We’re looking instead at risk takers’ physiology – how good are they at sensing signals from their viscera? We should refocus on the body, or more exactly the interaction between body and brain. Medics find this obvious; economists don’t.”

The research was largely funded by the Economic and Social Research Council, the European Research Council and the Dr Mortimer and Theresa Sackler Foundation. Additional support was provided by the National Institute for Health Research Cambridge Biomedical Research Centre.

Reference
Kandasamy, N, Garfinkel, SN, Page, L et al. Interoceptive Ability Predicts Survival on a London Trading Floor. Scientific Reports; 19 Sept 2016; DOI: 10.1038/srep32986


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Neurons Feel The Force – Physical Interactions Control Brain Development

Neurons feel the force – physical interactions control brain development

source: www.cam.ac.uk

Researchers have identified a new mechanism controlling brain development: that neurons not only ‘smell’ chemicals in their environment, but also ‘feel’ their way through the developing brain.

Considering mechanics might lead to new breakthroughs in our understanding of neuronal regeneration.

Kristian Franze

Scientists have found that developing nerve cells are able to ‘feel’ their environment as they grow, helping them form the correct connections within the brain and with other parts of the body. The results, reported in the journal Nature Neuroscience, could open up new avenues of research in brain development, and lead to potential treatments for spinal cord injuries and other types of neuronal damage.

As the brain develops, roughly 100 billion neurons make over 100 trillion connections to send and receive information. For decades, it has been widely accepted that neuronal growth is controlled by small signalling molecules which are ‘sniffed’ out by the growing neurons, telling them which way to go, so that they can find their precise target. The new study, by researchers from the University of Cambridge, shows that neuronal growth is not only controlled by these chemical signals, but also by the physical properties of their environment, which guide the neurons along complex stiffness patterns in the tissue through which they grow.

“The fact that neurons in the developing brain not only respond to chemical signals but also to the mechanical properties of their environment opens many exciting new avenues for research in brain development,” said the study’s lead author Dr Kristian Franze, from Cambridge’s Department of Physiology, Development and Neuroscience. “Considering mechanics might also lead to new breakthroughs in our understanding of neuronal regeneration. For example, following spinal cord injuries, the failure of neurons to regrow through damaged tissue with altered mechanical properties has been a persistent challenge in medicine.”

We navigate our world guided by our senses, which are based on interactions with different facets of our environment — at the seaside you smell and taste the saltiness of the air, feel the grains of sand and the coldness of the water, and hear the crashing of waves on the beach. Within our bodies, individual neurons also sense and react to their environment – they ‘taste’ and ‘smell’ small chemical molecules, and, as this study shows, ‘feel’ the stiffness and structure of their surroundings. They use these senses to guide how and where they grow.

Using a long, wire-like extension called an axon, neurons carry electrical signals throughout the brain and body. During development, axons must grow along precisely defined pathways until they eventually connect with their targets. The enormously complex networks that result control all body functions. Errors in the neuronal ‘wiring’ or catastrophic severing of the connections, as occurs during spinal cord injury, may lead to severe disabilities.

A number of chemical signals controlling axon growth have been identified. Called ‘guidance cues,’ these molecules are produced by cells in the tissue surrounding growing axons and may either attract or repel the axons, directing them along the correct paths. However, chemical guidance cues alone cannot fully explain neuronal growth patterns, suggesting that other factors contribute to guiding neurons.

One of these factors turns out to be mechanics: axons also possess a sense of ‘touch’. In order to move, growing neurons must exert forces on their environment. The environment in turn exerts forces back, and the axons can therefore ‘feel’ the mechanical properties of their surroundings, such as its stiffness. “Consider the difference between walking on squelchy mud versus hard rock – how you walk, your balance and speed, will differ on these two surfaces,” said Franze. “Similarly, axons adjust their growth behaviour depending on the mechanical properties of their environment.” However, until recently it was not known what environments axons encounter as they grow, and Franze and his colleagues decided to find out.

They developed a new technique, based on atomic force microscopy, to measure the stiffness of developing Xenopus frog brains at high resolution – revealing what axons might feel as they grow through the brain. The study found complex patterns of stiffness in the developing brain that seemed to predict axon growth directions. The researchers showed that axons avoided stiffer areas of the brain and grew towards softer regions. Changing the normal brain stiffness caused the axons to get lost and fail to find their targets.

In collaboration with Professor Christine Holt’s research group, the team then explored how exactly the axons were feeling their environments. They found that neurons contain ion channels called Piezo1, which sit in the cell membrane: the barrier between cell and environment. These channels open only when a large enough force is applied, similar to shutter valves in air mattresses. Opening of these channels generates small pores in the membrane of the neurons, which allows calcium ions to enter the cells. Calcium then triggers a number of reactions that change how neurons grow.

When neuronal membranes were stiffened using a substance extracted from a spider venom, which made it harder to open the channels, neurons became ‘numb’ to environmental stiffness. This caused the axons to grow abnormally without reaching their target. Removing Piezo1 from the cells, similarly abolishing the axons’ capacity to feel differences in stiffness, had the same effect.

“We already understand quite a bit about the detection and integration of chemical signals” said Franze. “Adding mechanical signals to this picture will lead to a better understanding of the growth and development of the nervous system. These insights will help us answer critical questions in developmental biology as well as in biomedicine and regenerative biology.”

Reference:
David E Koser et al. Mechanosensing is critical for axon growth in the developing brain.’ Nature Neuroscience (2016). DOI: 10.1038/nn.4394


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

A Tight Squeeze For Electrons – Quantum Effects Observed In ‘One-Dimensional’ Wires

A tight squeeze for electrons – quantum effects observed in ‘one-dimensional’ wires

source: www.cam.ac.uk

Researchers have observed quantum effects in electrons by squeezing them into one-dimensional ‘quantum wires’ and observing the interactions between them. The results could be used to aid in the development of quantum technologies, including quantum computing.

Scientists have controlled electrons by packing them so tightly that they start to display quantum effects, using an extension of the technology currently used to make computer processors. The technique, reported in the journal Nature Communications, has uncovered properties of quantum matter that could pave a way to new quantum technologies.

The ability to control electrons in this way may lay the groundwork for many technological advances, including quantum computers that can solve problems fundamentally intractable by modern electronics. Before such technologies become practical however, researchers need to better understand quantum, or wave-like, particles, and more importantly, the interactions between them.

Squeezing electrons into a one-dimensional ‘quantum wire’ amplifies their quantum nature to the point that it can be seen, by measuring at what energy and wavelength (or momentum) electrons can be injected into the wire.

“Think of a crowded train carriage, with people standing tightly packed all the way down the centre of the carriage,” said Professor Christopher Ford of the University of Cambridge’s Cavendish Laboratory, one of the paper’s co-authors. “If someone tries to get in a door, they have to push the people closest to them along a bit to make room. In turn, those people push slightly on their neighbours, and so on. A wave of compression passes down the carriage, at some speed related to how people interact with their neighbours, and that speed probably depends on how hard they were shoved by the person getting on the train. By measuring this speed, one could learn about the interactions.”

“The same is true for electrons in a quantum wire – they repel each other and cannot get past, so if one electron enters or leaves, it excites a compressive wave like the people in the train,” said the paper’s first author Dr Maria Moreno, also from the Cavendish Laboratory.

However, electrons have another characteristic, their angular momentum or ‘spin’, which also interacts with their neighbours. Spin can also set off a wave carrying energy along the wire, and this spin wave travels at a different speed to the charge wave. Measuring the wavelength of these waves as the energy is varied is called tunnelling spectroscopy. The separate spin and charge waves were detected experimentally by researchers from Harvard and Cambridge Universities.

Now, in the paper published in Nature Communications, the Cambridge researchers have gone one stage further, to test the latest predictions of what should happen at high energies, where the original theory breaks down.

A flurry of theoretical activity in the past decade has led to new predictions of other ways of exciting waves among the electrons — it’s as if the person entering the train pushes so hard some people fall over and knock into others much further down the carriage. These new ‘modes’ are weaker than the spin and charge waves and so are harder to detect.

The collaborators of the Cambridge researchers from the University of Birmingham predicted that there would be a hierarchy of modes corresponding to the variety of ways in which the interactions can affect the quantum-mechanical particles, and the weaker modes should be strongest in very short wires.

To make a set of such short wires, the Cambridge group set about devising a way of making contact to a set of 6000 narrow strips of metal that are used to create the quantum wires from the semiconducting material gallium arsenide (GaAs). This required an extra layer of metal in the shape of bridges between the strips.

By varying the magnetic field and voltage, the tunnelling from the wires to an adjacent sheet of electrons could be mapped out, and this revealed evidence for the extra curves predicted, where it can be seen as an upside-down replica of the spin curve.

These results will now be applied to better understand and control the behaviour of electrons in the building blocks of a quantum computer.

Reference:
Moreno et al. Nonlinear spectra of spinons and holons in short GaAs quantum wires.’ Nature Communications (2016).DOI: 10.1038/ncomms12784 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Gaia Results Revealed – First Data Release From The Most Detailed Map Ever Made Of The Sky

Gaia results revealed – first data release from the most detailed map ever made of the sky

 

source: www.cam.ac.uk

The first results from the Gaia satellite, which is completing an unprecedented census of more than one billion stars in the Milky Way, are being released today to astronomers and the public.

Gaia’s first major data release is both a wonderful achievement in its own right, and a taster of the truly dramatic advances to come in future years.

Gerry Gilmore

Detailed information about more than a billion stars in the Milky Way has been published in the first data release from the Gaia satellite, which is conducting the first-ever ‘galactic census.’ The release marks the first chance astronomers and the public have had to get their hands on the most detailed map ever made of the sky.

Gaia, which orbits the sun at a distance of 1.5 million kilometres from the earth, was launched by the European Space Agency in December 2013 with the aim of observing a billion stars and revolutionising our understanding of the Milky Way. During its expected five-year lifespan, Gaia will observe each of a billion stars about 70 times.

The unique mission is reliant on the work of Cambridge researchers who collect the vast quantities of data transmitted by Gaia to a data processing centre at the University, overseen by a team at the Institute of Astronomy.

“Gaia’s first major data release is both a wonderful achievement in its own right, and a taster of the truly dramatic advances to come in future years,” said Professor Gerry Gilmore from the Institute of Astronomy, who is also the UK Principal Investigator for Gaia. “Several UK teams have leading roles in Gaia’s Data Processing and Analysis efforts, which convert the huge raw data streams from the satellite into the beautiful science-ready information now made available for the global scientific and public communities. UK industry made critical contributions to the Gaia spacecraft. The UK public, including school students, as well as scientists, are sharing the excitement of this first ever galactic census.”

In addition to the work taking place at Cambridge, teams from Edinburgh, the Mullard Space Science Laboratory (MSSL) at UCL London, Leicester, Bristol and the Science and Technology Facilities Council’s Rutherford Appleton Lab are all contributing to the processing of the vast amounts of data from Gaia, in collaboration with industrial and academic partners from across Europe.

The team in Cambridge, led by Dr Floor van Leeuwen, Dr Dafydd Wyn Evans and Dr Francesca De Angeli, processed the flux information – the amount of energy that crosses a unit area per unit time – providing the calibrated magnitudes of around 1.6 billion stars, 1.1 billion of which are now published as part of the first data release.

The Cambridge team also check the daily photometric data for unexpected large outliers, which led to the regular publication of photometric science alerts that were ready for immediate follow-up observations from the ground.

“The sheer volume of data processed for this first release is beyond imagination: around 120 billion images were analysed, and most of these more than once, as all the processing is iterative,” said van Leeuwen, who is Gaia photometric data processing lead. “Many problems had to be overcome, and a few small ones still remain. Calibrations have not yet reached their full potential. Nevertheless, we are already reaching accuracies that are significantly better than expected before the mission, and which can challenge most ground-based photometric data in accuracy.”

“This first Gaia data release has been an important exercise for the Gaia data reduction teams, getting them to focus on deliverable products and their description,” said Evans. “But it is only the first small step towards much more substantial results.”

While today marks the first major data release from Gaia, in the two years since its launch, the satellite has been producing scientific results in the form of Gaia Alerts.

Dr Simon Hodgkin, lead of the Cambridge Alerts team said, “The Gaia Alerts project takes advantage of the fact that the Gaia satellite scans each part of the sky repeatedly throughout the mission. By comparing successive observations of the same patch of sky, scientists can search for transients – astronomical objects which brighten, fade, change or move. These transients are then announced to the world each day as Gaia Alerts for both professional and amateur astronomers to observe with telescopes from the ground.”

The range of Gaia’s discoveries from Science Alerts is large – supernovae of various types, cataclysmic variable stars, novae, flaring stars, gravitational microlensing events, active galactic nuclei and quasars, and many sources whose nature remains a mystery.

Gaia has discovered many supernovae, the brilliant explosions of stars at the end of their lives. Many of these have been ‘Type Ia’ supernovae, which can be used to measure the accelerating expansion of the Universe. But among these apparently typical supernovae there have been some rarer events. Gaia16ada was spotted by Gaia in the nearby galaxy NGC4559, and appears to be an eruption of a very massive, unstable star. The Hubble Space Telescope observed this galaxy some years ago, allowing astronomers to pinpoint the precise star which erupted.

Another lucky catch for Gaia was the discovery of Gaia16apd – a supernova which is nearly a hundred times brighter than normal. Astronomers still don’t know what the missing ingredient in these ultra-bright supernovae is, and candidates include exotic rapidly spinning neutron stars, or jets from a black hole. Cambridge astronomer Dr Morgan Fraser is trying to understand these events, saying, “We have only found a handful of these exceptionally bright supernovae, compared to thousands of normal supernovae. For Gaia to spot one so nearby is a fantastic result.”

Many of the Gaia Alerts found so far are bright enough to be observable with a small telescope. Amateur astronomers have taken images of supernovae found by Gaia, while schoolchildren have used robotic telescopes including the Faulkes Telescopes in Australia and Hawaii to do real science with transients.

Dr Hodgkin said: “Since the announcement of the first transients discovered with Gaia in September 2014, over one thousand alerts have been released. With Gaia continually relaying new observations to ground, and our team working on finding transients continually improving their software, the discoveries look set to continue well into the future!”

For the UK teams the future means providing improvements in the pre-processing of the data and extending the processing to cover the photometric Blue and Red prism data. Also data from the Radial Velocity Spectrometer, with major involvement from MSSL, will be included in future releases. The photometric science alerts will continue to operate throughout the mission, and summaries of the results will be included in future releases. “Despite the considerable amount of data, the first Gaia data release provides just a taste of the accuracy and completeness of the final catalogue,” said De Angeli.

The Cambridge Gaia team has also released a dedicated smartphone app, which will allow anyone worldwide to follow the Gaia Alerts discoveries as they happen. Real spacecraft data will be available to the world as soon as it is processed, with users able to follow the discoveries and see what they are. Information to access the app is available athttps://gaia.ac.uk.

The Gaia data processing teams in the UK have been and are being supported by the UK Space Agency and the STFC. STFC helped the set-up of the data applications centre and STFC’s current support involves the UK exploitation of the scientific data to be yielded from the mission. Industrial partners include Airbus, MSSL and e2v Technologies.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

“Opening The Skull” Of Patients After Head Injury Reduces Risk Of Death From Brain Swelling

“Opening the skull” of patients after head injury reduces risk of death from brain swelling

source: www.cam.ac.uk

Craniectomy – a surgical procedure in which part of the skull is removed to relieve brain swelling – significantly reduces the risk of death following traumatic brain injury, an international study led by the University of Cambridge has found.

Traumatic brain injury is an incredibly serious and life-threatening condition. From our study, we estimate that craniectomies can almost halve the risk of death for patients with a severe traumatic brain injury and significant swelling.

Peter Hutchinson

Traumatic brain injury is a serious injury to the brain, often caused by road traffic accidents, assaults or falls. It can lead to dangerous swelling in the brain which, in turn, can lead to brain damage or even death.

A team led by researchers at the Department of Clinical Neurosciences, University of Cambridge, and based at Addenbrooke’s Hospital, recruited over 400 traumatic brain injury patients over a ten-year period from the UK and another 19 countries worldwide. They then randomly assigned the patients to one of two groups for treatment – craniectomy or medical management.

In research published this week in the New England Journal of Medicine, the researchers report that six months after the head injury, just over one in four patients (27%) who received a craniectomy had died compared to just under a half (49%) of patients who received medical management. However, the picture was complicated as patients who survived after a craniectomy were more likely to be dependent on others for care (30.4% compared to 16.5%).

Further follow-up showed that patients who survived following a craniectomy continued improving from six to 12 months after injury. As a result, at 12 months, nearly half of craniectomy patients were at least independent at home (45.4%), as compared with one-third of patients in the medical group (32.4%).

Peter Hutchinson, Professor of Neurosurgery at the Department of Clinical Neurosciences at Cambridge, says: “Traumatic brain injury is an incredibly serious and life-threatening condition. From our study, we estimate that craniectomies can almost halve the risk of death for patients with a severe traumatic brain injury and significant swelling. Importantly, this is the first high-quality clinical trial in severe head injury to show a major difference in outcome. However, we need to be really conscious of the quality of life of patients following this operation which ranged from vegetative state through varying states of disability to good recovery.”

Angelos Kolias, Clinical Lecturer at the Department, adds: “Doctors and families will need to be aware of the wide range of possible long-term outcomes when faced with the difficult decision as to whether to subject someone to what is a major operation. Our next step is to look in more detail at factors that predict outcome and at ways to reduce any potential adverse effects following surgery. We are planning to hold a consensus meeting in Cambridge next year to discuss these issues.”

The research was funded by the Medical Research Council (MRC) and managed by the National Institute for Health Research (NIHR) on behalf of the MRC–NIHR partnership, with further support from the NIHR Cambridge Biomedical Research Centre, the Academy of Medical Sciences, the Health Foundation, the Royal College of Surgeons of England and the Evelyn Trust.

Reference

Hutchinson, PJ et al. Trial of Decompressive Craniectomy for Traumatic Intracranial Hypertension, NEJM; 2 Sept 2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.