All posts by Admin

Being Overweight Linked To Poorer Memory

Being overweight linked to poorer memory

source: www.cam.ac.uk

Overweight young adults may have poorer episodic memory – the ability to recall past events – than their peers, suggests new research from the University of Cambridge, adding to increasing evidence of a link between memory and overeating.

How vividly we remember a recent meal, for example today’s lunch, can make a difference to how hungry we feel

Lucy Cheke

In a preliminary study published in The Quarterly Journal of Experimental Psychology, researchers from the Department of Psychology at Cambridge found an association between high body mass index (BMI) and poorer performance on a test of episodic memory.

Although only a small study, its results support existing findings that excess bodyweight may be associated with changes to the structure and function of the brain and its ability to perform certain cognitive tasks optimally. In particular, obesity has been linked with dysfunction of the hippocampus, an area of the brain involved in memory and learning, and of the frontal lobe, the part of the brain involved in decision making, problem solving and emotions, suggesting that it might also affect memory; however, evidence for memory impairment in obesity is currently limited.

Around 60% of UK adults are overweight or obese: this number is predicted to rise to approximately 70% by 2034. Obesity increases the risk of physical health problems, such as diabetes and heart disease, as well as psychological health problems, such as depression and anxiety.

“Understanding what drives our consumption and how we instinctively regulate our eating behaviour is becoming more and more important given the rise of obesity in society,” says Dr Lucy Cheke. “We know that to some extent hunger and satiety are driven by the balance of hormones in our bodies and brains, but psychological factors also play an important role – we tend to eat more when distracted by television or working, and perhaps to ‘comfort eat’ when we are sad, for example.

“Increasingly, we’re beginning to see that memory – especially episodic memory, the kind where you mentally relive a past event – is also important. How vividly we remember a recent meal, for example today’s lunch, can make a difference to how hungry we feel and how much we are likely to reach out for that tasty chocolate bar later on.”

The researchers tested 50 participants aged 18-35, with body mass indexes (BMIs) ranging from 18 through to 51 – a BMI of 18-25 is considered healthy, 25-30 overweight, and over 30 obese. The participants took part in a memory test known as the ‘Treasure-Hunt Task’, where they were asked to hide items around complex scenes (for example, a desert with palm trees) across two ‘days’. They were then asked to remember which items they had hidden, where they had hidden them, and when they were hidden. Overall, the team found an association between higher BMI and poorer performance on the tasks.

The researchers say that the results could suggest that the structural and functional changes in the brain previously found in those with higher BMI may be accompanied by a reduced ability to form and/or retrieve episodic memories. As the effect was shown in young adults, it adds to growing evidence that the cognitive impairments that accompany obesity may be present early in adult life.

This was a small, preliminary study and so the researchers caution that further research will be necessary to establish whether the results of this study can be generalised to overweight individuals in general, and to episodic memory in everyday life rather than in experimental conditions.

“We’re not saying that overweight people are necessarily more forgetful,” cautions Dr Cheke, “but if these results are generalizable to memory in everyday life, then it could be that overweight people are less able to vividly relive details of past events – such as their past meals. Research on the role of memory in eating suggests that this might impair their ability to use memory to help regulate consumption.

“In other words, it is possible that becoming overweight may make it harder to keep track of what and how much you have eaten, potentially making you more likely to overeat.”

Dr Cheke believes that this work is an important step in understanding the role of psychological factors in obesity. “The possibility that there may be episodic memory deficits in overweight individuals is of concern, especially given the growing evidence that episodic memory may have a considerable influence on feeding behaviour and appetite regulation,” she says.

Co-author Dr Jon Simons adds: “By recognising and addressing these psychological factors head-on, not only can we come to understand obesity better, but we may enable the creation of interventions that can make a real difference to health and wellbeing.”

The study was funded by the Medical Research Council and Girton College, University of Cambridge, and the James S McDonnell Foundation.

Reference
Cheke, LG et al. Higher BMI is Associated with Episodic Memory Deficits in Young Adults. The Quarterly Journal of Experimental Psychology; 22 Feb 2016. DOI:10.1080/17470218.2015.1099163


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/being-overweight-linked-to-poorer-memory#sthash.2AOx9JGd.dpuf

Flowers Tone Down The Iridescence of Their Petals and Avoid Confusing Bees

Flowers tone down the iridescence of their petals and avoid confusing bees

source: www.cam.ac.uk

Latest research shows that flowers’ iridescent petals, which may look plain to human eyes, are perfectly tailored to a bee’s-eye-view.

There are lots of optical effects in nature that we don’t yet understand… we are finding out that animals and plants have a lot more to say to the world and to each other

Beverley Glover

Iridescent flowers are never as dramatically rainbow-coloured as iridescent beetles, birds or fish, but their petals produce the perfect signal for bees, according to a new study published today in Current Biology.

Bees buzzing around a garden, looking for nectar, need to be able to spot flower petals and recognise which coloured flowers are full of food for them. Professor Beverley Glover from the University of Cambridge’s Department of Plant Sciences and Dr Heather Whitney from the University of Bristol found that iridescence – the shiny, colour-shifting effect seen on soap bubbles – makes flower petals more obvious to bees, but that too much iridescence confuses bees’ ability to distinguish colours.

Whitney, Glover and their colleagues found that flowers use more subtle, or imperfect, iridescence on their petals, which doesn’t interfere with the bees’ ability to distinguish subtly different colours, such as different shades of purple. Perfect iridescence, for example as found on the back of a CD, would make it more difficult for bees to distinguish between subtle colour variations and cause them to make mistakes in their flower choices.

“In 2009 we showed that some flowers can be iridescent and that bees can see that iridescence, but since then we have wondered why floral iridescence is so much less striking than other examples of iridescence in nature,” says Glover. “We have now discovered that floral iridescence is a trade-off that makes flower detection by bumblebees easier, but won’t interfere with their ability to recognise different colours.”

Bees use ‘search images’, based on previously-visited flowers, to remember which coloured flowers are a good source of nectar.

“On each foraging trip a bee will usually retain a single search image of a particular type of flower,” explains Glover, “so if they find a blue flower that is rich in nectar, they will then visit more blue flowers on that trip rather than hopping between different colours. If you watch a bee on a lavender plant, for example, you’ll see it visit lots of lavender flowers and then fly away – it won’t usually move from a lavender flower to a yellow or red flower.”

This colour recognition is vital for both the bees and the plants, which rely on the bees to pollinate them. If petals were perfectly iridescent, then bees could struggle to identify and recognise which colours are worthwhile visiting for nectar – instead, flowers have developed an iridescence signal that allows them to talk to bees in their own visual language.

The researchers created replica flowers that were either perfectly iridescent (using a cast of the back of a CD), imperfectly iridescent (using casts of natural flowers), or non-iridescent. They then tested how long it took for individual bees to find the flowers.

They found that the bees were much quicker to locate the iridescent flowers than the non-iridescent flowers, but it didn’t make a difference whether the flowers were perfectly or imperfectly iridescent. The bees were just as quick to find the replicas modelled on natural petals as they were to find the perfectly iridescent replicas.

When they tested how fast the bees were to find nectar-rich flowers amongst other, similarly-coloured flowers, they found that perfect iridescence impeded the bees’ ability to distinguish between the flowers – the bees were often confused and visited the similarly-coloured flowers that contained no nectar. However, imperfect iridescence, found on natural petals, didn’t interfere with this ability, and the bees were able to successfully locate the correct flowers that were full of nectar.

“Bees are careful shoppers in the floral supermarket, and floral advertising has to tread a fine line between dazzling its customers and being recognisable,” says Lars Chittka from Queen Mary University of London, another co-author of the study.

“To our eyes most iridescent flowers don’t look particularly striking, and we had wondered whether this is simply because flowers aren’t very good at producing iridescence,” says Glover. “But we are not the intended target – bees are, and they see the world differently from humans.”

“There are lots of optical effects in nature that we don’t yet understand. We tend to assume that colour is used for either camouflage or sexual signalling, but we are finding out that animals and plants have a lot more to say to the world and to each other.”

Glover and her colleagues are now working towards developing real flowers that vary in their amount of iridescence so that they can examine how bees interact with them.

“The diffraction grating that the flowers produce is not as perfectly regular as those we can produce on things like CDs, but this ‘advantageous imperfection’ appears to benefit the flower-bee interaction,” says Whitney.

Reference: Whitney, Heather et al “Flower Iridescence Increases Object Detection in the Insect Visual System without Compromising Object Identity” Current Biology (2016). DOI: http://dx.doi.org/10.1016/j.cub.2016.01.026

Professor Glover will be giving the talk ‘Can we improve crop pollination by breeding better flowers?’ at the Cambridge Science Festival on Sunday 20 March 2016. More information can be found here: http://www.sciencefestival.cam.ac.uk/events/can-we-improve-crop-pollinat…

Inset images: Iridescent flower (Copyright Howard Rice); Bee on non-iridescent flower (Edwige Moyroud); Bee on non-iridescent flower (Edwige Moyroud).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/flowers-tone-down-the-iridescence-of-their-petals-and-avoid-confusing-bees#sthash.YUBqwA0A.dpuf

Honeypot Britain? EU Migrants’ Benefits and The UK Referendum

Honeypot Britain? EU migrants’ benefits and the UK referendum

source: www.cam.ac.uk

Ahead of Britain’s EU referendum, research will explore the experiences of EU migrants working in the UK, and attitudes to employment and social security – for which there is little empirical evidence, despite intense political rhetoric. An initial study suggests workers from the EU are significantly under-represented in employment tribunals.

Accusations that the UK has become a ‘honeypot nation’ has become a key issue in the debate about the UK’s membership of the EU

Amy Ludlow

A new Cambridge University research project is gathering “robust empirical evidence” on the experience of EU migrant workers in the UK, exploring everything from hopes and expectations to how they find work and what use EU migrants make of benefits.

The research is timely, as perceptions of EU migrants undercutting British workers or acting as ‘benefits tourists’ are fuelling much of the debate in the lead-up to June’s EU Referendum.

Some MPs are warning that Britain has become a “honeypot nation” with its social security system acting as a primary pull factor, leading to David Cameron’s negotiation of a so-called ‘emergency brake’ on benefits for EU migrants.

However, critics argue that the government have been consistently unable to provide any evidence that this is the case. For example, last week’s response to a Parliamentary question on the amount spent on benefits to EU migrants was simply: “the information is not available”.

The EU Migrant Worker Project will aim to fill some of that knowledge gap. By combining interviews and focus groups with new methodologies for analysing available data, the research team hope to build an evidential base for EU migrants’ experiences of and attitudes toward Britain’s employment and social security systems.

The project, funded by the Economic and Social Research Council, is led by Professor Catherine Barnard and Dr Amy Ludlow from Cambridge’s Faculty of Law, and is launched today (Friday 26th February) with a roundtable discussion involving Labour former Home Secretary Charles Clarke and current Conservative MP Heidi Allen among others.

Professor Barnard said: “We hope to shed new light on the big question of how we adequately regulate migration within a socio-economically diverse EU and a post-financial crisis context. This question is central to Brexit and to the outcome of the UK’s referendum on EU membership.”

Initial work has already been carried out, and a study published last October in the journal Industrial Law shows that EU migrants are using UK employment tribunals at much lower rates than would be expected relative to population size.

The study, the only one of its kind, is based on analysis of three years of Employment Tribunal decisions alongside field interviews. It suggests that migrant workers from EU-8 nations use employment tribunals over 85% less than would be expected, given the size of the workforce they represent.

The researchers identified various factors affecting migrants’ willingness and ability to use tribunals, including: lack of knowledge of their rights, reluctance to engage with the judicial system and, for those in the UK for a short time, a desire to maximise their earnings that is prioritised over complaints about mistreatment.

Under current EU law, EU migrants have rights to equal treatment in their terms and conditions of employment offered to domestic workers.

However, this initial study suggests that when it comes to employment conditions these may be rights that “exist more ‘on paper’ than in practice”, write the researchers.

“While we found good evidence to suggest that EU-8 workers were fairly treated by Employment Tribunal judges, navigating the system and accessing enough advice to understand the basic elements of the rights these workers are due is deeply problematic,” said Dr Ludlow.

“In interviews, we were told that largescale cuts to local authorities have had a negative impact on resources such as Citizen Advice Bureaus. These are important sources of guidance for workers who cannot afford legal advice, including workers from the EU.”

Professor Barnard said that the introduction of Employment Tribunal fees has meant that many workers are now priced out of claiming their employment rights. “If the Government is concerned about migrant workers’ undercutting employment terms and conditions and labour standards for domestic workers, our research suggests that resource needs to be directed to enabling migrant workers to enforce their rights, and to properly resourcing enforcement organisations such as the Gangmasters’ Licencing Authority.”

Unlike some other EU Member States, the UK did not impose restrictions on the admission of workers coming from the so-called EU-8 countries (such as Poland and the Czech Republic), apart from the requirement to register under the Workers’ Registration Scheme.

Over a million EU-8 workers, taking advantage of their free movement rights under Article 45 of the Treaty on the Functioning of the European Union (TFEU), have arrived in the UK since 2004. They enjoy rights to equal treatment in any social and tax advantages offered to domestic workers – including the payment of child benefit and ‘in-work benefits’ such as tax credits.

Barnard and Ludlow plan to use the research design from their employment enforcement study and apply it to social security tribunals, to help give some sense of the number of EU migrants who claim benefits and the nature of the cases in which they are involved. They will also interview EU migrants and those that work closely with them, to explore migrants’ hopes, expectations and experiences.

I didn’t come to the UK just to work in any kind of job

Early interviews have highlighted the importance of online grass roots communities such as Facebook groups for migrant workers seeking advice, and that stopping child benefit for EU migrants may result in fewer family units making the transition to the UK, and an increase in younger, unattached men working in the UK, who are likely to integrate less permanently within their host community.

While some interviewees are preparing to leave Britain, citing a better quality of life in their home nation (“I’m not interested in staying in the UK just because it’s possible”), the researchers also found migrant success stories.

One interviewee spoke of her determination to work in nursing: “I didn’t come to the UK just to work in any kind of job. Either I’m working my way towards nursing or, if that’s not possible, I’m going back.” After struggling through bar work and learning medical English on her days off, the woman is now a nurse in a local hospital.

“Many of the EU migrants we’ve talked to so far don’t understand our complex social security system; their only interest is in finding work,” said Dr Ludlow.

As well as one-to-one interviews and focus groups, the researchers will be making a documentary and providing migrant workers with disposable cameras. “It’s another way of trying to capture the migrant experience that offers an alternative insight to words on paper,” said Professor Barnard.

The project is a two-way process, she says, with minute-long podcasts summarizing relevant aspects of the law, which will be available on EU Migrant Worker Project later this month.

“What we can offer the migrant community in return is quite detailed knowledge of the law and their rights and how to enforce those rights, both to claim employment rights but also social security benefits.”

Added Dr Ludlow: “Accusations that the UK has become a ‘honeypot nation’ has become a key issue in the debate about the UK’s membership of the EU.

“By gathering empirical evidence about EU migrants’ experiences of navigating the labour market and social security system in the UK, we hope to increase our understanding of EU and domestic law as it works in practice and to inform public opinion in anticipation of the referendum on 23 June and beyond.”

If you are interested in learning more about Professor Barnard and Dr Ludlow’s work please email euworker@hermes.cam.ac.uk, tweet @eumigranworker, or contact them on their Facebook page https://www.facebook.com/eu.migrantworker/. Their project website is: www.eumigrantworker.law.cam.ac.uk.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/honeypot-britain-eu-migrants-benefits-and-the-uk-referendum#sthash.ADfds434.dpuf

If General Practice Fails, The Whole NHS Fails, Argue Healthcare Experts

If general practice fails, the whole NHS fails, argue healthcare experts

source: www.cam.ac.uk

The current focus on financial crises in hospitals diverts attention from the crisis in general practice, argue Professor Martin Roland from the University of Cambridge and GP Sir Sam Everington in an editorial published in The BMJ today.

Hospitals’ £2bn deficit “certainly sounds dramatic”, they argue, “but hospitals don’t go bust – someone usually picks up the bill.” General practice doesn’t have that luxury, and its share of the NHS budget has fallen from 11% in 2006 to under 8.5% now.

Recent research shows that GPs are experiencing unprecedented levels of stress with increasing workload and overwhelming bureaucracy. A GP’s comment at a recent national conference encapsulates the sense of despair: “The pressure of work leaves me in constant fear of making mistakes”.

GPs are finding it harder to recruit trainees and to find partners to replace those increasingly retiring in their 50s.

Politicians and NHS leaders want more care to be moved into primary care, yet the share of funding devoted to general practice is falling as a high proportion of the NHS budget is channelled into hospitals, and in the past 10 years, the number of hospital consultants has increased at twice the rate of GPs.

GPs currently manage the great majority of patients without referral or admission to hospital but if this balance shifted only slightly, hospitals would be overwhelmed.

“It is general practice that makes the NHS one of the world’s most cost effective health services,” they say. The £136 cost per patient per year for unlimited general practice care is less than the cost of a single visit to a hospital outpatient department.

The authors, who are both internationally renowned experts in general practice, present a number of solutions. They say GPs need a “substantial injection of new funding” to provide more staff in primary care.

In addition, new roles are needed to take the “strain off” clinical staff, for example, physician associates, pharmacists, and advanced practice nurses. They also argue that reviews of practices’ contracts that threaten serious financial destabilisation should be put on hold while a fair funding formula is developed to replace the 25 year old ‘Carr-Hill’ formula.

NHS England should tackle spiralling indemnity costs by providing Crown Indemnity similar to that for hospital doctors, as GPs increasingly do work previously done by specialists.

Bureaucracy should be slashed, in part by changing the £224m Care Quality Commission inspection regime to one where only the 5-10% of practices found to be struggling are revisited within five years.

In hospitals, the ‘Choose and Book’ referral system needs radical reform – the authors estimate that communicating by phone, email, and online video link could reduce outpatient attendance by as much as 50% in some specialties. And the ‘Payment by Results’ system for funding hospitals must become a population based, capitated budget that incentivises hospitals to support patients and clinicians in the community.

The authors identify two ‘elephants in the room’ that can no longer be ignored. First, cuts to social care make it increasingly difficult for hospitals to discharge patients.

Second, the UK’s funding for healthcare has fallen well behind its European neighbours – now thirteenth out of 15 in healthcare expenditure as a percentage of gross domestic product. In 2000, Tony Blair promised to raise NHS spending to mid-European levels. Today, this would require another £22bn a year.

“Urgent action is needed to restore the NHS,” warn the authors. “But the crisis will not be averted by focusing on hospitals. If general practice fails, the whole NHS fails.”

Professor Martin Roland, Professor of Health Services Research at the University of Cambridge, adds: “GPs need to feel valued rather than continually criticised by politicians and regulators. Many other countries see primary care as the jewel in the crown of the NHS, yet many practices are at breaking point, with an increasing number simply handing in their contracts and closing.”

Sir Sam Everington, Tower Hamlets GP and chair of Tower Hamlets CCG, says: “Patients really value the support of their family doctor, particularly in crises like end of life care. Moving care into the community means supporting patients to die at home surrounded by their loved ones – this is one of many reasons why family medicine is critical to the NHS.

“Family medicine and new developments like social prescribing show the strengths of general practice in supporting vulnerable patients in all aspects of their physical and mental well-being.”

Reference
Martin Roland and Sam Everington. Tackling the crisis in general practice. The BMJ. 18 Feb 2016. dx.doi.org/10.1136/bmj.i942

Adapted from a press release by The BMJ.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/if-general-practice-fails-the-whole-nhs-fails-argue-healthcare-experts#sthash.26vud4eZ.dpuf

UK Online Alternative Finance Market Grows To £3.2 Billion In 2015

UK online alternative finance market grows to £3.2 billion in 2015

source: www.cam.ac.uk

The UK online alternative finance sector grew 84% in 2015, facilitating £3.2 billion in investments, loans and donations, according to a new report published today.

These areas of finance are increasingly becoming part of our everyday economic life.

Robert Wardrop

This is a significant increase in volume, but growth of the online alternative finance market is slowing down, with the annual growth in 2014/2015 being nearly half the 161% growth from 2013/14. Although the absolute year-on-year growth rate is slowing down, the report said, the alternative finance industry still recorded substantive expansion across almost all models.

The report also highlights the rapid expansion of donations-based crowdfunding, the perceived risk of fraud and malpractice by the industry, and increasing institutionalisation – as around a quarter of P2P (peer-to-peer) loans are now funded by institutional investors, including traditional banks and government through organisations such as the British Business Bank.

Pushing Boundaries – 2015 UK Alternative Finance is jointly published by the Cambridge Centre for Alternative Finance at the University of Cambridge and UK innovation foundation Nesta, in partnership with KPMG and with the support of CME Group Foundation. It is the latest in an annual series of reports from the University of Cambridge Judge Business School and Nesta, which track the size and development of online alternative finance, such as P2P lending and crowdfunding, in the UK.

Key findings of the report, a survey of 94 crowdfunding and P2P lending platforms, include:

  • Increased share of the market for business finance: in 2015 it is estimated that online alternative finance platforms provided the equivalent of over 3% of all lending to SMEs (small and medium-sized enterprises) in the UK. For small businesses – those with a turnover of less than £1 million a year – P2P platforms provided an amount lending equivalent of 13% of all new bank loans.
  • Institutionalisation is taking off: 2015 saw increased involvement from institutional investors in the online alternative finance market. The report shows that 32% of loans in P2P consumer lending and 26% of P2P business lending were funded by institutional investors.
  • Donation-based crowdfunding is the fastest growing model: although starting from a relatively small base (£2 million), donation-based crowdfunding is the fastest growing model in the 2015 study, up by 500% to £12 million.
  • Real estate is the single most popular sector: in 2014/2015 the most popular sector for online alternative finance investments and loans was real estate, with the combined debt and equity-based funding for this sector reaching £700m in 2015.
  • The equity market is growing fast: the second fastest growing area of the alternative finance market is equity-based crowdfunding, up by 295% – from £84 million raised in 2014, to £332m in 2015. Excluding real estate crowdfunding, in 2014/2015 the equity-based crowdfunding sector contributed to £245 million worth of venture financing in 2015 – equivalent to over 15% of total UK seed and venture equity investment.
  • The industry is generally satisfied with current regulation: when asked what they thought of existing regulation, more than 90% of P2P lending and equity-based crowdfunding platforms stated that they thought the current level was appropriate.
  • The biggest risk to market growth is fraud or malpractice: when asked what they saw as the biggest risk to the future growth of the market, 57% of P2P lending and equity-based crowdfunding platforms cited the potential collapse of one or more of the well-known industry player due to fraud or malpractice.

“The substantive growth of alternative finance in the UK last year is not surprising, given that these new channels of finance are increasingly moving mainstream,” said Robert Wardrop, Executive Director of Cambridge Centre for Alternative Finance. “One of the key drivers underpinning this development is the growing institutionalisation of the sector. The Cambridge Centre for Alternative Finance is proud to shed light on this fascinating and dynamic industry, to help inform policymakers, regulators and the general public about how these areas of finance are increasingly becoming part of our everyday economic life.”

“2015 has seen another year of remarkable growth for Alternative Finance in the UK,” said Stian Westlake, Nesta’s Executive Director of Policy & Research. “Little more than a collection of plucky startups just six years ago, the sector now does £3.2 billion of business a year. As the sector grows and matures it is sure to face challenges – investors will be keen to see returns, and another financial crisis would certainly test the robustness of P2P lending.”

Warren Mead, Global co-lead of Fintech at KPMG, said: “After years of pushing boundaries, 2016 will be the year where ‘alternative’ financial options finally join the ranks of the mainstream. But while this evolution gives the industry the platform to grow, it also brings its own set of challenges. Being part of the financial establishment doesn’t sit well with its original social purpose. Incumbents are also playing catch up with their own digital investment, and are closing in on the disrupters’ lead. Meanwhile, platform failures within these growing networks are inevitable. So the question is, will the hard won enthusiasm for these platforms start to wane?”

The report was led by Bryan Zhang, a Director of the Cambridge Centre for Alternative Finance, and Peter Baeck, Principal Researcher at Nesta. It has been supported in part by funding from audit, tax and advisory service KPMG and CME Group Foundation, the foundation affiliated with CME Group.

Originally published on the Cambridge Judge Business School website. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/uk-online-alternative-finance-market-grows-to-ps32-billion-in-2015#sthash.aJ4BI2YD.dpuf

Students Share Their Lives At Cambridge

Students share their lives at Cambridge

Shadows and mentors walking along King's Parade in Cambridge

source: www.cam.ac.uk

350 Year 12s and mature learners have just experienced life at Cambridge thanks to one of the UKs largest student-led access initiatives.

This university is for anyone, no matter what their background is, as long as they love their subject and have real academic potential

Helena Blair, CUSU Access Officer

Cambridge University Students’ Union’s (CUSU) annual Shadowing Scheme gives state school educated Year 12s and mature prospective applicants from across the UK the opportunity to come to Cambridge, stay in one of its Colleges and ‘shadow’ a current undergraduate for three days.
By giving participants the chance to see what undergraduate life is really like, the Shadowing Scheme shatters the popular myths and misconceptions that might otherwise deter them from applying.
Established in 2000, the Scheme is run over three long weekends in January and February. During this time, each ‘shadow’ accompanies a ‘mentor, a current undergraduate who is studying a subject which they are interested in.
During their stay, ‘Shadows’ get a taste of lectures, supervisions and for the scientists, laboratory classes. They also have the opportunity to sample some of the University’s student societies and chat to current students from a wide range of backgrounds and courses.
The CUSU Shadowing Scheme targets academically strong Year 12s and mature learners who have little or no family experience of higher education, and who attend schools and colleges which do not have tradition of progression to leading universities.
Helena Blair, a Cambridge graduate and now CUSU’s Access Officer, says ‘Our student mentors are eager to spread the word that this university is for anyone, no matter what their background is, as long as they love their subject and have real academic potential. No one from my school had applied before and the myths convinced me that I’d never get in, or fit in, but meeting so many friendly, accepting and diverse Cambridge students changed my outlook’.
This year, Michaela Chan, a Trinity Hall engineering student from Luton, has taken two shadows under her wing – Talent from Welwyn Garden City and Sam from Bradford. Sam hopes to study engineering and got the chance to attend a few second year lectures in the Faculty. Talent kept her options open, attending a maths lecture and joining Cambridge medics at Addenbrookes hospital.
Meanwhile, Alia Khalid, a Sidney Sussex philosophy student from Harrogate, is mentoring Josh, a Year 12 from Stoke. Josh is trying to work out whether he’s more interested in Philosophy or Psychological and Behavioural Sciences. The pair emerge from a morning lecture on Physicalism, the philosophical position that everything which exists is no more extensive than its physical properties.
Unfazed but hungry, they head towards Sidney Sussex College where an informal ‘Meet the Students’ with pizza event has taken over the College bar. Joining them on the stroll along the iconic King’s Parade is Olivia, a philosophy student and her shadow, Adriana, a fellow Londoner.
Olivia applied to Cambridge after taking part in one of the Summer Schools run by the University with the Sutton Trust and is now working hard to give Adriana, who hasn’t decided whether to study English, Philosophy or Law, as much information as possible. At Sidney Sussex, she introduces her to Damian, a first year English student at Christ’s College. “He said some useful things about his course and how to prepare for applying” reports Adriana,”I’ve got some more reading to do … Is that pizza spicy?”
At the University’s Careers Service, ‘shadows’ from Norwich, Wales, Liverpool and London have gathered to hear from its Director, Gordon Chesterman, about the opportunities which a Cambridge degree can offer. Cambridge graduates are some of the most employable anywhere in the world but Gordon is also keen to emphasise that a Cambridge degree provides flexibility and choice. Luke, a Year 12 from Swansea, would like to study Natural Sciences but he’s worried that if he makes the wrong choice now, he’ll struggle to get the job he wants later on. Gordon immediately reassures him.
“What do you think these people studied?” he asks, “a fraud investigator, an investment banker, a long-haul pilot, a community outreach officer in Iraq?”
“Maths?” someone suggests.
“No, they actually all studied music”.
Everyone is surprised but as Gordon explains, studying Music at Cambridge develops the analytical skills, organisation and self-discipline which all of these careers demand.
At the end of the session, conversation turns to life in Cambridge and Lara Grace, a Year 12 from Streatham admits “If I got in, I’d have to learn how to ride a bike. I’ve never cycled in London”. Lara Grace wants to apply to study Human, Social, and Political Sciences and then pursue a career in Human Rights. If she’s only anxious about the cycling, the Shadowing Scheme has done its job – busting myths and inspiring confidence.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/news/students-share-their-lives-at-cambridge#sthash.rlbv2Z8h.dpuf

Five-Dimensional Black Hole Could ‘Break’ General Relativity

Five-dimensional black hole could ‘break’ general relativity

source: www.cam.ac.uk

 Researchers have successfully simulated how a ring-shaped black hole could cause general relativity to break down: assuming the universe contains at least five dimensions, that is.

As long as singularities stay hidden behind an event horizon, they do not cause trouble and general relativity holds

Markus Kunesch

Researchers have shown how a bizarrely shaped black hole could cause Einstein’s general theory of relativity, a foundation of modern physics, to break down. However, such an object could only exist in a universe with five or more dimensions.

The researchers, from the University of Cambridge and Queen Mary University of London, have successfully simulated a black hole shaped like a very thin ring, which gives rise to a series of ‘bulges’ connected by strings that become thinner over time. These strings eventually become so thin that they pinch off into a series of miniature black holes, similar to how a thin stream of water from a tap breaks up into droplets.

Ring-shaped black holes were ‘discovered’ by theoretical physicists in 2002, but this is the first time that their dynamics have been successfully simulated using supercomputers. Should this type of black hole form, it would lead to the appearance of a ‘naked singularity’, which would cause the equations behind general relativity to break down. The results are published in the journal Physical Review Letters.

 

General relativity underpins our current understanding of gravity: everything from the estimation of the age of the stars in the universe, to the GPS signals we rely on to help us navigate, is based on Einstein’s equations. In part, the theory tells us that matter warps its surrounding spacetime, and what we call gravity is the effect of that warp. In the 100 years since it was published, general relativity has passed every test that has been thrown at it, but one of its limitations is the existence of singularities.

A singularity is a point where gravity is so intense that space, time, and the laws of physics, break down. General relativity predicts that singularities exist at the centre of black holes, and that they are surrounded by an event horizon – the ‘point of no return’, where the gravitational pull becomes so strong that escape is impossible, meaning that they cannot be observed from the outside.

“As long as singularities stay hidden behind an event horizon, they do not cause trouble and general relativity holds – the ‘cosmic censorship conjecture’ says that this is always the case,” said study co-author Markus Kunesch, a PhD student at Cambridge’s Department of Applied Mathematics and Theoretical Physics (DAMTP). “As long as the cosmic censorship conjecture is valid, we can safely predict the future outside of black holes. Because ultimately, what we’re trying to do in physics is to predict the future given knowledge about the state of the universe now.”

But what if a singularity existed outside of an event horizon? If it did, not only would it be visible from the outside, but it would represent an object that has collapsed to an infinite density, a state which causes the laws of physics to break down. Theoretical physicists have hypothesised that such a thing, called a naked singularity, might exist in higher dimensions.

“If naked singularities exist, general relativity breaks down,” said co-author Saran Tunyasuvunakool, also a PhD student from DAMTP. “And if general relativity breaks down, it would throw everything upside down, because it would no longer have any predictive power – it could no longer be considered as a standalone theory to explain the universe.”

We think of the universe as existing in three dimensions, plus the fourth dimension of time, which together are referred to as spacetime. But, in branches of theoretical physics such as string theory, the universe could be made up of as many as 11 dimensions. Additional dimensions could be large and expansive, or they could be curled up, tiny, and hard to detect. Since humans can only directly perceive three dimensions, the existence of extra dimensions can only be inferred through very high energy experiments, such as those conducted at the Large Hadron Collider.

Einstein’s theory itself does not state how many dimensions there are in the universe, so theoretical physicists have been studying general relativity in higher dimensions to see if cosmic censorship still holds. The discovery of ring-shaped black holes in five dimensions led researchers to hypothesise that they could break up and give rise to a naked singularity.

What the Cambridge researchers, along with their co-author Pau Figueras from Queen Mary University of London, have found is that if the ring is thin enough, it can lead to the formation of naked singularities.

Using the COSMOS supercomputer, the researchers were able to perform a full simulation of Einstein’s complete theory in higher dimensions, allowing them to not only confirm that these ‘black rings’ are unstable, but to also identify their eventual fate. Most of the time, a black ring collapses back into a sphere, so that the singularity would stay contained within the event horizon. Only a very thin black ring becomes sufficiently unstable as to form bulges connected by thinner and thinner strings, eventually breaking off and forming a naked singularity. New simulation techniques and computer code were required to handle these extreme shapes.

“The better we get at simulating Einstein’s theory of gravity in higher dimensions, the easier it will be for us to help with advancing new computational techniques – we’re pushing the limits of what you can do on a computer when it comes to Einstein’s theory,” said Tunyasuvunakool. “But if cosmic censorship doesn’t hold in higher dimensions, then maybe we need to look at what’s so special about a four-dimensional universe that means it does hold.”

The cosmic censorship conjecture is widely expected to be true in our four-dimensional universe, but should it be disproved, an alternative way of explaining the universe would then need to be identified. One possibility is quantum gravity, which approximates Einstein’s equations far away from a singularity, but also provides a description of new physics close to the singularity.

The COSMOS supercomputer at the University of Cambridge is part of the Science and Technology Facilities Council (STFC) DiRAC HPC Facility.

Inset image: A video of a very thin black ring starting to break up into droplets. In this process a naked singularity is created and weak cosmic censorship is violated. Credit: Pau Figueras, Markus Kunesch, and Saran Tunyasuvunakool

Reference:
Pau Figueras, Markus Kunesch, and Saran Tunyasuvunakool ‘End Point of Black Ring Instabilities and the Weak Cosmic Censorship Conjecture.’ Physical Review Letters (2016). DOI: 10.1103/PhysRevLett.116.071102


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/five-dimensional-black-hole-could-break-general-relativity#sthash.yCJot22O.dpuf

Researchers Identify ‘Neurostatin’ That May Reduce The Risk of Alzheimer’s Disease

Researchers identify ‘neurostatin’ that may reduce the risk of Alzheimer’s disease

source: www.cam.ac.uk

An approved anti-cancer drug successfully targets the first step in the toxic chain reaction that leads to Alzheimer’s disease, suggesting that treatments may be found to lower the risk of developing the neurodegenerative condition.

The body has a variety of natural defences to protect itself against neurodegeneration, but as we age, these defences become progressively impaired and can get overwhelmed.

Michele Vendruscolo

Researchers have identified a drug that targets the first step in the toxic chain reaction leading to the death of brain cells, suggesting that treatments could be developed to protect against Alzheimer’s disease, in a similar way to how statins are able to reduce the risk of developing heart disease.

The drug, which is an approved anti-cancer treatment, has been shown to delay the onset of Alzheimer’s disease, both in a test tube and in nematode worms. It has previously been suggested that statin-like drugs – which are safe and can be taken widely by those at risk of developing disease – might be a prospect, but this is the first time that a potential ‘neurostatin’ has been reported.

When the drug was given to nematode worms genetically programmed to develop Alzheimer’s disease, it had no effect once symptoms had already appeared. But when the drug was given to the worms before any symptoms became apparent, no evidence of the condition appeared, raising the possibility that this drug, or other molecules like it, could be used to reduce the risk of developing Alzheimer’s disease. The results are reported in the journal Science Advances.

By analysing the way the drug, called bexarotene, works at the molecular level, the international team of researchers, from the University of Cambridge, Lund University and the University of Groningen, found that it stops the first step in the molecular cascade that leads to the death of brain cells. This step, called primary nucleation, occurs when naturally occurring proteins in the body fold into the wrong shape and stick together with other proteins, eventually forming thin filament-like structures called amyloid fibrils. This process also creates smaller clusters called oligomers, which are highly toxic to nerve cells and are thought to be responsible for brain damage in Alzheimer’s disease.

“The body has a variety of natural defences to protect itself against neurodegeneration, but as we age, these defences become progressively impaired and can get overwhelmed,” said Professor Michele Vendruscolo of Cambridge’s Department of Chemistry, the paper’s senior author. “By understanding how these natural defences work, we might be able to support them by designing drugs that behave in similar ways.”

For the past two decades, researchers have attempted to develop treatments for Alzheimer’s that could stop the aggregation and proliferation of oligomers. However, these attempts have all failed, in part because there was not a precise knowledge of the mechanics of the disease’s development: Vendruscolo and his colleagues have been working to understand exactly that.

Using a test developed by study co-author Professor Tuomas Knowles, also from the Department of Chemistry, and by Professor Sara Linse, from Lund University, the researchers were able to determine what happens during each stage of the disease’s development, and also what might happen if one of those stages was somehow switched off.

“In order to block protein aggregation, we need accurate understanding of exactly what is happening and when,” said Vendruscolo. “The test that we have developed not only measures the rates of the process as a whole, but also the rates of its specific component sub-processes, so that we can reduce the toxicity of the aggregates rather than simply stopping them forming.”

Johnny Habchi, the first author of the paper, and colleagues assembled a library of more than 10,000 small molecules which interact in some way with amyloid-beta, a molecule that plays a vital role in Alzheimer’s disease. Using the test developed by Knowles and Linse, the researchers first analysed molecules that were either drugs already approved for some other purpose, or drugs developed for Alzheimer’s disease or other similar conditions which had failed clinical trials.

The first successful molecule they identified was bexarotene, which is approved by the US Food and Drug Administration for the treatment of lymphoma. “One of the real steps forward was to take a molecule that we thought could be a potential drug and work out exactly what it does. In this case, what it does is suppress primary nucleation, which is the aim for any neurostatin-type molecule,” said Vendruscolo. “If you stop the process before aggregation has started, you can’t get proliferation.”

One of the key advances of the current work is that by understanding the mechanisms of how Alzheimer’s disease develops in the brain, the researchers were able to target bexarotene to the correct point in the process.

“Even if you have an effective molecule, if you target the wrong step in the process, you can actually make things worse by causing toxic protein assemblies to build up elsewhere,” said study co-author Professor Chris Dobson, Master of St John’s College, University of Cambridge. “It’s like traffic control – if you close a road to try to reduce jams, you can actually make the situation worse if you put the block in the wrong place. It is not necessarily the case that all the molecules in earlier drug trials were ineffective, but it may be that in some cases the timing of the delivery was wrong.”

Earlier studies of bexarotene had suggested that the drug could actually reverse Alzheimer’s symptoms by clearing amyloid-beta aggregates in the brain, which received a great deal of attention. However, the earlier results, which were later called into question, were based on a completely different mode of action – the clearance of aggregates – than the one reported in the current study. By exploiting their novel approach, which enables them to carry out highly quantitative analysis of the aggregation process, the researchers have now shown that compounds such as bexarotene could instead be developed as preventive drugs, because its primary action is to inhibit the crucial first step in the aggregation of amyloid-beta.

“We know that the accumulation of amyloid is a hallmark feature of Alzheimer’s and that drugs to halt this build-up could help protect nerve cells from damage and death,” said Dr Rosa Sancho, Head of Research at Alzheimer’s Research UK. “A recent clinical trial of bexarotene in people with Alzheimer’s was not successful, but this new work in worms suggests the drug may need to be given very early in the disease. We will now need to see whether this new preventative approach could halt the earliest biological events in Alzheimer’s and keep damage at bay in in further animal and human studies.”

Over the next 35 years, the number of people with Alzheimer’s disease is predicted to go from 40 million to 130 million, with 70% of those in middle or low-income countries. “The only way of realistically stopping this dramatic rise is through preventive measures: treating Alzheimer’s disease only after symptoms have already developed could overwhelm healthcare systems around the world.”

The body has a number of natural defences designed to keep proteins in check. But as we get older, these processes can become impaired and get overwhelmed, and some proteins can slip through the safety net, resulting in Alzheimer’s disease and other protein misfolding conditions. While neurostatins are not a cure for Alzheimer’s disease, the researchers say that they could reduce its risk by acting as a backup for the body’s natural defences against misfolding proteins.

“You wouldn’t give statins to someone who had just had a heart attack, and we doubt that giving a neurostatin to an Alzheimer’s patient who could no longer recognise a family member would be very helpful,” said Dobson. “But if it reduces the risk of the initial step in the process, then it has a serious prospect of being an effective preventive treatment.”

But is there hope for those already affected by the disease? The methods that have led to the present advance have enabled the researchers to identify compounds that, rather than preventing the disease, could slow down its progression even when symptoms have become evident. “The next target of our research is also to be able to treat victims of this dreadful disease,” said Vendruscolo.

Reference:
Johnny Habchi et. al. ‘An anti-cancer drug suppresses the primary nucleation reaction that produces the toxic Aβ42 aggregates linked with Alzheimer’s disease.’ Science Advances (2016). DOI: 10.1126/sciadv.1501244


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/researchers-identify-neurostatin-that-may-reduce-the-risk-of-alzheimers-disease#sthash.YXHB75WV.dpuf

Could The Food We Eat Affect Our Genes? Study In Yeast Suggests This May Be The Case

Could the food we eat affect our genes? Study in yeast suggests this may be the case

source: www.cam.ac.uk

Almost all of our genes may be influenced by the food we eat, according to new research published in the journal Nature Microbiology. The study, carried out in yeast – which can be used to model some of the body’s fundamental processes – shows that while the activity of our genes influences our metabolism, the opposite is also true and the nutrients available to cells influence our genes.

In many cases the effects were so strong, that changing a cell’s metabolic profile could make some of its genes behave in a completely different manner

Markus Ralser

The behaviour of our cells is determined by a combination of the activity of its genes and the chemical reactions needed to maintain the cells, known as metabolism. Metabolism works in two directions: the breakdown of molecules to provide energy for the body and the production of all compounds needed by the cells.

Knowing the genome – the complete DNA ‘blueprint’ of an organism – can provide a substantial amount of information about how a particular organism will look. However, this does not give the complete picture: genes can be regulated by other genes or regions of DNA, or by ‘epigenetic’ modifiers – small molecules attached to the DNA that act like switches to turn genes on and off.

Previous studies have suggested that another player in gene regulation may exist: the metabolic network – the biochemical reactions that occur within an organism. These reactions mainly depend on the nutrients a cell has available – the sugars, amino acids, fatty acids and vitamins that are derived from the food we eat.

To examine the scale at which this happens, an international team of researchers, led by Dr Markus Ralser at the University of Cambridge and the Francis Crick Institute, London, addressed the role of metabolism in the most basic functionality of a cell. They did so using yeast cells. Yeast is an ideal model organism for large scale experiments at it is much simpler to manipulate than animal models, yet many of its important genes and fundamental cellular mechanisms are the same as or very similar to those in animals and humans.

The researchers manipulated the levels of important metabolites – the products of metabolic reactions – in the yeast cells and examined how this affected the behaviour of the genes and the molecules they produced. Almost nine out of ten genes and their products were affected by changes in cellular metabolism.

“Cellular metabolism plays a far more dynamic role in the cells than we previously thought,” explains Dr Ralser. “Nearly all of a cell’s genes are influenced by changes to the nutrients they have access to. In fact, in many cases the effects were so strong, that changing a cell’s metabolic profile could make some of its genes behave in a completely different manner.

“The classical view is that genes control how nutrients are broken down into important molecules, but we’ve shown that the opposite is true, too: how the nutrients break down affects how our genes behave.”

The researchers believe that the findings may have wide-ranging implications, including on how we respond to certain drugs. In cancers, for example, tumour cells develop multiple genetic mutations, which change the metabolic network within the cells. This in turn could affect the behaviour of the genes and may explain with some drugs fail to work for some individuals.

“Another important aspect of our findings is a practical one for scientists,” explains says Dr Ralser. “Biological experiments are often not reproducible between laboratories and we often blame sloppy researchers for that. It appears however, that small metabolic differences can change the outcomes of the experiments. We need to establish new laboratory procedures that control better for differences in metabolism. This will help us to design better and more reliable experiments.”

Reference
Alam, MT et al. The metabolic background is a global player in Saccharomyces gene expression epistasis. Nature Microbiology; 1 Feb. DOI: 10.1038/nmicrobiol.2015.30

 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/could-the-food-we-eat-affect-our-genes-study-in-yeast-suggests-this-may-be-the-case#sthash.V8p3of9G.dpuf

Gravitational Waves Detected 100 Years After Einstein’s Prediction

Gravitational waves detected 100 years after Einstein’s prediction

source: www.cam.ac.uk

New window on the universe is opened with the observation of gravitational waves – ripples in spacetime – caused by the collision of two black holes.

I feel incredibly lucky to be part of the team – this discovery will change the way we do astronomy.

Christopher Moore

An international team of scientists have observed ripples in the fabric of spacetime called gravitational waves, arriving at the earth from a cataclysmic event in the distant universe. This confirms a major prediction of Albert Einstein’s 1915 general theory of relativity and opens an unprecedented new window onto the cosmos.

The gravitational waves were detected on 14 September 2015 at 09:51 UK time by both LIGO (Laser Interferometer Gravitational-wave Observatory) detectors in Louisiana and Washington State in the US. They originated from two black holes, each around 30 times the mass of the Sun and located more than 1.3 billion light years from Earth, coalescing to form a single, even more massive black hole.

The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery, published in the journal Physical Review Letters, was made by the LIGO Scientific Collaboration (which includes the GEO Collaboration and the Australian Consortium for Interferometric Gravitational Astronomy) and the Virgo Collaboration using data from the two LIGO detectors.

“The discovery of gravitational waves by the LIGO team is an incredible achievement,” said Professor Stephen Hawking, the Dennis Stanton Avery and Sally Tsui Wong-Avery Director of Research at the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge. “It is the first observation of gravitational waves as predicted by Einstein and will allow us new insights into our universe. The gravitational waves were released from the collision of two black holes, the properties of which are consistent with predictions I made in Cambridge in the 1970s, such as the black hole area and uniqueness theorems. We can expect this observation to be the first of many as LIGO sensitivity increases, keeping us all busy with many further surprises.”

Gravitational waves carry unique information about the origins of our Universe and studying them is expected to provide important insights into the evolution of stars, supernovae, gamma-ray bursts, neutron stars and black holes. However, they interact very weakly with particles and require incredibly sensitive equipment to detect. British and German teams, including researchers from the University of Cambridge, working with US, Australian, Italian and French colleagues as part of the LIGO Scientific Collaboration and the Virgo Collaboration, are using a technique called laser interferometry.

Each LIGO site comprises two tubes, each four kilometres long, arranged in an L-shape. A laser is beamed down each tube to very precisely monitor the distance between mirrors at each end. According to Einstein’s theory, the distance between the mirrors will change by a tiny amount when a gravitational wave passes by the detector. A change in the lengths of the arms of close to 10-19 metres (just one-ten-thousandth the diameter of a proton) can be detected.

According to general relativity, a pair of black holes orbiting around each other lose energy through the emission of gravitational waves, causing them to gradually approach each other over billions of years, and then much more quickly in the final minutes. During the final fraction of a second, the two black holes collide into each other at nearly one-half the speed of light and form a single more massive black hole, converting a portion of the combined black holes’ mass to energy, according to Einstein’s formula E=mc2. This energy is emitted as a final strong burst of gravitational waves. It is these gravitational waves that LIGO has observed.

Independent and widely separated observatories are necessary to verify the direction of the event causing the gravitational waves, and also to determine that the signals come from space and are not from some other local phenomenon.

To ensure absolute accuracy, the consortium of nearly 1,000 scientists from 16 countries spent several months carefully checking and re-checking the data before submitting their findings for publication.

Christopher Moore, a PhD student from Cambridge’s Institute of Astronomy, was part of the discovery team who worked on the data analysis.

“Since September, we’ve known that something was detected, but it took months of checking to confirm that it was actually gravitational waves,” he said. “This team has been looking for evidence of gravitational waves for decades – a huge amount of work has gone into it, and I feel incredibly lucky to be part of the team. This discovery will change the way we do astronomy.”

Over coming years, the Advanced LIGO detectors will be ramped up to full power, increasing their sensitivity to gravitational waves, and in particular allowing more distant events to be measured. With the addition of further detectors, initially in Italy and later in other locations around the world, this first detection is surely just the beginning. UK scientists continue to contribute to the design and development of future generations of gravitational wave detectors.

The UK Minister for Universities and Science, Jo Johnson MP, said: “Einstein’s theories from over a century ago are still helping us to understand our universe. Now that we have the technological capability to test his theories with the LIGO detectors his scientific brilliance becomes all the more apparent. The Government is increasing support for international research collaborations, and these scientists from across the UK have played a vital part in this discovery.”

LIGO was originally proposed as a means of detecting these gravitational waves in the 1980s by Kip Thorne, Caltech’s Richard P. Feynman Professor of Theoretical Physics, Emeritus; Ronald Drever, professor of physics, emeritus also from Caltech; and Rainer Weiss, professor of physics, emeritus, from MIT.

“The description of this observation is beautifully described in the Einstein theory of General Relativity formulated 100 years ago and comprises the first test of the theory in strong gravitation. It would have been wonderful to watch Einstein’s face had we been able to tell him,” said Weiss.

“With this discovery, we humans are embarking on a marvelous new quest: the quest to explore the warped side of the universe—objects and phenomena that are made from warped spacetime. Colliding black holes and gravitational waves are our first beautiful examples,” said Thorne.

The discovery was made possible by the enhanced capabilities of Advanced LIGO, a major upgrade that increases the sensitivity of the instruments compared to the first generation LIGO detectors, enabling a large increase in the volume of the universe probed—and the discovery of gravitational waves during its first observation run.

The US National Science Foundation leads in financial support for Advanced LIGO. Funding organisations in Germany (Max Planck Society), the UK (Science and Technology Facilities Council, STFC) and Australia (Australian Research Council) also have made significant commitments to the project.

Several of the key technologies that made Advanced LIGO so much more sensitive have been developed and tested by the German UK GEO collaboration. Significant computer resources have been contributed by the AEI Hannover Atlas Cluster, the LIGO Laboratory, Syracuse University, and the University of Wisconsin-Milwaukee.

Several universities designed, built, and tested key components for Advanced LIGO: The Australian National University, the University of Adelaide, the University of Florida, Stanford University, Columbia University of New York, and Louisiana State University.

Cambridge has a long-standing involvement in the field of gravitational wave science, and specifically with the LIGO experiment. Until recently these efforts were spearheaded by Dr Jonathan Gair, who left last year for a post at the University of Edinburgh and who has made significant contributions to a wide range of gravitational wave and LIGO science; he is one of the authors on the new paper. Several scientists in Cambridge are current members of the collaboration, including PhD students Christopher Moore and Alvin Chua from the Institute of Astronomy; Professor Anthony Lasenby and PhD student Sonke Hee from the Cavendish Laboratory and the Kavli Institute of Cosmology; and Professor Mike Hobson from the Cavendish Laboratory.

Further members of the collaboration until recently based at Cambridge, include Dr Philip Graff (author on the detection paper) and Dr Farhan Feroz, who, jointly with Mike Hobson and Anthony Lasenby, developed a machine learning method of analysis used currently within LIGO, as well as Dr Christopher Berry (author) and Dr Priscilla Canizares.

These findings will be discussed at next month’s Cambridge Science Festival during theopen afternoon at the Institute of Astronomy.  

Reference:
B. P. Abbott et al. (LIGO Scientific Collaboration and Virgo Collaboration) ‘Observation of Gravitational Waves from a Binary Black Hole Merger.’ Physical Review Letters (2016). DOI: 10.1103/PhysRevLett.116.061102. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/gravitational-waves-detected-100-years-after-einsteins-prediction#sthash.IC9M412v.dpuf

Stopping Tumour Cells Killing Surrounding Tissue May Provide Clue To Fighting Cancer

Stopping tumour cells killing surrounding tissue may provide clue to fighting cancer

source: www.cam.ac.uk

Tumours kill off surrounding cells to make room to grow, according to new research from the University of Cambridge. Although the study was carried out using fruit flies, its findings suggest that drugs to prevent, rather than encourage, cell death might be effective at fighting cancer – contrary to how many of the current chemotherapy drugs work.

It sounds counterintuitive not to encourage cell death as this means you’re not attacking the tumour itself

Eugenia Piddini

The idea that different populations of cells compete within the body, with winners and losers, was discovered in the 1970s and is thought to be a ‘quality control’ mechanism to rid the tissue of damaged or poorly-performing cells. With the discovery that genes involved in cancer promote this process, scientists have speculated that so-called ‘cell competition’ might explain how tumours grow within our tissues.

Now, researchers at the Wellcome Trust/Cancer Research UK Gurdon Institute, University of Cambridge, have used fruit flies genetically manipulated to develop intestinal tumours to show for the first time that as the tumour grows and its cells proliferate, it kills off surrounding healthy cells, making space in which to grow. The results of the study, funded by Cancer Research UK, are published in the journal Current Biology.

Image: Tumour cells (green) growing in the intestine of a fruit fly. Credit: Golnar Kolahgar

Dr Eugenia Piddini, who led the research, believes the finding may answer one of the longstanding questions about cancer. “We know that as cancer spreads through the body – or ‘metastasises’ – it can cause organ failure,” she says. “Our finding suggests a possible explanation for this: if the tumour kills surrounding cells, there will come a point where there are no longer enough healthy cells for the organ to continue to function.”

The cancer cells encourage a process known as apoptosis, or ‘cell death’, in the surrounding cells, though the mechanism by which this occurs is currently unclear and will be the subject of further research.

By manipulating genetic variants within the surrounding cells to resist apoptosis, the researchers were able to contain the tumour and prevent its spread. This suggests drugs that carry out the same function – inhibiting cell death – may provide an effective way to prevent the spread of some types of cancer. This is counter to the current approach to fighting cancer: most current drugs used in chemotherapy encourage cell death as a way of destroying the tumour, though this can cause ‘collateral damage’ to healthy cells, hence why chemotherapy patients often become very sick during treatment.

In fact, some drugs that inhibit cell death are already being tested in clinical trials to treat conditions such as liver damage; if proven to be safe, they may provide options for potential anti-cancer drugs. However, further research is needed to confirm that this approach will be suitable for treating cancer.

“It sounds counterintuitive not to encourage cell death as this means you’re not attacking the tumour itself,” says Dr Eugenia Piddini. “But if we think of it like an army fighting a titan, it makes sense that if you protect your soldiers and stop them dying, you stand a better chance of containing – and even killing – your enemy.”

The work, which was carried out by postdoctoral researcher Saskia Suijkerbuijk and colleagues in the Piddini group, used fruit flies because they are much simpler organisms to study than mammals; however, many of the genes being studied are conserved across species – in other words, the genes, or genes with an identical or very similar function, are found in both the fruit fly and mammals.

Dr Alan Worsley, senior science information officer at Cancer Research UK, said: “Tumours often need to elbow healthy cells out of the way in order to grow. This intriguing study in fruit flies suggests that if researchers can turn off the signals that tell healthy cells to die, they could act as a barrier that boxes cancer cells in and stunts their growth. We don’t yet know if the same thing would work in patients, but it highlights an ingenious new approach that could help to keep early stage cancers in check.”

Reference
Suijkerbuijk, SJE et al. Cell competition drives the growth of intestinal adenomas in Drosophila. Current Biology; 22 Feb 2016. dx.doi.org/10.1016/j.cub.2015.12.043


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

The Fitzwilliam Museum is 200 Today

The Fitzwilliam Museum is 200 today

Today, one of the great collections of art in the UK celebrates its bicentenary. Two hundred years to the day of his death, the Fitzwilliam Museum has revealed previously unknown details of the life of its mysterious founder, Richard 7th Viscount Fitzwilliam of Merrion.

The gift Viscount Fitzwilliam left to the nation was one of the most important of his age

Tim Knox

Research for a new book has shown how his beloved library may have contributed to his death, and how his passion for music led him to the love of his life: a French dancer with whom he had two children, ‘Fitz’ and ‘Billy’.

The Fitzwilliam Museum: a History is written by Lucilla Burn, Assistant Director for Collections at the Fitzwilliam. She said: “Lord Fitzwilliam’s life has been described as ‘deeply obscure’. Many men of his class and period, who sought neither fame nor notoriety, nor wrote copious letters or diaries, do not leave a conspicuous record. But by going through the archives and letters that relate to him, for the first time we can paint a fuller picture of his history, including aspects of his life that have previously been unknown, even to staff here at the Fitzwilliam.”

Lord Fitzwilliam died on the 4th of February 1816, and founded the Fitzwilliam Museum through the bequest to the University of Cambridge of his splendid collection of art, books and manuscripts, along with £100,000 to build the Museum. This generous gift began the story of one of the finest museums in Britain, which now houses over half a million artworks and antiquities.

Other than his close connection to Cambridge and his love of art and books, a motivation for Fitzwilliam’s bequest may have been his lack of legitimate heirs. The new details of his mistress help to explain why he never married.

In 1761 Richard Fitzwilliam entered Trinity Hall, Cambridge, and in 1763 his Latin ode, Ad Pacem, was published in a volume of loyal addresses to George III printed by the University of Cambridge. He made a strong impression on his tutor, the fiercely ambitious Samuel Hallifax, who commissioned Joseph Wright of Derby to paint a fine portrait of Fitzwilliam on his graduation with an MA degree in 1764.

Fitzwilliam’s studies continued after Cambridge; he travelled widely on the continent, perfecting his harpsichord technique in Paris with Jacques Duphly, an eminent composer, teacher and performer. A number of Fitzwilliam’s own harpsichord compositions have survived, indicating he was a gifted musician.
But from 1784 he was also drawn to Paris by his passionate attachment to Marie Anne Bernard, a dancer at the Opéra whose stage name was Zacharie. With Zacharie, Fitzwilliam fathered three children, two of whom survived infancy – little boys known to their parents as ‘Fitz’ and ‘Billy’. How the love affair ended is unknown, but its fate was clouded, if not doomed, by the French Revolution.

We do not know what happened to Zacharie after her last surviving letter, written to Lord Fitzwilliam late in December 1790. Her health was poor, so it is possible that she died in France. However, the elder son, ‘Fitz’, Henry Fitzwilliam Bernard, his wife Frances and their daughter Catherine were alive and living in Richmond with Lord Fitzwilliam at the time of the latter’s death in 1816. It is not known what happened to ‘Billy’.

At the age of seventy, early in August 1815, Lord Fitzwilliam fell from a ladder in his library and broke his knee. This accident may have contributed to his death the following spring; and on 18 August that year Fitzwilliam drew up his last will and testament. Over the course of his life he had travelled extensively in Europe; by the time of his death he had amassed around 144 paintings, including masterpieces by Titian, Veronese and Palma Vecchio, 300 carefully ordered albums of Old Master prints, and a magnificent library containing illuminated manuscripts, musical autographs by Europe’s greatest composers and 10,000 fine printed books.

His estates were left to his cousin’s son, George Augustus Herbert, eleventh Earl of Pembroke and eighth Earl of Montgomery. But he also carefully provided for his relatives and dearest friends. The family of Fitzwilliam’s illegitimate son, Henry Fitzwilliam Bernard (‘Fitz’), including his wife and daughter, received annuities for life totalling £2,100 a year.
On his motivation for leaving all his works of art to the University, he wrote: “And I do hereby declare that the bequests so by me made to the said Chancellor Masters and Scholars of the said University are so made to them for the purpose of promoting the Increase of Learning and the other great objects of that Noble Foundation.”

Fitzwilliam Museum Director Tim Knox said: “The gift Viscount Fitzwilliam left to the nation was one of the most important of his age. This was the period when public museums were just beginning to emerge. Being a connoisseur of art, books and music, our Founder saw the importance of public collections for the benefit of all. But we are also lucky that his life circumstances enabled him to do so – had there been a legitimate heir, he might not have been able to give with such liberality. From the records we have discovered he appears to have been as generous as he was learned: he arranged music concerts to raise funds for charity, and helped many people escaping the bloodiest moments of the French Revolution. We are delighted to commemorate our Founder in our bicentenary year.”

Exhibitions and events for the Fitzwilliam Museum’s Bicentenary will be taking place throughout 2016. These include two key exhibitions opening in February, a retrospective of its history, Celebrating the First 200 Years: The Fitzwilliam Museum 1816 – 2016, and a major exhibition of Egyptian antiquities, Death on the Nile: Uncovering the afterlife of ancient Egypt. For more information visit www.fitzmuseum.cam.ac.uk.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Syrian Aid: Lack of Evidence For ‘Interventions That Work’, Say Researchers

Syrian aid: lack of evidence for ‘interventions that work’, say researchers

source: www.cam.ac.uk

The lack of an evidence base in the donor-funded response to Syrian migrant crisis means funds may be allocated to ineffective interventions, say researchers, who call on funders and policymakers in London for this week’s Syrian Donor Conference to insist on evaluation as a condition of aid.

A focus on health and health services is notably absent in the donor conference agenda yet it is a fundamental determinant on the success of education and livelihoods policies

Adam Coutts

In the fifth year of the Syrian refugee crisis, donors and humanitarian agencies still remain unsure about which policies and interventions have been most effective, and continue to rely on a largely reactive response, say a group of researchers, aid workers and Syrian medical professionals.

Response approaches to date have often been short-termist, sometimes duplicating work and have very little evidence of effectiveness or impact, they say.

As national leaders and UN delegates gather in London today for the Support Syria Donor Conference, members of the Syrian Public Health Network warn that unless aid is provided on condition of evidence-gathering and transparency so funding can be directed to interventions that work, the health, education and livelihoods of refugees will continue to deteriorate.

They caution that Syrians in neighbouring countries such as Lebanon and Jordan – where services are stretched to breaking point – will suffer the most from ineffective interventions unless governments and NGOs of wealthy nations to do more to link allocation of donor funds to evidence, something that Network members have highlighted in a briefing for the UK’s Department for International Development.

“A focus on health and health services is notably absent in the donor conference agenda yet it is a fundamental determinant on the success of education and livelihoods policies,” said Dr Adam Coutts, Cambridge University researcher and member of the Syria Public Health Network.

“What funding there is for refugee healthcare risks disappearing unless governments insist on an evidence basis for aid allocation, similar to that expected in domestic policy-making.

“It is estimated that there are now over 4.3 million Syrian refugees in neighbouring frontline countries, and over half these people are under the age of 18. This level of displacement is unprecedented and given how short funds are, we need to be sure that programmes work,” said Coutts, from Cambridge’s Department of Politics and International Studies.

“New ideas and approaches need to be adopted in order to reduce the massive burdens on neighbouring frontline states.”

Researchers say that the health response should do more to address the so-called ‘non-communicable diseases’ which ultimately cause more deaths: slow, silent killers such as diabetes, heart disease and, in particular, mental disorders. This means moving towards the development of universal health care systems in the region and building new public health services.

The calls for more evidence come on the back of an article published last week in theJournal of the Royal Society of Medicine, in which members of the Syria Public Health Network (SPHN) address the response to mental disorders among displaced Syrians.

Clinics in some camps in Turkey and Lebanon report almost half of occupants suffering from high levels of psychological distress. However, many Syrians in neighbouring countries live outside the camps – up to 80% in Jordan, for example – which means cases are unreported.

In Lebanon, despite political commitment to mental health, there are just 71 psychiatrists, mostly in Beirut.

“The implementation of short-term mental health interventions which often lack culturally relevant or practically feasible assessment tools risk diverting funds away from longer term, evidence based solutions,” said Coutts.

Moreover, a shortage of Syrian mental health professionals – less than 100 prior to the conflict has now fallen to less than 60 – is worsened by some neighbouring countries preventing Syrian doctors of any specialism from practising. Along with Physicians for Human Rights, SPHN members are calling for restrictions to be lifted on practising licenses for displaced Syrian health professionals.

“To date Syrian medical workers in Lebanon and Jordan are a largely untapped workforce who are ready to work and help with the response. However, due to labour laws and the dominance of private health service providers it is very difficult if not impossible for them to work legally,” said SPHN member Dr Aula Abbara.

Emerging evidence from the Syrian crisis, as well as evidence from previous conflicts, is pointing to psychological treatments which show some effectiveness:

Pilot studies with refugees in Turkish camps using ‘telemental’ projects, the delivery of psychiatric care through telecommunications, suggest that such techniques are effective in supporting healthcare professionals on the ground.

The ‘teaching recovery techniques’ method is designed to boost children’s capacity to cope with the psychological aftermath of war. These techniques have been used in communities in the aftermath of major natural disasters and conflicts, and have shown promise.

While SPHN members caution that adequate testing of these interventions is required, they argue that this is precisely the point: more evidence of what works.

Added Coutts: “A more scientific approach is needed so that precious and increasingly scarce financial aid is put to the most effective use possible. At the moment, NGOs and governments are not making sufficient reference to evidence in determining health, education and labour market policies for the largest displacement of people since World War Two.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Modelling How The Brain Makes Complex Decisions

Modelling how the brain makes complex decisions

source: www.cam.ac.uk

Researchers have built the first biologically realistic mathematical model of how the brain plans and learns when faced with a complex decision-making process.

By combining planning and learning into one coherent model, we’ve made what is probably the most comprehensive model of complex decision-making to date

Johannes Friedrich

Researchers have constructed the first comprehensive model of how neurons in the brain behave when faced with a complex decision-making process, and how they adapt and learn from mistakes.

The mathematical model, developed by researchers from the University of Cambridge, is the first biologically realistic account of the process, and is able to predict not only behaviour, but also neural activity. The results, reported in The Journal of Neuroscience, could aid in the understanding of conditions from obsessive compulsive disorder and addiction to Parkinson’s disease.

The model was compared to experimental data for a wide-ranging set of tasks, from simple binary choices to multistep sequential decision making. It accurately captures behavioural choice probabilities and predicts choice reversal in an experiment, a hallmark of complex decision making.

Our decisions may provide immediate gratification, but they can also have far-reaching consequences, which in turn depend on several other actions we have already made or will make in the future. The trouble that most of us have is how to take the potential long-term effects of a particular decision into account, so that we make the best choice.

There are two main types of decisions: habit-based and goal-based. An example of a habit-based decision would be a daily commute, which is generally the same every day. Just as certain websites are cached on a computer so that they load faster the next time they are visited, habits are formed by ‘caching’ certain behaviours so that they become virtually automatic.

An example of a goal-based decision would be a traffic accident or road closure on that same commute, forcing the adoption of a different route.

“A goal-based decision is much more complicated from a neurobiological point of view, because there are so many more variables – it involves exploring a branching set of possible future situations,” said the paper’s first author Dr Johannes Friedrich of Columbia University, who conducted the work while a postdoctoral researcher in Cambridge’s Department of Engineering. “If you think about a detour on your daily commute, you need to make a separate decision each time you reach an intersection.”

Habit-based decisions have been thoroughly studied by neuroscientists and are fairly well-understood in terms of how they work at a neural level. The mechanisms behind goal-based decisions however, remain elusive.

Now, Friedrich and Dr Máté Lengyel, also from Cambridge’s Department of Engineering, have built a biologically realistic solution to this computational problem. The researchers have shown mathematically how a network of neurons, when connected appropriately, can identify the best decision in a given situation and its future cumulative reward.

“Constructing these sorts of models is difficult because the model has to plan for all possible decisions at any given point in the process, and computations have to be performed in a biologically plausible manner,” said Friedrich. “But it’s an important part of figuring out how the brain works, since the ability to make decisions is such a core competence for both humans and animals.”

The researchers also found that for making a goal-based decision, the synapses which connect the neurons together need to ‘embed’ the knowledge of how situations follow on from each other, depending on the actions that are chosen, and how they result in immediate reward.

Crucially, they were also able to show in the same model how synapses can adapt and re-shape themselves depending on what did or didn’t work previously, in the same way that it has been observed in human and animal subjects.

“By combining planning and learning into one coherent model, we’ve made what is probably the most comprehensive model of complex decision-making to date,” said Friedrich. “What I also find exciting is that figuring out how the brain may be doing it has already suggested us new algorithms that could be used in computers to solve similar tasks,” added Lengyel.

The model could be used to aid in the understanding of a range of conditions. For instance, there is evidence for selective impairment in goal-directed behavioural control in patients with obsessive compulsive disorder, which forces them to rely instead on habits. Deep understanding of the underlying neural processes is important as impaired decision making has also been linked to suicide attempts, addiction and Parkinson’s disease.

Reference:
Johannes Friedrich and Máté Lengyel. ‘Goal-Directed Decision Making with Spiking Neurons.’ The Journal of Neuroscience (2016). DOI: 10.1523/JNEUROSCI.2854-15.2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/modelling-how-the-brain-makes-complex-decisions#sthash.V8eJGMPe.dpuf

Graphene Shown To Safely Interact With Neurons In The Brain

Graphene shown to safely interact with neurons in the brain

source: www.cam.ac.uk

Researchers have shown that graphene can be used to make electrodes that can be implanted in the brain, which could potentially be used to restore sensory functions for amputee or paralysed patients, or for individuals with motor disorders such as Parkinson’s disease.

We are just at the tip of the iceberg when it comes to the potential of graphene and related materials in bio-applications and medicine.

Andrea Ferrari

Researchers have successfully demonstrated how it is possible to interface graphene – a two-dimensional form of carbon – with neurons, or nerve cells, while maintaining the integrity of these vital cells. The work may be used to build graphene-based electrodes that can safely be implanted in the brain, offering promise for the restoration of sensory functions for amputee or paralysed patients, or for individuals with motor disorders such as epilepsy or Parkinson’s disease.

The research, published in the journal ACS Nano, was an interdisciplinary collaboration coordinated by the University of Trieste in Italy and the Cambridge Graphene Centre.

Previously, other groups had shown that it is possible to use treated graphene to interact with neurons. However the signal to noise ratio from this interface was very low. By developing methods of working with untreated graphene, the researchers retained the material’s electrical conductivity, making it a significantly better electrode.

“For the first time we interfaced graphene to neurons directly,” said Professor Laura Ballerini of the University of Trieste in Italy. “We then tested the ability of neurons to generate electrical signals known to represent brain activities, and found that the neurons retained their neuronal signalling properties unaltered. This is the first functional study of neuronal synaptic activity using uncoated graphene based materials.”

Our understanding of the brain has increased to such a degree that by interfacing directly between the brain and the outside world we can now harness and control some of its functions. For instance, by measuring the brain’s electrical impulses, sensory functions can be recovered. This can be used to control robotic arms for amputee patients or any number of basic processes for paralysed patients – from speech to movement of objects in the world around them. Alternatively, by interfering with these electrical impulses, motor disorders (such as epilepsy or Parkinson’s) can start to be controlled.

Scientists have made this possible by developing electrodes that can be placed deep within the brain. These electrodes connect directly to neurons and transmit their electrical signals away from the body, allowing their meaning to be decoded.

However, the interface between neurons and electrodes has often been problematic: not only do the electrodes need to be highly sensitive to electrical impulses, but they need to be stable in the body without altering the tissue they measure.

Too often the modern electrodes used for this interface (based on tungsten or silicon) suffer from partial or complete loss of signal over time. This is often caused by the formation of scar tissue from the electrode insertion, which prevents the electrode from moving with the natural movements of the brain due to its rigid nature.

Graphene has been shown to be a promising material to solve these problems, because of its excellent conductivity, flexibility, biocompatibility and stability within the body.

Based on experiments conducted in rat brain cell cultures, the researchers found that untreated graphene electrodes interfaced well with neurons. By studying the neurons with electron microscopy and immunofluorescence the researchers found that they remained healthy, transmitting normal electric impulses and, importantly, none of the adverse reactions which lead to the damaging scar tissue were seen.

According to the researchers, this is the first step towards using pristine graphene-based materials as an electrode for a neuro-interface. In future, the researchers will investigate how different forms of graphene, from multiple layers to monolayers, are able to affect neurons, and whether tuning the material properties of graphene might alter the synapses and neuronal excitability in new and unique ways. “Hopefully this will pave the way for better deep brain implants to both harness and control the brain, with higher sensitivity and fewer unwanted side effects,” said Ballerini.

“We are currently involved in frontline research in graphene technology towards biomedical applications,” said Professor Maurizio Prato from the University of Trieste. “In this scenario, the development and translation in neurology of graphene-based high-performance biodevices requires the exploration of the interactions between graphene nano- and micro-sheets with the sophisticated signalling machinery of nerve cells. Our work is only a first step in that direction.”

“These initial results show how we are just at the tip of the iceberg when it comes to the potential of graphene and related materials in bio-applications and medicine,” said Professor Andrea Ferrari, Director of the Cambridge Graphene Centre. “The expertise developed at the Cambridge Graphene Centre allows us to produce large quantities of pristine material in solution, and this study proves the compatibility of our process with neuro-interfaces.”

The research was funded by the Graphene Flagship, a European initiative which promotes a collaborative approach to research with an aim of helping to translate graphene out of the academic laboratory, through local industry and into society.

Reference:
Fabbro A., et. al. ‘Graphene-Based Interfaces do not Alter Target Nerve Cells.’ ACS Nano (2016). DOI: 10.1021/acsnano.5b05647


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Changes to NHS Policy Unlikely to reduce Emergency Hospital Admissions

Changes to NHS policy unlikely to reduce emergency hospital admissions

source: www.cam.ac.uk

Recent changes to UK healthcare policy intended to reduce the number of emergency hospital admissions are unlikely to be effective, according to a study published in the British Medical Journal.

Too often government policy is based on wishful thinking rather than on hard evidence on what is actually likely to work, and new interventions often aren’t given enough time to bed in to know whether they’re really working

Martin Roland

Alternative approaches are therefore needed to tackle the continuing rise of costly emergency admissions, conclude researchers from the Health Research Board Centre for Primary Care Research at the Royal College of Surgeons in Ireland (RCSI) in collaboration with the University of Cambridge.

Recently introduced changes to GPs’ pay mean that they are now incentivised to identify people in their practice thought to be at high-risk of future emergency admission and offer extra support in the form of ‘case-management’, including personalised care plans. However, the researchers show that emergency admission is a difficult outcome to predict reliably. Electronic tools have been developed to identify people at high-risk but these tools will, at best, only identify a minority of people who will actually be admitted to hospital. In addition, the researchers found that there is currently little evidence that implementing case management for people identified as high-risk actually reduces the risk of future emergency admission.

The authors suggest alternative options that may have more impact on the use of hospital beds for patients following an emergency admission, based on the research evidence in this area.

One recommendation is to focus on reducing the length of time that patients are in hospital – though this depends on resources being available in the community to support patients when they are discharged. Second, a significant proportion of all emergency admissions are re-admissions to hospital following discharge and research evidence supports interventions to reduce some of these admissions, especially when several members of the healthcare team (e.g. doctor, nurse, social worker, case manager) are involved in helping patients manage the transition from hospital to home.

A third option is to focus on certain medical conditions, such as pneumonia, known to cause avoidable emergency admissions and more likely to respond to interventions in primary care. Finally, the authors suggest that policy efforts should be concentrated in more deprived areas where people are more likely to suffer with multiple chronic medical conditions and are more likely to be admitted to hospital.

Lead author and Health Research Board Research Fellow Dr Emma Wallace from the RCSI said: “Reducing emergency admissions is a popular target when trying to curtail spiralling healthcare costs. However, only a proportion of all emergency admissions are actually avoidable and it’s important that policy efforts to reduce emergency admissions are directed where they are most likely to succeed.

“Our analysis indicates that current UK healthcare policy targeting people identified as high risk in primary care for case management is unlikely to be effective and alternative options need to be considered.”

Professor Martin Roland, senior author and Professor of Health Services Research at the University of Cambridge, added: “Too often government policy is based on wishful thinking rather than on hard evidence on what is actually likely to work, and new interventions often aren’t given enough time to bed in to know whether they’re really working.

“Reducing the number of people who are readmitted to hospital, and reducing the length of time that people stay in hospital are both likely to have a bigger effect on hospital bed use than trying to predict admission in the population. Both of these need close working between primary and secondary care and between health and social care.”

Reference
Wallace, E et al. Reducing emergency admissions through community-based interventions: are uncertainties in the evidence reflected in health policy? BMJ; 28 Jan 2016


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/changes-to-nhs-policy-unlikely-to-reduce-emergency-hospital-admissions#sthash.bP74Lhd1.dpuf

Making Operating Systems Safer and Faster With ‘Unikernels’

Making operating systems safer and faster with ‘unikernels’

source: www.cam.ac.uk

Technology to improve the security, speed and scale of data processing in age of the Internet of Things is being developed by a Cambridge spin-out company.

This acquisition shows the power of open source development to have impact and to be commercially successful.

Andy Hopper

Specialised computer software components to improve the security, speed and scale of data processing in cloud computing are being developed by a University of Cambridge spin-out company. The company, Unikernel Systems, which was formed by staff and postdoctoral researchers at the University Computer Laboratory, has recently been acquired by San-Francisco based software company Docker Inc.

Unikernels are small, potentially transient computer modules specialised to undertake a single task at the point in time when it is needed. Because of their reduced size, they are far more secure than traditional operating systems, and can be started up and shut down quickly and cheaply, providing flexibility and further security.

They are likely to become increasingly used in applications where security and efficiency are vital, such as systems storing personal data and applications for the so-called Internet of Things (IoT) – internet-connected appliances and consumer products.

“Unikernels provide the means to run the same application code on radically different environments from the public cloud to IoT devices,” said Dr Richard Mortier of the Computer Laboratory, one of the company’s advisors. “This allows decisions about where to run things to be revisited in the light of experience – providing greater flexibility and resilience. It also means software on those IoT devices is going to be a lot more reliable.”

Recent years have seen a huge increase in the amount of data that is collected, stored and processed, a trend that will only continue as increasing numbers of devices are connected to the internet. Most commercial data storage and processing now takes place within huge datacentres run by specialist providers, rather than on individual machines and company servers; the individual elements of this system are obscured to end users within the ‘cloud’. One of the technologies that has been instrumental in making this happen is virtual machines.

Normally, a virtual machine (VM) runs just like a real computer, with its own virtual operating system – just as your desktop computer might run Windows. However, a single real machine can run many VMs concurrently. VMs are general purpose, able to handle a wide range of jobs from different types of user, and capable of being moved across real machines within datacentres in response to overall user demand. The University’s Computer Laboratory started research on virtualisation in 1999, and the Xen virtual machine monitor that resulted now provides the basis for much of the present-day cloud.

Although VMs have driven the development of the cloud (and greatly reduced energy consumption), their inherent flexibility can come at a cost if their virtual operating systems are the generic Linux or Windows systems. These operating systems are large and complex, they have significant memory footprints, and they take time to start up each time they are required. Security is also an issue, because of their relatively large ‘attack surface’.

Given that many VMs are actually used to undertake a single function, (e.g. acting as a company database), recent research has shifted to minimising complexity and improving security by taking advantage of the narrow functionality. And this is where unikernels come in.

Researchers at the Computer Laboratory started restructuring VMs into flexible modular components in 2009, as part of the RCUK-funded MirageOS project. These specialised modules – or unikernels – are in effect the opposite of generic VMs. Each one is designed to undertake a single task; they are small, simple and quick, using just enough code to enable the relevant application or process to run (about 4% of a traditional operating system according to one estimate).

The small size of unikernels also lends considerable security advantages, as they present a much smaller ‘surface’ to malicious attack, and also enable companies to separate out different data processing tasks in order to limit the effects of any security breach that does occur. Given that resource use within the cloud is metered and charged, they also provide considerable cost savings to end users.

By the end of last year, the unikernel technology arising from MirageOS was sufficiently advanced that the team, led by Dr. Anil Madhavapeddy, decided to found a start-up company. The company, Unikernel Systems, was recently acquired by San Francisco-based Docker Inc. to accelerate the development and broad adoption of the technology, now envisaged as a critical element in the future of the Internet of Things.

“This brings together one of the most significant developments in operating systems technology of recent years, with one of the most dynamic startups that has already revolutionised the way we use cloud computing. This link-up will truly allow us all to “rethink cloud infrastructure”, said Balraj Singh, co-founder and CEO of Unikernel Systems.

“This acquisition shows that the Computer Laboratory continues to produce innovations that find their way into mainstream developments. It also shows the power of open source development to have impact and to be commercially successful”, said Professor Andy Hopper, Head of the University of Cambridge Computer Laboratory.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

How Many Ways Can You Arrange 128 Tennis Balls? Researchers Solve an Apparently Impossible Problem

How many ways can you arrange 128 tennis balls? Researchers solve an apparently impossible problem

source: www.cam.ac.uk

A bewildering physics problem has apparently been solved by researchers, in a study which provides a mathematical basis for understanding issues ranging from predicting the formation of deserts, to making artificial intelligence more efficient.

The brute force way of doing this would be to keep changing the system and recording the configurations. Unfortunately, it would take many lifetimes before you could record it all. Also, you couldn’t store them, because there isn’t enough matter in the universe.

Stefano Martiniani

In research carried out at the University of Cambridge, a team developed a computer program that can answer this mind-bending puzzle: Imagine that you have 128 soft spheres, a bit like tennis balls. You can pack them together in any number of ways. How many different arrangements are possible?

The answer, it turns out, is something like 10250 (1 followed by 250 zeros). The number, also referred to as ten unquadragintilliard, is so huge that it vastly exceeds the total number of particles in the universe.

Far more important than the solution, however, is the fact that the researchers were able to answer the question at all. The method that they came up with can help scientists to calculate something called configurational entropy – a term used to describe how structurally disordered the particles in a physical system are.

Being able to calculate configurational entropy would, in theory, eventually enable us to answer a host of seemingly impossible problems – such as predicting the movement of avalanches, or anticipating how the shifting sand dunes in a desert will reshape themselves over time.

These questions belong to a field called granular physics, which deals with the behaviour of materials such as snow, soil or sand. Different versions of the same problem, however, exist in numerous other fields, such as string theory, cosmology, machine learning, and various branches of mathematics. The research shows how questions across all of those disciplines might one day be addressed.

Stefano Martiniani, a Gates Scholar at St John’s College, University of Cambridge, who carried out the study with colleagues in the Department of Chemistry, explained: “The problem is completely general. Granular materials themselves are the second most processed kind of material in the world after water and even the shape of the surface of the Earth is defined by how they behave.”

“Obviously being able to predict how avalanches move or deserts may change is a long, long way off, but one day we would like to be able to solve such problems. This research performs the sort of calculation we would need in order to be able to do that.”

At the heart of these problems is the idea of entropy – a term which describes how disordered the particles in a system are. In physics, a “system” refers to any collection of particles that we want to study, so for example it could mean all the water in a lake, or all the water molecules in a single ice cube.

When a system changes, for example because of a shift in temperature, the arrangement of these particles also changes. For example, if an ice cube is heated until it becomes a pool of water, its molecules become more disordered. Therefore, the ice cube, which has a tighter structure, is said to have lower entropy than the more disordered pool of water.

At a molecular level, where everything is constantly vibrating, it is often possible to observe and measure this quite clearly. In fact, many molecular processes involve a spontaneous increase in entropy until they reach a steady equilibrium.

In granular physics, however, which tends to involve materials large enough to be seen with the naked eye, change does not happen in the same way. A sand dune in the desert will not spontaneously change the arrangement of its particles (the grains of sand). It needs an external factor, like the wind, for this to happen.

This means that while we can predict what will happen in many molecular processes, we cannot easily make equivalent predictions about how systems will behave in granular physics. Doing so would require us to be able to measure changes in the structural disorder of all of the particles in a system – its configurational entropy.

To do that, however, scientists need to know how many different ways a system can be structured in the first place. The calculations involved in this are so complicated that they have been dismissed as hopeless for any system involving more than about 20 particles. Yet the Cambridge study defied this by carrying out exactly this type of calculation for a system, modelled on a computer, in which the particles were 128 soft spheres, like tennis balls.

“The brute force way of doing this would be to keep changing the system and recording the configurations,” Martiniani said. “Unfortunately, it would take many lifetimes before you could record it all. Also, you couldn’t store the configurations, because there isn’t enough matter in the universe with which to do it.”

Instead, the researchers created a solution which involved taking a small sample of all possible configurations and working out the probability of them occurring, or the number of arrangements that would lead to those particular configurations appearing.

Based on these samples, it was possible to extrapolate not only in how many ways the entire system could therefore be arranged, but also how ordered one state was compared with the next – in other words, its overall configurational entropy.

Martiniani added that the team’s problem-solving technique could be used to address all sorts of problems in physics and maths. He himself is, for example, currently carrying out research into machine learning, where one of the problems is knowing how many different ways a system can be wired to process information efficiently.

“Because our indirect approach relies on the observation of a small sample of all possible configurations, the answers it finds are only ever approximate, but the estimate is a very good one,” he said. “By answering the problem we are opening up uncharted territory. This methodology could be used anywhere that people are trying to work out how many possible solutions to a problem you can find.”

The paper, Turning intractable counting into sampling: computing the configurational entropy of three-dimensional jammed packings, is published in the journal, Physical Review E.

Stefano Martiniani is a St John’s Benefactor Scholar and Gates Scholar at the University of Cambridge.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

 

Cambridge Joins Consortium to Launch £40 Million Apollo Therapeutics Fund

Cambridge joins consortium to launch £40 million Apollo Therapeutics Fund

source: www.cam.ac.uk

Three global pharmaceutical companies and the technology transfer offices of three world-leading universities – Imperial College London, University College London and the University of Cambridge – have joined forces with a combined £40 million to create the Apollo Therapeutics Fund.

This new joint venture will support the translation of ground-breaking academic science from within these universities into innovative new medicines for a broad range of diseases.

Each of the three industry partner companies – AstraZeneca UK Limited, Glaxo Group Limited and Johnson & Johnson Innovation-JJDC, Inc. – will contribute £10 million over 6 years to the venture. The technology transfer offices of the three university partners – Imperial Innovations Group plc, Cambridge Enterprise Ltd and UCL Business plc – will each contribute a further £3.3 million.

The aim of Apollo is to advance academic pre-clinical research from these universities to a stage at which it can either be taken forward by one of the industry partners following an internal bidding process or be out-licensed. The three industry partners will also provide R&D expertise and additional resources to assist with the commercial evaluation and development of projects.

Drug development is extremely complex, costly and lengthy; currently only around 10 percent of therapies entering clinical trials reach patients as medicines. By combining funding for promising early-stage therapeutics from leading UK universities with a breadth of industry expertise, Apollo aims to share the risk and accelerate the development of important new treatments, while also reducing the cost.

Dr Ian Tomlinson, former Senior Vice President, Worldwide Business Development and Biopharmaceuticals R&D, for GSK and founder & Chief Scientific Officer of Domantis Limited, has been appointed Chairman of the Apollo Therapeutics Investment Committee (AIC). Comprising representatives from the six partners, the AIC will make all investment decisions.

The AIC will be advised by an independent Drug Discovery Team of ex-industry scientists who will be employed by Apollo to work with the universities and their technology transfer offices to identify and shape projects to bring forward for development. All therapy areas and modalities, including small molecules, peptides, proteins, antibodies, cell and gene therapies will be considered.

Apollo will be based at Stevenage Bioscience Catalyst. Once funded, projects will be progressed by the Drug Discovery Team alongside the university investigators, with other external resources and also in-kind resources from the industry partners as appropriate. For successful projects, the originating university and technology transfer office will receive a percentage of future commercial revenues or out-licensing fees and the remainder will be divided amongst all the Apollo partners.

Dr Tomlinson said: “This is the first time that three global pharmaceutical companies and the technology transfer offices of three of the world’s top ten universities have come together to form a joint enterprise of this nature, making the Apollo Therapeutics Fund a truly innovative venture.

“Apollo provides an additional source of early stage funding that will allow more therapeutics projects within the three universities to realise their full potential. The active participation of the industry partners will also mean that projects will be shaped at a very early stage to optimise their suitability for further development.

“The Apollo Therapeutics Fund should benefit the UK economy by increasing the potential for academic research to be translated into new medicines for patients the world over.”

Iain Thomas, Head of Life Sciences at Cambridge Enterprise, added: “Efficiently bringing together drug discovery expertise, potential customers, funding and project management, along with rapid decision making and execution through the Apollo Therapeutics Fund is a unique and extraordinarily exciting and valuable proposition for any academic or company that wants to see early stage ground breaking therapeutic technology progress to the clinic for patient benefit and economic return.”

Adapted from a press release by Apollo Therapeutics.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/cambridge-joins-consortium-to-launch-ps40-million-apollo-therapeutics-fund#sthash.OvTFfvJB.dpuf

How To Get Teams To Share Information

How to get teams to share information

source: www.cam.ac.uk

Are you happy to share information with your colleagues? And do they share their valuable information with you? A number of companies have realised that withholding key information within organisational silos might happen more often that we might like to admit. Now a new study suggests how and when companies should restore meaningful communication across the organisation.

The study reveals that teams across the land are not playing nicely after all. In fact, there are many occasions when we choose not to share information with colleagues if we think it can harm our own prospects of success. And when that information determines, say, the level of funding passed down from a CEO, it can have a significant – and counter-productive – effect on the company as a whole.

“Most organisations must make decisions about where best to allocate resources,” said Nektarios (Aris) Oraiopoulos of Cambridge Judge Business School, whose study, published in the journal Management Science, examined how these issues play out in the pharmaceutical industry. “Pharmaceutical companies as a whole need to regularly reassess their research and development portfolios and decide which projects have the greatest potential; for example they might choose to improve an existing drug or develop a new one. Such decisions are often made by executives who rely on information provided by the project managers. But individual project managers do not necessarily give accurate information to the boss if they think it will cost them the resources that fund their projects.”

Oraiopoulos’s study, undertaken with Vincent Mak of Cambridge Judge Business School, and Professor Jochen Schlapp of Mannheim University, revealed managers’ likelihood to share information depended on whether there was an appropriate fit between the type of the project (e.g. a new project vs a ‘me-too’ project) and the incentives scheme in place.

“In small companies such as start-ups, there’s often such a strong culture of collective ambition and responsibility – and enhanced risk – that it’s hard to attribute success or failure individually,” said Oraiopoulos. “Therefore the most effective incentive rewards everyone on the basis of the collective success. But as the company grows, people inevitably assume singular responsibilities, the outcomes are less risky and, in the interests of the company, managers start following individual agendas – and management starts rewarding individual performance.”

Which is where the problems start. “If two project managers are offered a group incentive for success, individuals are more willing to be upfront about any failings. But when the two project managers compete for resources and rewards, as it often happens in a bigger organisation, project managers are less likely to step aside.”

There are many reasons for this, said Oraiopoulos, not necessarily based in deception. “Pharmaceutical research includes many ‘true believers’ – researchers who have absolute faith in a new product, especially if it could cure an important disease. But that faith skews their judgment. They believe their breakthrough is just around the corner, even if all the existing evidence suggests otherwise.”

This is a difficult moral argument for any CEO to reject – a difficulty compounded by the lack of impartial information in such a knowledge-specific industry. “One project manager’s specialty might be cardiovascular, another’s oncology,” said Oraiopoulos. “No one knows the science and potential of their product better than they do. They can present an accurate case on why their project deserves resources – or, consciously or subconsciously, mask its failings because no-one has the expertise to challenge them. So how does the CEO tell the difference?”

The answer is trust and giving teams a compelling motivation to be honest. But a collective incentive has drawbacks. “If you’re leading one of five departments who are rewarded only for collective excellence,” said Oraiopoulos, “where’s your motivation? You might as well let the others carry you.”

And even financial incentive doesn’t necessarily work. “Many researchers’ greatest reward is completing their project,” said Oraiopoulos. “That means being consistently confident their boss supports their work.”

So what’s the solution? “Organisations are tackling it in different ways,” he said. “Some are creating smaller, individual units, for example, centres of excellence or turning departments into small start-ups, with defined budgets. Others are promoting more collaboration between departments.”

Swiss global healthcare company Roche did both. When it bought drug developer Genentech in 2009 it kept the two companies’ research and development sections separate, empowering its “late-stage development group” to pick the strongest project – and motivating the losing group by announcing it would develop its plans later. But while that worked with Alzheimer’s treatments, a more linked approach was required for fast-paced developments in cancer research. “The need to understand the biology and right therapeutic approach requires the best minds,” said Roche’s head of oncology Jason Coloma. “We needed to leverage the knowledge in these divisions and break down some of these firewalls.”

The company formed a cancer immunotherapy committee which, says Roche, “brings the leadership and senior scientific minds together to consider different areas of interest and unmet needs that can be fulfilled by looking at different combinations.”

Roche’s approach confirms Oraiopoulos’s findings that new products require a team strategy, while ‘me-too’ projects benefit from more individual approaches. But how to break down a colossal R&D function into start-up-style divisions?

GlaxoSmithKline replaced its research and development ‘pyramid’ with 12 centres of excellence. “We learned these centres must be built around two things,” its then CEO Jean-Pierre Garnier said later. “A specific mission – the most effective therapies for Alzheimer’s – and the stage of the R&D process required to perform that mission, for example choosing a target for attacking the disease. Anything not critical to the core R&D process must occur outside the centre. All other functions – toxicology, drug metabolism, formulation, had to become service units, delivering at the lowest possible cost.”

Simultaneously, GSK overhauled its incentives. “Pharmaceutical R&D typically pursues two objectives – to be first in class and to offer the best-in-class compound for attacking a disease. For too long the industry has tried to be a ballet dancer and a footballer at the same time.”

But he warned fragmenting a company needs commitment. “To operate in this fashion, companies must strengthen opportunities, negotiate deals and nurture external scientific ‘bets’ (work with outside experts). This means a cultural shift. It’s an enormous but necessary task.”

Oraiopoulos’s research suggests there are so many variables – different products, motivations, branches of medicine, organisational goals – each company must then find its own solution. Pfizer’s recent buy-out of Botox maker Allergan is expected to maintain separate divisions for innovative and established treatments, so how the company allocates its resources remains to be seen.

“You must strike a balance,” said Oraiopoulos, “between rewarding individual and group performance. It’s a spectrum and each company must find their place on it, for patients and for the advancement of treatments. Many companies are encountering this challenge. We’re only scratching the surface.”

Reference:
Schlapp, Oraiopoulos, and Mak: ‘Resource Allocation Under Imperfect Evaluation.’ Management Science (2015). DOI: 10.1287/mnsc.2014.2083

Originally published on the Cambridge Judge Business School website


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/how-to-get-teams-to-share-information#sthash.Lzyiamgr.dpuf

Artificial Intelligence and Rise of the Machines: Cambridge Science Festival 2016

Artificial intelligence and rise of the machines: Cambridge Science Festival 2016

source: www.cam.ac.uk

The annual two-week Festival, which runs from 7 – 20 March and stages more than 300 events, examines the growing interaction between humans and technology.

Full programme now online | Bookings open Monday 8 February

Will artificial intelligence be superior to or as creative as the human brain? Are we letting machines take over and give rise to mass unemployment or worse? Should we be worried about quantum computing and the impact it will have on the way we work, communicate and live in the future? Or should we harness rather than hate the digital deluge?

As we teeter on the brink of total machine dependency and epoch-making technological developments that cross all areas of our lives, Cambridge Science Festival asks these and many other critical questions.

The annual Festival presents an impressive line-up of who’s who from the science world and beyond, including Professors Sir John Gurdon (Nobel Laureate), Sir David Spiegelhalter, Richard Gilbertson, Raymond Leflamme, Didier Queloz, Meg Urry, and Tony Purnell, Head of Technical Development for British Cycling. Other speakers include Dr Ewan Birney, Director of the European Bioinformatics Institute; Angus Thirlwell, CEO and Co-founder of Hotel Chocolat; Dr Hermann Hauser, eminent technology entrepreneur; comedian Robin Ince; Charles Simonyi, high-tech pioneer and space traveller; and writer Simon Guerrier (Dr Who).

At the core of this year’s Festival is a series of events that explores the increasing symbiosis between humans and technology, and the questions this raises for humanity in the coming century. On the first day, a panel of outstanding speakers debate the implications of artificial intelligence. The panel consists of experts from the fields of information technology, robotics and neuroscience, including Dr Hermann Hauser, Dr Mateja Jamnik, Professor Trevor Robbins and Professor Alan Winfield. This event will be moderated by Tom Feilden, Science Correspondent for the Today Programme on BBC Radio 4.

Organiser of the event, Professor Barbara Sahakian, University of Cambridge, said: “Artificial intelligence could be of great benefit to society, producing innovative discoveries and providing humans with more leisure time. However, workers are concerned that, more and more, jobs are being taken over by artificial intelligence. We can see this in the context of the current trend for robots to work in car factories and driverless trains, and also in the future movement towards driverless cars.

“Some people feel this is an inevitable progression into the future due to advances in artificial intelligence, information technology and machine learning. However, others including many neuroscientists are not convinced that computers will ever be able to demonstrate creativity nor fully understand social and emotional interactions.”

The view that machines are taking over every aspect of our lives and whether this is a positive or negative factor of modern living is further examined in the event, ‘The rise of the humans: at the intersection of society and technology’. Dave Coplin, author and Chief Envisioning Officer for Microsoft UK, discusses the future of the UK’s IT and digital industries and addresses the convergence of society and technology, focussing on the developments that are creating so many new opportunities.

Coplin, who also has a new book coming out shortly, believes our current relationship with technology is holding us back from using it properly and we should think differently about the potential future uses for technology.

He said: “We should harness, rather than hate, the digital deluge. Individuals and organisations need to rise up and take back control of the potential that technology offers our society. We need to understand and aspire to greater outcomes from our use of technology”.

Building further on these issues in the second week of the Festival, Zoubin Ghahramani, Professor of Information Engineering at the University of Cambridge and the Cambridge Liaison Director of the Alan Turing Institute, explores intelligence and learning in brains and machines. He asks, what is intelligence? What is learning? Can we build computers and robots that learn? How much information does the brain store? How does mathematics help us answer these questions?

Professor Ghahramani highlights some current areas of research at the frontiers of machine learning, including a project to develop an Automatic Statistician, and speculates on some of the future applications of computers that learn.

For many, quantum computing is the answer to machine learning. Influential pioneer in quantum information theory and the co-founder and current director of the Institute for Quantum Computing at the University of Waterloo, Professor Raymond Laflamme presents the annual Andrew Chamblin Memorial Lecture: ‘harnessing the quantum world’. During his lecture, Professor Laflamme will share the latest breakthroughs and biggest challenges in the quest to build technologies based on quantum properties that will change the ways we work, communicate and live.

A former PhD student of Professor Stephen Hawking, Professor Laflamme is interested in harnessing the laws of quantum mechanics to develop new technologies that will have extensive societal impact. He believes that the development of quantum computers will allow us to really understand the quantum world and explore it more deeply.

He said: “This exploration will allow us to navigate in the quantum world, to understand chemistry and physics at the fundamental level and bring us new technologies with applications in health, such as the development of drugs, and to develop new materials with a variety of applications.

“In the next half decade, we will produce quantum processors with more than 100 quantum bits (qubits). As we pass the count of about 30 qubits (approximately one gigabyte), classical computers can no longer compete and we fully enter the quantum world. That will be very exciting, from then on we do not have the support of classical computers to tell us if the quantum devices behave as expected so we will need to find new ways to learn the reliability of these devices. Once we have 30-50 qubits (approximately one million gigabytes), I believe that we will get an equivalent of Moore’s law, but for the increased number of qubits.”

New technologies also have a major impact on healthcare, which comes under the spotlight during the final weekend of the Festival as it returns for the third year running to the Cambridge Biomedical Campus. During the event ‘How big data analysis is changing how we understand the living world’, Dr Ewan Birney, Fellow of the Royal Society and Director of the EMBL European Bioinformatics Institute, explores the opportunities and challenges of genomics and big data in healthcare, from molecular data to high-resolution imaging.

These kinds of technological revolutions mean biological data is being collected faster than ever. Dr Shamith Samarajiwa, from the Medical Research Council Cancer Unit, explains how analysing biomedical big data can help us understand different cancers and identify new targets for treatments in ‘Battling cancer with data science’. Meanwhile, Dr Peter Maccallum from Cancer Research UK Cambridge Institute, discusses the challenges of storing and processing the terabytes of data produced every week, more than was previously generated in a decade in the event ‘Big data from small sources: computing demands in molecular and cell biology’.

Speaking ahead of this year’s Science Festival, Coordinator, Dr Lucinda Spokes said, “Using the theme of big data and knowledge, we are addressing important questions about the kinds of technology that affects, or will affect, not only every aspect of science, from astronomy to zoology, but every area of our lives; health, work, relationships and even what we think we know.

“Through a vast range of debates, talks, demonstrations and performances, some of the most crucial issues of our time and uncertainties about our future as a species will be explored during these packed two weeks.”

The full programme also includes events on neuroscience, healthcare, sports science, psychology, zoology and an adults-only hands-on session amongst many others.

Facebook: www.facebook.com/Cambridgesciencefestival

Twitter: @camscience #csf2016


Cambridge Science Festival

Since its launch in 1994, the Cambridge Science Festival has inspired thousands of young researchers and visitor numbers continues to rise; last year, the Festival attracted well over 45,000 visitors. The Festival, one of the largest and most respected, brings science, technology, engineering, maths and medicine to an audience of all ages through demonstrations, talks, performances and debates. It draws together a diverse range of independent organisations in addition to many University departments, centres and museums.

This year’s Festival sponsors and partners are Cambridge University Press, AstraZeneca, MedImmune, Illumina, TTP Group, Science AAAS, BlueBridge Education, Siemens, ARM, Microsoft Research, Redgate, Linguamatics, FameLab, Babraham Institute, Wellcome Genome Campus, Napp, The Institute of Engineering and Technology, St Mary’s School, Anglia Ruskin University, Cambridge Junction, Addenbrooke’s Hospital, Addenbrooke’s Charitable Trust, James Dyson Foundation, Naked Scientists, Hills Road Sixth Form College, UTC Cambridge, British Science Week, Alzheimer’s Research UK, Royal Society of Chemistry, Cambridge Science Centre, Cambridge Live, and BBC Cambridgeshire.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/artificial-intelligence-and-rise-of-the-machines-cambridge-science-festival-2016#sthash.4ujm8SvB.dpuf

Fuel Cell Electrolyte Developed To Offer Cleaner, More Efficient Energy

Fuel cell electrolyte developed to offer cleaner, more efficient energy

source: www.cam.ac.uk

A new thin-film electrolyte material that helps solid oxide fuel cells operate more efficiently and cheaply than those composed of conventional materials, and has potential applications for portable power sources, has been developed at the University of Cambridge.

The ability to precisely engineer and tune highly crystalline materials at the nanoscale is absolutely key for next-generation power generation and storage of many different kinds.

Judith Driscoll

These new materials offer the possibility of either significantly improving the efficiency of current high-temperature fuel cell systems, or achieving the same performance levels at much lower temperatures. Either of these approaches could enable much lower fuel consumption and waste energy. The material was co-invented by Professor Judith Driscoll of the Department of Materials Science and Metallurgy and her colleague Dr Shinbuhm Lee, with support from collaborators at Imperial College and at three different labs in the US.

Solid oxide fuel cells are comprised of a negative electrode (cathode) and positive electrode (anode), with an electrolyte material sandwiched between them. The electrolyte transports oxygen ions from the cathode to the anode, generating an electric charge. Compared to conventional batteries, fuel cells have the potential to run indefinitely, if supplied by a source of fuel such as hydrogen or a hydrocarbon, and a source of oxygen.

By using thin-film electrolyte layers, micro solid oxide fuel cells offer a concentrated energy source, with potential applications in portable power sources for electronic consumer or medical devices, or those that need uninterruptable power supplies such as those used by the military or in recreational vehicles.

“With low power requirements and low levels of polluting emissions, these fuel cells offer an environmentally attractive solution for many power source applications,” said Dr Charlanne Ward of Cambridge Enterprise, the University’s commercialisation arm, which is managing the patent that was filed in the US. “This opportunity has the potential to revolutionise the power supply problem of portable electronics, by improving both the energy available from the power source and safety, compared with today’s battery solutions.”

In addition to providing significantly improved conductivity, the new electrolyte material offers:

  • minimal heat loss and short circuiting due to low electronic conductivity
  • minimal cracking under heat cycling stress due to small feature size in the construction
  • high density, reducing the risk of fuel leaks
  • simple fabrication using standard epitaxial growth and self-assembly techniques

“The ability to precisely engineer and tune highly crystalline materials at the nanoscale is absolutely key for next-generation power generation and storage of many different kinds,” said Driscoll. “Our new methods and understanding have allowed us to exploit the very special properties of nanomaterials in a practical and stable thin-film configuration, resulting in a much improved oxygen ion conducting material.”

In October, a paper on the enhancement of oxygen ion conductivity in oxides was published in Nature Communications. It is this enhancement that improves efficiency and enables low-temperature operation of fuel cells. As a result of the reported advantages, the novel electrolyte material can also potentially be used in the fabrication of improved electrochemical gas sensors and oxygen separation membranes (to extract oxygen molecules from the air). The inventors have also published two other papers showing the enhanced ionic conduction in different materials systems, one in Nano Letters and one inAdvanced Functional Materials.

Cambridge Enterprise is working with Driscoll to take the technology to market, seeking to collaborate with a fuel cell manufacturer with expertise in thin-film techniques to validate the new material.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

J A Kemp Patent and Trade Mark Attorneys to open office in Cambridge

J A Kemp Patent and Trade Mark Attorneys to open office in Cambridge

Displaying JA Kemp Logo Strap Colour.pngDisplaying JA Kemp Logo Strap Colour.png

J A Kemp has announced that it will open an office in Cambridge in February 2016.

Ranked in tier 1 by The Legal 500 and Chambers and Partners, J A Kemp is one of the largest UK and European patent and trade mark attorney firms.  The firm’s strong international presence is complemented by a UK practice that has nearly doubled in volume in the past four years and has long included significant work for several of the country’s leading university-based technology transfer organisations as well as a diverse range of significant clients in other sectors.  J A Kemp also advises start-ups and growing businesses across all industry sectors on patents, trade marks and IP strategy.

The firm’s patent capabilities embrace all technology areas across the spectrum of biotechnology and life sciences, chemistry and pharmaceuticals, electronics, engineering, IT and software.  The fast-growing business, scientific and academic hub of Cambridge is an ideal base to expand the firm’s operations with a view to serving clients there and in the M11 corridor and East of England.

The firm, whose main office is in central London, already has an office in Oxford which has tripled in size over the last four years.  The office in Cambridge will be at 30 Station Road, in the well-connected and growing business district surrounding Cambridge railway station, and will accommodate six attorneys and two support staff.

 

Commenting on the move, partner Andy Bentham, who will head up the new office, said:

“We have seen a significant increase in patent activity from the UK in the last few years. Much of this stems from the entrepreneurial scientific communities in Oxford and Cambridge.  In Cambridge growth is faster now than at any time in the 25 years since I first came here as an undergraduate. 

 “We are looking forward to joining the Cambridge cluster of hi-tech, biomedical, agri-tech and other innovators.  With 150 people based in our main office in London and over 20 based in Oxford, opening an office in Cambridge too means we can offer services from all three points of the so-called ‘golden triangle’.”

ANDY BENTHAM Cropped

 

J A Kemp has the specialist expertise required to secure and protect the most complex patent portfolios and the firm has an internationally renowned team of specialists in trade mark matters.  J A Kemp also offers full-service IP dispute resolution and litigation expertise.

 For further information contact: Claire Wright, Head of Marketing – cwright@jakemp.com – 020 3077 8600.

Brain Waves Could Help Predict How We Respond To General Anaesthetics

Brain waves could help predict how we respond to general anaesthetics

source: www.cam.ac.uk

The complex pattern of ‘chatter’ between different areas of an individual’s brain while they are awake could help doctors better track and even predict their response to general anaesthesia – and better identify the amount of anaesthetic necessary – according to new research from the University of Cambridge.

A very good way of predicting how an individual responds to our anaesthetic was the state of their brain network activity at the start of the procedure

Srivas Chennu

Currently, patients due to undergo surgery are given a dose of anaesthetic based on the so-called ‘Marsh model’, which uses factors such as an individual’s body weight to predict the amount of drug needed. As patients ‘go under’, their levels of awareness are monitored in a relatively crude way. If they are still deemed awake, they are simply given more anaesthetic. However, general anaesthetics can carry risks, particularly if an individual has an underlying health condition such as a heart disorder.

As areas of the brain communicate with each other, they give off tell-tale signals that can give an indication of how conscious an individual is. These ‘networks’ of brain activity can be measured using an EEG (electroencephalogram), which measures electric signals as brain cells talk to each other. Cambridge researchers have previously shown that these network signatures can even be seen in some people in a vegetative state and may help doctors identify patients who are aware despite being unable to communicate. These findings build upon advances in the science of networks to tackle the challenge of understanding and measuring human consciousness.

In a study published today in the open access journal PLOS Computational Biology, funded by the Wellcome Trust, the researchers studied how these signals changed in healthy volunteers as they received an infusion of propofol, a commonly used anaesthetic.

Twenty individuals (9 male, 11 female) received a steadily increasing dose of propofol – all up to the same limit – while undergoing a task that involved pressing one button if they heard a ‘ping’ and a different button if they heard a ‘pong’. At the same time, the researchers tracked their brain network activity using an EEG.

By the time the subjects had reached the maximum dose, some individuals were still awake and able to carry out the task, while others were unconscious. As the researchers analysed the EEG readings, they found clear differences between those who were responding to the anaesthetic and those who remained able to carry on with the task. This ‘brain signature’ was evident in the network of communications between brain areas carried by alpha waves (brain cell oscillations in the frequency range of 7.5–12.5 Hz), the normal range of electrical activity of the brain when conscious and relaxed.

In fact, when the researchers looked at the baseline EEG readings before any drug was given, they already saw differences between those who would later succumb to the drug and those who were less responsive to its effects. Dividing the subjects into two groups based on their EEG readings – those with lots of brain network activity at baseline and those with less – the researchers were able to predict who would be more responsive to the drug and who would be less.

The researchers also measured levels of propofol in the blood to see if this could be used as a measure of how conscious an individual was. Although they found little correlation with the alpha wave readings in general, they did find a correlation with a specific form of brain network activity known as delta-alpha coupling. This may be able to provide a useful, non-invasive measure of the level of drug in the blood.

“A very good way of predicting how an individual responds to our anaesthetic was the state of their brain network activity at the start of the procedure,” says Dr Srivas Chennu from the Department of Clinical Neurosciences, University of Cambridge. “The greater the network activity at the start, the more anaesthetic they are likely to need to put them under.”

Dr Tristan Bekinschtein, senior author from the Department of Psychology, adds: “EEG machines are commonplace in hospitals and relatively inexpensive. With some engineering and further testing, we expect they could be adapted to help doctors optimise the amount of drug an individual needs to receive to become unconscious without increasing their risk of complications.”

Srivas Chennu will be speaking at the Cambridge Science Festival on Wednesday 16 March. During the event, ‘Brain, body and mind: new directions in the neuroscience and philosophy of consciousness’, he will be examining what it means to be conscious.

Reference
Chennu, S et al. Brain connectivity dissociates responsiveness from drug exposure during propofol induced transitions of consciousness. PLOS Computational Biology; 14 Jan 2016

Image
Brain networks during the transition to unconsciousness during propofol sedation (drug infusion timeline shown in red). Participants with robust networks at baseline (left panel) remained resistant to the sedative, while others showed characteristically different, weaker networks during unconsciousness (middle). All participants regained similar networks when the sedative wore off (right).


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/brain-waves-could-help-predict-how-we-respond-to-general-anaesthetics#sthash.NI0yjOFD.dpuf

John Maynard Keynes: Great Economist, Poor Currency Trader

John Maynard Keynes: great economist, poor currency trader

Source: www.cam.ac.uk

John Maynard Keynes struggled as a foreign-exchange trader, finds the first detailed study of the famous economist as currency speculator.

If someone as economically literate and well-connected as Keynes found it difficult to time currencies, then the rest of us should think twice before believing we can do any better.

David Chambers

A detailed new study of the chequered currency-trading record of John Maynard Keynes might make today’s overconfident currency speculators think twice.

While Keynes was one of the most famous economists in history, and his stock-picking record as an asset manager was outstanding, a forensic analysis of his personal currency trades found that his record was pedestrian by comparison.

The findings are forthcoming in the Journal of Economic History, in a study co-authored by Olivier Accominotti from the London School of Economics and Political Science, and David Chambers of the University of Cambridge Judge Business School.

“Unlike his stock investing, Keynes found currency investing a lot tougher, despite the fact that he was at the centre of the world of international finance throughout the time he traded currencies,” said Chambers. To be sure, Keynes made money from speculating in currencies in the 1920s and 1930s and his profits arose from more than pure chance. “Directionally, he called currencies more or less correctly but he really struggled with timing his trades. One main message for investors today is that if someone as economically literate and well-connected as Keynes found it difficult to time currencies, then the rest of us should think twice before believing we can do any better.”

In his currency trading, Keynes relied heavily on his own analysis of fundamental economic factors such as inflation, trade balance, capital flows and political developments.

Such ‘fundamentals-based’ strategy differs from ‘technical’ strategies that follow simple mechanical trading rules but seek profits by identifying market anomalies – typically through the carry trade (betting on high-interest currencies versus low-interest rate currencies) and momentum (betting on currencies which have recently appreciated versus those which have depreciated). Both fundamentals-based and technical trading styles are observed among modern-day currency managers.

But Keynes produced unremarkable results at the dawn of the modern foreign-exchange market, when dealings were transformed by telegraphic transfer and the emergence of a forward exchange market.

The period during which he traded was marked by considerable foreign exchange volatility and large deviations of exchange rates from their fundamental values which appear obvious to investors today. However, trading these deviations in real time was hazardous. “Implementing a currency trading strategy based on the analysis of macroeconomic fundamentals was challenging (even) for John Maynard Keynes,” said the research paper.

This was particularly the case in the 1920s. Currency traders can be judged in terms of the return generated per unit of risk, also known as the Sharpe Ratio. While Keynes generated a Sharpe Ratio of approximately 0.2 (assuming his trading equity was fixed), the same ratio for an equal-weighted blend of the carry and momentum strategies was substantially higher at close to 1.0 after transaction costs. When he resumed currency trading in 1932 after a five-year break coinciding with the return to the gold standard, although Keynes outperformed the carry strategy (whose mean return was negative) in the 1930s, he still underperformed a simple momentum strategy.

The study also found that Keynes “experienced periods of considerable losses in both the 1920s and 1930s. Indeed, he was close to being technically bankrupt in 1920 and could only stay trading thanks to his ability to borrow funds from his social circle.”

The research is based on a detailed dataset of 354 personal currency trades by Keynes between 1919 and 1939 (mostly in five currencies against the British pound – the US dollar, French franc, Deutsche mark, Italian lira and Dutch guilder).

Details of the trades were contained in ledgers kept in the archives at King’s College, Cambridge, where Keynes managed the college endowment fund for decades. Famously shifting the college portfolio from property to stocks, Keynes’s investment writings based on his very successful investment strategy at King’s College later on became a source of inspiration for David Swensen, the architect of the influential “Yale model” for managing university endowments in the US today.

Originally published on the Cambridge Judge Business School website.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

– See more at: http://www.cam.ac.uk/research/news/john-maynard-keynes-great-economist-poor-currency-trader#sthash.D0LmkPr1.dpuf