All posts by Admin

DNA From Viking Cod Bones Suggests 1,000 Years of European Fish Trade

DNA from Viking cod bones suggests 1,000 years of European fish trade

source: www.cam.ac.uk

New research using DNA from the fish bone remains of Viking-era meals reveals that north Norwegians have been transporting – and possibly trading – Arctic cod into mainland Europe for a millennium.

Our findings suggest that distant requirements for this Arctic protein had already begun to influence the economy and ecology of Europe in the Viking age

James Barrett

Norway is famed for its cod. Catches from the Arctic stock that spawn each year off its northern coast are exported across Europe for staple dishes from British fish and chips to Spanish bacalao stew.

Now, a new study published today in the journal PNAS suggests that some form of this pan-European trade in Norwegian cod may have been taking place for 1,000 years.

Latest research from the universities of Cambridge and Oslo, and the Centre for Baltic and Scandinavian Archaeology in Schleswig, used ancient DNA extracted from the remnants of Viking-age fish suppers.

The study analysed five cod bones dating from between 800 and 1066 AD found in the mud of the former wharves of Haithabu, an early medieval trading port on the Baltic. Haithabu is now a heritage site in modern Germany, but at the time was ruled by the King of the Danes.

The DNA from these cod bones contained genetic signatures seen in the Arctic stock that swim off the coast of Lofoten: the northern archipelago still a centre for Norway’s fishing industry.

Researchers say the findings show that supplies of ‘stockfish’ – an ancient dried cod dish popular to this day – were transported over a thousand miles from northern Norway to the Baltic Sea during the Viking era.

Prior to the latest study, there was no archaeological or historical proof of a European stockfish trade before the 12th century.

While future work will look at further fish remains, the small size of the current study prevents researchers from determining whether the cod was transported for trade or simply used as sustenance for the voyage from Norway.

However, they say that the Haithabu bones provide the earliest evidence of fish caught in northern Norway being consumed on mainland Europe – suggesting a European fish trade involving significant distances has been in operation for a millennium.

“Traded fish was one of the first commodities to begin to knit the European continent together economically,” says Dr James Barrett, senior author of the study from the University of Cambridge’s McDonald Institute for Archaeological Research.

“Haithabu was an important trading centre during the early medieval period. A place where north met south, pagan met Christian, and those who used coin met those who used silver by weight.”

“By extracting and sequencing DNA from the leftover fish bones of ancient cargoes at Haithabu, we have been able to trace the source of their food right the way back to the cod populations that inhabit the Barents Sea, but come to spawn off Norway’s Lofoten coast every winter.

“This Arctic stock of cod is still highly prized – caught and exported across Europe today. Our findings suggest that distant requirements for this Arctic protein had already begun to influence the economy and ecology of Europe in the Viking age.”

Stockfish is white fish preserved by the unique climate of north Norway, where winter temperature hovers around freezing. Cod is traditionally hung out on wooden frames to allow the chill air to dry the fish. Some medieval accounts suggest stockfish was still edible as much as ten years after preservation.

The research team argue that the new findings offer some corroboration to the unique 9th century account of the voyages of Ohthere of Hålogaland: a Viking chieftain whose visit to the court of King Alfred in England resulted in some of his exploits being recorded.

“In the accounts inserted by Alfred’s scribes into the translation of an earlier 5th century text, Ohthere describes sailing from Hålogaland to Haithabu,” says Barrett. Hålogaland was the northernmost province of Norway. 

“While no cargo of dried fish is mentioned, this may be because it was simply too mundane a detail,” says Barrett. “The fish-bone DNA evidence is consistent with the Ohthere text, showing that such voyages between northern Norway and mainland Europe were occurring.”

“The Viking world was complex and interconnected. This is a world where a chieftain from north Norway may have shared stockfish with Alfred the Great while a late-antique Latin text was being translated in the background. A world where the town dwellers of a cosmopolitan port in a Baltic fjord may have been provisioned from an Arctic sea hundreds of miles away.”

The sequencing of the ancient cod genomes was done at the University of Oslo, where researchers are studying the genetic makeup of Atlantic cod in an effort to unpick the anthropogenic impacts on these long-exploited fish populations.

“Fishing, particularly of cod, has been of central importance for the settlement of Norway for thousands of years. By combining fishing in winter with farming in summer, whole areas of northern Norway could be settled in a more reliable manner,” says the University of Oslo’s Bastiaan Star, first author of the new study.

Star points to the design of Norway’s new banknotes that prominently feature an image of cod, along with a Viking ship, as an example of the cultural importance still placed on the fish species in this part of Europe.

“We want to know what impact the intensive exploitation history covering millennia has inflicted on Atlantic cod, and we use ancient DNA methods to investigate this,” he says.

The study was funded by the Research Council of Norway and the Leverhulme Trust.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Celebrity Twitter Accounts Display ‘Bot-Like’ Behaviour

Celebrity Twitter accounts display ‘bot-like’ behaviour

source: www.cam.ac.uk

‘Celebrity’ Twitter accounts – those with more than 10 million followers – display more bot-like behaviour than users with fewer followers, according to new research.

A Twitter user can be a human and still be a spammer, and an account can be operated by a bot and still be benign.

Zafar Galani

The researchers, from the University of Cambridge, used data from Twitter to determine whether bots can be accurately detected, how bots behave, and how they impact Twitter activity.

They divided accounts into categories based on total number of followers, and found that accounts with more than 10 million followers tend to retweet at similar rates to bots. In accounts with fewer followers however, bots tend to retweet far more than humans. These celebrity-level accounts also tweet at roughly the same pace as bots with similar follower numbers, whereas in smaller accounts, bots tweet far more than humans. Their results will be presented at the IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM) in Sydney, Australia.

Bots, like people, can be malicious or benign. The term ‘bot’ is often associated with spam, offensive content or political infiltration, but many of the most reputable organisations in the world also rely on bots for their social media channels. For example, major news organisations, such as CNN or the BBC, who produce hundreds of pieces of content daily, rely on automation to share the news in the most efficient way. These accounts, while classified as bots, are seen by users as trustworthy sources of information.

“A Twitter user can be a human and still be a spammer, and an account can be operated by a bot and still be benign,” said Zafar Galani, a PhD student at Cambridge’s Computer Laboratory, who led the research. “We’re interested in seeing how effectively we can detect automated accounts and what effects they have.”

Bots have been on Twitter for the majority of the social network’s existence – it’s been estimated that anywhere between 40 and 60% of all Twitter accounts are bots. Some bots have tens of millions of followers, although the vast majority have less than a thousand – human accounts have a similar distribution.

In order to reliably detect bots, the researchers first used the online tool BotOrNot (since renamed BotOMeter), which is one of the only available online bot detection tools. However, their initial results showed high levels of inaccuracy. BotOrNot showed low precision in detecting bots that had bot-like characteristics in their account name, profile info, content tweeting frequency and especially redirection to external sources. Gilani and his colleagues then decided to take a manual approach to bot detection.

Four undergraduate students were recruited to manually inspect accounts and determine whether they were bots. This was done using a tool that automatically presented Twitter profiles, and allowed the students to classify the profile and make notes. Each account was collectively reviewed before a final decision was reached.

In order to determine whether an account was a bot (or not), the students looked at different characteristics of each account. These included the account creation date, average tweet frequency, content posted, account description, whether the user replies to tweets, likes or favourites received and the follower to friend ratio. A total of 3,535 accounts were analysed: 1,525 were classified as bots and 2010 as humans.

The students showed very high levels of agreement on whether individual accounts were bots. However, they showed significantly lower levels of agreement with the BotOrNot tool.

The bot detection algorithm they subsequently developed achieved roughly 86% accuracy in detecting bots on Twitter. The algorithm uses a type of classifier known as Random Forests, which uses 21 different features to detect bots, and the classifier itself is trained by the original dataset annotated by the human annotators.

The researchers found that bot accounts differ from humans in several key ways. Overall, bot accounts generate more tweets than human accounts. They also retweet far more often, and redirect users to external websites far more frequently than human users. The only exception to this was in accounts with more than 10 million followers, where bots and humans showed far more similarity in terms of the volume of tweets and retweets.

“We think this is probably because bots aren’t that good at creating original Twitter content, so they rely a lot more on retweets and redirecting followers to external websites,” said Galani. “While bots are getting more sophisticated all the time, they’re still pretty bad at one-on-one Twitter conversations, for instance – most of the time, a conversation with a bot will be mostly gibberish.”

Despite the sheer volume of Tweets produced by bots, humans still have better quality and more engaging tweets – tweets by human accounts receive on average 19 times more likes and 10 times more retweets than tweets by bot accounts. Bots also spend less time liking other users’ tweets.

“Many people tend to think that bots are nefarious or evil, but that’s not true,” said Galani. “They can be anything, just like a person. Some of them aren’t exactly legal or moral, but many of them are completely harmless. What I’m doing next is modelling the social cost of these bots – how are they changing the nature and quality of conversations online? What is clear though, is that bots are here to stay.”

Reference: 
Zafar Galani, Ekaterina Kochmar, Jon Crowcroft. Classification of Twitter Accounts into Automated Agents and Human Users. Paper presented at 9th IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM’17). Sydney, New South Wales, Australia.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Unique Crayfish Service Makes the Headlines

Unique Crayfish service makes the headlines

Innovative Cambridge online startup Crayfish International – which offers a unique service to help businesses operate in China – is making waves.

Just over a month after launch, it featured last week on the front page of China Daily, which has the widest print circulation of any English-language newspaper in China, reaching an international audience.

Crayfish matches English-Chinese bilingual freelancers with Western businesses who need help in dealing with their Chinese partners and audiences, providing a source of qualified people to undertake projects and offer information, knowledge and cultural insight.

Business users post their projects on to the site and freelancers bid for the work, with the transaction carried out through the web-based Crayfish platform, Crayfish.io. Users pay a fee after they accept a freelancer’s proposal, with payment – less commission – released on completion of the job.

Most jobs to date have involved document translation but more diverse projects are on the horizon, according to Crayfish Founder, Ting Zhang. “In the post-Brexit era, British businesses have to start looking at exporting to countries outside the EU, and China is one of the most important markets,” she said.

Read the article in China Daily
http://www.chinadaily.com.cn/world/2017-07/26/content_30256718.htm

Sign up is FREE at www.crayfish.io.

Genetic Study Suggests Present-Day Lebanese Descend From Biblical Canaanites

Genetic study suggests present-day Lebanese descend from biblical Canaanites

source: www.cam.ac.uk

Researchers analysed DNA extracted from 4,000-year-old human remains to reveal that more than 90% of Lebanese ancestry is from ancient Canaanite populations.

The fact that we can retrieve whole genomes from conditions not considered ideal for DNA preservation also shows how far the field have advanced technically

C.L Scheib

Scientist have sequenced the entire genomes of 4,000-year-old Canaanite individuals who inhabited the Near East region during the Bronze Age, and compared these to other ancient and present-day populations. The results, published in the American Journal of Human Genetics, suggest that present-day Lebanese are direct descendants of the ancient Canaanites.

The Near East is often described as the cradle of civilisation. The Bronze Age Canaanites, later known as the Phoenicians, introduced many aspects of society that we know today – they created the first alphabet, established colonies throughout the Mediterranean and were mentioned several times in the Bible.

However, historical records of the Canaanites are limited. They were mentioned in ancient Greek and Egyptian texts, and the Bible which reports widespread destruction of Canaanite settlements and annihilation of the communities. Experts have long debated who the Canaanites were genetically, what happened to them, who their ancestors were and if they had any descendants today.

In the first study of its kind, an international team of scientists have uncovered the genetics of the Canaanite people and a firm link with people living in Lebanon today. The team discovered that more than 90 per cent of present-day Lebanese ancestry is likely to be from the Canaanites, with an additional small proportion of ancestry coming from a different Eurasian population.

The team, including researchers from Cambridge University’s Department of Archaeology and Anthropology, and led by the Wellcome Trust Sanger Institute, estimate that new Eurasian people mixed with the Canaanite population about 2,200 to 3,800 years ago at a time when there were many conquests of the region from outside.

The analysis of ancient DNA also revealed that the Canaanites themselves were a mixture of local people who settled in farming villages during the Neolithic period and eastern migrants who arrived in the area around 5,000 years ago.

“Ancient DNA is becoming an indispensable tool for understanding population movements of the past. This study in particular provides previously inaccessible information about a group of people known only by surviving written accounts and interpretations of archaeological findings,” said Dr. C L Scheib, one of two Cambridge co-authors, along with Dr Toomas Kivisild.

“The fact that we can retrieve whole genomes from conditions not considered ideal for DNA preservation also shows how far the field has advanced technically,” she said.

In the study, researchers sequenced whole genomes of five Canaanite individuals who lived 4,000 years ago in a city known as Sidon in present-day Lebanon. Scientists also sequenced the genomes of 99 present-day Lebanese and analysed the genetic relationship between the ancient Canaanites and modern Lebanese.

Dr Marc Haber, first author from the Sanger Institute, said: “It was a pleasant surprise to be able to extract and analyse DNA from 4,000-year-old human remains found in a hot environment, which is not known for preserving DNA well. We overcame this challenge by taking samples from the petrous bone in the skull, which is a very tough bone with a high density of ancient DNA.”

Dr Claude Doumet-Serhal, co-author and Director of the Sidon excavation site in Lebanon, said: “For the first time we have genetic evidence for substantial continuity in the region, from the Bronze Age Canaanite population through to the present day. These results agree with the continuity seen by archaeologists.

“Collaborations between archaeologists and geneticists greatly enrich both fields of study and can answer questions about ancestry in ways that experts in neither field can answer alone.”

Adapted from a Wellcome Trust press release. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Target ‘Best Connected Neighbours’ to Stop Spread of Infection in Developing Countries

Target ‘best connected neighbours’ to stop spread of infection in developing countries

source: www.cam.ac.uk

An innovative new study takes a network theory approach to targeted treatment in rural Africa, and finds that a simple algorithm may be more effective than current policies, as well as easier to deploy, when it comes to preventing disease spread – by finding those with “most connections to sick people”.

Finding the best connected neighbours, the ‘hubs’, and removing them through treatment, looks to be the quickest way to fragment a network that spreads infections

Goylette Chami

Our lives benefit from social networks: the contact and dialogue between family, friends, colleagues and neighbours. However these networks can also cost lives by transmitting infection or misinformation, particularly in developing nations.

In fact, when there is an outbreak of disease, or of damaging rumour that hinders uptake of vaccination, the network through which it spreads needs to be broken up – and fast.

But who are the people with most connections – the ‘hubs’ in any social network – that should be targeted with inoculating drugs or health education in order to quickly isolate a contagion?

Information about social networks in rural villages in the developing world is costly and time-consuming to collect, and usually unavailable. So current immunisation strategies target people with established community roles: healthcare workers, teachers, and local officials.

Now, Cambridge researchers have for the first time combined networking theories with ‘real world’ data collected from thousands of rural Ugandan households, and shown that a simple algorithm may be significantly more effective at finding the highly connected ‘hubs’ to target for halting disease spread.

The ‘acquaintance algorithm’ employed by researchers is remarkably simple: select village households at random and ask who in their network is most trusted for medical advice.

Researchers were surprised to find that the most influential people in social networks were very often not those with obvious positions in a community. As such, these valuable ‘hubs’ are invisible to drug administration programmes without the algorithmic approach.

“Everyone is a node in a social network. Most nodes have just a few connections. However, a small number of nodes have the majority of connections. These are the hubs we want to uncover and target in order to intentionally cause failure in social networks spreading pathogens or damaging behaviour,” says lead researcher Dr Goylette Chami, from Cambridge’s Department of Pathology.

“It was striking to find that important village positions may be best left untargeted for interventions seeking to stop the spread of pathogens through a rural social network,” says Chami.

In the study, published today in the journal PNAS, the researchers write that this simple strategy could be particularly effective for isolating households that refuse to take medicine, so that they don’t endanger the rest of a community with infection.

To control disease caused by parasitic worm infections, for example, at least three quarters of any given community need to be treated. “The refusal of treatment by a few people can result in the destabilisation of mass drug administration programmes that aim to treat 1.9 billion people worldwide,” says Chami.

An average of just 32% of households (‘nodes’) selected by the acquaintance algorithm need to be provided health education (and ‘removed’ from a network) to reach the disease control threshold for an entire community. Using traditional role-based targeting, the average needed is much larger: some 54%.

“We discovered that acquaintance algorithms outperformed the conventional field-based approaches of targeting well-established community roles for finding individuals with the most connections to sick people, as well as isolating the spread of misinformation,” says Chami

“Importantly, this simple strategy doesn’t require any information on who holds which role and how to reach them. No database is needed. As such, it is easy to deploy in rural, low-income settings.”

“In an ideal world, everyone would be treated,” says Chami. “However, with limited resources, time and information, finding the best connected neighbours, the ‘hubs’, and removing them through treatment, looks to be the quickest way to fragment a network that spreads infections, and to render the most people safe.”

Chami and colleagues from Cambridge and the Ugandan Ministry of Health collected data on social and health advice networks from over 16,000 people in 17 villages across rural Uganda. They also collected data on networks of disease using reports of diarrhoea as a proxy for infection spread – particularly relevant to recent large-scale cholera outbreaks.

To do this, Chami built a survey app from open source code and loaded it on to 76 Google nexus tablets. The team then trained a number of individuals from the local villages to help them go door to door.

Adds Chami: “This kind of ‘network theory’ approach to public health in the developing world, and the use of acquaintance algorithms, if tested in randomised controlled trials, may increase compliance to treatments and inform strategies for the distribution of vaccines.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge-Led Collaborations Aim to Tackle Global Food Security and Public Health Challenges

Cambridge-led collaborations aim to tackle global food security and public health challenges

source: www.cam.ac.uk

Two major research collaborations led by the University of Cambridge have been awarded almost £15 million in funding, the Minister of State for Universities and Science, Jo Johnson MP, announced today during a visit to Cambridge’s Sainsbury Laboratory.

The two collaborations are focused on food security in India and public health in Bangladesh and will see researchers from the UK and developing countries working together as equal partners.

The awards are part of the Global Challenges Research Fund, which aims to build upon research knowledge in the UK, and strengthen capacity overseas, to help address challenges, informed by expressed need in the developing countries.

Jo Johnson, Minister for Universities and Science, said: “From healthcare to green energy, the successful projects receiving funding today highlight the strength of the UK’s research base and our leadership in helping developing countries tackle some of the greatest global issues of our time.

“At a time when the pace of scientific discovery and innovation is quickening, we are placing science and research at the heart of our Industrial Strategy to build on our strengths and maintain our status as science powerhouse.”

Andrew Thompson, GCRF Champion at Research Councils UK, said: “The 37 projects announced today build research capacity both here in the UK and in developing countries to address systemic development challenges,  from African agriculture to sustainable cities, clean oceans, and green energy, to improved healthcare, food security, and gender equality.”

TIGR2ESS (Transforming India’s Green Revolution by Research and Empowerment for Sustainable food Supplies)

Lead: Professor Howard Griffiths (Department of Plant Sciences)

Talk of a second Green Revolution has been around for a while. The first – in India and other developing countries, in the 1960s – brought a massive increase in crop production that sustained the country’s mushrooming population. But now there are new pressures – not just the need to produce even more food, but to reduce the damage done by excessive use of pesticides, fertiliser and water in the face of climate change.

TIGR2ESS, a collaboration between UK and Indian scientists, seeks to frame the big question – how to bring about a second Green revolution – in all its breadth and depth. India is developing fast– agriculture needs to take account of urbanisation, for example, which has drawn so many away from the land. Smallholder farmers- particularly women- need smart technologies to sustain crop yields, and improve health and nutrition.

The TIGR2ESS programme will assess these options, as well as supporting basic research programmes, and providing advice to local communities. There will be many opportunities for academic exchanges, mentoring and career development for scientists from both countries. Links with the relevant government ministries in India, plus industrial connections built into the programme, will hopefully turn the best recommendations into reality.

“We are extremely pleased that the TIGR2ESS programme will help to deliver our vision for partnerships with institutions in India to improve crop science and food security,” says Professor Howard Griffiths, Co-Chair of the University of Cambridge’s Strategic Initiative in Global Food Security.

“Agriculture is feminizing. We need to ensure that state resources and services, and knowledge resources, are equally accessible to women farmers,” adds Dr V Selvam, MS Swaminathan Research Foundation, India, one of the collaborators.

CAPABLE (Cambridge Programme to Assist Bangladesh in Lifestyle and Environmental risk reduction)

Lead: Professor John Danesh (Department of Public Health and Primary Care)

Gathering a big group of people and studying their health in the long term can uncover game-changing facts. The British Doctors’ Study, for example, which began in 1951, revealed that smoking causes lung cancer. Imagine if the same could be done in a country facing a perfect storm of chronic health problems.

Bangladesh is admired worldwide for its success in cutting child mortality and fertility rate, yet it faces an onslaught of chronic diseases that arise from an interplay of factors ranging from arsenic-contaminated drinking water to iron-deficient foods and from air pollution to the rise of the western lifestyle.

CAPABLE has the ambitious goal of recruiting 100,000 people from landscapes ranging from the green paddy fields of rural Bangladesh to the slums of the densest city in the world – Dhaka. From their data, engineers, sociologists, health researchers and a host of other disciplines will try to understand how the risk factors interact – and build a model that can be used to test interventions before they are implemented.

“We aim to help develop simple, scalable and effective solutions to control major environmental and lifestyle risk factors in Bangladesh,” says Scientific Director of the CAPABLE programme Dr Rajiv Chowdhury from the Department of Public Health and Primary Care at the University of Cambridge.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

The US is back on the timetable at London Stansted

Primera Air Boston Stansted

The US is back on the timetable at London Stansted. Daily direct flights from Stansted to New York and Boston will start from April 2018 courtesy Danish carrier Primera Air, which is opening a new base at the Essex hub.

The service will link the Cambridge UK-Stansted-London science & technology corridor with Newark Airport for the Big Apple and Boston’s Logan Airport. Primera Air also plans to announce another transatlantic route from Stansted by the end of this summer.

The new routes are terrific news for the Cambridge cluster business community. Scores of local businesses are either US-owned or trading significantly with transatlantic customers; US companies such as Apple, Amazon, Google, Illumina, Gilead and many others are also scaling from Cambridge.

Flights will be operated by new Airbus A321NEO aircraft and include a choice of two cabins, full-service premium and low-fare economy, free Wi-Fi and onboard charging points.

Primera Air will be the first airline for nine years to fly scheduled services to the US from Stansted.

Four airlines have tried to establish a Stansted-US service but only Continental – hit by 9/11 three months in – could claim ill-fortune. The last of the pretenders, American Airlines, nipped in, took out two rivals, and promptly pulled out again but had cleared a pitch for a future day – or so it assumed.

Primera Air president and chairman Andri Ingolfsson said: “We are very proud to announce our new base at London Stansted and routes to the US. With our brand new Airbus A321NEO aircraft, we are opening routes previously traditionally served only by wide-body aircraft. 

“With unmatched efficiency of these new-generation aircraft, we will be able to offer unprecedented prices to our passengers from London Stansted to the USA. At the same time, we are very proud to be offering a low-fare/high quality product and service concept, that will be perfect both for leisure and business travellers.”

Andrew Cowan, Stansted’s chief executive added: “We’re thrilled Primera Air has chosen London Stansted as its UK base for these exciting and innovative new long-haul services to the USA.

“We know from our customers that there is enormous demand for flights to New York and Boston from London and the East of England, so the arrival of Primera Air is fantastic news for both business and leisure passengers wanting great value, excellent service and the convenience of flying transatlantic from their local airport.”

Flights to both destinations went on sale today (July 20) with prices starting from £149 one-way inclusive of all taxes, fees and charges.
Visit www.primeraair.com for more information.

Cambridge Innovation Summit 2017

Cambridge Innovation Summit 2017

Innovation thought leaders from USA, Russia and across Europe came together at the Cambridge Innovation Summit last week to share experiences, benchmark and shape the global innovation landscape for years to come.

Organised by the Centre for Business Innovation (CfBI) and enabled by generous sponsorship from Astra Zeneca, Hewitsons and the Department for International Trade, the invitation only event attracted over 80 corporate delegates with responsibility for innovation. They spent a day working together at Emmanuel College Cambridge followed by dinner in the iconic Old Hall at Queens’ College.

Keynote talks about Fastworks, Design Thinking, Data Innovation in Healthcare and Creative Labs were presented by GE  (US), HP (US), BASF (D) and IBM (UK) and chaired by Peter Hiscocks of the Judge Business School. This was complemented by a session on ‘Innovation Made in Cambridge’ with cases from Amazon/Alexa; Owlstone and Cambridge Silicon Radio chaired by Charles Cotton of Cambridge Phenomenon Ltd. The day also featured demonstrations  of additive manufacturing, drone management, solar technology and ‘Internet of Things’  from innovative early stage Cambridge Companies as well as cameos of the achievements of the Collaborative Consortia which are run by the Centre for Business Innovation.

Carol Safford of Vodafone described the Summit as  “An inspiring and thought provoking opportunity to ride the Innovation Curve in Beautiful Surroundings” . Elise Kissling of BASF added “Amazing day at the Centre for Business Innovation in Cambridge.  CfBI’s Innovation Summit offered its consortia members of leading corporates, start-ups and research institutes a compact day on innovation best practice”

Andrew Priest of law firm Hewitsons LLP, who specialises in commercial and intellectual property matters for technology sector clients, said: “Once again, the Cambridge Innovation Summer Summit was a truly enjoyable and worthwhile event, and Hewitsons was delighted to co-sponsor it for a second year running.

“As a firm, we work with some exceptionally innovative clients, so we know what talent and experience the technology sector has to offer in and around Cambridge. It was great to see leading Cambridge innovators, and those responsible for innovation within the organisations of business giants from around the world, recount their fascinating stories and exchange ideas on how to use innovative potential to achieve sustainable competitive advantage.”

CfBI’s ‘Medical Adherence’, ‘Nano-Carbon Enhanced Materials’ and ‘Corporate Venturing’ consortia also met in the Cambridge on the following day enabling international delegates to get added benefit from their visit.

In response to  the excellent feedback from delegates, plans are already underway to run Cambridge Innovation Summit 2018 on an even larger scale.

Images from the Cambridge Innovation Summit 2017

Working Session

Neil Hempsall of GE Fastworks address

 

Networking Dinner in Old Hall at Queen’s

Emmanuel College Cambridge

Further information from: Peter Hewkin  07951721110  ceo@cfbi.com

Concerns Over Side Effects of Statins Stopping Stroke Survivors Taking Medication

Concerns over side effects of statins stopping stroke survivors taking medication

source: www.cam.ac.uk

Negative media coverage of the side effects associated with taking statins, and patients’ own experiences of taking the drugs, are among the reasons cited by stroke survivors and their carers for stopping taking potentially life-saving drugs, according to research published today.

These findings have highlighted the need for an open, honest dialogue between patients and/or their carers, and healthcare professionals

Anna De Simoni

Individuals who have had a stroke are at risk of a second stroke, which carries a greater risk of disability and death than first time strokes. In fact, one third of all strokes occur in individuals who have previously had a stroke. To prevent this recurrence, patients are offered secondary preventative medications; however, adherence is a problem with 30% of stroke patients failing to take their medications as prescribed.

To examine the barriers to taking these medications, researchers at the University of Cambridge and Queen Mary University, London (QMUL), analysed posts to TalkStroke, a UK-based online forum hosted by the Stroke Association, across a seven year period (2004-2011).  The forum was used by stroke survivors and their carers.

The team, led by Dr Anna De Simoni, a lecturer in Primary Care Research at QMUL and visiting researcher at the Department of Public Health and Primary Care, University of Cambridge, has previously used the forum to explore issues such as the impairment that can make it difficult for stroke survivors to maintain a job.

The findings of the study, which looked at posts by 84 participants, including 49 stroke survivors and 33 caregivers, are published today in the journal BMJ Open. The Stroke Association gave the researchers permission to analyse the results, and to prevent identification of individuals, the team did not use verbatim comments.

Among the reasons cited by the forum users, side effects were a major factor in decisions to stop taking medication. Several contributors had experienced negative side effects and as a result had stopped taking the medication, sometimes in consultation with their GP and other times unilaterally. Others reported that they, or the person they were caring for, had stopped taking the medication after reading negative stories in the press about side effects.

Other users expressed concerns over the medication they were offered. There were conflicting views about the efficacy of the medications – some contributors believed they were very important, while others believed that their risk could be managed by lifestyle changes alone.

Contributors also reported mixed views of healthcare professionals – some felt confident in their doctor’s decision, while others questioned their decisions, some even questioning their motivation for prescribing particular drugs.

“These findings have highlighted the need for an open, honest dialogue between patients and/or their carers, and healthcare professionals,” says Dr De Simoni. “Doctors need to listen to these concerns, discuss the benefits and drawbacks of taking the medication, and be willing to support a patient’s informed decision to refuse medications.”

However, perceptions did not present the only barriers to adherence: there were often practical considerations. Drugs were sometimes too large and difficult to swallow, or a drug regime was too burdensome. The complexities of the drug regimens sometimes meant having to develop routines and strategies to ensure patients kept to them. One survivor described having to pay for the medications by credit card as she was unable to work and had no money or benefits coming in.

“By analysing people’s views as expressed in online forums, where they are more open and less guarded, we’ve seen some valuable insights into why some stroke survivors have difficulty adhering to their medication,” says PhD candidate and first author James Jamison from the Department of Public Health and Primary Care at Cambridge.

“Challenging negative beliefs about medication and adopting practices that make routines for taking medication simpler, particularly for those patients who have suffered disability as a result of stroke, should increase adherence and ultimately improve health outcomes.”

The research was supported by the National Institute of Health Research, the Stroke Association and the British Heart Foundation.

For more information about statins, visit NHS Choices.

Reference
Jamison, J et al. Barriers and facilitators to adherence to secondary stroke prevention medications after stroke: Analysis of survivors’ and caregivers’ views from an online stroke forum. BMJ Open; 19 July 2017; DOI: 10.17863/CAM.10458


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Smallest-Ever Star Discovered By Astronomers

Smallest-ever star discovered by astronomers

source: www.cam.ac.uk

A star about the size of Saturn – the smallest ever measured – has been identified by astronomers.

Our discovery reveals how small stars can be.

Alexander Boetticher

The smallest star yet measured has been discovered by a team of astronomers led by the University of Cambridge. With a size just a sliver larger than that of Saturn, the gravitational pull at its stellar surface is about 300 times stronger than what humans feel on Earth.

The star is likely as small as stars can possibly become, as it has just enough mass to enable the fusion of hydrogen nuclei into helium. If it were any smaller, the pressure at the centre of the star would no longer be sufficient to enable this process to take place. Hydrogen fusion is also what powers the Sun, and scientists are attempting to replicate it as a powerful energy source here on Earth.

These very small and dim stars are also the best possible candidates for detecting Earth-sized planets which can have liquid water on their surfaces, such as TRAPPIST-1, an ultracool dwarf surrounded by seven temperate Earth-sized worlds.

The newly-measured star, called EBLM J0555-57Ab, is located about six hundred light years away. It is part of a binary system, and was identified as it passed in front of its much larger companion, a method which is usually used to detect planets, not stars. Details will be published in the journal Astronomy & Astrophysics.

“Our discovery reveals how small stars can be,” said Alexander Boetticher, the lead author of the study, and a Master’s student at Cambridge’s Cavendish Laboratory and Institute of Astronomy. “Had this star formed with only a slightly lower mass, the fusion reaction of hydrogen in its core could not be sustained, and the star would instead have transformed into a brown dwarf.”

EBLM J0555-57Ab was identified by WASP, a planet-finding experiment run by the Universities of Keele, Warwick, Leicester and St Andrews. EBLM J0555-57Ab was detected when it passed in front of, or transited, its larger parent star, forming what is called an eclipsing stellar binary system. The parent star became dimmer in a periodic fashion, the signature of an orbiting object. Thanks to this special configuration, researchers can accurately measure the mass and size of any orbiting companions, in this case a small star. The mass of EBLM J0555-57Ab was established via the Doppler, wobble method, using data from the CORALIE spectrograph.

“This star is smaller, and likely colder than many of the gas giant exoplanets that have so far been identified,” said von Boetticher. “While a fascinating feature of stellar physics, it is often harder to measure the size of such dim low-mass stars than for many of the larger planets. Thankfully, we can find these small stars with planet-hunting equipment, when they orbit a larger host star in a binary system. It might sound incredible, but finding a star can at times be harder than finding a planet.”

This newly-measured star has a mass comparable to the current estimate for TRAPPIST-1, but has a radius that is nearly 30% smaller. “The smallest stars provide optimal conditions for the discovery of Earth-like planets, and for the remote exploration of their atmospheres,” said co-author Amaury Triaud, senior researcher at Cambridge’s Institute of Astronomy. “However, before we can study planets, we absolutely need to understand their star; this is fundamental.”

Although they are the most numerous stars in the Universe, stars with sizes and masses less than 20% that of the Sun are poorly understood, since they are difficult to detect due to their small size and low brightness. The EBLM project, which identified the star in this study, aims to plug that lapse in knowledge. “Thanks to the EBLM project, we will achieve a far greater understanding of the planets orbiting the most common stars that exist, planets like those orbiting TRAPPIST-1,” said co-author Professor Didier Queloz of Cambridge’ Cavendish Laboratory.

Reference
Alexander von Boetticher et al. ‘A Saturn-size low-mass star at the hydrogen-burning limit.’ Astronomy & Astrophysics (2017). arXiv:1706.08781


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Green Method Developed For Making Artificial Spider Silk

Green method developed for making artificial spider silk

source: www.cam.ac.uk

Researchers have designed a super stretchy, strong and sustainable material that mimics the qualities of spider silk, and is ‘spun’ from a material that is 98% water.

This method of making fibres could be a sustainable alternative to current manufacturing methods.

Darshil Shah

A team of architects and chemists from the University of Cambridge has designed super-stretchy and strong fibres which are almost entirely composed of water, and could be used to make textiles, sensors and other materials. The fibres, which resemble miniature bungee cords as they can absorb large amounts of energy, are sustainable, non-toxic and can be made at room temperature.

This new method not only improves upon earlier methods of making synthetic spider silk, since it does not require high energy procedures or extensive use of harmful solvents, but it could substantially improve methods of making synthetic fibres of all kinds, since other types of synthetic fibres also rely on high-energy, toxic methods. The results are reported in the journal Proceedings of the National Academy of Sciences.

Spider silk is one of nature’s strongest materials, and scientists have been attempting to mimic its properties for a range of applications, with varying degrees of success. “We have yet to fully recreate the elegance with which spiders spin silk,” said co-author Dr Darshil Shah from Cambridge’s Department of Architecture.

The fibres designed by the Cambridge team are “spun” from a soupy material called a hydrogel, which is 98% water. The remaining 2% of the hydrogel is made of silica and cellulose, both naturally available materials, held together in a network by barrel-shaped molecular “handcuffs” known as cucurbiturils. The chemical interactions between the different components enable long fibres to be pulled from the gel.

The fibres are pulled from the hydrogel, forming long, extremely thin threads – a few millionths of a metre in diameter. After roughly 30 seconds, the water evaporates, leaving a fibre which is both strong and stretchy.

“Although our fibres are not as strong as the strongest spider silks, they can support stresses in the range of 100 to 150 megapascals, which is similar to other synthetic and natural silks,” said Shah. “However, our fibres are non-toxic and far less energy-intensive to make.”

The fibres are capable of self-assembly at room temperature, and are held together by supramolecular host-guest chemistry, which relies on forces other than covalent bonds, where atoms share electrons.

“When you look at these fibres, you can see a range of different forces holding them together at different scales,” said Yuchao Wu, a PhD student in Cambridge’s Department of Chemistry, and the paper’s lead author. “It’s like a hierarchy that results in a complex combination of properties.”

The strength of the fibres exceeds that of other synthetic fibres, such as cellulose-based viscose and artificial silks, as well as natural fibres such as human or animal hair.

In addition to its strength, the fibres also show very high damping capacity, meaning that they can absorb large amounts of energy, similar to a bungee cord. There are very few synthetic fibres which have this capacity, but high damping is one of the special characteristics of spider silk. The researchers found that the damping capacity in some cases even exceeded that of natural silks.

“We think that this method of making fibres could be a sustainable alternative to current manufacturing methods,” said Shah. The researchers plan to explore the chemistry of the fibres further, including making yarns and braided fibres.

This research is the result of a collaboration between the Melville Laboratory for Polymer Synthesis in the Department of Chemistry, led by Professor Oren Scherman; and the Centre for Natural Material Innovation in the Department of Architecture, led by Dr Michael Ramage. The two groups have a mutual interest in natural and nature-inspired materials, processes and their applications across different scales and disciplines.

The research is supported by the UK Engineering and Physical Sciences Research Council (EPSRC) and the Leverhulme Trust.

Reference
Yuchao Wu et al. ‘Bioinspired supramolecular fibers drawn from a multiphase self-assembled hydrogel.’ Proceedings of the National Academy of Sciences (2017). DOI: 10.1073/pnas.1705380114


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Unique Cambridge Online Start-Up Helps UK Businesses Trade in Chinese

Unique Cambridge online start-up helps UK businesses trade in Chinese

     

With the ever-changing international trade environment, trading with China is more important than ever.  There is a real need for a single source of help for doing business with China that not only provides useful information, knowledge and insight, but also offers an efficient mechanism to get Chinese projects done affordably and to a good standard.

Crayfish International, a Cambridge based company, has addressed this need by launching the Crayfish® platform, the first and only online marketplace dedicated to English-Chinese bilingual project work. It enables businesses to access instantly a variety of skills and China expertise on demand – from translation, to setting up a Chinese social media account, identifying a distributor or finding the right manufacturer in China.  No matter how big or small, Crayfish’s simple design and transaction process means all projects can be done in just four easy steps. Crayfish offers considerable cost savings enabling businesses to make good use of flexible Chinese speaking resources that are currently under-utilised.

“After working with many SMEs to access China over the past 17 years, we fully understand all the difficulties they have in getting Chinese projects done when they do not have enough resources. The timing of Crayfish could not be better, especially with the uncertainties post-Brexit,” says Ting Zhang, Founder and CEO of Crayfish International.

Ting also set up China Business Solutions back in 2001, the same year China joined WTO, with a mission to help UK companies to turn the China opportunities into commercial success. Since then she has personally helped many companies including familiar names in the Cambridge tech cluster but it was her passion to find an ultimate affordable solution for SMEs that led to the creation of Crayfish, which Ting explains as “Chinese-speaking resources at your fingertips”.   According to Ting, “Crayfish is perfect for those who would like to trade with China in small steps, for example getting a one pager translated into Chinese, or getting someone in China to qualify some sales leads on the ground. For companies already having significant operations in China, they can find Crayfish helpful for ad hoc project work that they don’t have in-house resources to do quickly and well.”

Co-founder Dr Elizabeth Hill adds: “Our vision is to bridge the language and cultural barrier between the rest of the world and China, and our goal is for Crayfish to become THE PLACE for getting affordable help in Chinese.”

Since the launch of the platform three weeks ago, a number of translation and business development projects have already been completed and more are underway including one that involves getting utility model patents filed in China. Freelancers have signed up from the UK, China, New Zealand and Finland, with skills covering translation, social media, business development, sourcing, legal, accounting, events, investment analysis and technical research.

Eighteen months in the making, Crayfish is backed by two Cambridge serial entrepreneurs Dr Jonathan Milner and Dr David Cleevely.

For more information about Crayfish, please visit www.Crayfish.io

Media enquiries:  media@crayfish.io

Artificial Bile Ducts Grown in Lab and Transplanted Into Mice Could Help Treat Liver Disease in Children

Artificial bile ducts grown in lab and transplanted into mice could help treat liver disease in children

source: www.cam.ac.uk

Cambridge scientists have developed a new method for growing and transplanting artificial bile ducts that could in future be used to help treat liver disease in children, reducing the need for liver transplantation.

Our work has the potential to transform the treatment of bile duct disorders

Ludovic Vallier

In research published in the journal Nature Medicine, the researchers grew 3D cellular structure which, once transplanted into mice, developed into normal, functioning bile ducts.

Bile ducts are long, tube-like structures that carry bile, which is secreted by the liver and is essential for helping us digest food. If the ducts do not work correctly, for example in the childhood disease biliary atresia, this can lead to damaging build of bile in the liver.

The study suggests that it will be feasible to generate and transplant artificial human bile ducts using a combination of cell transplantation and tissue engineering technology. This approach provides hope for the future treatment of diseases of the bile duct; at present, the only option is a liver transplant.

The University of Cambridge research team, led by Professor Ludovic Vallier and Dr Fotios Sampaziotis from the Wellcome-MRC Cambridge Stem Cell Institute and Dr Kourosh Saeb-Parsy from the Department of Surgery, extracted healthy cells (cholangiocytes) from bile ducts and grew these into functioning 3D duct structures known as biliary organoids.  When transplanted into mice, the biliary organoids assembled into intricate tubular structures, resembling bile ducts.

The researchers, in collaboration with Mr Alex Justin and Dr Athina Markaki from the Department of Engineering, then investigated whether the biliary organoids could be grown on a ‘biodegradable collagen scaffold’, which could be shaped into a tube and used to repair damaged bile ducts in the body.  After four weeks, the cells had fully covered the miniature scaffolding resulting in artificial tubes which exhibited key features of a normal, functioning bile duct.  These artificial ducts were then used to replace damaged bile ducts in mice.  The artificial duct transplants were successful, with the animals surviving without further complications.

“Our work has the potential to transform the treatment of bile duct disorders,” explains Professor Vallier. “At the moment, our only option is liver transplantation, so we are limited by the availability of healthy organs for transplantation. In future, we believe it will be possible to generate large quantities of bioengineered tissue that could replace diseased bile ducts and provide a powerful new therapeutic option without this reliance on organ transplants.”

“This demonstrates the power of tissue engineering and regenerative medicine,” adds Dr Sampaziotis. “These artificial bile ducts will not only be useful for transplanting, but could also be used to model other diseases of the bile duct and potentially develop and test new drug treatments.”

Professor Vallier is part of the Department of Surgery at the University of Cambridge and his team are jointly based at the Wellcome Trust-MRC Cambridge Stem Cell Institute and the Wellcome Trust Sanger Institute.

The work was supported by the Medical Research Council, Sparks children’s medical research charity and the European Research Council.

Reference
Sampaziotis, F et al. Reconstruction of the murine extrahepatic biliary tree using primary extrahepatic cholangiocyte organoids. Nature Medicine; 3 July 2017; DOI: 10.1038/nm.4360


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Brain Training’ App Found To Improve Memory In People With Mild Cognitive Impairment

‘Brain training’ app found to improve memory in people with mild cognitive impairment

source: www.cam.ac.uk

A ‘brain training’ game developed by researchers at the University of Cambridge could help improve the memory of patients in the very earliest stages of dementia, suggests a study published today in The International Journal of Neuropsychopharmacology.

There’s increasing evidence that brain training can be beneficial for boosting cognition and brain health, but it needs to be based on sound research

Barbara Sahakian

Amnestic mild cognitive impairment (aMCI) has been described as the transitional stage between ‘healthy ageing’ and dementia. It is characterised by day-to-day memory difficulties and problems of motivation. At present, there are no approved drug treatments for the cognitive impairments of patients affected by the condition.

Cognitive training has shown some benefits, such as speed of attentional processing, for patients with aMCI, but training packages are typically repetitive and boring, affecting patients’ motivation. To overcome this problem, researchers from the Departments of Psychiatry and Clinical Neurosciences and the Behavioural and Clinical Neuroscience Institute at the University of Cambridge developed ‘Game Show’, a memory game app, in collaboration with patients with aMCI, and tested its effects on cognition and motivation.

The researchers randomly assigned forty-two patients with amnestic MCI to either the cognitive training or control group. Participants in the cognitive training group played the memory game for a total of eight one-hour sessions over a four-week period; participants in the control group continued their clinic visits as usual.

In the game, which participants played on an iPad, the player takes part in a game show to win gold coins. In each round, they are challenged to associate different geometric patterns with different locations. Each correct answer allows the player to earn more coins. Rounds continue until completion or after six incorrect attempts are made. The better the player gets, the higher the number of geometric patterns presented – this helps tailor the difficulty of the game to the individual’s performance to keep them motivated and engaged. A game show host encourages the player to maintain and progress beyond their last played level.

Screenshot from Game Show. Credit: Sahakian Lab

The results showed that patients who played the game made around a third fewer errors, needed fewer trials and improved their memory score by around 40%, showing that they had correctly remembered the locations of more information at the first attempt on a test of episodic memory. Episodic memory is important for day-to-day activities and is used, for example, when remembering where we left our keys in the house or where we parked our car in a multi-story car park. Compared to the control group, the cognitive training group also retained more complex visual information after training.

In addition, participants in the cognitive training group indicated that they enjoyed playing the game and were motivated to continue playing across the eight hours of cognitive training. Their confidence and subjective memory also increased with gameplay. The researchers say that this demonstrates that games can help maximise engagement with cognitive training.

“Good brain health is as important as good physical health. There’s increasing evidence that brain training can be beneficial for boosting cognition and brain health, but it needs to be based on sound research and developed with patients,” says Professor Barbara Sahakian, co-inventor of the game: “It also need to be enjoyable enough to motivate users to keep to their programmes. Our game allowed us to individualise a patient’s cognitive training programme and make it fun and enjoyable for them to use.”

Dr George Savulich, the lead scientist on the study, adds: “Patients found the game interesting and engaging and felt motivated to keep training throughout the eight hours. We hope to extend these findings in future studies of healthy ageing and mild Alzheimer’s disease.”

The researchers hope to follow this published study up with a future large-scale study and to determine how long the cognitive improvements persist.

The design of ‘Game Show’ was based on published research from the Sahakian Laboratory at the University of Cambridge. The study was funded by Janssen Pharmaceuticals/J&J and Wellcome.

In 2015, Professor Sahakian and colleagues showed that another iPad game developed by her team was effective at improving the memory of patients with schizophrenia, helping them in their daily lives at work and living independently. The Wizard memory game is available through PEAK via the App Store and Google Play.

Reference
George Savulich, Thomas Piercy, Chris Fox, John Suckling, James Rowe, John O’Brien, Barbara Sahakian. Cognitive training using a novel memory game on an iPad in patients with amnestic mild cognitive impairment (aMCI). The International Journal of Neuropsychopharmacology; 3 July 2017; DOI: 10.1093/ijnp/pyx040


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Bulges’ In Volcanoes Could Be Used To Predict Eruptions

‘Bulges’ in volcanoes could be used to predict eruptions

source: www.cam.ac.uk

A team of researchers from the University of Cambridge have developed a new way of measuring the pressure inside volcanoes, and found that it can be a reliable indicator of future eruptions.

This could be a new way of predicting volcanic eruptions.

Clare Donaldson

Using a technique called ‘seismic noise interferometry’ combined with geophysical measurements, the researchers measured the energy moving through a volcano. They found that there is a good correlation between the speed at which the energy travelled and the amount of bulging and shrinking observed in the rock. The technique could be used to predict more accurately when a volcano will erupt. Their results are reported in the journal Science Advances.

Data was collected by the US Geological Survey across Kīlauea in Hawaii, a very active volcano with a lake of bubbling lava just beneath its summit. During a four-year period, the researchers used sensors to measure relative changes in the velocity of seismic waves moving through the volcano over time. They then compared their results with a second set of data which measured tiny changes in the angle of the volcano over the same time period.

As Kīlauea is such an active volcano, it is constantly bulging and shrinking as pressure in the magma chamber beneath the summit increases and decreases. Kīlauea’s current eruption started in 1983, and it spews and sputters lava almost constantly. Earlier this year, a large part of the volcano fell away and it opened up a huge ‘waterfall’ of lava into the ocean below. Due to this high volume of activity, Kīlauea is also one of the most-studied volcanoes on Earth.

The Cambridge researchers used seismic noise to detect what was controlling Kīlauea’s movement. Seismic noise is a persistent low-level vibration in the Earth, caused by everything from earthquakes to waves in the ocean, and can often be read on a single sensor as random noise. But by pairing sensors together, the researchers were able to observe energy passing between the two, therefore allowing them to isolate the seismic noise that was coming from the volcano.

“We were interested in how the energy travelling between the sensors changes, whether it’s getting faster or slower,” said Clare Donaldson, a PhD student in Cambridge’s Department of Earth Sciences, and the paper’s first author. “We want to know whether the seismic velocity changes reflect increasing pressure in the volcano, as volcanoes bulge out before an eruption. This is crucial for eruption forecasting.”

One to two kilometres below Kīlauea’s lava lake, there is a reservoir of magma. As the amount of magma changes in this underground reservoir, the whole summit of the volcano bulges and shrinks. At the same time, the seismic velocity changes. As the magma chamber fills up, it causes an increase in pressure, which leads to cracks closing in the surrounding rock and producing faster seismic waves – and vice versa.

“This is the first time that we’ve been able to compare seismic noise with deformation over such a long period, and the strong correlation between the two shows that this could be a new way of predicting volcanic eruptions,” said Donaldson.

Volcano seismology has traditionally measured small earthquakes at volcanoes. When magma moves underground, it often sets off tiny earthquakes, as it cracks its way through solid rock. Detecting these earthquakes is therefore very useful for eruption prediction. But sometimes magma can flow silently, through pre-existing pathways, and no earthquakes may occur. This new technique will still detect the changes caused by the magma flow.

Seismic noise occurs continuously, and is sensitive to changes that would otherwise have been missed. The researchers anticipate that this new research will allow the method to be used at the hundreds of active volcanoes around the world.

Reference
C. Donaldson et al. ‘Relative seismic velocity variations correlate with deformation at Kīlauea volcano’. Science Advances (2017) DOI: 10.1126/sciadv.1700219 

Inset image: Lava Waterfall, Kilauea Volcano, Hawaii. Credit: Dhilung Kirat


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Study Reveals Mysterious Equality With Which Grains Pack It In

Study reveals mysterious equality with which grains pack it in

source: www.cam.ac.uk

For the first time, researchers have been able to test a theory explaining the physics of how substances like sand and gravel pack together, helping them to understand more about some of the most industrially-processed materials on the planet.

Granular materials are so widely-used that understanding their physics is very important. Clearly, something very special is happening at the moment when grains pack together in this way.

Stefano Martiniani

At the moment they come together, the individual grains in materials like sand and snow appear to have exactly the same probability of combining into any one of their many billions of possible arrangements, researchers have shown.

The finding, by an international team of academics at the University of Cambridge, UK, and Brandeis University in the US, appears to confirm a decades-old mathematical theory which has never been proven, but provides the basis for better understanding granular materials – one of the most industrially significant classes of material on the planet.

A granular material is anything that comprises solid particles that can be seen individually with the naked eye. Examples include sand, gravel, snow, coal, coffee, and rice.

If correct, the theory demonstrated in the new study points to a fact of remarkable – and rather mysterious – mathematical symmetry. It means, for example, that every single possible arrangement of the grains of sand within a sand dune is exactly as probable as any other.

The study was led by Stefano Martiniani, who is based at New York University but undertook the research while completing his PhD at St John’s College, University of Cambridge.

“Granular materials are so widely-used that understanding their physics is very important,” Martiniani said. “This theory gives us a very simple and elegant way to describe their behaviour. Clearly, something very special is happening in their physics at the moment when grains pack together in this way.”

The conjecture that Martiniani tested was first proposed in 1989 by the Cambridge physicist Sir Sam F. Edwards, in an effort to better understand the physical properties of granular materials.

Globally, these are the second-most processed type of material in industry (after water) and staples of sectors such as energy, food and pharmaceuticals. In the natural world, vast granular assemblies, such as sand dunes, interact directly with wind, water and vegetation. Yet the physical laws that determine how they behave in different conditions are still poorly understood. Sand, for example, behaves like a solid when jammed together, but flows like a liquid when loose.

Understanding more about the mechanics of granular materials is of huge practical importance. When they jam during industrial processing, for example, it can cause significant disruption and damage. Equally, the potential for granular materials to “unjam” can be disastrous, such as when soil or snow suddenly loosens, causing a landslide or avalanche.

At the heart of Edwards’ proposal was a simple hypothesis: If one does not explicitly add a bias when preparing a jammed packing of granular materials – for example by pouring sand into a container – then any possible arrangement of the grains within a certain volume will occur with the same probability.

This is the analogue of the assumption that is at the heart of equilibrium statistical mechanics – that all states with the same energy occur with equal probability. As a result the Edwards hypothesis offered a way for researchers to develop a statistical mechanics framework for granular materials, which has been an area of intense activity in the last couple of decades.

But the hypothesis was impossible to test – not least because above a handful of grains, the number of possible arrangements becomes unfathomably huge. Edwards himself died in 2015, with his theory still the subject of heated scientific debate.

Now, Martiniani and colleagues have been able to put his conjecture to a direct test, and to their surprise they found that it broadly holds true. Provided that the grains are at the point where they have just jammed together (or are just about to separate), all possible configurations are indeed equally likely.

Helpfully, this critical point – known as the jamming transition – is also the point of practical significance for many of the granular materials used in industry. Although Martiniani modelled a system comprising soft spheres, a bit like sponge tennis balls, many granular materials are hard grains that cannot be compressed further once in a packed state.

“Apart from being a very beautiful theory, this study gives us the confidence that Edwards’ framework was correct,” Martiniani said. “That means that we can use it as a lens through which to look at a whole range of related problems.”

Aside from informing existing processes that involve granular materials, there is a wider significance to better understanding their mechanics. In physics, a “system” is anything that involves discrete particles operating as part of a wider network. Although bigger in scale, the way in which icebergs function as part of an ice floe, or the way that individual vehicles move within a flow of traffic (and indeed sometimes jam), can be studied using a similar theoretical basis.

Martiniani’s study was undertaken during his PhD, while he was a Gates Scholar, under the supervision of Professor Daan Frenkel from the Department of Chemistry. It built on earlier research in which he developed new methods for calculating the probability of granular systems packing into different configurations, despite the vast numbers involved. In work published last year, for example, he and colleagues used computer modelling to work out how many ways a system containing 128 tennis balls could potentially be arranged. The answer turned out to be ten unquadragintilliard – a number so huge that it vastly exceeds the total number of particles in the universe.

In the new study, the researchers employed a sampling technique which attempts to compute the probability of different arrangements of grains without actually looking at the frequency with which these arrangements occur. Rather than taking an average from random samples, the method involves calculating the limits of the possibility of specific arrangements, and then calculates the overall probability from this.

The team applied this to a computer model of 64 soft spheres – an imaginary system which could therefore be “over-compressed” after reaching the jamming transition point. In an over-compressed state, the different arrangements were found to have different probabilities of occurrence. But as the system decompressed to the point of the jamming transition, at which the grains were effectively just touching, the researchers found that all probabilities became equal – exactly as Edwards predicted.

“In 1989, we didn’t really have the means of studying whether Edwards was right or not,” Martiniani added. “Now that we do, we can understand more about how granular materials work; how they flow, why they get stuck, and how we can use and manage them better in a whole range of different situations.”

The study, Numerical test of the Edwards conjecture shows that all packings become equally probable at jamming is published in the journal Nature Physics. DOI: 10.1038/nphys4168.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

World War II Bombing Associated With Resilience, Not ‘German Angst’

World War II bombing associated with resilience, not ‘German Angst’

source: www.cam.ac.uk

Experiencing traumatic events may be associated with greater mental resilience among residents rather than causing widespread angst, suggests a study published this week that investigated the effect of World War II bombing on the mental health of citizens in German cities.

Maybe this stereotype of ‘German Angst’ isn’t entirely valid

Jason Rentfrow

Germans have been stereotyped as being industrious and punctual, but also as being more likely to be anxious and worried, a phenomenon described as ‘German Angst’. Former German Chancellor Helmut Schmidt, widely regarded as one of Germany’s leading post-war intellectuals, once claimed, “The Germans have a tendency to be afraid. This has been part of their consciousness since the end of the Nazi period and the war”.

This personality type is characterised by high levels of neurotic personality traits (more likely to be in a negative emotional state), as opposed to traits of openness, agreeableness, extraversion, or conscientiousness, which together make up the ‘Big Five’ personality traits. It has been suggested that the heavy bombing of German cities in World War II, and the resulting destruction and trauma experienced by residents, may have been a contributory factor in this proposed higher incidence of neurotic traits.

In a study published this week in European Journal of Personality, an international team of researchers from the UK, Germany, USA, and Australia, analysed the neurotic personality traits and mental health of over 33,500 individuals across 89 regional German cities that experienced wartime bombing, and investigated whether people in cities that experienced higher levels of bombing were more likely to display neurotic traits. The researchers measured neurotic traits using the Big Five Inventory personality test as part of an online questionnaire, and focused on measures of neuroticism, anxiety, and depression.

“If the idea of ‘German Angst’ is true, then we’d expect people from cities that were heavily bombed during the war to be more anxious and less resilient to new stresses such as economic hardship,” says study author Dr Jason Rentfrow from the Department of Psychology, University of Cambridge. “Ours is the first study to investigate this link.”

The researchers found that in fact, residents of heavily bombed cities were less likely to display neurotic traits, suggesting that wartime bombing is not a factor in German Angst. The results indicate that residents of heavily bombed German cities instead recorded higher levels of mental resilience and were better able to cope in times of stress.

“We’ve seen from other studies that when people experience difficulties in life, these can provide them with a broader perspective on things and perhaps make more trivial stresses seem unimportant,” explains Dr Rentfrow. “It’s possible that this is what we are seeing here.”

The researchers also looked at how Germany compared to 107 other countries for neurotic traits, to see whether there really was evidence of ‘German Angst’. They found that Germany ranks 20th, 31st, and 53rd for depression, anxiety, and neuroticism respectively. Additionally, other countries that have experienced significant trauma due to warfare, such as Japan, Afghanistan, and Vietnam, also did not score highly for neurotic traits, further suggesting that such traumatic events are not associated with increased neuroticism.

“Germany didn’t stand out as high in anything resembling angst compared with other countries, which suggests that maybe this stereotype of ‘German Angst’ isn’t entirely valid,” says Dr Rentfrow. “Clearly we need to be careful about national stereotypes.”

The researchers emphasise that their findings show only an association, and that this data does not show whether more severe bombing caused greater mental resilience, or whether other factors were at play.

Although this research may have implications for other war-torn countries, including the current situation in Syria cities, the study did not investigate potential neuroticism or resilience in these countries, so no wider conclusions can be drawn from this data.

Study participants filled out online questionnaires provided by the global Gosling-Potter Internet Project, including 44 questions to assess their personality and mental state. Of the sample, just under 60% were female and the mean age was 30 years old. Almost all (96%) of the respondents were White/Caucasian while just under one in three (30%) had a bachelor’s degree or higher, and overall the sample was broadly representative of the populations of the cities assessed. Although the researchers tried to control for the movement of people between different cities, there were limitations with the data available from the online survey and so this movement may have affected the results.

The data also could not tell whether increased resilience was associated with a recent event, or whether it was associated with an event from many years or even decades ago. However, there is broader literature to support the notion of traumas increasing resilience in individuals, and more research in this area would shed further light on the relationship and potential mechanisms at play.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Computer-designed antibodies target Toxins Associated with Alzheimer’s Disease

Computer-designed antibodies target toxins associated with Alzheimer’s disease

source: www.cam.ac.uk

Researchers at the University of Cambridge have designed antibodies that target the protein deposits in the brain associated with Alzheimer’s disease, and stop their production.

If we can find better and cheaper ways of producing antibodies, we would increase the chances of finding treatments for patients

Michele Vendruscolo

The researchers used computer-based methods to develop antibodies – the star players of the body’s natural defence system – to target the deposits of misfolded proteins which are a hallmark of Alzheimer’s disease. Early tests of the antibodies in test tubes and in nematode worms showed an almost complete elimination of these pathogens.

The antibodies were designed to systematically scan the sequence of amyloid-beta, the main component of the toxic deposits associated with Alzheimer’s disease. By targeting specific regions, or epitopes, of the amyloid-beta sequence, the different antibodies were able to block amyloid-beta’s ability to stick together, or aggregate. Their results are reported in the journal Science Advances.

Alzheimer’s disease is the most common form of dementia, which affects nearly one million people in the UK and about 50 million worldwide. One of the hallmarks of Alzheimer’s disease is the build-up of protein deposits, known as plaques and tangles, in the brains of affected individuals. These deposits, which accumulate when naturally-occurring proteins in the body fold into the wrong shape and aggregate, are formed primarily of two proteins: amyloid-beta and tau.

The process of protein aggregation also creates smaller clusters called oligomers, which are highly toxic to nerve cells and are thought to be responsible for brain damage in Alzheimer’s disease. Researchers around the world have spent decades attempting to unravel the processes that cause Alzheimer’s disease, and to target the misfolding proteins before they are able to aggregate.

Antibodies are dedicated proteins that help defend the body against harmful pathogens by recognising their specific targets, known as antigens. The power of antibodies can be harnessed to make effective treatments, such as vaccines, but to date no antibody has been developed to treat Alzheimer’s or any other neurodegenerative disease, although several antibody-based treatments for Alzheimer’s disease are currently in clinical trials.

“Developing antibody-based therapies is costly and time-consuming, but if we can find better and cheaper ways of producing antibodies, we would increase the chances of finding treatments for patients – making them by design can create opportunities to achieve this goal,” said Professor Michele Vendruscolo from the Centre for Misfolding Diseases in Cambridge, and the paper’s senior author.

To date, there have been two main ways of producing antibodies. The first, which has been in use for about 50 years, is to inject animals with the relevant antigen. The antigen stimulates the immune system to produce antibodies to attack the alien substance, and those antibodies can then be extracted as a therapeutic. The second method, developed in the 1990s, does not require the use of animals and instead relies on the screening of large laboratory-constructed libraries to isolate the relevant antibodies.

“In the past few years, thanks to increasingly powerful computers and large structural databases, it has become possible to design antibodies in a computer, which substantially lowers the time and cost required,” said study co-author Dr Pietro Sormanni, a postdoctoral researcher in the Centre for Misfolding Diseases. “It also allows us to target specific regions within the antigen, as well as to control for other properties critical for clinical applications, such as antibody stability and solubility.”

One of the advantages of the antibodies used in this study is their very small size. In these smaller antibodies, called single-domain antibodies, the ‘trigger’ for an immune response is stripped off, thereby blocking the inflammatory reactions that have so far prevented the widespread adoption of antibody-based therapies for Alzheimer’s disease.

A major advantage of these designed antibodies is that they can be systematically produced to bind to the different regions of the target protein. In this way researchers can extensively and inexpensively explore a variety of mechanisms of action, and select the most effective one for blocking the production of toxins.

“Since the designed antibodies can selectively target oligomers, which are present in low numbers relative to the total amounts of amyloid-beta, we expect them to be effective even when administered in low doses,” said Dr Francesco Aprile, a Senior Research Fellow of the Alzheimer’s Society in the Centre for Misfolding Diseases and the study’s first author.

Not only are these antibodies designed to not stimulate an immune response, but they are also much smaller than standard antibodies, so they could be delivered more effectively to the brain through the blood-brain barrier. Aprile has recently been awarded the 2017 ‘Outstanding early-career contribution to dementia’ award by the Alzheimer’s Society for his work.

“The innovative approach taken by Dr Aprile and his colleagues tackles the issue of developing drugs for Alzheimer’s disease from a new angle, by using advanced computer techniques to design drugs that specifically block a crucial aspect of the disease process,” said James Pickett, Head of Research at the Alzheimer’s Society. “Over the last 50 years, advances in antibody technology have delivered radical new treatments for a wide range of common diseases including rheumatoid arthritis, multiple sclerosis and some forms of cancer. While the research is still in the early stages, we are excited by the potential of this work and hope it can do the same for Alzheimer’s disease.”

“These results indicate that computational methods are becoming ready to be used alongside existing antibody discovery methods, enabling the exploration of new ways of treating a range of human diseases,” said Vendruscolo.

Reference:
Francesco A. Aprile et al. ‘Selective targeting of primary and secondary nucleation pathways in Aβ42 aggregation using a rational antibody scanning method.’ Science Advances (2017). DOI: 10.1126/sciadv.1700488


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Earliest-Known Children’s Adaptation of Japanese Literary Classic Discovered in British Library

Earliest-known children’s adaptation of Japanese literary classic discovered in British Library

source: www.cam.ac.uk

A chance discovery in the British Library has led to the discovery and reproduction of the earliest-known children’s adaptation of one of Japan’s greatest works of literature.

This is a missing piece of the jigsaw. And it was sitting in the British Library all along.

Laura Moretti

Dr Laura Moretti, from the Faculty of Asian and Middle Eastern Studies at Cambridge, came across an unknown children’s picture-book, dating from 1766, under the title of Ise fūryū: Utagaruta no hajimari (The Fashionable Ise: The Origins of Utagaruta) while on a study trip with her students.

The British Library copy, part of the collection belonging to Sir Ernest Satow, a 19th century British scholar and diplomat, is a picture-book adaptation of Ise Monogatari. Translated into English as The Tales of Ise, it is one of the most important works in Japanese literature and was originally composed probably in the late 9th century following the protagonist, Ariwara no Narihira, through his many romances, friendships and travels.

The Tales of Ise has since been adapted and reinterpreted continually down the centuries as part of the canon of Japanese literature.

“If we were to hazard a comparison, The Tales of Ise could be seen as the equivalent of the works of Shakespeare in terms of canonical status in Japan but I had never heard of or seen a children’s adaptation before – no-one knew of this book,” said Moretti. “This is a missing piece of the jigsaw. No one ever knew if it had been rewritten for children – but now we know. And it was sitting in the British Library all along.”

Dr Moretti’s new book, Recasting the Past (Brill, 2016), presents a full-colour reproduction of the 18th century edition, alongside a transcription in modern Japanese, an English translation, and textual analysis. The publication of the 1766 adaptation of the Tales of Ise fills a gap in scholars’ understanding of the work’s history. Although much scholarship has taken place on the reception of Tales of Ise and its target audiences in different epochs, no one has previously explored the age of its readership.

The 1766 introduction by the publisher shows that the book was intended to be read by children and there are various clues to support this view. The main character Narihira first appears as a young boy at school, a portrayal which encourages young people to identify with him. The whole text is also written using mainly the phonetic syllabary which could be understood by readers with only two years of schooling. The story was also abbreviated to include only 13 of the original 125 episodes –  making it easily accessible to a broad readership and was useful for introducing those with basic literacy to Japan’s cultural heritage. The book would have educated children in the narrative of The Tales of Ise as well as the aesthetic quality of the poetry.

Moretti, though, counters the notion that only children would have read Utagaruta no hajimari, and argues that the text could also work as a substitute of the The Tales of Ise for those adults with limited linguistic and cultural literacy.

Now, after several years of negotiating the necessary permissions to use the two complete extant copies (one held at the National Institute of Japanese Literature and the other at the Gotoh Museum, both in Tokyo; alas the British Library copy has only one volume of three) and to finish the transcription, translation and textual analysis, Utagaruta is available again for readers to enjoy – more than 250 years after it was first printed.

While graphic novels and comic books such as manga remain hugely popular in Japan and across the world today, instances of books where images and text are interdependent abound in pre-modern and early-modern Japanese literature. In this specific case, Moretti shows that the primary function of images was to complement the prose by filling in the gaps left by the narrative. Images set the scene for the story and helped to characterize the protagonists by depicting their dress and physical appearance.

Moretti believes that studying this children’s adaptation can give a contribution to the study of children’s literature in general, discovering aspects that might not be apparent in other cultures.

“Utagaruta no hajimari, for example, is trying to draw children into the world of the adult, rather than shield them from it by introducing children to sex and appropriate romantic behaviour,” she said.

“A vast number of early-modern Japanese picture-books that adapt canonical literature awaits to be studied. This research is the first step in the foundation of this field of study. If appropriately developed, it has the potential to shed light onto new sides of children’s literature as well as to advance in the understanding of how early-modern Japanese graphic prose functioned.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Human Reproduction Likely To Be More Efficient Than Previously Thought

Human reproduction likely to be more efficient than previously thought

source: www.cam.ac.uk

How difficult is it to conceive? According to a widely-held view, fewer than one in three embryos make it to term, but a new study from a researcher at the University of Cambridge suggests that human embryos are not as susceptible to dying in the first weeks after fertilisation as often claimed.

It’s impossible to give a precise figure for how many embryos survive in the first week but in normal healthy women, it probably lies somewhere between 60-90%

Gavin Jarvis

Dr Gavin Jarvis from Cambridge’s Department of Physiology, Development and Neuroscience re-examined data going back to the 1940’s and concluded that previous claims about natural embryo mortality are too often exaggerated. His report is published in F1000Research.

“Trying to determine whether a human embryo survives during the first days after fertilisation is almost impossible,” says Dr Jarvis. “A woman can only suspect that she is pregnant, at the earliest, two weeks after fertilisation, when she misses a period. Using sensitive laboratory tests, embryos can be detected as they implant into the womb about one week after fertilisation. What happens before then under natural circumstances is anyone’s guess.”

In 1938, two doctors in Boston, Dr Arthur Hertig and Dr John Rock, became the first people to see a human embryo when they examined wombs removed from women during surgery. They estimated that a half of human embryos die in the first two weeks after fertilisation. However, Dr Jarvis’s re-analysis of this data shows that this figure is so imprecise as to be of little value.

“I think it is fair to say that their data show that embryos can and do fail at these early stages, and also that many do just fine, but we could say that even without the data,” he adds. “Hertig’s samples, whilst descriptively informative, are quantitatively unhelpful. It doesn’t take us much further than where we would be without the data.”

Pregnancies are also lost after the first two weeks and currently published estimates of total embryo loss from fertilisation through to birth range from less than 50% to 90%. Embryo mortality of 90% implies that only 10% of embryos survive to birth, implying that human reproduction is highly inefficient.

Since 1988, several studies on women trying to get pregnant have provided a more consistent picture. The earliest point at which pregnancy can be detected is one week after fertilisation when the embryo starts to implant into the womb of the mother. At this point the hormone hCG, which is used in regular pregnancy tests, becomes detectable. Among implanting embryos, about one in five fail very soon and the woman will have a period at about the expected time, never suspecting that she conceived. Once a period is missed and pregnancy confirmed, about 10-15% will be lost before live birth, mostly within the first few months. In total, once implantation starts, about two thirds of embryos survive to birth. The number of embryos that survive and die before implantation remains unknown.

Modern reproductive technologies have enabled fertilisation to be observed directly in the laboratory. Poor survival of in vitro embryos may have contributed to the pessimistic view about natural human embryo survival, says Dr Jarvis.

“Fertilising human eggs and culturing human embryos in the laboratory is not easy. A large proportion of eggs fertilised in vitro do not develop properly even for a week. Of those that do and are transferred into women undergoing IVF treatment, most do not become a new-born baby.”

This failure of in vitro embryos may reflect the natural situation. Alternatively, the artificial environment of reproductive treatments may contribute to the high failure rate of IVF embryos. Dr Jarvis’s re-analysis of the data suggests that the latter is the case.

“It’s impossible to give a precise figure for how many embryos survive in the first week but in normal healthy women, it probably lies somewhere between 60-90%. This wide range reflects the lack of relevant data. Although we can’t be precise, we can avoid exaggeration, and from reviewing the studies that do exist, it is clear that many more survive than is often claimed,” concludes Dr Jarvis.

Reference
Gavin E Jarvis. Early embryo mortality in natural human reproduction: What the data say. f1000research; DATE; DOI: 10.12688/f1000research.8937.2

Gavin E Jarvis. Estimating limits for natural human embryo mortality. f1000research; DATE; DOI: 10.12688/f1000research.9479.2


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

The Longing of Belonging: African Photography on Show at MAA

The Longing of Belonging: African photography on show at MAA

A photography exhibition capturing the black South African Zionist community – the most popular religious denomination in the country – opens at the Museum of Archaeology and Anthropology (MAA) today.

What’s important is the sense of intimacy between me and the church.

Sabelo Mlangeni

Kholwa: The Longing of Belonging showcases the work of South African photographer Sabelo Mlangeni who dreamt up the exhibition during conversations with Joel Cabrita, a researcher from Cambridge’s Faculty of Divinity, who is researching the history of Zionism in South Africa. ‘Kholwa’ means ‘belief’ in isiZulu, one of the most widely spoken languages in South Africa.

Approximately 30 per cent of all South Africans are members of a Zionist church. Zionism (unrelated to Jewish Zionism) is the country’s largest popular religious movement but began life as a 20th century Protestant faith healing movement, originating in the small town of Zion (pop. 24,000), Illinois, in the largely white Midwest of the USA.

Cabrita’s work charts the dramatic shift and 20th century expansion of Christianity and seeks to explain how Zionism travelled across the Atlantic Ocean and became one of the most important influences in black communities more than 8,000 miles away. With approximately 15 million members, it is the largest Christian group in the region.

Mlangeni is a member of the Zionist church and his grounding in the religion can be traced in the intimate and person portraits of church members going on display in Cambridge. He and Cabrita are interested in examining what is at stake when a photographer turns his camera on a religious community they are part of.

Mlangeni said: “The biggest question for me is being part of the community, part of the church. How can I point out other people as being ‘amakholwa’ (‘the believers’) when that is what I myself am? This is a body of work that doesn’t ‘look’ at the Zionist church. It is very important for me to emphasise this, I am not interested in exotifying the church.

“I want to look at people gathering beyond church, and the strong spiritual relationships, which also include me. A long time before even studying photography, I made a lot of work about the church and church members. So my camera was in the church for a long time, church people knew me with a camera. When I look at this work, what’s important is the sense of intimacy between me and the church.”

“For me the most important part of meeting with Joel Cabrita is that it brought something new to me, an understanding of where the Zionists came from, what their beginnings were, where the church was really born [in the USA].”

Some of Mlangeni’s images portray the umlindelo amakholwa (the night vigil of believers). This all-night service forms the cornerstone of Zionist worship across South Africa.

The service consists of long nights of the entire community gathered in longing expectation for the spirit to descend, whether ancestral spirits or the Christian God. Song, prayer, sermons and dance see the believers through the night. Umlindelo amakholwa is the occasion when bonds of solidarity and community are cemented between those who spend the night in expectant waiting. As dawn breaks, the believers make their way home, while some head to a full day of work.

“Zionism was founded in the in the American Midwest in the 1890s and spread to South Africa in 1904 via missionaries and the circulation of faith-healing literature,” said Cabrita. “From the small town of Wakkerstroom, near the village of Driefontein where Sabelo grew up, Zionism spread across the region with migrant labourers returning from Johannesburg’s gold mines. Today, Zionism has adapted to African understandings of the world, with few traces of its North American roots. Southern African Zionists remain committed to the power of prayer to heal bodily illness as their American forebears.”

Working mainly in black and white, Mlangeni’s photographs focus on capturing the intimate, everyday moments of communities in contemporary South Africa. His work includes ‘Big City’ (2002 to 2015) which focuses on Johannesburg’s history, and ‘Country Girls’ (shot between 2003 and 2009), which focuses on gay communities in rural South Africa, especially in the area of Driefontein, his own village in the province of Mpumalanga.

As a childhood friend of many of his subjects, Mlangeni has been able to create photographs from a perspective of unique understanding and membership of the community he is portraying. Throughout his work, Mlangeni avoids ‘othering’ or ‘exoticising’ his subjects, and instead attempts to show the multi-faceted, intimate reality of daily life of these individuals. While many of them face discrimination due to their sexual identities, or are living in precarious socio-economic situations, Mlangeni’s work does not cast his subjects  as ‘victims’ but rather portrays their resilience, joyfulness and dignity as ordinary people.

“His photography continually erases and removes the boundaries between observer and subject,” added Cabrita. “Mlangeni is portraying his own belief as much as he is exploring the spiritual commitments of his photographic subjects.

“They chart his own journey towards belong, and longing for belonging within the Zionist community, a journey that has been mediated through a photographer’s lens. While some photographs reveal open, friendly gazes, others confront us with turned backs, inscrutable silhouettes and hidden figures buried deep in pictures, hinting at anonymity, inaccessibility and profound longing.”

Kholwa: The Longing of Belonging – which runs from June 13-September 10 – is free to the public. Visit www.maa.cam.ac.uk for further details and opening times.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Similarities in Human and Pig Embryos Provide Clues to Early Stages of Development

Similarities in human and pig embryos provide clues to early stages of development

source: www.cam.ac.uk

Scientists have shown how the precursors of egg and sperm cells – the cells that are key to the preservation of a species – arise in the early embryo by studying pig embryos alongside human stem cells.

The remarkable similarities between human and pig development suggest that we may soon be able to reveal the answers to some of our long-held questions

Toshihiro Kobayashi

In research published today in Nature, researchers at the University of Cambridge and the University of Nottingham demonstrate how pig embryos and human embryonic cells show remarkable similarities in the early stages of their development. By combining these two models, they hope to improve our understanding of the origins of diseases such as paediatric germ cell tumours and fetal abnormalities.

Primordial germ cells, the precursors of eggs and sperm, are among the earliest cells to emerge in human embryos after implantation, appearing around day 17, while the surrounding cells go on to form the rest of the human body. However, little is understood about how they originate. Currently, the law prohibits culture of human embryos beyond 14 days, which prevents investigations on this and subsequent events such as gastrulation, when the overall body plan is established.

Now, researchers have used a combination of human and pig models of development to shed light on these events. They have shown for the first time that the interplay between two key genes is critical for the formation of the germline precursors and that this ‘genetic cocktail’ is not the same in all species.

First, by using human pluripotent embryonic stem cells in vitro, scientists led by Professor Azim Surani at the Wellcome Trust/Cancer Research UK Gurdon Institute established a model that simulates genetic and cellular changes occurring up to gastrulation. Human pluripotent embryonic stem cells are ‘master cells’ found in embryos, which have the potential to become almost any type of cell in the body.

As these stem cells can be multiplied and precisely genetically manipulated, the model system provides a powerful tool for detailed molecular analysis of how human cells transform into distinct cell types during early development, and which changes might underlie human diseases.

The work shows that when an embryo progresses towards gastrulation, cells temporarily acquire the potential to form primordial germ cells, but shortly afterwards lose this potential and instead acquire the potential to form precursors of blood and muscle (mesoderm) or precursors of the gut, lung and the pancreas (endoderm). The model also tells us that while the genes SOX17 and BLIMP1 are critical for germ cell fate, SOX17 subsequently has another role in the specification of endodermal tissues.

For an accurate picture of how the embryo develops, however, it is necessary to understand how cells behave in the three-dimensional context of a normal embryo. This cannot be achieved by studies on the most commonly used mouse embryos, which develop as egg ‘cylinders’, unlike the ‘flat-disc’ human embryos. Pig embryos, on the other hand, develop as flat discs (similar to human embryos), can be easily obtained, and are ethically more acceptable than working with non-human primate (monkey) embryos.

Researchers from the University of Nottingham dissected whole flat discs from pig embryos at different developmental stages and found that development of these embryos matches with the observations on the in vitro human model, as well as with non-human primate embryonic stem cells in vitro.  For example, pig germ cells emerge in the course of gastrulation just as predicted from the human model, and with the expression of the same key genes as in human germ cells. Human and pig germ cells also exhibit key characteristics of this lineage, including initiation of reprogramming and re-setting of the epigenome – modifications to our DNA that regulate its operations and have the potential to be passed down to our offspring – which continues as germ cells progress towards development into sperm and eggs.

The combined human-pig models for early development and cell fate decisions likely reflect critical events in early human embryos in the womb.  Altogether, knowledge gained from this approach can be applied to regenerative medicine for the derivation of relevant human cell types that might be used to help understand and treat human diseases, and to understand how mutations that perturb early development can result in human diseases.

Dr Ramiro Alberio, from the School of Biosciences at the University of Nottingham, says: “We’ve shown how precursors to egg and sperm cells arise in pigs and humans, which have similar patterns of embryo development. This suggests that the pig can be an excellent model system for the study of early human development, as well as improving our understanding of the origins of genetic diseases.”

Dr Toshihiro Kobayashi in the Surani lab at the Gurdon Institute, adds: “We are currently prevented from studying human embryo development beyond day 14, which means that certain key stages in our development remain a mystery. The remarkable similarities between human and pig development suggest that we may soon be able to reveal the answers to some of our long-held questions.”

The research was supported by Wellcome.

Reference
Kobayashi, T et al. Principles of early human development and germ cell program from conserved model systems. Nature; 7 June 2017; DOI: 10.1038/nature22812

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Opinion: Remainer or Re-Leaver? The Philosophical Conundrum Posed By Brexit

Opinion: Remainer or re-leaver? The philosophical conundrum posed by Brexit

source: www.cam.ac.uk

A recent YouGov survey suggests there is increasing agreement that ‘Brexit means Brexit’. However, Alfred Moore from the Conspiracy and Democracy Project suspects support is “broad but shallow”, and forcing people to change their minds about Brexit poses a danger to democracy.

It is vital to keep alive the arguments that lost the day because in a democracy you always get to fight another one.

Alfred Moore

​If you only glanced at a recent YouGov survey, you might think that a large majority of the UK is in agreement about Brexit. The electorate may have divided pretty evenly in the referendum, but now the 45% of “hard leavers” are joined by 23% who “voted to remain but still think the government has a duty to bring the UK out of the EU”.

One reading of this poll is that the country is now uniting behind Brexit. As YouGov headlined its report: “Forget 52%. The rise of the ‘re-leavers’ mean the pro-Brexit electorate is 68%.”

But to conclude that the country is uniting would be shallow, and for prime minister Theresa May, at least, dangerous.

Most people now accept Brexit, but that doesn’t mean they believe in it. Re-leavers are addressing a genuine philosophical problem: should you change your beliefs when you find yourself in the minority?

Jean-Jacques Rousseau once wrote:

When a law is proposed in the people’s assembly, what is asked of them is not precisely whether they approve or reject, but whether or not it conforms to the general will that is theirs. Each man [sic], in giving his vote, states his opinion on this matter, and the declaration of the general will is drawn from the counting of votes. When, therefore, the opinion contrary to mine prevails, this proves merely that I was in error, and that what I took to be the general will was not so.

Put to one side the fact that Rousseau thought citizens should reflect in solitude on what was best for the country and that they should not discuss their views before voting.

Rousseau’s point was that the result, when it came, revealed the true will of the people. If you find yourself in the minority, it means you were wrong. Brexit, one might conclude, was the correct choice. The 48% were simply in error.

A different view is associated with the liberal tradition. Being in the minority says nothing about “right” and “wrong”. It announces simply that you lost. Nothing more, nothing less.

This is an important distinction. If being in the minority means you were wrong, then presumably you wouldn’t be crazy to change your mind. After all, if we assume that everybody is equal in their ability to judge these questions, then the majority is more likely to be right.

But if being in the minority simply means that you lost, then perhaps it’s important that you don’t change your mind, that you don’t stop arguing the issue, and that you don’t stop using all the constitutional means at your disposal to press your case. It is vital to keep alive the arguments that lost the day because in a democracy you always get to fight another one.

Keeping alive those arguments is often difficult. There is always pressure on those who lost to admit they were wrong, to pretend they’ve changed their minds, or at least to shut up. The famous phrase “tyranny of the majority” was never just about protecting minority rights; it was about recognising the force of majority opinion.

To suggest that the UK is uniting around Brexit, then, is a danger to democracy itself. That danger comes from pressure on the losers to actually change their minds. Worryingly, this now seems to be May’s position. As she said in a campaign speech near Middlesborough:

You can only deliver Brexit if you believe in Brexit.

I’m not a ‘re-leaver’”, she seemed to be saying. “I’m now a true believer, and you should be too.”

The other danger is to May. If she thinks the country is really uniting around Brexit, then she could do worse than talk to the street musician interviewed by the Financial Times a few weeks ago: “I don’t think the referendum will be overturned. People seem to think of it as “the people’s vote” and to overturn it would in some way be seen to be undemocratic. People who voted Remain are powerless at the moment.“

He’s right. Those who voted to stay in the EU lost and are, at the moment, powerless. However, politics can change pretty quickly. Support for going ahead with Brexit is broad but shallow. If the economy starts getting worse, the true believers may march on undaunted, eyes fixed firmly on the horizon, but the re-leavers may find their doubts coming back to the surface.

The more salient number in the survey might turn out to be the true believers, who say they will stick with Brexit whatever the consequences: and that’s only 45%.

This article was originally published on The Conversation


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Genes Influence Ability to Read a Person’s Mind From Their Eyes

Genes influence ability to read a person’s mind from their eyes

source: www.cam.ac.uk

Our DNA influences our ability to read a person’s thoughts and emotions from looking at their eyes, suggests a new study published in the journal Molecular Psychiatry.

This is the first study to attempt to correlate performance on the Eye Test with variation in the human genome

Varun Warrier

Twenty years ago, a team of scientists at the University of Cambridge developed a test of ‘cognitive empathy’ called the ‘Reading the Mind in the Eyes’ Test (or the Eyes Test, for short). This revealed that people can rapidly interpret what another person is thinking or feeling from looking at their eyes alone. It also showed that some of us are better at this than others, and that women on average score better on this test than men.

Now, the same team, working with the genetics company 23andMe along with scientists from France, Australia and the Netherlands, report results from a new study of performance on this test in 89,000 people across the world. The majority of these were 23andMe customers who consented to participate in research. The results confirmed that women on average do indeed score better on this test.

More importantly, the team confirmed that our genes influence performance on the Eyes Test, and went further to identify genetic variants on chromosome 3 in women that are associated with their ability to “read the mind in the eyes”.

The study was led by Varun Warrier, a Cambridge PhD student, and Professors Simon Baron-Cohen, Director of the Autism Research Centre at the University of Cambridge, and Thomas Bourgeron, of the University Paris Diderot and the Institut Pasteur.

Interestingly, performance on the Eyes Test in males was not associated with genes in this particular region of chromosome 3. The team also found the same pattern of results in an independent cohort of almost 1,500 people who were part of the Brisbane Longitudinal Twin Study, suggesting the genetic association in females is a reliable finding.

The closest genes in this tiny stretch of chromosome 3 include LRRN1 (Leucine Rich Neuronal 1) which is highly active in a part of the human brain called the striatum, and which has been shown using brain scanning to play a role in cognitive empathy. Consistent with this, genetic variants that contribute to higher scores on the Eyes Test also increase the volume of the striatum in humans, a finding that needs to be investigated further.

Previous studies have found that people with autism and anorexia tend to score lower on the Eyes Test. The team found that genetic variants that contribute to higher scores on the Eyes Test also increase the risk for anorexia, but not autism. They speculate that this may be because autism involves both social and non-social traits, and this test only measures a social trait.

Varun Warrier says: “This is the largest ever study of this test of cognitive empathy in the world. This is also the first study to attempt to correlate performance on this test with variation in the human genome. This is an important step forward for the field of social neuroscience and adds one more piece to the puzzle of what may cause variation in cognitive empathy.”

Professor Bourgeron adds: “This new study demonstrates that empathy is partly genetic, but we should not lose sight of other important social factors such as early upbringing and postnatal experience.”

Professor Baron-Cohen says: “We are excited by this new discovery, and are now testing if the results replicate, and exploring precisely what these genetic variants do in the brain, to give rise to individual differences in cognitive empathy. This new study takes us one step closer in understanding such variation in the population.”

Reference
Warrier, V et al. Genome-wide meta-analysis of cognitive empathy: heritability, and correlates with sex, neuropsychiatric conditions and cognition. Molecular Psychiatry; 6 June 2017; DOI: 10.1038/MP.2017.122


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Pilot Programme Encourages Researchers To Share The Code Behind Their Work

Pilot programme encourages researchers to share the code behind their work

source: www.cam.ac.uk

New project, partly designed by a University of Cambridge researcher, aims to improve transparency in science by sharing ‘how the sausage is made’.

Having the code means that others have a better chance of replicating your work.

Stephen Eglen

A new pilot project, designed by a Cambridge researcher and supported by the Nature family of journals, will evaluate the value of sharing the code behind published research.

For years, scientists have discussed whether and how to share data from painstaking research and costly experiments. Some are further along in their efforts toward ‘open science’ than others: fields such as astronomy and oceanography, for example, involve such expensive and large-scale equipment and logistical challenges to data collection that collaboration among institutions has become the norm.

Recently, academic journals, including several Nature journals, are turning their attention to another aspect of the research process: computer programming code. Code is becoming increasingly important in research because scientists are often writing their own computer programs to interpret their data, rather than using commercial software packages. Some journals now include scientific data and code as part of the peer-review process.

Now, in a commentary published in the journal Nature Neuroscience, a group of researchers from the UK, Europe and the United States have argued that the sharing of code should be part of the peer-review process. In a separate editorial, the journal has announced a pilot project to ask future authors to make their code available for review.

Code is an important part of the research process, and often the only definitive account of how data were processed. “Methods are now so complex that they are difficult to describe concisely in the limited ‘methods’ section of a paper,” said Dr Stephen Eglen from Cambridge’s Department of Applied Mathematics and Theoretical Physics, and the paper’s lead author. “And having the code means that others have a better chance of replicating your work, and so should add confidence.”

Making the programs behind the research accessible allows other scientists to test the code and reproduce the computations in an experiment — in other words, to reproduce results and solidify findings. It’s the “how the sausage is made” part of research, said co-author Ben Marwick, from the University of Washington. It also allows the code to be used by other researchers in new studies, making it easier for scientists to build on the work of their colleagues.

“What we’re missing is the convention of sharing code or the tools for turning data into useful discoveries or information,” said Marwick. “Researchers say it’s great to have the data available in a paper — increasingly raw data are available in supplementary files or specialised online repositories — but the code for performing the clever analyses in between the raw data and the published figures and tables are still inaccessible.”

Other Nature Research journals, such as Nature Methods and Nature Biotechnology,provide for code review as part of the article evaluation process. Since 2014, the company has encouraged writers to make their code available upon request.

The Nature Neuroscience pilot focuses on three elements: whether the code supporting an author’s main claims is publicly accessible; whether the code functions without mistakes; and whether it produces the results cited. At the moment this is a pilot project to which authors can opt in. It may be that in future it becomes mandatory and only when the code has been reviewed will a paper then be accepted.

“This extra step in the peer review process is to encourage ‘replication’ of results, and therefore help reduce the ‘replication crisis’,” said Eglen. “It also means that readers can understand more fully what authors have done.”

An open science approach to sharing code is not without its critics, as well as scientists who raise legal and ethical questions about the repercussions. How do researchers get proper credit for the code they share? How should code be cited in the scholarly literature? How will it count toward tenure and promotion applications? How is sharing code compatible with patents and commercialization of software technology?

“We hope that when people do not share code it might be seen as ‘having something to hide,’ although people may regard the code as ‘theirs’ and their IP, rather than something to be shared,” said Eglen. “Nowadays, we believe the final paper is the ultimate representation of a piece of research, but actually the final paper is just an advert for the scholarship, which here is the computer code to solve a particular task. By sharing the code, we actually get the most useful part of the scholarship, rather than the paper, which is just the author’s ‘gloss’ on the work they have done.”

Adapted from a University of Washington press release


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.