All posts by Admin

Clean Energy: Experts Outline How Governments Can Successfully Invest Before It’s Too Late

Clean energy: experts outline how governments can successfully invest before it’s too late

source: www.cam.ac.uk

Researchers distil twenty years of lessons from clean energy funding into six ‘guiding principles’. They argue that governments must eschew constant reinventions and grant scientists greater influence before our “window of opportunity” to avert climate change closes.

We urgently need to take stock of policy initiatives around the world that aim to accelerate new energy technologies

Laura Diaz Anadon

Governments need to give technical experts more autonomy and hold their nerve to provide more long-term stability when investing in clean energy, argue researchers in climate change and innovation policy in a new paper published today.

Writing in the journal Nature, the authors from UK and US institutions have set out guidelines for investment in world-changing energy innovation based on an analysis of the last twenty years of “what works” in clean energy research programs.

Their six simple “guiding principles” also include the need to channel innovation into the private sector through formal tech transfer programmes, and to think in terms of lasting knowledge creation rather than ‘quick win’ potential when funding new projects.

The authors offer a stark warning to governments and policymakers: learn from and build on experience before time runs out, rather than constantly reinventing aims and processes for the sake of political vanity.

“As the window of opportunity to avert dangerous climate change narrows, we urgently need to take stock of policy initiatives around the world that aim to accelerate new energy technologies and stem greenhouse gas emissions,” said Laura Diaz Anadon, Professor of Climate Change Policy at the University of Cambridge.

“If we don’t build on the lessons from previous policy successes and failures to understand what works and why, we risk wasting time and money in a way that we simply can’t afford,” said Anadon, who authored the new paper with colleagues from the Harvard Kennedy School, as well as the University of Minnesota’s Prof Gabriel Chan.

Public investments in energy research have risen since the lows of the mid-1990s and early 2000s. OECD members spent US$16.6 billion on new energy research and development (R&D) in 2016 compared to $10b in 2010. The EU and other nations pledged to double clean energy investment as part of 2015’s Paris Climate Change Agreement.

Recently, the UK government set out its own Clean Growth Strategy, committing £2.5 billion between 2015 and 2021, with hundreds of million to be invested in new generations of small nuclear power stations and offshore wind turbines.

However, Anadon and colleagues point out that government funding for energy innovation has, in many cases, been highly volatile in the recent past: with political shifts resulting in huge budget fluctuations and process reinventions in the UK and US.

For example, the research team found that every single year between 1990 and 2017, one in five technology areas funded by the US Department of Energy (DoE) saw a budget shift of more than 30%. The Trump administration’s current plan is to slash 2018’s energy R&D budget by 35% across the board.

In the UK, every Prime Minister since 2000 has created new institutions to manage energy innovation and bridge the public and private sectors. Blair’s UK Carbon Trust; Brown’s Energy Technologies Institute; Cameron’s Catapults; May’s Faraday Challenge as part of the latest industrial Strategy.

“Experimentation has benefits, but also costs,” said Anadon. “Researchers are having to relearn new processes, people and programmes with every political transition – wasting time and effort for scientists, companies and policymakers.”

“Rather than repeated overhauls, existing programs should be continuously evaluated and updated. New programs should only be set up if they fill needs not currently met.”

More autonomy for project selection should be passed to active scientists, who are “best placed to spot bold but risky opportunities that managers miss,” say the authors of the new paper.

They point to projects instigated by the US National Labs producing more commercially-viable technologies than those dictated by DoE headquarters – despite the Labs holding a mere 4% of the DoE’s overall budget.

The six evidence-based guiding principles for clean energy investment are:

  • Give researchers and technical experts more autonomy and influence over funding decisions.
  • Build technology transfer into research organisations.
  • Focus demonstration projects on learning.
  • Incentivise international collaboration.
  • Adopt an adaptive learning strategy.
  • Keep funding stable and predictable.

From US researchers using the pace of Chinese construction markets to test energy reduction technologies, to the UK government harnessing behavioural psychology to promote energy efficiency, the authors highlight just a few examples of government investment that helped create or improve clean energy initiatives across the world.

“Let’s learn from experience on how to accelerate the transition to a cleaner, safer and more affordable energy system,” they write.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Pop-Up Mints and Coins Made From Prayers

Pop-up mints and coins made from prayers

source: www.cam.ac.uk

In the tumultuous upheaval of the English Civil War, Royalist castles under siege used ‘pop-up’ mints to make coins to pay their soldiers. A unique display at the Fitzwilliam Museum tells the centuries-old story of emergency currency made from gold, silver and compressed prayer books.

Emergency coins show how a micro-economy developed during times of siege.

Richard Kelleher

We’re used to the kind of circular coins that jangle in your pocket. But this one is lozenge-shaped and features a crude impression of a castle on its face. Its edges are sharp.

A silver shilling piece, it was made in 1648 during the bloody siege of Pontefract Castle. Today it’s one of 80 examples of currency on display at the Fitzwilliam Museum. The temporary exhibition – Currencies of Conflict – is thought to be the first dedicated exclusively to emergency money.

The focus is on coinage that reflects the turmoil of the English Civil War. But the exhibition also sets these coins within a wider context of 2,500 years of history and features some rarely shown items from the Fitzwilliam’s outstanding collection.

Between 1644 and 1649, the Royalist stronghold of Pontefract Castle was besieged three times by the Parliamentary forces led by Oliver Cromwell. Royalists loyal to King Charles 1 also held out at Carlisle, Newark and Scarborough Castles. All eventually fell to the Parliamentarians.

Examples of siege coinage from all four castles appear in the display. These coins were made by craftsmen working within the fortress walls, using metal obtained from melting down objects requisitioned from the occupants of the castle and town.

People, and especially soldiers, had to be paid to ensure their continued loyalty. “We don’t know how many emergency coins were made during these sieges but a contemporary journal entry from Carlisle suggests that £323 of shilling pieces were struck from requisitioned plate. They show how a micro-economy developed during times of siege,” said curator Richard Kelleher.

Although the quality and weight of the silver, and (rarely) gold, was generally good, the manufacture was often much less sophisticated. In temporary mints, pieces of metal were stamped with ‘dies’ of varied workmanship, from the crude designs at Carlisle to the accomplished work of the Newark engraver.

“In the emergency conditions of a siege, coins were sometimes diamond-shaped or hexagonal as these shapes were easier to cut to specific weights than conventionally minted coins which required the specialist machinery of the mint,” said Kelleher.

In the medieval period, numerous mints operated across England but by 1558 there was only one royal mint and it was in the Tower of London. During the Civil War, Charles I moved his court to Oxford, establishing a mint in the city. A stunning gold ‘triple unite’ (a coin worth £3 – one of the largest value coins ever minted) is an example of the fine workmanship of the Oxford mint.

On its face it shows a finely executed bust of the king holding a sword and olive branch, while the reverse carries the Oxford Declaration: “The Protestant religion, the laws of England, and the liberty of Parliament.” Another rare coin from Oxford is a silver pound coin weighing more than 120g showing the king riding a horse over the arms of his defeated enemies.

Also displayed is a silver medal, made during the short Protectorate headed by Oliver Cromwell. It commemorates the Battle of Dunbar of 1650 when Cromwell’s forces defeated an army loyal to Charles II. Its face shows the bust of Cromwell with battle scenes in the background, while the reverse shows the interior view of Parliament with the speaker sitting in the centre.

The earliest piece in the exhibition is an electrum coin dating from the 6th century BC. It originates from the kingdom of Lydia (western Turkey) and depicts a lion and a bull in combat. The earliest reference to coinage in the literature records a payment in coin by the Lydian king for a military purpose.

A Hungarian medal, commemorating the recapture of Budapest, provides a snapshot of a famous siege in progress. The walls are surrounded by cavalry and infantry complete with the machinery of siege warfare – artillery pieces – which have breached the walls.

This medal was also used as a vehicle for propaganda. The reverse carries the image of the Imperial eagle (representing the Habsburg Empire) defending its nest from an attacking dragon which represents the threat of the Ottoman Empire.

Much less elaborate are examples of coins made in circumstances when precious metals were in short supply. A 16th-century Dutch token is made from compressed prayer books and a piece from occupied Ghent in the First World War is made of card.

Extremely vulnerable to damp, these coins’ survival is little short of miraculous. During the siege of Leiden the mayor requisitioned all metal, including coins, for the manufacture of weapons and ammunition. In return, citizens were given token coins made from hymnals, prayer books and bibles.

Bringing the narrative of currency and conflict into the 20th century are paper currencies of the Second World War. Britain and its American allies issued currency for liberated areas of Italy and France, and for occupied Germany.

The temporary exhibition Currencies of Conflict: siege and emergency money from antiquity to WWII continues at the Fitzwilliam Museum until 23 February 2018. Admission is free.

Inset images: England, Charles 1 (1625-49) silver shilling siege piece, 1645, Carlisle; England, Charles 1 (1625-49) gold triple unite, 1643, struck at Oxford; Commonwealth (1649-60), silver medal of 1650 commemorating the Battle of Dunbar; Lydia, Croesus (561-546 BC), Gold stater. Foreparts of bull and lion facing each other; Leopold I (1658-1705) silver medal, ‘Budapest defended 1686’ by GF Nurnberger; Netherlands, Leiden, paper siege of 5 stuivers, 1574; Germany, Allied Military Currency, 1 mark, 1944.

 

 

 

 

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Sir Isaac Newton’s Cambridge Papers Added To UNESCO’s Memory Of The World Register

Sir Isaac Newton’s Cambridge papers added to UNESCO’s Memory of the World Register

The Cambridge papers of Sir Isaac Newton, including early drafts and Newton’s annotated copies of Principia Mathematica – a work that changed the history of science – have been added to UNESCO’s International Memory of the World Register.

Newton’s papers are among the world’s most important collections in the western scientific tradition and are one of the Library’s most treasured collections.

Katrina Dean

Held at Cambridge University Library, Newton’s scientific and mathematical papers represent one of the most important archives of scientific and intellectual work on universal phenomena. They document the development of his thought on gravity, calculus and optics, and reveal ideas worked out through painstaking experiments, calculations, correspondence and revisions.

In combination with alchemical papers at King’s College, Cambridge and his notebooks and correspondence at Trinity College, Cambridge and the Fitzwilliam Museum, this represents the largest and most important collection of Newton’s papers worldwide.

Katrina Dean, Curator of Scientific Collections at Cambridge University Library said: “Newton’s papers are among the world’s most important collections in the western scientific tradition and are one of the Library’s most treasured collections. They were the first items to be digitised and added to the Cambridge Digital Library in 2011 and featured in our 600th anniversary exhibition Lines of Thought last year. In 2017, their addition to the UNESCO International Memory of the World Register recognises their unquestionable international importance.”

The Memory of the World Project is an international initiative to safeguard the documentary heritage of humanity against collective amnesia, neglect, the ravages of time and climatic conditions, and wilful and deliberate destruction. It calls for the preservation of valuable archival, library and private collections all over the world, as well as the reconstitution of dispersed or displaced documentary heritage, and the increased accessibility to and dissemination of these items.

Newton’s Cambridge papers, and those at the Royal Society, now join the archive of Winston Churchill, held at Cambridge University’s Churchill Archives Centre, on the UNESCO Register. They also join Newton’s theological and alchemical papers at the National Library of Israel, which were added in 2015.

The chief attractions in the Cambridge collection are Newton’s own copies of the first edition of the Principia (1687), covered with his corrections, revisions and additions for the second edition.

The Cambridge papers also include significant correspondence with natural philosophers and mathematicians including Henry Oldenberg, Secretary of the Royal Society, Edmond Halley, the Astronomer Royal who persuaded Newton to publish Principia, Richard Bentley, the Master of Trinity College, and John Collins, mathematician and fellow of the Royal Society who became an important collector of Newton’s works.

Added Dean: “One striking illustration of Newton’s experimental approach is in his ‘Laboratory Notebook’, which includes details of his investigations into light and optics in order to understand the nature of colour. His essay ‘Of Colours’ includes a diagram that illustrates the experiment in which he inserted a bodkin into his eye socket to put pressure on the eyeball to try to replicate the sensation of colour in normal sight.”

Another important item is Newton’s so-called ‘Waste Book’, a large notebook inherited from his stepfather. From 1664, he used the blank pages for optical and mathematical calculations and gradually mastered the analysis of curved lines, surfaces and solids. By 1665, he had invented the method of calculus. Newton later used the dated, documentary evidence provided by the Waste Book to argue his case in the priority dispute with Gottfried Wilhelm Leibniz over the invention of the calculus.

Cambridge University Librarian Jess Gardner said: “Newton’s work and life continue to attract wonder and new perspectives on our place in the Universe. Cambridge University Library will continue to work with scholars and curators worldwide to make Newton’s papers accessible now and for future generations.”

Isaac Newton entered Trinity College as an undergraduate in 1661 and became a Fellow in 1667. In 1669, he became Lucasian Professor of Mathematics at Cambridge University, a position he held until 1701.

Among the more personal items in the Cambridge collections are Newton’s daily concerns as recorded in an undergraduate notebook which records Newton’s expenditure on white wine, wafers, shoe-strings and ‘a paire of stockings’, along with a guide to Latin pronunciation.

A notebook of 1662-1669 records Newton’s sins before and after Whitsunday of 1662, written in a coded shorthand and first deciphered between 1872 and 1888. Among them are ‘Eating an apple at Thy house’, ‘Robbing my mothers box of plums and sugar’ along with the more serious ‘Wishing death and hoping it to some’ before a list of his expenses. These included chemicals, two furnaces and a recent edition of one of the most comprehensive compilations of alchemical writings in the western tradition Theatrum chemicum, edited by the publisher Lazarus Zetzner.

Cambridge University Library is also hosting a series of talks open to the public by Sarah Dry and Patricia Fara on Newton’s manuscripts and Newton’s role in Enlightenment culture and polite society on December 7 and December 14 respectively. For details and bookings, see: http://www.lib.cam.ac.uk/using-library/whats


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

£5.4 Million Centre Will Help Transform The UK’s Construction Sector For The Digital Age

£5.4 million centre will help transform the UK’s construction sector for the digital age

source: www.cam.ac.uk

The Government have announced £5.4 million in funding to launch the Centre for Digital Built Britain at the University of Cambridge, which will help people make better use of cities by championing the digital revolution in the built environment. The Centre is part of a landmark government-led investment in growing the UK’s construction sector.

This is a wonderful opportunity to put the breadth of research and industry engagement expertise from Cambridge at the heart of Digital Built Britain.

Jennifer Schooling

The Centre is a partnership between the Department of Business, Energy & Industrial Strategy and the University to support the transformation of the construction sector using digital technologies to better plan, build, maintain and use infrastructure. It will focus on the ongoing transformation of the built environment through the digital tools, standards and processes that are collectively known as Building Information Modelling (BIM). BIM enables the people building and managing our transport networks, cities and major infrastructure projects to take advantage of advances in the digital world to intelligently deliver better services and end products for UK citizens.

Led by Professor Andy Neely, Pro-Vice-Chancellor: Enterprise and Business Relations, the Centre builds on the expertise and experience of faculty from the Cambridge Centre for Smart Infrastructure and Construction (CSIC), Cambridge Big Data, the Distributed Information and Automation Lab (DIAL), the Cambridge Service Alliance (CSA) and the Institute for Manufacturing. The Cambridge researchers work with a team of specialists from Digital Built Britain Programme and partners from industry and academia to develop and demonstrate policy and practical insights that will enable the exploitation of new and emerging technologies, data and analytics to enhance the natural and built environment, thereby driving up commercial competitiveness and productivity, as well as citizen quality of life and well-being.

“The Centre for Digital Built Britain will work in partnership with Government and industry to improve the performance, productivity and safety of construction through the better use of digital technologies,” said Professor Neely.

“The achievement of the BIM Task Group in delivering the Level 2 BIM programme has provided both the UK and increasingly a worldwide platform for the digitisation of the construction and services sectors.  We welcome the vast experience and capability Cambridge brings to the team and the creation of the Centre for Digital Built Britain,” said Dr Mark Bew MBE, Strategic Advisor to the Centre for Digital Built Britain.

“The construction and infrastructure sector are poised for a digital revolution, and Britain is well placed to lead it. Over the next decade advances in BIM will combine with the Internet of Things (IoT), data analytics, data-driven manufacturing and the digital economy to enable us to plan new buildings and infrastructure more effectively, build them at lower cost, operate and maintain them more efficiently, and deliver better outcomes to the people who use them,” said Dr Jennifer Schooling, Director of the Centre for Smart Infrastructure and Construction. “This is a wonderful opportunity to put the breadth of research and industry engagement expertise from Cambridge at the heart of Digital Built Britain.”

The UK is leading the world with its support of BIM implementation in the construction sector through its commitment to the Digital Built Britain Programme. By embedding Level 2 BIM in the government projects such as Crossrail, the programme has contributed significantly to Government’s £3 billion of efficiency savings between 2011 and 2015. Since 2016, all UK centrally funded projects require Level 2 BIM, which has achieved considerable cost savings for its construction procurement to date. Tasked with supporting innovation in the construction sector, the Construction Leadership Council has also put BIM at the heart of its sector strategy Construction 2025; which commits to cut built asset costs by 33 percent, and time and carbon by 50 percent. The Centre will continue and build on this transformative approach.

The Centre for Digital Built Britain will be based in the Maxwell Centre in West Cambridge and will be formally launched in Spring 2018.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Prehistoric Women’s Manual Work Was Tougher Than Rowing In Today’s Elite Boat Crews

Prehistoric women’s manual work was tougher than rowing in today’s elite boat crews

source: www.cam.ac.uk

The first study to compare ancient and living female bones shows that women from early agricultural eras had stronger arms than the rowers of Cambridge University’s famously competitive boat club. Researchers say the findings suggest a “hidden history” of gruelling manual labour performed by women that stretched across millennia.

By interpreting women’s bones in a female-specific context we can start to see how intensive, variable and laborious their behaviours were

Alison Macintosh

A new study comparing the bones of Central European women that lived during the first 6,000 years of farming with those of modern athletes has shown that the average prehistoric agricultural woman had stronger upper arms than living female rowing champions.

Researchers from the University of Cambridge’s Department of Archaeology say this physical prowess was likely obtained through tilling soil and harvesting crops by hand, as well as the grinding of grain for as much as five hours a day to make flour.

Until now, bioarchaeological investigations of past behaviour have interpreted women’s bones solely through direct comparison to those of men. However, male bones respond to strain in a more visibly dramatic way than female bones.

The Cambridge scientists say this has resulted in the systematic underestimation of the nature and scale of the physical demands borne by women in prehistory.

“This is the first study to actually compare prehistoric female bones to those of living women,” said Dr Alison Macintosh, lead author of the study published today in the journalScience Advances.

“By interpreting women’s bones in a female-specific context we can start to see how intensive, variable and laborious their behaviours were, hinting at a hidden history of women’s work over thousands of years.”

The study, part of the European Research Council-funded ADaPt (Adaption, Dispersals and Phenotype) Project, used a small CT scanner in Cambridge’s PAVE laboratory to analyse the arm (humerus) and leg (tibia) bones of living women who engage in a range of physical activity: from runners, rowers and footballers to those with more sedentary lifestyles.

The bones strengths of modern women were compared to those of women from early Neolithic agricultural eras through to farming communities of the Middle Ages.

“It can be easy to forget that bone is a living tissue, one that responds to the rigours we put our bodies through. Physical impact and muscle activity both put strain on bone, called loading. The bone reacts by changing in shape, curvature, thickness and density over time to accommodate repeated strain,” said Macintosh.

“By analysing the bone characteristics of living people whose regular physical exertion is known, and comparing them to the characteristics of ancient bones, we can start to interpret the kinds of labour our ancestors were performing in prehistory.”

Over three weeks during trial season, Macintosh scanned the limb bones of the Open- and Lightweight squads of the Cambridge University Women’s Boat Club, who ended up winning this year’s Boat Race and breaking the course record. These women, most in their early twenties, were training twice a day and rowing an average of 120km a week at the time.

The Neolithic women analysed in the study (from 7400-7000 years ago) had similar leg bone strength to modern rowers, but their arm bones were 11-16% stronger for their size than the rowers, and almost 30% stronger than typical Cambridge students.

The loading of the upper limbs was even more dominant in the study’s Bronze Age women (from 4300-3500 years ago), who had 9-13% stronger arm bones than the rowers but 12% weaker leg bones.

A possible explanation for this fierce arm strength is the grinding of grain. “We can’t say specifically what behaviours were causing the bone loading we found. However, a major activity in early agriculture was converting grain into flour, and this was likely performed by women,” said Macintosh.

“For millennia, grain would have been ground by hand between two large stones called a saddle quern. In the few remaining societies that still use saddle querns, women grind grain for up to five hours a day.

“The repetitive arm action of grinding these stones together for hours may have loaded women’s arm bones in a similar way to the laborious back-and-forth motion of rowing.”

However, Macintosh suspects that women’s labour was hardly likely to have been limited to this one behaviour.

“Prior to the invention of the plough, subsistence farming involved manually planting, tilling and harvesting all crops,” said Macintosh. “Women were also likely to have been fetching food and water for domestic livestock, processing milk and meat, and converting hides and wool into textiles.

“The variation in bone loading found in prehistoric women suggests that a wide range of behaviours were occurring during early agriculture. In fact, we believe it may be the wide variety of women’s work that in part makes it so difficult to identify signatures of any one specific behaviour from their bones.”

Dr Jay Stock, senior study author and head of the ADaPt Project, added: “Our findings suggest that for thousands of years, the rigorous manual labour of women was a crucial driver of early farming economies. The research demonstrates what we can learn about the human past through better understanding of human variation today.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Eye Contact With Your Baby Helps Synchronise Your Brainwaves

Eye contact with your baby helps synchronise your brainwaves

source: www.cam.ac.uk

Making eye contact with an infant makes adults’ and babies’ brainwaves ‘get in sync’ with each other – which is likely to support communication and learning – according to researchers at the University of Cambridge.

When the adult and infant are looking at each other, they are signalling their availability and intention to communicate with each other

Victoria Leong

When a parent and infant interact, various aspects of their behaviour can synchronise, including their gaze, emotions and heartrate, but little is known about whether their brain activity also synchronises – and what the consequences of this might be.

Brainwaves reflect the group-level activity of millions of neurons and are involved in information transfer between brain regions. Previous studies have shown that when two adults are talking to each other, communication is more successful if their brainwaves are in synchrony.

Researchers at the Baby-LINC Lab at the University of Cambridge carried out a study to explore whether infants can synchronise their brainwaves to adults too – and whether eye contact might influence this. Their results are published today in the Proceedings of National Academy of Sciences (PNAS).

The team examined the brainwave patterns of 36 infants (17 in the first experiment and 19 in the second) using electroencephalography (EEG), which measures patterns of brain electrical activity via electrodes in a skull cap worn by the participants. They compared the infants’ brain activity to that of the adult who was singing nursery rhymes to the infant.

In the first of two experiments, the infant watched a video of an adult as she sang nursery rhymes. First, the adult – whose brainwave patterns had already been recorded – was looking directly at the infant. Then, she turned her head to avert her gaze, while still singing nursery rhymes. Finally, she turned her head away, but her eyes looked directly back at the infant.

As anticipated, the researchers found that infants’ brainwaves were more synchronised to the adults’ when the adult’s gaze met the infant’s, as compared to when her gaze was averted Interestingly, the greatest synchronising effect occurred when the adults’ head was turned away but her eyes still looked directly at the infant. The researchers say this may be because such a gaze appears highly deliberate, and so provides a stronger signal to the infant that the adult intends to communicate with her.

In the second experiment, a real adult replaced the video. She only looked either directly at the infant or averted her gaze while singing nursery rhymes. This time, however, her brainwaves could be monitored live to see whether her brainwave patterns were being influenced by the infant’s as well as the other way round.

This time, both infants and adults became more synchronised to each other’s brain activity when mutual eye contact was established. This occurred even though the adult could see the infant at all times, and infants were equally interested in looking at the adult even when she looked away. The researchers say that this shows that brainwave synchronisation isn’t just due to seeing a face or finding something interesting, but about sharing an intention to communicate.

To measure infants’ intention to communicate, the researcher measured how many ‘vocalisations’ infants made to the experimenter. As predicted, infants made a greater effort to communicate, making more ‘vocalisations’, when the adult made direct eye contact – and individual infants who made longer vocalisations also had higher brainwave synchrony with the adult.

Dr Victoria Leong, lead author on the study said: “When the adult and infant are looking at each other, they are signalling their availability and intention to communicate with each other.  We found that both adult and infant brains respond to a gaze signal by becoming more in sync with their partner. This mechanism could prepare parents and babies to communicate, by synchronising when to speak and when to listen, which would also make learning more effective.”

Dr Sam Wass, last author on the study, said: “We don’t know what it is, yet, that causes this synchronous brain activity. We’re certainly not claiming to have discovered telepathy! In this study, we were looking at whether infants can synchronise their brains to someone else, just as adults can. And we were also trying to figure out what gives rise to the synchrony.

“Our findings suggested eye gaze and vocalisations may both, somehow, play a role. But the brain synchrony we were observing was at such high time-scales – of three to nine oscillations per second – that we still need to figure out how exactly eye gaze and vocalisations create it.”

This research was supported by an ESRC Transformative Research Grant to Dr Leong and Dr Wass.

Reference
Leong, V et al. Speaker gaze increases infant-adult connectivity. PNAS; 28 Nov 2017; DOI: 10.1101/108878


Researcher profile: Dr Victoria Leong

Dr Victoria Leong is an Affiliated Lecturer at Cambridge’s Department of Psychology, and also an Assistant Professor of Psychology at Nanyang Technological University, Singapore. Her research aims to understand how parents and infants communicate and learn from each other, and the brain mechanisms that help them to interact effectively as social partners.

“The Baby-LINC lab is designed to look like a home living room so that mothers and babies feel comfortable,” she says.  In the lab, they use a wireless EEG system to measure infants’ brain activity, which means that babies don’t have to be tethered to a computer and we can conduct recordings for longer periods of time. “This is invaluable if the baby needs a nap or a nappy change in-between doing our tasks!”

Dr Leong says she is passionate about “real-world neuroscience”. In other words, “understanding and not ignoring the very real – and often very messy – human social contexts that infiltrate brain processes”. This means that in addition to world class facilities and methods, the ability to collect robust data also depends on keeping the infants relaxed and happy. “Many a tantrum can be averted by the judicious and timely application of large soapy bubbles and rice cakes. The ability to blow large charming bubbles thereafter became a key criteria for recruiting research assistants!”

The research project came about “over a cup of tea [with Sam Wass] and a notepad to scratch out some frankly outlandish ideas about brain-to-brain synchrony”. They received £3,995 with the help of Cambridge Neuroscience and Cambridge Language Sciences for a pilot project and within a year went on to secure an ESRC Transformative Research Grant, which allowed them to significantly scale-up research operations, and to build the first mother-infant EEG hyperscanning facility in the UK (the Baby-LINC Lab).

“Cambridge is one of probably only a handful of highly-creative research environments in the world where young, untested post-doctoral researchers can organically come together, develop ambitious ideas, and have the support to try these out,” she says. “I am very proud of our humble beginnings, because they remind me that even a small handful of resources, wisely invested with hard work, can grow into world-class research.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Going Underground: Cambridge Digs Into The History of Geology With Landmark Exhibition

Going underground: Cambridge digs into the history of geology with landmark exhibition

source: www.cam.ac.uk

A box full of diamonds, volcanic rock from Mount Vesuvius, and the geology guide that Darwin packed for his epic voyage on the Beagle will go on display in Cambridge this week as part of the first major exhibition to celebrate geological map-making.

We show how for the first time people were encouraged to think about the secretive world beneath their feet.

Allison Ksiazkiewicz

Uncovering how the ground beneath our feet was mapped for the first time – and revealing some of the controversies and tragedies geology brought to the surface of intellectual debate, Landscapes Below opens to the public on Friday, November 24, at Cambridge University Library.

Featuring the biggest-ever object (1.9mx1.6m) to go on display at the Library: George Bellas Greenough’s 1819 A Geological Map of England and Wales (the first map produced by the Geological Society of London), as well as a visually stunning collection of maps from the earliest days of geology – the exhibition explores how these new subterranean visions of the British landscape influenced our understanding of the Earth. All the maps belonging to the library are going on display for the first time.

“I think the maps are beautiful objects, tell fascinating stories and frame geology in a new light,” said exhibition curator Allison Ksiazkiewicz. “This was a new take on nature and a new way of thinking about the landscape for those interested in nature.

“We show how the early pioneers of this new science wrestled with the ideas of a visual vocabulary – and how for the first time people were encouraged to think about the secretive world beneath their feet.”

As well as maps, Landscapes Below also brings together an extraordinary collection of fossils, artworks and a collection of 154 diamonds, on loan from the Sedgwick Museum of Earth Sciences. Displayed together for the first time, the diamonds were collected, arranged, and produced by Jacques Louis, Comte de Bournon who later became the Keeper of the Royal Mineral Collection for King Louis XVIII.

Another important exhibit on display for the first time is the first edition of George Cuvier and Alexandre Brongniart’s Researches on the Fossil Bones of Quadrupeds (1811), on loan from Trinity College. It examined the geology of the Paris Basin and revolutionised what was considered ‘young’ in geological terms.

Artists were also keen to accurately depict the geological landscape. After surviving Captain Cook’s ill-fated third voyage of discovery, artist, John Webber returned to England and travelled around the country painting landscapes and geological formations, as seen in Landscape of Rocks in Derbyshire. Christopher Packe’s A New Philosophico-Chorographical Chart of East-Kent (1743), on loan from the Geological Society of London, is a remarkable, engraved map that draws on early modern medicine in the interpretation of the surrounding landscape.

“The objects we’re putting on display show the many different applications of geological knowledge,” added Ksiazkiewicz. “Whether it’s a map showing the coal fields of Lancashire in the 1830s – or revealing how this new science was used for economic and military reasons.”

In many ways, the landscapes the earliest geologists worked among became battlegrounds as a scientific old guard – loyal to the established pursuits of mineralogy and chemistry – opposed a new generation of scientists intent on using the fossil record in the study of the Earth’s age and formation.

Exhibitions Officer Chris Burgess said: “Maps were central to the development of geology but disagreement between its leading figures was common. Maps of the period did not just show new knowledge but represented visible arguments about how that knowledge should be recorded.”

The exhibition also includes objects from those with rather tragic histories, including William Smith – whose famous 1815 Geological Map of England has been described as the ‘Magna Carta of geology’. Despite publishing the world’s first geological map (which is still used as the basis of such maps today), Smith was shunned by the scientific community for many years, became a bankrupt, and ended up in debtors’ prison.

John MacCulloch, who produced the Geological Map of Scotland, did not live to see his work published after his honeymoon carriage overturned and killed him at the age of 61. He spent 15 summers surveying Scotland, after convincing the Board of Ordnance to sponsor the project. There was some dispute about how MacCulloch calculated his mileage and spent the funds, and the Ordnance only paid for six summers’ worth of work. Five summers were paid for by the Treasury and four from his own pocket.

Added Ksiazkiewicz: “Not only do these maps and objects represent years of work by individuals looking to develop a new science of the Earth, they stir the imagination. You can imagine yourself walking across the landscape and absorbing all that comes with it – views, antiquities, fossils, and vegetation. And weather, there’s always weather.”

Landscapes Below runs from November 25, 2017 to March 29, 2018 at Cambridge University Library’s Milstein Exhibition Centre. Admission is free. Opening times are Mon-Fri 9am-6pm and Saturday 9am-16.30pm. Closed Sundays.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

How To Cut Your Lawn For Grasshoppers

How to cut your lawn for grasshoppers

source: www.cam.ac.uk

Picture a grasshopper landing randomly on a lawn of fixed area. If it then jumps a certain distance in a random direction, what shape should the lawn be to maximise the chance that the grasshopper stays on the lawn after jumping?

The grasshopper problem is a rather nice one, as it helps us try out techniques for the physics problems we really want to get to.

Adrian Kent

One could be forgiven for wondering what the point of such a question might be. But the solution, proposed by theoretical physicists in the UK and the US, has some intriguing connections to quantum theory, which describes the behaviour of particles at the atomic and sub-atomic scales. Systems based on the principles of quantum theory could lead to a revolution in computing, financial trading, and many other fields.

The researchers, from the University of Cambridge and the University of Massachusetts Amherst, used computational methods inspired by the way metals are strengthened by heating and cooling to solve the problem and find the ‘optimal’ lawn shape for different grasshopper jump distances. Their results are reported in the journal Proceedings of the Royal Society A.

For the mathematically-inclined gardeners out there, the optimal lawn shape changes depending on the distance of the jump. Counter-intuitively, a circular lawn is never optimal, and instead, more complex shapes, from cogwheels to fans to stripes, are best at retaining hypothetical grasshoppers. Interestingly, the shapes bear a resemblance to shapes seen in nature, including the contours of flowers, the patterns in seashells and the stripes on some animals.

“The grasshopper problem is a rather nice one, as it helps us try out techniques for the physics problems we really want to get to,” said paper co-author Professor Adrian Kent, of Cambridge’s Department of Applied Mathematics and Theoretical Physics. Kent’s primary area of research is quantum physics, and his co-author Dr Olga Goulko works in computational physics.

To find the best lawn, Goulko and Kent had to convert the grasshopper problem from a mathematical problem to a physics one, by mapping it to a system of atoms on a grid. They used a technique called simulated annealing, which is inspired by a process of heating and slowly cooling metal to make it less brittle. “The process of annealing essentially forces the metal into a low-energy state, and that’s what makes it less brittle,” said Kent. “The analogue in a theoretical model is you start in a random high-energy state and let the atoms move around until they settle into a low-energy state. We designed a model so that the lower its energy, the greater the chance the grasshopper stays on the lawn. If you get the same answer – in our case, the same shape – consistently, then you’ve probably found the lowest-energy state, which is the optimal lawn shape.”

For different jump distances, the simulated annealing process turned up a variety of shapes, from cogwheels for short jump distances, through to fan shapes for medium jumps, and stripes for longer jumps. “If you asked a pure mathematician, their first guess might be that the optimal shape for a short jump is a disc, but we’ve shown that’s never the case,” said Kent. “Instead we got some weird and wonderful shapes – our simulations gave us a complicated and rich array of structures.”

Goulko and Kent began studying the grasshopper problem to try to better understand the difference between quantum theory and classical physics. When measuring the spin – the intrinsic angular momentum – of two particles on two random axes for particular states, quantum theory predicts you will get opposite answers more often than any classical model allows, but we don’t yet know how big the gap between classical and quantum is in general. “To understand precisely what classical models do allow, and see how much stronger quantum theory is, you need to solve another version of the grasshopper problem, for lawns on a sphere,” said Kent. Having developed and tested their techniques for grasshoppers on a two-dimensional lawn, the authors plan to look at grasshoppers on a sphere in order to better understand the so-called Bell inequalities, which describe the classical-quantum gap.

The lawn shapes which Goulko and Kent found also echo some shapes found in nature. The famous mathematician and code-breaker Alan Turing came up with a theory in 1952 on the origin of patterns in nature, such as spots, stripes and spirals, and the researchers say their work may also help explain the origin of some patterns. “Turing’s theory involves the idea that these patterns arise as solutions to reaction-diffusion equations,” said Kent. “Our results suggest that a rich variety of pattern formation can also arise in systems with essentially fixed-range interactions. It may be worth looking for explanations of this type in contexts where highly regular patterns naturally arise and are not otherwise easily explained.”

Reference:
Olga Goulko and Adrian Kent. ‘The grasshopper problem.’ Proceedings of the Royal Society A (2017). DOI: http://dx.doi.org/10.1098/rspa.2017.0494


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Report Highlights Opportunities and Risks Associated With Synthetic Biology and Bioengineering

Report highlights opportunities and risks associated with synthetic biology and bioengineering

source: www.cam.ac.uk
Human genome editing, 3D-printed replacement organs and artificial photosynthesis – the field of bioengineering offers great promise for tackling the major challenges that face our society. But as a new article out today highlights, these developments provide both opportunities and risks in the short and long term.

Rapid developments in the field of synthetic biology and its associated tools and methods, including more widely available gene editing techniques, have substantially increased our capabilities for bioengineering – the application of principles and techniques from engineering to biological systems, often with the goal of addressing ‘real-world’ problems.

In a feature article published in the open access journal eLife, an international team of experts led by Dr Bonnie Wintle and Dr Christian R. Boehm from the Centre for the Study of Existential Risk at the University of Cambridge, capture perspectives of industry, innovators, scholars, and the security community in the UK and US on what they view as the major emerging issues in the field.

Dr Wintle says: “The growth of the bio-based economy offers the promise of addressing global environmental and societal challenges, but as our paper shows, it can also present new kinds of challenges and risks. The sector needs to proceed with caution to ensure we can reap the benefits safely and securely.”

The report is intended as a summary and launching point for policy makers across a range of sectors to further explore those issues that may be relevant to them.

Among the issues highlighted by the report as being most relevant over the next five years are:

Artificial photosynthesis and carbon capture for producing biofuels

If technical hurdles can be overcome, such developments might contribute to the future adoption of carbon capture systems, and provide sustainable sources of commodity chemicals and fuel.

Enhanced photosynthesis for agricultural productivity

Synthetic biology may hold the key to increasing yields on currently farmed land – and hence helping address food security – by enhancing photosynthesis and reducing pre-harvest losses, as well as reducing post-harvest and post-consumer waste.

Synthetic gene drives

Gene drives promote the inheritance of preferred genetic traits throughout a species, for example to prevent malaria-transmitting mosquitoes from breeding. However, this technology raises questions about whether it may alter ecosystems, potentially even creating niches where a new disease-carrying species or new disease organism may take hold.

Human genome editing

Genome engineering technologies such as CRISPR/Cas9 offer the possibility to improve human lifespans and health. However, their implementation poses major ethical dilemmas. It is feasible that individuals or states with the financial and technological means may elect to provide strategic advantages to future generations.

Defence agency research in biological engineering

The areas of synthetic biology in which some defence agencies invest raise the risk of ‘dual-use’. For example, one programme intends to use insects to disseminate engineered plant viruses that confer traits to the target plants they feed on, with the aim of protecting crops from potential plant pathogens – but such technologies could plausibly also be used by others to harm targets.

In the next five to ten years, the authors identified areas of interest including:

Regenerative medicine: 3D printing body parts and tissue engineering

While this technology will undoubtedly ease suffering caused by traumatic injuries and a myriad of illnesses, reversing the decay associated with age is still fraught with ethical, social and economic concerns. Healthcare systems would rapidly become overburdened by the cost of replenishing body parts of citizens as they age and could lead new socioeconomic classes, as only those who can pay for such care themselves can extend their healthy years.

Microbiome-based therapies

The human microbiome is implicated in a large number of human disorders, from Parkinson’s to colon cancer, as well as metabolic conditions such as obesity and type 2 diabetes. Synthetic biology approaches could greatly accelerate the development of more effective microbiota-based therapeutics. However, there is a risk that DNA from genetically engineered microbes may spread to other microbiota in the human microbiome or into the wider environment.

Intersection of information security and bio-automation

Advancements in automation technology combined with faster and more reliable engineering techniques have resulted in the emergence of robotic ‘cloud labs’ where digital information is transformed into DNA then expressed in some target organisms. This opens the possibility of new kinds of information security threats, which could include tampering with digital DNA sequences leading to the production of harmful organisms, and sabotaging vaccine and drug production through attacks on critical DNA sequence databases or equipment.

Over the longer term, issues identified include:

New makers disrupt pharmaceutical markets

Community bio-labs and entrepreneurial startups are customizing and sharing methods and tools for biological experiments and engineering. Combined with open business models and open source technologies, this could herald opportunities for manufacturing therapies tailored to regional diseases that multinational pharmaceutical companies might not find profitable. But this raises concerns around the potential disruption of existing manufacturing markets and raw material supply chains as well as fears about inadequate regulation, less rigorous product quality control and misuse.

Platform technologies to address emerging disease pandemics

Emerging infectious diseases—such as recent Ebola and Zika virus disease outbreaks—and potential biological weapons attacks require scalable, flexible diagnosis and treatment. New technologies could enable the rapid identification and development of vaccine candidates, and plant-based antibody production systems.

Shifting ownership models in biotechnology

The rise of off-patent, generic tools and the lowering of technical barriers for engineering biology has the potential to help those in low-resource settings, benefit from developing a sustainable bioeconomy based on local needs and priorities, particularly where new advances are made open for others to build on.

Dr Jenny Molloy comments: “One theme that emerged repeatedly was that of inequality of access to the technology and its benefits. The rise of open source, off-patent tools could enable widespread sharing of knowledge within the biological engineering field and increase access to benefits for those in developing countries.”

Professor Johnathan Napier from Rothamsted Research adds: “The challenges embodied in the Sustainable Development Goals will require all manner of ideas and innovations to deliver significant outcomes. In agriculture, we are on the cusp of new paradigms for how and what we grow, and where. Demonstrating the fairness and usefulness of such approaches is crucial to ensure public acceptance and also to delivering impact in a meaningful way.”

Dr Christian R. Boehm concludes: “As these technologies emerge and develop, we must ensure public trust and acceptance. People may be willing to accept some of the benefits, such as the shift in ownership away from big business and towards more open science, and the ability to address problems that disproportionately affect the developing world, such as food security and disease. But proceeding without the appropriate safety precautions and societal consensus—whatever the public health benefits—could damage the field for many years to come.”

The research was made possible by the Centre for the Study of Existential Risk, the Synthetic Biology Strategic Research Initiative (both at the University of Cambridge), and the Future of Humanity Institute (University of Oxford). It was based on a workshop co-funded by the Templeton World Charity Foundation and the European Research Council under the European Union’s Horizon 2020 research and innovation programme.

Reference
Wintle, BC, Boehm, CR et al. A transatlantic perspective on 20 emerging issues in biological engineering. eLife; 14 Nov 2017; DOI: 10.7554/eLife.30247


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Ancient Fish Scales and Vertebrate Teeth Share An Embryonic Origin

Ancient fish scales and vertebrate teeth share an embryonic origin

source: www.cam.ac.uk

Latest findings support the theory that teeth in the animal kingdom evolved from the jagged scales of ancient fish, the remnants of which can be seen today embedded in the skin of sharks and skate.

This ancient dermal skeleton has undergone considerable reductions and modifications through time

Andrew Gillis

In biology, one long-running debate has teeth: whether ancient fish scales moved into the mouth with the origin of jaws, or if the tooth had its own evolutionary inception.

Recent studies on species such as zebrafish showed scales and teeth developing from distinctly different clusters of cells in fish embryos, pouring cold water on ‘teeth from scales’ theories.

However, while most fish in the sea have bones, one ancient lineage – sharks, skates and rays – possess skeletons made entirely of cartilage.

These cartilaginous fish retain some primitive characteristics that have been lost in their bony counterparts, including small spiky scales embedded in their skin called ‘dermal denticles’ that bear a striking resemblance to jagged teeth.

Now, researchers at the University of Cambridge have used fluorescent markers to track cell development in the embryo of a cartilaginous fish – a little skate in this case – and found that these thorny scales are in fact created from the same type of cells as teeth: neural crest cells.

The findings, published in the journal PNAS, support the theory that, in the depths of early evolution, these ‘denticle’ scales were carried into the emerging mouths of jawed vertebrates to form teeth. Jawed vertebrates now make up 99% of all living vertebrates, from fish to mammals.

“The scales of most fish that live today are very different from the ancient scales of early vertebrates,” says study author Dr Andrew Gillis from Cambridge’s Department of Zoology and the Marine Biological Laboratory in Woods Hole.

“Primitive scales were much more tooth-like in structure, but have been retained in only a few living lineages, including that of cartilaginous fishes such as skates and sharks.

“Stroke a shark and you’ll find it feels rougher than other fish, as shark skin is covered entirely in dermal denticles. There’s evidence that shark skin was actually used as sandpaper as early as the Bronze Age,” says Gillis.

“By labelling the different types of cells in the embryos of skate, we were able to trace their fates. We show that, unlike most fish, the denticle scales of sharks and skate develop from neural crest cells, just like teeth.

“Neural crest cells are central to the process of tooth development in mammals. Our findings suggest a deep evolutionary relationship between these primitive fish scales and the teeth of vertebrates.

“Early jawless vertebrates were filter feeders – sucking in small prey items from the water. It was the advent of both jaws and teeth that allowed vertebrates to begin processing larger and more complex prey.”

The very name of these scales, dermal denticles, alludes to the fact that they are formed of dentine: a hard calcified tissue that makes up the majority of a tooth, sitting underneath the enamel.

The jagged dermal denticles on sharks and skate – and, quite possibly, vertebrate teeth – are remnants of the earliest mineralised skeleton of vertebrates: superficial armour plating.

This armour would have perhaps peaked some 400 million years ago in now-extinct jawless vertebrate species, as protection against predation by ferocious sea scorpions, or even their early jawed kin.

The Cambridge scientists hypothesise that these early armour plates were multi-layered: consisting of a foundation of bone and an outer layer of dentine – with the different layers deriving from different types of cells in unborn embryos.

These layers were then variously retained, reduced or lost in different vertebrate linages over the course of evolution. “This ancient dermal skeleton has undergone considerable reductions and modifications through time,” says Gillis.

“The sharks and skate have lost the bony under-layer, while most fish have lost the tooth-like dentine outer layer. A few species, such as the bichir, a popular fish in home aquariums, have retained aspects of both layers of this ancient external skeleton.”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

The Beauty of Engineering

The beauty of engineering

 

source: www.cam.ac.uk

Crystal tigers, metal peacock feathers and a ‘nano man’ are just some of the striking images featured in the Department of Engineering’s annual photo competition, the winners of which have been announced today.

The competition, sponsored by ZEISS (Scanning electron microscopy division), international leaders in the fields of optics and optoelectronics, has been held annually for the last 13 years. See more of this year’s winners here.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Mini Liver Tumours’ Created In a Dish For The First Time

‘Mini liver tumours’ created in a dish for the first time

source: www.cam.ac.uk

Scientists have created mini biological models of human primary liver cancers, known as organoids, in the lab for the first time. In a paper published in Nature Medicine, the tiny laboratory models of tumours were used to identify a new drug that could potentially treat certain types of liver cancer.

Primary liver cancer is the second most lethal cancer worldwide. To better understand the biology of the disease and develop potential treatments, researchers need models that can grow in the lab and accurately reflect how the tumours behave in patients. Previously, cultures of cells had been used but these are hard to maintain and fail to recreate the 3D structure and tissue architecture of human tumours.

The researchers created the mini tumours (up to 0.5mm) – termed ‘tumouroids’ – to mimic the three most common forms of primary liver cancer. The tumour cells were surgically removed from eight patients and grown in a solution containing specific nutrients and substances which prevent healthy cells out-competing the tumour cells.

The team, from the Wellcome/Cancer Research UK Gurdon Institute in Cambridge, used the tumouroids to test the efficacy of 29 different drugs, including those currently used in treatment and drugs in development. One compound, a type of protein inhibitor, was found to inhibit the activation of a protein called ERK in two of the three types of tumouroids, a crucial step in the development of liver cancer.

The researchers then tested this compound in vivo, transplanting two types of tumouroids into mice and treating them with the drug. A marked reduction in tumour growth was seen in mice treated with the drug, identifying a potential novel treatment for some types of primary liver cancer.

The tumouroids were able to preserve tissue structure as well as the gene expression patterns of the original human tumours from which they were derived. The individual subtypes of three different types of liver cancer, as well as the different tumour tissues which they came from, were all still distinguishable even after they had been grown in a dish for a long time. As the tumouroids retain the biological features of their parent tumour, they could play an important role in developing personalised medicine for patients.

The creation of biologically accurate models of tumours will also reduce the number of animals needed in certain experiments. Animal studies will still be required to validate findings, but the tumouroids will allow scientists to explore key questions about the biology of liver cancer in cultures rather than mice.

Lead researcher Dr Meritxell Huch, a Wellcome Sir Henry Dale Fellow from the Gurdon Institute, said: “We had previously created organoids from healthy liver tissue, but the creation of liver tumouroids is a big step forward for cancer research. They will allow us to understand much more about the biology of liver cancer and, with further work, could be used to test drugs for individual patients to create personalised treatment plans.”

Dr Andrew Chisholm, Head of Cellular and Developmental Sciences at Wellcome said: “This work shows the power of organoid cultures to model human cancers. It is impressive to see just how well the organoids are able to mimic the biology of different liver tumour types, giving researchers a new way of investigating this disease.  These models are vital for the next generation of cancer research, and should allow scientists to minimise the numbers of animals used in research.”

Dr Vicky Robinson, Chief Executive of the NC3Rs which partially funded the work, said: “We are pleased to see that the funds from our annual 3Rs prize, sponsored by GlaxoSmithKline, have furthered Dr Huch’s research. Each year the prize recognises exceptional science which furthers the 3Rs, and the work being conducted by Meri and her team is continuing to make progress in this area. This new breakthrough involving liver cancer organoids has the potential to reduce the number of animals required in the early stages of liver cancer research, and provide more biologically accurate models of human tumours.”

This work was funded by a National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs) research prize, Wellcome and Cancer Research UK Cambridge Centre.

Reference
Broutier, L et al. Human primary liver cancer–derived organoid cultures for diseasemodelling and drug screening. Nature Medicine; 13 Nov 2017; DOI: 10.1038/nm.4438

Press release from Wellcome.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Keyhole Surgery More Effective Than Open Surgery For Ruptured Aneurysm

Keyhole surgery more effective than open surgery for ruptured aneurysm

The use of keyhole surgery to repair ruptured abdominal aortic aneurysm is both clinically and cost effective and should be adopted more widely, concludes a randomised trial published by The BMJ today.

More than 1000 people a year in the UK require emergency surgery to repair a ruptured abdominal aortic aneurysm. Without repair, ruptured aneurysm is nearly always fatal

Michael Sweeting

This is the first randomised trial comparing the use of keyhole (endovascular) aneurysm repair versus traditional open surgery to repair ruptured aneurysm, with full midterm follow-up.

Abdominal aortic aneurysm is a swelling of the aorta – the main blood vessel that leads away from the heart, down through the abdomen to the rest of the body. If the artery wall ruptures, the risk of death is high, and emergency surgery is needed.

Three recent European randomised trials showed that keyhole repair does not reduce the high death rate up to three months after surgery compared with open repair. However, mid-term outcomes (three months to three years) of keyhole repair are still uncertain.

An international research team set out to assess three year clinical outcomes and cost effectiveness of a strategy of keyhole repair (whenever the shape of the aorta allows this) versus open repair for patients with suspected ruptured abdominal aortic aneurysm who were part of the IMPROVE trial.

Dr Michael Sweeting from the Department of Public Health and Primary Care at the University of Cambridge, who was involved in the trial, says: “More than 1000 people a year in the UK require emergency surgery to repair a ruptured abdominal aortic aneurysm. Without repair, ruptured aneurysm is nearly always fatal. However, surgery is not without its own significant risks, so we are always looking at ways of reducing the risk to the patient. One option is keyhole surgery, but until now not enough was known about how its outcomes compare to regular, open surgery beyond one year after repair.”

The trial involved 613 patients from 30 vascular centres (29 in the UK – including at Addenbrooke’s Hospital, Cambridge – and one in Canada) with a clinical diagnosis of ruptured aneurysm, of whom 316 were randomised to a strategy of keyhole repair and 297 to open repair.

Deaths were monitored for an average of 4.9 years and were similar in both groups three months after surgery. At three years, there were fewer deaths in the keyhole group than in the open repair group, leading to lower mortality (48% vs 56%). However, after seven years there was no clear difference between the groups.

The need for repeat surgery (‘reinterventions’) related to the aneurysm occurred at a similar rate in both groups, with about 28% of each group needing at least one reintervention after three years.

Average quality of life was higher in the keyhole group in the first year, but by three years was similar across the groups. This early higher average quality of life, coupled with the lower mortality at three years, led to a gain in average quality adjusted life years or QALYs (a measure of healthy years lived) at three years in the keyhole versus the open repair group.

The keyhole group also spent fewer days in hospital (14.4 versus 20.5 in the open repair group) and had lower overall costs (£16,900 versus £19,500 in the open repair group).

The researchers point to some study limitations, such as sample size and midterm data focusing on aneurysm-related events, which may have led to some bias. Nevertheless, they say compared with open repair, there are clear benefits associated with keyhole surgery.

“These findings show that, in the first three years after repair, keyhole surgery can improve outcomes and quality of life for patients compared to open surgery, and is more cost effective and requires less time in hospital – important factors to consider for our stretched health services,” adds Dr Sweeting.

Reference
Comparative clinical effectiveness and cost effectiveness of endovascular strategy v open repair for ruptured abdominal aortic aneurysm: three year results of the IMPROVE randomised trial. BMJ; 15 Nov 2017; DOI:

Adapted from a press release from The BMJ.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Children With Disabilities Are Being Denied Equal Opportunities For a Quality Education Across The World, Including In The UK

Children with disabilities are being denied equal opportunities for a quality education across the world, including in the UK

source: www.cam.ac.uk

Researchers from the Faculty of Education have produced a new report on the current state of education for children with disabilities in both England and India. Here, Dr Nidhi Singal, one of the report’s authors, outlines some of the key statistics, and argues that teachers need better training and more support “underpinned by principles of inclusion”.

We need to invest in inclusive teaching and learning processes and not just changes to school infrastructure

Nidhi Singal

Countries, with both developed and developing economies, need to do more to ensure that children with disabilities not only access education, but also benefit from quality education.

In England, while children with special educational needs and disabilities (SEND) access school, multiple concerns have been raised in relation to their learning and quality of life in school. The educational attainments of these children are significantly lower than for those without SEND at every level of the national curriculum.

In 2017 the Department for Education reported that, at Key Stage 2 level, only 14% of children with SEND reached the expected level for reading, writing and maths (in contrast to 62% of children without SEND).

Socially, there has been an increase in incidents of bullying and hate crime in relation to children with SEND and the National Society for the Prevention of Cruelty to Children highlights that they are significantly more likely to face abuse. Official statistics note that children with social, emotional, mental health needs are nine times more likely to face permanent exclusion from school.

The World Health Organisation in collaboration with the World Bank recently emphasised that 15% of the world’s population, approximately one billion, live with some form of disability. Estimates for the number of children under the age of 14 living with disabilities range between 93m and 150m.

Across the world, people with disabilities have poorer health outcomes, lower educational achievements, less economic participation and higher rates of poverty than people without disabilities. This is partly because people with disabilities experience significant barriers in accessing basic services, including health, education and employment.

Amongst these, education is paramount as it has significant economic, social and individual returns. Education has the potential to lift people out of chronic poverty. Accessing quality education can improve learning outcomes which leads to positive economic growth. The Global Monitoring Report calculates that if all students living in low income countries were to leave school with basic reading skills there would be a 12% reduction in world poverty.

Additionally, education has the potential to create more equitable and healthy societies. For example, evidence shows educating mothers reduces early births, lowers infant mortality rates and improves child nutrition.

Furthermore, inclusive education is integral to creating societies that are interconnected, based on values of social justice, equity of opportunities and freedom. The Sustainable Development Goals have given a considerable boost to this vision of “inclusive and equitable quality education” with significant international proclamations and national legislations being drawn up. Nevertheless, children with disabilities continue to remain the most difficult to reach.

Including children with disabilities in education systems, and ensuring quality education, is a moral and ethical commitment with considerable benefits both at the individual and national level. The International Labour Organisation estimates that the exclusion of persons with disabilities from the work force costs nations up to 7% of the national GDP. Other estimates from China suggest that every additional year of schooling in rural areas means a 5-8% wage increase for the person with disabilities.

While there is a long way to go, there is little question that educational access is on an upward trajectory in many low and middle income countries. According to official data from India over the last five years there has been approximately 16% increase in the numbers of children with disabilities enrolled in mainstream primary schools.

Nonetheless, children who are most like to be excluded, even in states with high enrolment rates are those with disabilities. They are also most likely to drop out before completing five years of primary schooling and are least likely to transition to secondary school or higher education.

Across the globe, learning for children with disabilities remains a significant challenge. In order to address this, we need to invest in inclusive teaching and learning processes and not just changes to school infrastructure. Teachers need better training and support underpinned by principles of inclusion. Significantly, children with disabilities must be respected as important partners in creating better schools for all.

The report has been produced for the World Innovation Summit for Education and will be presented this week at the summit in Doha.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Archaeologists Uncover Rare 2,000-Year-Old Sundial During Roman Theatre Excavation

Archaeologists uncover rare 2,000-year-old sundial during Roman theatre excavation

source: www.cam.ac.uk

A 2,000-year-old intact and inscribed sundial – one of only a handful known to have survived – has been recovered during the excavation of a roofed theatre in the Roman town of Interamna Lirenas, near Monte Cassino, in Italy.

“Not only have we been able to identify the individual who commissioned the sundial, we have also been able to determine the specific public office he held in relation to the likely date of the inscription”

Alessandro Launaro

Not only has the sundial survived largely undamaged for more than two millennia, but the presence of two Latin texts means researchers from the University of Cambridge have been able to glean precise information about the man who commissioned it.

The sundial was found lying face down by students of the Faculty of Classics as they were excavating the front of one of the theatre’s entrances along a secondary street. It was probably left behind at a time when the theatre and town was being scavenged for building materials during the Medieval to post-Medieval period. In all likelihood it did not belong to the theatre, but was removed from a prominent spot, possibly on top of a pillar in the nearby forum.

“Less than a hundred examples of this specific type of sundial have survived and of those, only a handful bear any kind of inscription at all – so this really is a special find,” said Dr Alessandro Launaro, a lecturer at the Faculty of Classics at Cambridge and a Fellow of Gonville & Caius College.

“Not only have we been able to identify the individual who commissioned the sundial, we have also been able to determine the specific public office he held in relation to the likely date of the inscription.”

The base prominently features the name of M(arcus) NOVIUS M(arci) F(ilius) TUBULA [Marcus Novius Tubula, son of Marcus], whilst the engraving on the curved rim of the dial surface records that he held the office of TR(ibunus) PL(ebis) [Plebeian Tribune] and paid for the sundial D(e) S(ua) PEC(unia) (with his own money).

The nomen Novius was quite common in Central Italy. On the other hand, the cognomen Tubula (literally ‘small trumpet’) is only attested at Interamna Lirenas.

But even more striking is the specific public office Tubula held in relation to the likely date of the inscription. Various considerations about the name of the individual and the lettering style comfortably place the sundial’s inscription at a time (mid 1st c. BC onwards) by which the inhabitants of Interamna had already been granted full Roman citizenship.

“That being the case, Marcus Novius Tubula, hailing from Interamna Lirenas, would be a hitherto unknown Plebeian Tribune of Rome,” added Launaro. “The sundial would have represented his way of celebrating his election in his own hometown.”

Carved out from a limestone block (54 x 35 x 25 cm), the sundial features a concave face, engraved with 11 hour lines (demarcating the twelve horae of daylight) intersecting three day curves (giving an indication of the season with respect to the time of the winter solstice, equinox and summer solstice). Although the iron gnomon (the needle casting the shadow) is essentially lost, part of it is still preserved under the surviving lead fixing. This type of ‘spherical’ sundial was relatively common in the Roman period and was known as hemicyclium.

“Even though the recent archaeological fieldwork has profoundly affected our understanding of Interamna Lirenas, dispelling long-held views about its precocious decline and considerable marginality, this was not a town of remarkable prestige or notable influence,” added Launaro. “It remained an average, middle-sized settlement, and this is exactly what makes it a potentially very informative case-study about conditions in the majority of Roman cities in Italy at the time”.

“In this sense, the discovery of the inscribed sundial not only casts new light on the place Interamna Lirenas occupied within a broader network of political relationships across Roman Italy, but it is also a more general indicator of the level of involvement in Rome’s own affairs that individuals hailing from this and other relatively secondary communities could aspire to.”

The ongoing archaeological project at Interamna Lirenas continues to add new evidence about important aspects of the Roman civilization, stressing the high levels of connectivity and integration (political, social, economic and cultural) which it featured.

The 2017 excavation, directed by Dr Launaro (Gonville & Caius College) and Professor Martin Millett (Fitzwilliam College), both from the Faculty of Classics, in partnership with Dr Giovanna Rita Bellini of the Italian Soprintendenza Archeologia, Belle Arti e Paesaggio per le Province di Frosinone, Latina e Rieti, is part of a long-standing collaboration with the British School at Rome and the Comune of Pignataro Interamna and has benefitted from the generous support of the Isaac Newton Trust and Mr Antonio Silvestro Evangelista.

Inset image: The find spot near the former roofed theatre in Interamna Lirenas


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Sheep Are Able To Recognise Human Faces From Photographs

Sheep are able to recognise human faces from photographs

source: www.cam.ac.uk

Sheep can be trained to recognise human faces from photographic portraits – and can even identify the picture of their handler without prior training – according to new research from scientists at the University of Cambridge.

We’ve shown that sheep have advanced face-recognition abilities, comparable with those of humans and monkeys

Jenny Morton

Sheep can be trained to recognise human faces from photographic portraits – and can even identify the picture of their handler without prior training – according to new research from scientists at the University of Cambridge.

The study, published today in the journal Royal Society: Open Science, is part a series of tests given to the sheep to monitor their cognitive abilities. Because of the relatively large size of their brains and their longevity, sheep are a good animal model for studying neurodegenerative disorders such as Huntington’s disease.

The ability to recognise faces is one of the most important human social skills. We recognise familiar faces easily, and can identify unfamiliar faces from repeatedly presented images. As with some other animals such as dogs and monkeys, sheep are social animals that can recognise other sheep as well as familiar humans. Little is known, however, about their overall ability to process faces.

Researchers from Cambridge’s Department of Physiology, Development and Neuroscience trained eight sheep to recognise the faces of four celebrities (Fiona Bruce, Jake Gyllenhaal, Barack Obama and Emma Watson) from photographic portraits displayed on computer screens.

Training involved the sheep making decisions as they moved around a specially-designed pen. At one end of the pen, they would see two photographs displayed on two computer screens and would receive a reward of food for choosing the photograph of the celebrity (by breaking an infrared beam near the screen); if they chose the wrong photograph, a buzzer would sound and they would receive no reward. Over time, they learn to associate a reward with the celebrity’s photograph.

After training, the sheep were shown two photograph – the celebrity’s face and another face. In this test, sheep correctly chose the learned celebrity face eight times out of ten.

In these initial tests, the sheep were shown the faces from the front, but to test how well they recognised the faces, the researchers next showed them the faces at an angle. As expected, the sheep’s performance dropped, but only by about 15% – a figure comparable to that seen when humans perform the task.

Finally, the researchers looked at whether sheep were able to recognise a handler from a photograph without pre-training. The handlers typically spend two hours a day with the sheep and so the sheep are very familiar with them. When a portrait photograph of the handler was interspersed randomly in place of the celebrity, the sheep chose the handler’s photograph over the unfamiliar face seven out of ten times.

During this final task the researchers observed an interesting behaviour. Upon seeing a photographic image of the handler for the first time – in other words, the sheep had never seen an image of this person before – the sheep did a ‘double take’. The sheep checked first the (unfamiliar) face, then the handler’s image, and then unfamiliar face again before making a decision to choose the familiar face, of the handler.

“Anyone who has spent time working with sheep will know that they are intelligent, individual animals who are able to recognise their handlers,” says Professor Jenny Morton, who led the study. “We’ve shown with our study that sheep have advanced face-recognition abilities, comparable with those of humans and monkeys.

“Sheep are long-lived and have brains that are similar in size and complexity to those of some monkeys. That means they can be useful models to help us understand disorders of the brain, such as Huntington’s disease, that develop over a long time and affect cognitive abilities. Our study gives us another way to monitor how these abilities change, particularly in sheep who carry the gene mutation that causes Huntington’s disease.”

Professor Morton’s team recently began studying sheep that have been genetically modified to carry the mutation that causes Huntington’s disease.

Huntington’s disease affects more than 6,700 people in the UK. It is an incurable neurodegenerative disease that typically begins in adulthood. Initially, the disease affects motor coordination, mood, personality and memory, as well as other complex symptoms including impairments in recognising facial emotion. Eventually, patients have difficulty in speech and swallowing, loss of motor function and die at a relatively early age. There is no known cure for the disease, only ways to manage the symptoms.

The research was supported by the CHDI Foundation, Inc., a US-based charitable trust that supports biomedical research related to Huntington’s disease.

Reference
Knolle, F et al. Sheep recognize familiar and unfamiliar human faces from two-dimensional images. Royal Society Open Science; 8 Nov 2017; DOI: 10.1098/rsos.171228


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Fully Integrated Circuits Printed Directly Onto Fabric

Fully integrated circuits printed directly onto fabric

source: www.cam.ac.uk

Researchers have successfully incorporated washable, stretchable and breathable electronic circuits into fabric, opening up new possibilities for smart textiles and wearable electronics. The circuits were made with cheap, safe and environmentally friendly inks, and printed using conventional inkjet printing techniques.

Turning textile fibres into functional electronic components can open to an entirely new set of applications from healthcare and wellbeing to the Internet of Things.

Felice Torrisi

The researchers, from the University of Cambridge, working with colleagues in Italy and China, have demonstrated how graphene – a two-dimensional form of carbon – can be directly printed onto fabric to produce integrated electronic circuits which are comfortable to wear and can survive up to 20 cycles in a typical washing machine.

The new textile electronic devices are based on low-cost, sustainable and scalable inkjet printing of inks based on graphene and other two-dimensional materials, and are produced by standard processing techniques. The results are published in the journal Nature Communications.

Based on earlier work on the formulation of graphene inks for printed electronics, the team designed low-boiling point inks, which were directly printed onto polyester fabric. Additionally, they found that modifying the roughness of the fabric improved the performance of the printed devices. The versatility of this process allowed the researchers to design not only single transistors but all-printed integrated electronic circuits combining active and passive components.

Most wearable electronic devices that are currently available rely on rigid electronic components mounted on plastic, rubber or textiles. These offer limited compatibility with the skin in many circumstances, are damaged when washed and are uncomfortable to wear because they are not breathable.

“Other inks for printed electronics normally require toxic solvents and are not suitable to be worn, whereas our inks are both cheap, safe and environmentally-friendly, and can be combined to create electronic circuits by simply printing different two-dimensional materials on the fabric,” said Dr Felice Torrisi of the Cambridge Graphene Centre, the paper’s senior author.

“Digital textile printing has been around for decades to print simple colourants on textiles, but our result demonstrates for the first time that such technology can also be used to print the entire electronic integrated circuits on textiles,” said co-author Professor Roman Sordan of Politecnico di Milano. “Although we demonstrated very simple integrated circuits, our process is scalable and there are no fundamental obstacles to the technological development of wearable electronic devices both in terms of their complexity and performance.“

“The printed components are flexible, washable and require low power, essential requirements for applications in wearable electronics,” said PhD student Tian Carey, the paper’s first author.

The work opens up a number of commercial opportunities for two-dimensional material inks, ranging from personal health and well-being technology, to wearable energy harvesting and storage, military garments, wearable computing and fashion.

“Turning textile fibres into functional electronic components can open to an entirely new set of applications from healthcare and wellbeing to the Internet of Things,” said Torrisi. “Thanks to nanotechnology, in the future our clothes could incorporate these textile-based electronics, such as displays or sensors and become interactive.”

The use of graphene and other related 2D material (GRM) inks to create electronic components and devices integrated into fabrics and innovative textiles is at the centre of new technical advances in the smart textiles industry. The teams at the Cambridge Graphene Centre and Politecnico di Milano are also involved in the Graphene Flagship, an EC-funded, pan-European project dedicated to bringing graphene and GRM technologies to commercial applications.

The research was supported by grants from the Graphene Flagship, the European Research Council’s Synergy Grant, The Engineering and Physical Science Research Council, The Newton Trust, the International Research Fellowship of the National Natural Science Foundation of China and the Ministry of Science and Technology of China. The technology is being commercialised by Cambridge Enterprise, the University’s commercialisation arm.

Reference:
Tian Carey et al. ‘Fully inkjet-printed two-dimensional material field-effect heterojunctions for wearable and textile electronics.’ Nature Communications (2017). DOI: 10.1038/s41467-017-01210-2


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Cambridge BID Renewal Vote: Businesses Back Vision For City

Cambridge BID renewal vote: businesses back vision for city

BUSINESS

Cambridge BID's Ian Sandison
Cambridge BID’s Ian Sandison
source: http://www.cambridge-news.co.uk/business/business-news/cambridge-bid-renewal-vote-businesses-13856256

Businesses have backed Cambridge BID’s vision for a “world class” city experience by handing the organisation a new five-year mandate.

The business improvement district (BID) organisation, which was first formed in April 2013, will now run until at least March 2023 following a renewal ballot, which saw 80 per cent of voters from the retail, leisure, business and education sectors in the area back the BID.

Cambridge BID is funded by businesses themselves, with some 1,100 firms coming together to pay a proportional levy, thereby creating a £4.5m pot to be invested in Cambridge over the coming years.

In total, 45 per cent of businesses voted in the renewal ballot – a 35 per cent increase in turn-out compared with the term one ballot.

Cambridge BID events include the popular night markets and open air cinema

Ian Sandison, Chairman of Cambridge BID, said: “This is excellent news for Cambridge as it means that £4.5million of investment will be generated for the city over the next five years. We are very pleased with the outcome and look forward to delivering our existing projects as well as the services and initiatives outlined in the exciting new business plan. Our vision is to create a world-class experience for all who visit, live and work in Cambridge, a global city.”

“I would like to take this opportunity to thank the business community for showing faith in the work of Cambridge BID, as well as our partners such as Visit Cambridge & Beyond, Cambridge City Council and others with whom we have worked closely and will continue to do so.”

The vote took place last month, and as part of its renewed mandate Cambridge BID has set out a new business plan , which includes extending the BID area to incorporate the CB1 area and Station Square. It hopes this will help connect what it sees as a “key gateway of the city” to its historic centre, while also enabling projects that will attract new talent and new businesses to Cambridge.

Three new key work streams – welcome, experience and support – are identified in the plan, while existing initiatives, including the annual Mystery Shop, the business cost-saving scheme, the seven-day-a-week rapid response cleaning service, the provision of Christmas lights, the payment of CAMBAC membership (for BID levy payers who join the CAMBAC radio scheme), the monthly Cambridge Performance reports and continued support of Street Aid, the taxi marshals and the Street Pastors are all set to continue.

Cllr Lewis Herbert, leader of Cambridge City Council, added: “This result secures the future of a number of vital projects for city businesses and Cambridge, and will generate major additional investment and community benefits for our great city.

“The future Cambridge BID plans have been overwhelmingly endorsed by local businesses, which is just reward for what the BID partnership has already achieved and will now achieve.

“We look forward to continuing our joint work with the BID on exciting new projects over the next five years, creating a city centre everyone enjoys each time they are here and keeps them coming back for more.”

Périgord Black Truffle Cultivated In The UK For The First Time

Périgord black truffle cultivated in the UK for the first time

source: www.cam.ac.uk

The Mediterranean black truffle, one of the world’s most expensive ingredients, has been successfully cultivated in the UK, as climate change threatens its native habitat.

Even though humans have been eating truffles for centuries, we know remarkably little about how they grow and how they interact with their host trees.

Ulf Büntgen

Researchers from the University of Cambridge and Mycorrhizal Systems Ltd (MSL) have confirmed that a black truffle has been successfully cultivated in the UK for the first time: the farthest north that the species has ever been found. It was grown as part of a programme in Monmouthshire, South Wales, run by MSL in collaboration with local farmers. The results of the programme, reported in the journal Climate Research, suggest that truffle cultivation may be possible in many parts of the UK.

After nine years of waiting, the truffle was harvested in March 2017 by a trained dog named Bella. The aromatic fungus was growing within the root system of a Mediterranean oak tree that had been treated to encourage truffle production. Further microscopic and genetic analysis confirmed that Bella’s find was indeed a Périgord black truffle (Tuber melanosporum).

The black truffle is one of the most expensive delicacies in the world, worth as much as £1,700 per kilogram. Black truffles are prized for their intense flavour and aroma, but they are difficult and time-consuming to grow and harvest, and are normally confined to regions with a Mediterranean climate. In addition, their Mediterranean habitat has been affected by drought due to long-term climate change, and yields are falling while the global demand continues to rise. The truffle industry is projected to be worth £4.5 billion annually in the next 10-20 years.

Black truffles grow below ground in a symbiotic relationship with the root system of trees in soils with high limestone content. They are found mostly in northern Spain, southern France and northern Italy, where they are sniffed out by trained dogs or pigs. While they can form naturally, many truffles are cultivated by inoculating oak or hazelnut seedlings with spores and planting them in chalky soils. Even through cultivation, there is no guarantee that truffles will grow.

“It’s a risky investment for farmers – even though humans have been eating truffles for centuries, we know remarkably little about how they grow and how they interact with their host trees,” said paper co-author Professor Ulf Büntgen of Cambridge’s Department of Geography. “Since the system is underground, we can’t see how truffles are affected by different environmental conditions, or even when the best time to water them is. There’s been no science behind it until now, so progress is slow.”

In partnership with local farmers, Büntgen’s co-author Dr Paul Thomas from MSL and the University of Stirling has been cultivating truffles in the UK for the past decade. In 2015, MSL successfully cultivated a UK native Burgundy truffle, but this is the first time the more valuable black Périgord truffle has been cultivated in such a northern and maritime climate. Its host tree is a Mediterranean oak that was planted in 2008. Before planting, the tree was inoculated with truffle spores, and the surrounding soil was made less acidic by treating it with lime.

“This is one of the best-flavoured truffle species in the world and the potential for industry is huge,” said Thomas. “We planted the trees just to monitor their survival, but we never thought this Mediterranean species could actually grow in the UK – it’s an incredibly exciting development.”

The researchers have attributed the fact that black truffles are able to grow so far outside their native Mediterranean habitat to climate change. “Different species respond to climate change on different scales and at different rates, and you often get an ecological mismatch,” said Büntgen. “For instance, insects can move quickly, while the vegetation they depend on may not. It’s possible that truffles are one of these fast-shifting species.”

“This cultivation has shown that the climatic tolerance of truffles is much broader than previously thought, but it’s likely that it’s only possible because of climate change, and some areas of the UK – including the area around Cambridge – are now suitable for the cultivation of this species,” said Thomas. “While truffles are a very valuable crop, together with their host trees, they are also a beneficial component for conservation and biodiversity.”

The first harvested truffle, which weighed 16 grams, has been preserved for posterity, but in future, the truffles will be distributed to restaurants in the UK.

Reference: 
Paul Thomas and Ulf Büntgen. ‘New UK truffle find as a result of climate change.’ Climate Research (2017). DOI: 10.3354/cr01494. 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Scientists Identify Mechanism That Helps Us Inhibit Unwanted Thoughts

Scientists identify mechanism that helps us inhibit unwanted thoughts

source: www.cam.ac.uk

Scientists have identified a key chemical within the ‘memory’ region of the brain that allows us to suppress unwanted thoughts, helping explain why people who suffer from disorders such as anxiety, post-traumatic stress disorder (PTSD), depression, and schizophrenia often experience persistent intrusive thoughts when these circuits go awry.

Our ability to control our thoughts is fundamental to our wellbeing. When this capacity breaks down, it causes some of the most debilitating symptoms of psychiatric diseases

Michael Anderson

We are sometimes confronted with reminders of unwanted thoughts — thoughts about unpleasant memories, images or worries. When this happens, the thought may be retrieved, making us think about it again even though we prefer not to. While being reminded in this way may not be a problem when our thoughts are positive, if the topic was unpleasant or traumatic, our thoughts may be very negative, worrying or ruminating about what happened, taking us back to the event.

“Our ability to control our thoughts is fundamental to our wellbeing,” explains Professor Michael Anderson from the Medical Research Council Cognition and Brain Sciences Unit, which recently transferred to the University of Cambridge. “When this capacity breaks down, it causes some of the most debilitating symptoms of psychiatric diseases: intrusive memories, images, hallucinations, ruminations, and pathological and persistent worries. These are all key symptoms of mental illnesses such as PTSD, schizophrenia, depression, and anxiety.”

Professor Anderson likens our ability to intervene and stop ourselves retrieving particular memories and thoughts to stopping a physical action. “We wouldn’t be able to survive without controlling our actions,” he says. “We have lots of quick reflexes that are often useful, but we sometimes need to control these actions and stop them from happening. There must be a similar mechanism for helping us stop unwanted thoughts from occurring.”

A region at the front of the brain known as the prefrontal cortex is known to play a key role in controlling our actions and has more recently been shown to play a similarly important role in stopping our thoughts. The prefrontal cortex acts as a master regulator, controlling other brain regions – the motor cortex for actions and the hippocampus for memories.

In research published today in the journal Nature Communications, a team of scientists led by Dr Taylor Schmitz and Professor Anderson used a task known as the ‘Think/No-Think’ procedure to identify a significant new brain process that enables the prefrontal cortex to successfully inhibit our thoughts.

In the task, participants learn to associate a series of words with a paired, but otherwise unconnected, word, for example ordeal/roach and moss/north. In the next stage, participants are asked to recall the associated word if the cue is green or to suppress it if the cue is red; in other words, when shown ‘ordeal’ in red, they are asked to stare at the word but to stop themselves thinking about the associated thought ‘roach’.

Using a combination of functional magnetic resonance imaging (fMRI) and magnetic resonance spectroscopy, the researchers were able to observe what was happening within key regions of the brain as the participants tried to inhibit their thoughts. Spectroscopy enabled the researchers to measure brain chemistry, and not just brain activity, as is usually done in imaging studies.

Professor Anderson, Dr Schmitz and colleagues showed that the ability to inhibit unwanted thoughts relies on a neurotransmitter – a chemical within the brain that allows messages to pass between nerve cells – known as GABA. GABA is the main ‘inhibitory’ neurotransmitter in the brain, and its release by one nerve cell can suppress activity in other cells to which it is connected. Anderson and colleagues discovered that GABA concentrations within the hippocampus – a key area of the brain involved in memory – predict people’s ability to block the retrieval process and prevent thoughts and memories from returning.

“What’s exciting about this is that now we’re getting very specific,” he explains. “Before, we could only say ‘this part of the brain acts on that part’, but now we can say which neurotransmitters are likely important – and as a result, infer the role of inhibitory neurons – in enabling us to stop unwanted thoughts.”

“Where previous research has focused on the prefrontal cortex – the command centre – we’ve shown that this is an incomplete picture. Inhibiting unwanted thoughts is as much about the cells within the hippocampus – the ‘boots on the ground’ that receive commands from the prefrontal cortex. If an army’s foot-soldiers are poorly equipped, then its commanders’ orders cannot be implemented well.”

The researchers found that even within his sample of healthy young adults, people with less hippocampal GABA (less effective ‘foot-soldiers’) were less able to suppress hippocampal activity by the prefrontal cortex—and as a result much worse at inhibiting unwanted thoughts.

The discovery may answer one of the long-standing questions about schizophrenia. Research has shown that people affected by schizophrenia have ‘hyperactive’ hippocampi, which correlates with intrusive symptoms such as hallucinations. Post-mortem studies have revealed that the inhibitory neurons (which use GABA) in the hippocampi of these individuals are compromised, possibly making it harder for the prefrontal cortex to regulate activity in this structure. This suggests that the hippocampus is failing to inhibit errant thoughts and memories, which may be manifest as hallucinations.

According to Dr Schmitz: “The environmental and genetic influences that give rise to hyperactivity in the hippocampus might underlie a range of disorders with intrusive thoughts as a common symptom.”

In fact, studies have shown that elevated activity in the hippocampus is seen in a broad range of conditions such as PTSD, anxiety and chronic depression, all of which include a pathological inability to control thoughts – such as excessive worrying or rumination.

While the study does not examine any immediate treatments, Professor Anderson believes it could offer a new approach to tackling intrusive thoughts in these disorders. “Most of the focus has been on improving functioning of the prefrontal cortex,” he says, “but our study suggests that if you could improve GABA activity within the hippocampus, this may help people to stop unwanted and intrusive thoughts.”

The research was funded by the Medical Research Council.

Reference
Schmitz, TW et al. Hippocampal GABA enables inhibitory control over unwanted thoughts. Nature Communications; 3 Nov 2017; DOI: 10.1038/s41467-017-00956-z


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Oldest Recorded Solar Eclipse Helps Date The Egyptian Pharaohs

Oldest recorded solar eclipse helps date the Egyptian pharaohs

source: www.cam.ac.uk

Researchers have pinpointed the date of what could be the oldest solar eclipse yet recorded. The event, which occurred on 30 October 1207 BC, is mentioned in the Bible and could have consequences for the chronology of the ancient world.

If these words are describing a real observation, then a major astronomical event was taking place – the question for us to figure out is what the text actually means.

Colin Humphreys

Using a combination of the biblical text and an ancient Egyptian text, the researchers were then able to refine the dates of the Egyptian pharaohs, in particular the dates of the reign of Ramesses the Great. The results are published in the Royal Astronomical Society journal Astronomy & Geophysics.

The biblical text in question comes from the Old Testament book of Joshua and has puzzled biblical scholars for centuries. It records that after Joshua led the people of Israel into Canaan – a region of the ancient Near East that covered modern-day Israel and Palestine – he prayed: “Sun, stand still at Gibeon, and Moon, in the Valley of Aijalon. And the Sun stood still, and the Moon stopped, until the nation took vengeance on their enemies.”

“If these words are describing a real observation, then a major astronomical event was taking place – the question for us to figure out is what the text actually means,” said paper co-author Professor Sir Colin Humphreys from the University of Cambridge’s Department of Materials Science & Metallurgy, who is also interested in relating scientific knowledge to the Bible.

“Modern English translations, which follow the King James translation of 1611, usually interpret this text to mean that the sun and moon stopped moving,” said Humphreys, who is also a Fellow of Selwyn College. “But going back to the original Hebrew text, we determined that an alternative meaning could be that the sun and moon just stopped doing what they normally do: they stopped shining. In this context, the Hebrew words could be referring to a solar eclipse, when the moon passes between the earth and the sun, and the sun appears to stop shining. This interpretation is supported by the fact that the Hebrew word translated ‘stand still’ has the same root as a Babylonian word used in ancient astronomical texts to describe eclipses.”

Humphreys and his co-author, Graeme Waddington, are not the first to suggest that the biblical text may refer to an eclipse, however, earlier historians claimed that it was not possible to investigate this possibility further due to the laborious calculations that would have been required.

Independent evidence that the Israelites were in Canaan between 1500 and 1050 BC can be found in the Merneptah Stele, an Egyptian text dating from the reign of the Pharaoh Merneptah, son of the well-known Ramesses the Great. The large granite block, held in the Egyptian Museum in Cairo, says that it was carved in the fifth year of Merneptah’s reign and mentions a campaign in Canaan in which he defeated the people of Israel.

Earlier historians have used these two texts to try to date the possible eclipse, but were not successful as they were only looking at total eclipses, in which the disc of the sun appears to be completely covered by the moon as the moon passes directly between the earth and the sun. What the earlier historians failed to consider was that it was instead an annular eclipse, in which the moon passes directly in front of the sun, but is too far away to cover the disc completely, leading to the characteristic ‘ring of fire’ appearance. In the ancient world, the same word was used for both total and annular eclipses.

The researchers developed a new eclipse code, which takes into account variations in the Earth’s rotation over time. From their calculations, they determined that the only annular eclipse visible from Canaan between 1500 and 1050 BC was on 30 October 1207 BC, in the afternoon. If their arguments are accepted, it would not only be the oldest solar eclipse yet recorded, it would also enable researchers to date the reigns of Ramesses the Great and his son Merneptah to within a year.

“Solar eclipses are often used as a fixed point to date events in the ancient world,” said Humphreys. Using these new calculations, the reign of Merneptah began in 1210 or 1209 BC. As it is known from Egyptian texts how long he and his father reigned for, it would mean that Ramesses the Great reigned from 1276-1210 BC, with a precision of plus or minus one year, the most accurate dates available. The precise dates of the pharaohs have been subject to some uncertainty among Egyptologists, but this new calculation, if accepted, could lead to an adjustment in the dates of several of their reigns and enable us to date them precisely.

Reference
Colin Humphreys and Graeme Waddington. ‘Solar eclipse of 1207 BC helps to date pharaohs.’ Astronomy & Geophysics (2017). DOI: 10.1093/astrogeo/atx178.


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Skin Found To Play a Role In Controlling Blood Pressure

Skin found to play a role in controlling blood pressure

source: www.cam.ac.uk

Skin plays a surprising role in helping regulate blood pressure and heart rate, according to scientists at the University of Cambridge and the Karolinska Institute, Sweden. While this discovery was made in mice, the researchers believe it is likely to be true also in humans.

Nine of ten cases of high blood pressure appear to occur spontaneously, with no known cause

Randall Johnson

In a study published in the open access journal eLife, the researchers show that skin – our largest organ, typically covering two square metres in humans – helps regulate blood pressure and heart rate in response to changes in the amount of oxygen available in the environment.

High blood pressure is associated with cardiovascular disease, such as heart attack and stroke. For the vast majority of cases of high blood pressure, there is no known cause. The condition is often associated with reduced flow of blood through small blood vessels in the skin and other parts of the body, a symptom which can get progressively worse if the hypertension is not treated.

Previous research has shown that when a tissue is starved of oxygen – as can happen in areas of high altitude, or in response to pollution, smoking or obesity, for example – blood flow to that tissue will increase. In such situations, this increase in blood flow is controlled in part by the ‘HIF’ family of proteins.

To investigate what role the skin plays in the flow of blood through small vessels, a team of researchers from Cambridge and Sweden exposed mice to low-oxygen conditions. These mice had been genetically modified so that they are unable to produce certain HIF proteins in the skin.

“Nine of ten cases of high blood pressure appear to occur spontaneously, with no known cause,” says Professor Randall Johnson from the Department of Physiology, Development and Neuroscience at the University of Cambridge. “Most research in this areas tends to look at the role played by organs such as the brain, heart and kidneys, and so we know very little about what role other tissue and organs play.

“Our study was set up to understand the feedback loop between skin and the cardiovascular system. By working with mice, we were able to manipulate key genes involved in this loop.”

The researchers found that in mice lacking one of two proteins in the skin (HIF-1α or HIF-2α), the response to low levels of oxygen changed compared to normal mice and that this affected their heart rate, blood pressure, skin temperature and general levels of activity. Mice lacking specific proteins controlled by the HIFs also responded in a similar way.

In addition, the researchers showed that even the response of normal, healthy mice to oxygen starvation was more complex than previously thought. In the first ten minutes, blood pressure and heart rate rise, and this is followed by a period of up to 36 hours where blood pressure and heart rate decrease below normal levels. By around 48 hours after exposure to low levels of oxygen, blood pressure and heart rate levels had returned to normal.

Loss of the HIF proteins or other proteins involved in the response to oxygen starvation in the skin, was found to dramatically change when this process starts and how long it takes.

“These findings suggest that our skin’s response to low levels of oxygen may have substantial effects on the how the heart pumps blood around the body,” adds first author Dr Andrew Cowburn, also from Cambridge. “Low oxygen levels – whether temporary or sustained – are common and can be related to our natural environment or to factors such as smoking and obesity. We hope that our study will help us better understand how the body’s response to such conditions may increase our risk of – or even cause – hypertension.”

Professor Johnson adds: “Given that skin is the largest organ in our body, it perhaps shouldn’t be too surprising that it plays a role in regulation such a fundamental mechanism as blood pressure. But this suggests to us that we may need to take a look at other organs and tissues in the body and see how they, too, are implicated.”

The study was funded by Wellcome.

Reference
Cowburn, AS et al. Cardiovascular adaptation to hypoxia and the role of peripheral resistance. eLife; 19 Oct 2017; DOI: 10.7554/eLife.28755


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Scars’ Left By Icebergs Record West Antarctic Ice Retreat

‘Scars’ left by icebergs record West Antarctic ice retreat

Thousands of marks on the Antarctic seafloor, caused by icebergs which broke free from glaciers more than ten thousand years ago, show how part of the Antarctic Ice Sheet retreated rapidly at the end of the last ice age as it balanced precariously on sloping ground and became unstable. Today, as the global climate continues to warm, rapid and sustained retreat may be close to happening again and could trigger runaway ice retreat into the interior of the continent, which in turn would cause sea levels to rise even faster than currently projected.

Today, the Pine Island and Thwaites glaciers are grounded in a very precarious position, and major retreat may already be happening.

Matthew Wise

Researchers from the University of Cambridge, the British Antarctic Survey and Stockholm University imaged the seafloor of Pine Island Bay, in West Antarctica. They found that, as seas warmed at the end of the last ice age, Pine Island Glacier retreated to a point where its grounding line – the point where it enters the ocean and starts to float – was perched precariously at the end of a slope.

Breakup of the floating ‘ice shelf’ in front of the glacier left tall ice ‘cliffs’ at its edge. The height of these cliffs made them unstable, triggering the release of thousands of icebergs into Pine Island Bay, and causing the glacier to retreat rapidly until its grounding line reached a restabilising point in shallower water.

Today, as warming waters caused by climate change flow underneath the floating ice shelves in Pine Island Bay, the Antarctic Ice Sheet is once again at risk of losing mass from rapidly retreating glaciers. Significantly, if ice retreat is triggered, there are no relatively shallow points in the ice sheet bed along the course of Pine Island and Thwaites glaciers to prevent possible runaway ice retreat into the interior of West Antarctica. The results are published in the journal Nature.

“Today, the Pine Island and Thwaites glaciers are grounded in a very precarious position, and major retreat may already be happening, caused primarily by warm waters melting from below the ice shelves that jut out from each glacier into the sea,” said Matthew Wise of Cambridge’s Scott Polar Research Institute, and the study’s first author. “If we remove these buttressing ice shelves, unstable ice thicknesses would cause the grounded West Antarctic Ice Sheet to retreat rapidly again in the future. Since there are no potential restabilising points further upstream to stop any retreat from extending deep into the West Antarctic hinterland, this could cause sea-levels to rise faster than previously projected.”

Pine Island Glacier and the neighbouring Thwaites Glacier are responsible for nearly a third of total ice loss from the West Antarctic Ice Sheet, and this contribution has increased greatly over the past 25 years. In addition to basal melt, the two glaciers also lose ice by breaking off, or calving, icebergs into Pine Island Bay.

Today, the icebergs that break off from Pine Island and Thwaites glaciers are mostly large table-like blocks, which cause characteristic ‘comb-like’ ploughmarks as these large multi-keeled icebergs grind along the sea floor. By contrast, during the last ice age, hundreds of comparatively smaller icebergs broke free of the Antarctic Ice Sheet and drifted into Pine Island Bay. These smaller icebergs had a v-shaped structure like the keel of a ship and left long and deep single scars in the sea floor.

High-resolution imaging techniques, used to investigate the shape and distribution of ploughmarks on the sea floor in Pine Island Bay, allowed the researchers to determine the relative size and drift direction of icebergs in the past. Their analysis showed that these smaller icebergs were released due to a process called marine ice-cliff instability (MICI). More than 12,000 years ago, Pine Island and Thwaites glaciers were grounded on top of a large wedge of sediment and were buttressed by a floating ice shelf, making them relatively stable even though they rested below sea level.

Eventually, the floating ice shelf in front of the glaciers ‘broke up’, which caused them to retreat onto land sloping downward from the grounding lines to the interior of the ice sheet. This exposed tall ice ‘cliffs’ at their margin with an unstable height, and resulted in the rapid retreat of the glaciers from marine ice cliff instability between 12,000 and 11,000 years ago. This occurred under climate conditions that were relatively similar to those of today.

“Ice-cliff collapse has been debated as a theoretical process that might cause West Antarctic Ice Sheet retreat to accelerate in the future,” said co-author Dr Robert Larter, from the British Antarctic Survey. “Our observations confirm that this process is real and that it occurred about 12,000 years ago, resulting in rapid retreat of the ice sheet into Pine Island Bay.”

Today, the two glaciers are getting ever closer to the point where they may become unstable, resulting once again in rapid ice retreat.

The research has been funded in part by the UK Natural Environment Research Council (NERC)

Reference: 
Matthew G. Wise et al. ‘Evidence of marine ice-cliff instability in Pine Island Bay from iceberg-keel plough marks.’ Nature (2017). DOI: 10.1038/nature24458


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Machine Learning Used To Predict Earthquakes In a Lab Setting

Machine learning used to predict earthquakes in a lab setting

source: www.cam.ac.uk

A group of researchers from the UK and the US have used machine learning techniques to successfully predict earthquakes. Although their work was performed in a laboratory setting, the experiment closely mimics real-life conditions, and the results could be used to predict the timing of a real earthquake.

This is the first time that machine learning has been used to analyse acoustic data to predict when an earthquake will occur.

Colin Humphreys

The team, from the University of Cambridge, Los Alamos National Laboratory and Boston University, identified a hidden signal leading up to earthquakes and used this ‘fingerprint’ to train a machine learning algorithm to predict future earthquakes. Their results, which could also be applied to avalanches, landslides and more, are reported in the journal Geophysical Review Letters.

For geoscientists, predicting the timing and magnitude of an earthquake is a fundamental goal. Generally speaking, pinpointing where an earthquake will occur is fairly straightforward: if an earthquake has struck a particular place before, the chances are it will strike there again. The questions that have challenged scientists for decades are how to pinpoint when an earthquake will occur, and how severe it will be. Over the past 15 years, advances in instrument precision have been made, but a reliable earthquake prediction technique has not yet been developed.

As part of a project searching for ways to use machine learning techniques to make gallium nitride (GaN) LEDs more efficient, the study’s first author, Bertrand Rouet-Leduc, who was then a PhD student at Cambridge, moved to Los Alamos National Laboratory in New Mexico to start a collaboration on machine learning in materials science between Cambridge University and Los Alamos. From there the team started helping the Los Alamos Geophysics group on machine learning questions.

The team at Los Alamos, led by Paul Johnson, studies the interactions among earthquakes, precursor quakes (often very small earth movements) and faults, with the hope of developing a method to predict earthquakes. Using a lab-based system that mimics real earthquakes, the researchers used machine learning techniques to analyse the acoustic signals coming from the ‘fault’ as it moved and search for patterns.

The laboratory apparatus uses steel blocks to closely mimic the physical forces at work in a real earthquake, and also records the seismic signals and sounds that are emitted. Machine learning is then used to find the relationship between the acoustic signal coming from the fault and how close it is to failing.

The machine learning algorithm was able to identify a particular pattern in the sound, previously thought to be nothing more than noise, which occurs long before an earthquake. The characteristics of this sound pattern can be used to give a precise estimate (within a few percent) of the stress on the fault (that is, how much force is it under) and to estimate the time remaining before failure, which gets more and more precise as failure approaches. The team now thinks that this sound pattern is a direct measure of the elastic energy that is in the system at a given time.

“This is the first time that machine learning has been used to analyse acoustic data to predict when an earthquake will occur, long before it does, so that plenty of warning time can be given – it’s incredible what machine learning can do,” said co-author Professor Sir Colin Humphreys of Cambridge’s Department of Materials Science & Metallurgy, whose main area of research is energy-efficient and cost-effective LEDs. Humphreys was Rouet-Leduc’s supervisor when he was a PhD student at Cambridge.

“Machine learning enables the analysis of datasets too large to handle manually and looks at data in an unbiased way that enables discoveries to be made,” said Rouet-Leduc.

Although the researchers caution that there are multiple differences between a lab-based experiment and a real earthquake, they hope to progressively scale up their approach by applying it to real systems which most resemble their lab system. One such site is in California along the San Andreas Fault, where characteristic small repeating earthquakes are similar to those in the lab-based earthquake simulator. Progress is also being made on the Cascadia fault in the Pacific Northwest of the United States and British Columbia, Canada, where repeating slow earthquakes that occur over weeks or months are also very similar to laboratory earthquakes.

“We’re at a point where huge advances in instrumentation, machine learning, faster computers and our ability to handle massive data sets could bring about huge advances in earthquake science,” said Rouet-Leduc.

Reference: 
Bertrand Rouet-Leduc et al. ‘Machine Learning Predicts Laboratory Earthquakes.’ Geophysical Research Letters (2017). DOI: 10.1002/2017GL074677

 


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

‘Handful of Changes’ Make Cancer

‘Handful of changes’ make cancer

source: http://www.bbc.co.uk/news/health-41644020

CancerImage copyrightSPL

British scientists have worked out how many changes it takes to transform a healthy cell into a cancer.

The team, at the Wellcome Trust Sanger Institute, showed the answer was a tiny handful, between one and 10 mutations depending on the type of tumour.

It has been one of the most hotly debated issues in cancer science for decades.

The findings, published in the journal Cell, could improve treatment for patients.

If you played spot the difference between a cancer and healthy tissue, you could find tens of thousands of differences – or mutations – in the DNA.

Some are driving the cancer’s growth, while others are just along for the ride. So which ones are important?

Root cause

The researchers analysed the DNA from 7,664 tumours to find “driver mutations” that allow a cell to be more selfish, aggressive and cancerous.

They showed it could take:

  • just one mutation to drive thyroid and testicular cancers
  • four mutations to make a breast or liver cancer
  • 10 mutations to create a colorectal cancer.

Dr Peter Campbell, one of the researchers, told the BBC News website: “We’ve known about the genetic basis of cancer for many decades now, but how many mutations are responsible has been incredibly hotly debated.

“What we’ve been able to do in this study is really provide the first unbiased numbers.

“And it seems that of the thousands of mutations in a cancer genome, only a small handful are responsible for dictating the way the cell behaves, what makes it cancerous.”

Half the mutations identified were in sets of genetic instructions – or genes – that had never been implicated in cancer before.

Therapy

The long-term goal is to advance precision cancer treatment.

If doctors know which few mutations, out of thousands, were driving a patient’s cancer, it could allow drugs that specifically targeted that mutation to be used.

Drugs such as herceptin and Braf inhibitors are already used to attack specific mutations in tumours.

The researchers were able to pick out the mutations that were driving the growth of cancer by turning to Charles Darwin and evolutionary theory.

In essence, driver mutations should appear more often in tumours than “neutral” mutations that do not make the cell cancerous.

This is because the forces of natural selection give an evolutionary advantage to mutations that help a cell grow and divide more readily.

Dr Nicholas McGranahan, from the Cancer Research UK and the UCL Cancer Institute, said the approach was “elegant”.

He said: “Cancer is a disease that evolves and changes over time, and it makes sense to use ideas like this from species evolution to work out the genetic faults that cause cancer to grow.

“But as this study focuses on one part of cancer evolution, it can only give us insight into part of the puzzle.

“Other components such as how DNA is packaged into chromosomes are also key in how a tumour progresses and will need to be looked at to give us a clearer picture of how cancer evolves.”

Follow James on Twitter.