Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Tuesday, January 31, 2017

Nordic Countries Are Bringing About an Energy Transition Worth Copying

What can we learn from the Nordic low-carbon energy transition given the new US leadership vacuum on climate change? A new study by Benjamin K Sovacool offers some important lessons.

The Trump administration’s “First energy plan” criticises the “burdensome” regulations on the energy industry and aims to eliminate “harmful and unnecessary policies such as the Climate Action Plan” which was introduced by President Barack Obama. It has also deleted all mentions of climate change and global warming from the White House website.

Given the American leadership vacuum on energy and climate change, national and local planners looking to bring about energy transitions will need to look elsewhere. Five Nordic countries – Denmark, Finland, Iceland, Norway and Sweden – could hold answers for how to make the transition to a more energy efficient society generating energy through renewables. About 83% of electricity generation in Nordic countries is low-carbon, of which 63% comes entirely from renewable sources. The Nordic countries are also facilitating other low-carbon transitions across other sectors including heat, buildings, industry, and transport.

Samsø, a Danish island, generates all its electricity from wind power and biomass.
Credit: University of Sussex

A new study outlines broad lessons for how this transition could be replicated elsewhere.

The energy transition pays for itself (if you factor in the costs of air pollution)

The total estimated cost of the Nordic energy transition is roughly $357 billion more than business as usual, which comes to a total of less than 1 percent of cumulative GDP between now and 2050. Almost all of these costs will be offset by fuel savings. Even the external costs associated with the health impacts of air pollution alone in the Nordic countries (about $9 to $14 billion annually) are roughly equal to the additional investment needed to achieve a carbon neutral scenario.

Trade and interconnection with other countries are key for reaching energy targets

Trade and interconnection with Europe are instrumental to the Nordic countries reaching their carbon and energy targets. Nordic electricity trade must expand considerably— underscoring the need for paralleled, coordinated grid development and interconnections with Great Britain, the Netherlands, Germany, Poland, Lithuania, Latvia, and Estonia. “It’s as much a regional governance or European challenge as it is a national priority for individual Nordic states,” says Sovacool, a Professor of Energy Policy at the University of Sussex's Science Policy Research Unit and Director of the Centre on Innovation and Energy Demand.

Cities and municipalities take the lead

Cities and municipalities, or ‘subnational actors’ have taken the lead as key actors driving electricity and heat, energy efficiency, transport and the industry sectors to decarbonise, especially given that urbanization rates across the Nordic region are expected to occur at double the rate of previous decades. It is cities that will need to invest in new buildings, sponsor retrofits, erect electric vehicle charging infrastructure, and optimize heat networks.

Energy transitions take generations

Even for the Nordic countries, which are relatively wealthy, small, and committed, the transition will take at least three to four more decades. Its success rests upon a number of compelling technological contingencies or breakthroughs, each of which will take time. A few such breakthroughs include a continued phase out of nuclear power; a rapid ramping up of onshore and offshore wind energy; a spectacular diffusion of electric vehicles; a massive increase in bioenergy production; and the commercialization of industrial scale carbon capture and storage. On top of this, households and consumers must learn to adopt better energy management systems and industrial planners must come to install newer cement kilns, electric arc furnaces, and feedstock switching for chemicals, petrochemicals, and paper and pulping.

Transitions are contingent on other factors, contested and potentially unjust

For all of its promise, the Nordic transition is contingent upon and unique to its own sociotechnical environment. All the Nordic countries are endowed with plentiful fossil fuels that they can export to generate revenue that they funnel back into domestic decarbonisation process, coupled with a history of strong energy and climate planning and high fuel and electricity prices.

While the Nordic low-carbon transition has generally been successful and has benefited most within its society, the paper also identifies losers in the transition, including those set to lose their jobs as fossil fuels are displaced. Other potential obstacles to be overcome are a lack of understanding among some citizens about energy and climate topics, and the outsourcing of embodied carbon emissions overseas.



Contacts and sources:
Suzanne Fisher-Murray
University of Sussex

Gene for the “Sixth Sense” Found

With the help of two young patients with a unique neurological disorder, an initial study by scientists at the National Institutes of Health suggests that a gene called PIEZO2 controls specific aspects of human touch and proprioception, a “sixth sense” describing awareness of one’s body in space. Mutations in the gene caused the two to have movement and balance problems and the loss of some forms of touch. Despite their difficulties, they both appeared to cope with these challenges by relying heavily on vision and other senses.

“Our study highlights the critical importance of PIEZO2 and the senses it controls in our daily lives,” said Carsten G. Bönnemann, M.D., senior investigator at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) and a co-leader of the study published in The New England Journal of Medicine. “The results establish that PIEZO2 is a touch and proprioception gene in humans. Understanding its role in these senses may provide clues to a variety of neurological disorders.”

An NIH study shows that two young patients with a mutation in the PIEZO2 gene have problems with touch and proprioception, or body awareness.
Photos Courtesy of Bönnemann lab, NIH/NINDS, Bethesda, MD


Dr. Bönnemann’s team uses cutting edge genetic techniques to help diagnose children around the world who have disorders that are difficult to characterize. The two patients in this study are unrelated, one nine and the other 19 years old. They have difficulties walking; hip, finger and foot deformities; and abnormally curved spines diagnosed as progressive scoliosis.

Working with the laboratory of Alexander T. Chesler, Ph.D., investigator at NIH’s National Center for Complementary and Integrative Health (NCCIH), the researchers discovered that the patients have mutations in the PIEZO2 gene that appear to block the normal production or activity of Piezo2 proteins in their cells. Piezo2 is what scientists call a mechanosensitive protein because it generates electrical nerve signals in response to changes in cell shape, such as when skin cells and neurons of the hand are pressed against a table. Studies in mice suggest that Piezo2 is found in the neurons that control touch and proprioception.

“As someone who studies Piezo2 in mice, working with these patients was humbling,” said Dr. Chesler. “Our results suggest they are touch-blind. The patient’s version of Piezo2 may not work, so their neurons cannot detect touch or limb movements.”

Further examinations at the NIH Clinical Center suggested the young patients lack body awareness. Blindfolding them made walking extremely difficult, causing them to stagger and stumble from side to side while assistants prevented them from falling. When the researchers compared the two patients with unaffected volunteers, they found that blindfolding the young patients made it harder for them to reliably reach for an object in front of their faces than it was for the volunteers. Without looking, the patients could not guess the direction their joints were being moved as well as the control subjects could.

The patients were also less sensitive to certain forms of touch. They could not feel vibrations from a buzzing tuning fork as well as the control subjects could. Nor could they tell the difference between one or two small ends of a caliper pressed firmly against their palms. Brain scans of one patient showed no response when the palm of her hand was brushed.


Credit: NIH IRP (Intramural Research Program at the National Institutes of Health)


Nevertheless, the patients could feel other forms of touch. Stroking or brushing hairy skin is normally perceived as pleasant. Although they both felt the brushing of hairy skin, one claimed it felt prickly instead of the pleasant sensation reported by unaffected volunteers. Brain scans showed different activity patterns in response to brushing between unaffected volunteers and the patient who felt prickliness.

Despite these differences, the patients’ nervous systems appeared to be developing normally. They were able to feel pain, itch, and temperature normally; the nerves in their limbs conducted electricity rapidly; and their brains and cognitive abilities were similar to the control subjects of their age.

“What’s remarkable about these patients is how much their nervous systems compensate for their lack of touch and body awareness,” said Dr. Bönnemann. “It suggests the nervous system may have several alternate pathways that we can tap into when designing new therapies.”

Previous studies found that mutations in PIEZO2 may have various effects on the Piezo2 protein that may result in genetic musculoskeletal disorders, including distal arthrogryposis type 5, Gordon Syndrome, and Marden-Walker Syndrome. Drs. Bönnemann and Chesler concluded that the scoliosis and joint problems of the patients in this study suggest that Piezo2 is either directly required for the normal growth and alignment of the skeletal system or that touch and proprioception indirectly guide skeletal development.

“Our study demonstrates that bench and bedside research are connected by a two-way street,” said Dr. Chesler. “Results from basic laboratory research guided our examination of the children. Now we can take that knowledge back to the lab and use it to design future experiments investigating the role of PIEZO2 in nervous system and musculoskeletal development.”

This work was supported by the NCCIH and NINDS intramural research programs.



Contacts and sources:
National Center for Complementary and Integrative Health (NCCIH)


Citation: Chesler AT, et al. The role of PIEZO2 in human mechanosensation. New England Journal of Medicine. September 14, 2016. DOI: 10.1056/NEJMoa1602812

Underwater Gliders to Reach New Depths for Ocean Monitoring

An EU-funded project is developing two new deep-water gliders that would expand the ability of scientists and industry to measure the environmental impact of commercial activities such as drilling for oil and gas at sea. The autonomous gliders, sort of deep-sea drones, would also be able to extract better and more meaningful data from greater depths. The insights gained from improved ocean monitoring would contribute to the management of maritime resources.

Autonomous underwater gliders use internal buoyancy controls to travel to depth and back up again over long periods of time, while carrying sensors to continuously measure properties such as water temperature, salinity and pollution. Marine scientists use them as a cost-effective means of monitoring and understanding the ocean environment.

BRIDGES aims to expand their range and usefulness to both researchers and industry. The project is developing and testing two prototype gliders. One will be able to dive to 2 400 metres, the other to 5 000 metres. The gliders’ design will be based on the experience gained from Europe’s only deep-water glider, the SeaExplorer, which was developed by one of the project’s partners and can reach a depth of 1 000 metres.
 
  Image courtesy of ALSEAMAR

The two gliders will be able to carry a wider range of more sophisticated sensors, some of which are being developed by the project. Sensors designed for use by oil and gas companies and by deep-sea mining companies would be able to measure the environmental impact of their operations. Others could be used to discover new sources of petroleum or minerals.

For example, the project is developing a sensor to sample water for the presence of hydrocarbons or methane. Another will be able to measure nitrate, phosphate, ammonia and silicate levels. An acoustic system on the glider would be able to gather information on the characteristics of the seabed. An automatic water sampler will allow scientists to take direct measurements on 100 ml samples taken at specific depths.



The project envisages the provision of an oil and gas monitoring service through a package of sensors attached to a glider. For example, an underwater glider could be used as an early warning system for detecting deep-sea oil spills and seepages. Another package of sensors would allow scientists to measure biodiversity, eutrophication, and the extent of food networks in a particular area of an ocean or sea.

The project is designing both gliders in a modular format so different packages of sensors and batteries can easily switched between missions, which can last for up to two months. The gliders will also be outfitted with both a buoyancy system and a propeller engine, providing them with a wider range of control and movement at depth


Contacts and sources:
EC Research and Innovation

Prediction Of Large Earthquake Probability Improved

As part of the “Research in Collaborative Mathematics” project run by the Obra Social ”la Caixa”, researchers of the Mathematics Research Centre (CRM) and the UAB have developed a mathematical law to explain the size distribution of earthquakes, even in the cases of large-scale earthquakes such as those which occurred in Sumatra (2004) and in Japan (2011).

The probability of an earthquake occurring exponentially decreases as its magnitude value increases. Fortunately, mild earthquakes are more probable than devastatingly large ones. This relation between probability and earthquake magnitude follows a mathematical curve called the Gutenberg-Richter law, and helps seismologists predict the probabilities of an earthquake of a specific magnitude occurring in some part of the planet.

A village near the coast of Sumatra lays in ruin after the Tsunami that struck South East Asia.
Credit: Wikipedia /U.S. Navy

The law however lacks the necessary tools to describe extreme situations. For example, although the probability of an earthquake being of the magnitude of 12 is zero, since technically this would imply the earth breaking in half, the mathematics of the Gutenberg-Richter law do not consider impossible a 14-magnitude earthquake.

“The limitations of the law are determined by the fact that the Earth is finite, and the law describes ideal systems, in a planet with an infinite surface”, explains Isabel Serra, first author of the article, researcher at CRM and affiliate lecturer of the UAB Department of Mathematics.

To overcome these shortages, researchers studied a small modification in the Gutenberg-Richter law, a term which modified the curve precisely in the area in which probabilities were the smallest. “This modification has important practical effects when estimating the risks or evaluating possible economic losses. Preparing for a catastrophe where the losses could be, in the worst of the cases, very high in value, is not the same as not being able to calculate an estimated maximum value”, clarifies co-author Álvaro Corral, researcher at the Mathematics Research Centre and the UAB Department of Mathematics.

Obtaining the mathematical curve which best fits the registered data on earthquakes is not an easy task when dealing with large tremors. From 1950 to 2003 there were only seven earthquakes measuring higher than 8.5 on the Richter scale and since 2004 there have only been six. Although we are now in a more active period following the Sumatra earthquake, there are very few cases and that makes it statistically a poorer period. Thus, the mathematical treatment of the problem becomes much more complex than when there is an abundance of data. For Corral, “this is where the role of mathematics is fundamental to complement the research of seismologists and guarantee the accuracy of the studies”. According to the researcher, the approach currently used to analyse seismic risk is not fully correct and, in fact, there are many risk maps which are downright incorrect, “which is what happened with the Tohoku earthquake of 2011, where the area contained an under-dimensioned risk”. “Our approach has corrected some things, but we are still far from being able to give correct results in specific regions”, Corral continues.

The mathematical expression of the law at the seismic moment, proposed by Serra and Corral, meets all the conditions needed to determine both the probability of smaller earthquakes and of large ones, by adjusting itself to the most recent and extreme cases of Tohoku, in Japan (2011) and Sumatra, in Indonesia (2004); as well as to determine negligible probabilities for earthquakes of disproportionate magnitudes.

The derived Gutenberg-Richter law has also been used to begin to explore its applications in the financial world. Isabel Serra worked in this field before beginning to study earthquakes mathematically. “The risk assessment of a firm's economic losses is a subject insurance companies take very seriously, and the behaviour is similar: the probability of suffering losses decreases in accordance with the increase in volume of losses, according to a law that is similar to that of Gutenberg-Richter, but there are limit values which these laws do not take into consideration, since no matter how big the amount, the probability of losses of that amount never results in zero” Serra explains. “That makes the ‘expected value of losses' enormous. To solve this, changes would have to be made to the law similar to those we introduced to the law on earthquakes”.

The research was published in the Scientific Reports journal, from the same publishers as Nature, and received funding from the MINECO, AGAUR and the “Research in Collaborative Mathematics” project run by the “la Caixa” Foundation.




Contacts and sources: 
Universitat Autònoma de Barcelona

Citation: Isabel Serra & Álvaro Corral. Deviation from power law of the global seismic moment distribution. Scientific Reports 7, Article number: 40045 (2017) DOI:10.1038/srep40045.

Acupuncture Boosts Effectiveness of Standard Medical Care for Chronic Pain and Depression

Health specialists at the University of York have found than acupuncture treatment can boost the effectiveness of standard medical care, lessening the severity of chronic pain and depression.

In a report published in the National Institute for Health Research (NIHR) Journals Library, the researchers showed that there is significant evidence to demonstrate that acupuncture provides more than a placebo effect.

Professor of Acupuncture Research, Hugh MacPherson, working with a team of scientists from the UK and US, brought together the results of 29 high quality clinical trials focused on patients treated with acupuncture and standard medical care.

In the majority of these trials, patients with chronic pain treated with acupuncture and standard medical care were tested against those who were provided with standard medical care alone, such as anti-inflammatory drugs and physiotherapy. The trials involved approximately 18,000 patients diagnosed with chronic pain of the neck, lower back, head, and knee.

Credit: NIH

The report shows that the addition of acupuncture compared to standard medical care alone significantly reduced the number of headaches and migraine attacks and reduced the severity of neck and lower back pain. It also showed that acupuncture reduced the pain and disability of osteoarthritis, which led to patients being less reliant on anti-inflammatory tablets to control pain.

The study also concluded that acupuncture is cost effective, with the value for money being rated as less than the threshold of £20,000 cost per quality of life year - a metric for cost-effectiveness used by the National Institute for Health and Care Excellence (NICE).

Professor MacPherson, from the University of York’s Department of Health Sciences, said: “There has been an increase in practitioners using acupuncture as an intervention. Approximately four million acupuncture treatments are provided a year in the UK, but the evidence to show how clinically effective this form of treatment is has been limited.

“There has been a question mark for many years over whether policy and decision makers should or should not provide wider access to acupuncture. Our aim was to bring together data from high quality clinical trials and provide a robust evidence base that will help reduce this uncertainty and support commissioners and health professionals in making informed decisions backed up with research.”

The team also conducted a new clinical trial for depression, where acupuncture or counselling was provided and compared to the effectiveness of medication, such as antidepressants.

In a study of 755 patients with depression in the North of England, researchers showed that both acupuncture and counselling significantly reduced the severity of depressions and that these benefits were largely sustained for up to 12 months after treatment.

Professor MacPherson said: “The front-line treatment for depression in primary care usually involves antidepressants; however, they do not work well for more than half of patients.

“In the largest study of its kind, we have now provided a solid evidence base to show that not only can acupuncture and counselling bring patients out of an episode of depression, but it can keep the condition at bay for up to a year on average.”

The benefits of acupuncture are partially associated with placebo effects, which has contributed to the uncertainty around acupuncture’s clinical effectiveness. Professor MacPherson states, however, that this new research provides definitive evidence that when acupuncture is used to treat chronic pain, the reductions in pain are substantially more than those measured from sham (placebo) acupuncture.

Used only in clinical trials for research purposes, sham acupuncture involves inserting needles at the ‘wrong’ locations, or using non-inserted needles (fake needles) at the correct locations. That ‘true’ acupuncture has significantly more effect in reducing pain than sham acupuncture, provides evidence that acupuncture is not simply a placebo effect.

Professor MacPherson added: “Our new data provides a significant step forward in treating chronic pain and managing depression, because patients and health professionals can now make decisions on acupuncture with more confidence. Not only is it more cost effective, but it reduces pain levels and improves mood levels, which could reduce over reliance on drugs that can sometimes result in unwanted side effects.”



Contacts and sources:
University of York


Citation: Acupuncture for chronic pain and depression in primary care: a programme of research
Authors; Hugh MacPherson, Andrew Vickers, Martin Bland, David Torgerson,
Mark Corbett, Eldon Spackman, Pedro Saramago, Beth Woods,
Helen Weatherly, Mark Sculpher, Andrea Manca, Stewart Richmond, Ann Hopton, Janet Eldred and Ian Watt   DOI 10.3310/pgfar05030 https://www.journalslibrary.nihr.ac.uk/pgfar/pgfar05030#/abstract

DragonEye Hybrid Drone Is a Dragonfly Controlled by High Tech Backpack

The smallest aerial drones mimic insects in many ways, but none can match the efficiency and maneuverability of the dragonfly. Now, engineers at Draper are creating a new kind of hybrid drone by combining miniaturized navigation, synthetic biology and neurotechnology to guide dragonfly insects. The system looks like a backpack for a dragonfly.

DragonflEye, an internal research and development project at Draper, is already showing promise as a way to guide the flightpath of dragonflies. Potential applications of the technologies underpinning DragonflEye include guided pollination, payload delivery, reconnaissance and even precision medicine and diagnostics.

“DragonflEye is a totally new kind of micro-aerial vehicle that’s smaller, lighter and stealthier than anything else that’s manmade,” said Jesse J. Wheeler, biomedical engineer at Draper and principal investigator on the program. “This system pushes the boundaries of energy harvesting, motion sensing, algorithms, miniaturization and optogenetics, all in a system small enough for an insect to wear.”

A first generation backpack guidance system that includes energy harvesting, navigation & optical stimulation on a to-scale model of a dragonfly

Credit: Draper Laboratory

DragonflEye has been a team effort between Draper and Howard Hughes Medical Institute (HHMI) at Janelia Research Campus to create new optogenetic tools that send guidance commands from the backpack to special “steering” neurons inside the dragonfly nerve cord.

Research at HHMI—led by Anthony Leonardo, Janelia Research Campus group leader—has led to a deeper understanding of “steering” neurons in the nervous system of the dragonfly that control flight. HHMI is applying techniques in synthetic biology to make these “steering” neurons sensitive to light by inserting genes similar to those naturally found in the eye.

Draper is developing tiny optical structures, called optrodes, that can activate the special “steering” neurons with pulses of light piped into the nerve cord from the dragonfly’s backpack. Traditional optical fibers are too stiff to be wrapped around the tiny dragonfly nerve cord, so Draper developed innovative flexible optrodes that can bend light around sub-millimeter turns. These optrodes will enable precise and targeted neural activation without disrupting the thousands of nearby neurons.

“Someday these same tools could advance medical treatments in humans, resulting in more effective therapies with fewer side effects,” said Wheeler. “Our flexible optrode technology provides a new solution to enable miniaturized diagnostics, safely access smaller neural targets and deliver higher precision therapies.”

A close up of the backpack board and components before being folded and fitted to the dragonfly 
Credit: Draper Laboratory

Draper’s work on the DragonflEye program builds on its legacy in autonomous systems, microsystems, biomedical solutions and materials engineering and microfabrication. This deep expertise extended previous Janelia Research Campus work in energy harvesting and miniaturization to create the insect-scale autonomous navigation and neuromodulation system.

DragonflEye provides opportunities to put technology on some of nature’s most agile insects. For instance, honeybees, whose population has collapsed by half in the last 25 years, could be equipped with Draper’s technology to assist with pollination. One of nature’s greatest pollinators, honeybees contribute more than $15 billion to the value of U.S. agriculture every year. Draper’s tiny guidance system could help stem the loss of pollinators by monitoring their flight patterns, migration and overall health.


Contacts and sources:
The Charles Stark Draper Laboratory, Inc.

Fermi Sees Gamma Rays from Far Side Solar Flares


An international science team says NASA's Fermi Gamma-ray Space Telescope has observed high-energy light from solar eruptions located on the far side of the sun, which should block direct light from these events. This apparent paradox is providing solar scientists with a unique tool for exploring how charged particles are accelerated to nearly the speed of light and move across the sun during solar flares.

"Fermi is seeing gamma rays from the side of the sun we're facing, but the emission is produced by streams of particles blasted out of solar flares on the far side of the sun," said Nicola Omodei, a researcher at Stanford University in California. "These particles must travel some 300,000 miles within about five minutes of the eruption to produce this light."



Omodei presented the findings on Monday, Jan. 30, at the American Physical Society meeting in Washington, and a paper describing the results will be published online in The Astrophysical Journal on Jan. 31.

Fermi has doubled the number of these rare events, called behind-the-limb flares, since it began scanning the sky in 2008. Its Large Area Telescope (LAT) has captured gamma rays with energies reaching 3 billion electron volts, some 30 times greater than the most energetic light previously associated with these "hidden" flares.

Thanks to NASA's Solar Terrestrial Relations Observatory (STEREO) spacecraft, which were monitoring the solar far side when the eruptions occurred, the Fermi events mark the first time scientists have direct imaging of beyond-the-limb solar flares associated with high-energy gamma rays.
Credit: NASA

"Observations by Fermi's LAT continue to have a significant impact on the solar physics community in their own right, but the addition of STEREO observations provides extremely valuable information of how they mesh with the big picture of solar activity," said Melissa Pesce-Rollins, a researcher at the National Institute of Nuclear Physics in Pisa, Italy, and a co-author of the paper.

The hidden flares occurred Oct. 11, 2013, and Jan. 6 and Sept. 1, 2014. All three events were associated with fast coronal mass ejections (CMEs), where billion-ton clouds of solar plasma were launched into space. The CME from the most recent event was moving at nearly 5 million miles an hour as it left the sun. Researchers suspect particles accelerated at the leading edge of the CMEs were responsible for the gamma-ray emission.

These solar flares were imaged in extreme ultraviolet light by NASA's STEREO satellites, which at the time were viewing the side of the sun facing away from Earth. All three events launched fast coronal mass ejections (CMEs). Although NASA's Fermi Gamma-ray Space Telescope couldn't see the eruptions directly, it detected high-energy gamma rays from all of them. Scientists think particles accelerated by the CMEs rained onto the Earth-facing side of the sun and produced the gamma rays. The central image was returned by the STEREO A spacecraft, all others are from STEREO B.

Credit: NASA/STEREO

Large magnetic field structures can connect the acceleration site with distant part of the solar surface. Because charged particles must remain attached to magnetic field lines, the research team thinks particles accelerated at the CME traveled to the sun's visible side along magnetic field lines connecting both locations. As the particles impacted the surface, they generated gamma-ray emission through a variety of processes. One prominent mechanism is thought to be proton collisions that result in a particle called a pion, which quickly decays into gamma rays.

In its first eight years, Fermi has detected high-energy emission from more than 40 solar flares. More than half of these are ranked as moderate, or M class, events. In 2012, Fermi caught the highest-energy emission ever detected from the sun during a powerful X-class flare, from which the LAT detected high­energy gamma rays for more than 20 record-setting hours.

Combined images from NASA's Solar Dynamics Observatory (center) and the NASA/ESA Solar and Heliospheric Observatory (red and blue) show an impressive coronal mass ejection departing the far side of the sun on Sept. 1, 2014. This massive cloud raced away at about 5 million mph and likely accelerated particles that later produced gamma rays Fermi detected.

Credit: NASA/SDO and NASA/ESA/SOHO



Contacts and sources:
Francis Reddy

Artificial Intelligence Does As Well As Dermatologists In Identifying Skin Cancer

It’s scary enough making a doctor’s appointment to see if a strange mole could be cancerous. Imagine, then, that you were in that situation while also living far away from the nearest doctor, unable to take time off work and unsure you had the money to cover the cost of the visit. In a scenario like this, an option to receive a diagnosis through your smartphone could be lifesaving.

Universal access to health care was on the minds of computer scientists at Stanford when they set out to create an artificially intelligent diagnosis algorithm for skin cancer. They made a database of nearly 130,000 skin disease images and trained their algorithm to visually diagnose potential cancer. From the very first test, it performed with inspiring accuracy.

A dermatologist uses a dermatoscope, a type of handheld microscope, to look at skin. Computer scientists at Stanford have created an artificially intelligent diagnosis algorithm for skin cancer that matched the performance of board-certified dermatologists.  
Image credit: Matt Young

“We realized it was feasible, not just to do something well, but as well as a human dermatologist,” said Sebastian Thrun, an adjunct professor in the Stanford Artificial Intelligence Laboratory. “That’s when our thinking changed. That’s when we said, ‘Look, this is not just a class project for students, this is an opportunity to do something great for humanity.’”

The final product, the subject of a paper in the Jan. 25 issue of Nature, was tested against 21 board-certified dermatologists. In its diagnoses of skin lesions, which represented the most common and deadliest skin cancers, the algorithm matched the performance of dermatologists.
Why skin cancer

Every year there are about 5.4 million new cases of skin cancer in the United States, and while the five-year survival rate for melanoma detected in its earliest states is around 97 percent, that drops to approximately 14 percent if it’s detected in its latest stages. Early detection could likely have an enormous impact on skin cancer outcomes.

Diagnosing skin cancer begins with a visual examination. A dermatologist usually looks at the suspicious lesion with the naked eye and with the aid of a dermatoscope, which is a handheld microscope that provides low-level magnification of the skin. If these methods are inconclusive or lead the dermatologist to believe the lesion is cancerous, a biopsy is the next step.

Bringing this algorithm into the examination process follows a trend in computing that combines visual processing with deep learning, a type of artificial intelligence modeled after neural networks in the brain. Deep learning has a decades-long history in computer science but it only recently has been applied to visual processing tasks, with great success. The essence of machine learning, including deep learning, is that a computer is trained to figure out a problem rather than having the answers programmed into it.


Andre Esteva 
Image credit: Matt Young

“We made a very powerful machine learning algorithm that learns from data,” said Andre Esteva, co-lead author of the paper and a graduate student in the Thrun lab. “Instead of writing into computer code exactly what to look for, you let the algorithm figure it out.”

The algorithm was fed each image as raw pixels with an associated disease label. Compared to other methods for training algorithms, this one requires very little processing or sorting of the images prior to classification, allowing the algorithm to work off a wider variety of data.
From cats and dogs to melanomas and carcinomas

Rather than building an algorithm from scratch, the researchers began with an algorithm developed by Google that was already trained to identify 1.28 million images from 1,000 object categories. While it was primed to be able to differentiate cats from dogs, the researchers needed it to know a malignant carcinoma from a benign seborrheic keratosis.

“There’s no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own,” said Brett Kuprel, co-lead author of the paper and a graduate student in the Thrun lab. “We gathered images from the internet and worked with the medical school to create a nice taxonomy out of data that was very messy – the labels alone were in several languages, including German, Arabic and Latin.”

After going through the necessary translations, the researchers collaborated with dermatologists at Stanford Medicine, as well as Helen M. Blau, professor of microbiology and immunology at Stanford and co-author of the paper. Together, this interdisciplinary team worked to classify the hodgepodge of internet images. Many of these, unlike those taken by medical professionals, were varied in terms of angle, zoom and lighting. In the end, they amassed about 130,000 images of skin lesions representing over 2,000 different diseases.

Brett Kuprel
Image credit: Matt Young

During testing, the researchers used only high-quality, biopsy-confirmed images provided by the University of Edinburgh and the International Skin Imaging Collaboration Project that represented the most common and deadliest skin cancers – malignant carcinomas and malignant melanomas. The 21 dermatologists were asked whether, based on each image, they would proceed with biopsy or treatment, or reassure the patient. The researchers evaluated success by how well the dermatologists were able to correctly diagnose both cancerous and non-cancerous lesions in over 370 images.

The algorithm’s performance was measured through the creation of a sensitivity-specificity curve, where sensitivity represented its ability to correctly identify malignant lesions and specificity represented its ability to correctly identify benign lesions. It was assessed through three key diagnostic tasks: keratinocyte carcinoma classification, melanoma classification, and melanoma classification when viewed using dermoscopy. In all three tasks, the algorithm matched the performance of the dermatologists with the area under the sensitivity-specificity curve amounting to at least 91 percent of the total area of the graph.

An added advantage of the algorithm is that, unlike a person, the algorithm can be made more or less sensitive, allowing the researchers to tune its response depending on what they want it to assess. This ability to alter the sensitivity hints at the depth and complexity of this algorithm. The underlying architecture of seemingly irrelevant photos – including cats and dogs – helps it better evaluate the skin lesion images.
Health care by smartphone

Although this algorithm currently exists on a computer, the team would like to make it smartphone compatible in the near future, bringing reliable skin cancer diagnoses to our fingertips.

“My main eureka moment was when I realized just how ubiquitous smartphones will be,” said Esteva. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”

The team believes it will be relatively easy to transition the algorithm to mobile devices but there still needs to be further testing in a real-world clinical setting.

“Advances in computer-aided classification of benign versus malignant skin lesions could greatly assist dermatologists in improved diagnosis for challenging lesions and provide better management options for patients,” said Susan Swetter, professor of dermatology and director of the Pigmented Lesion and Melanoma Program at the Stanford Cancer Institute, and co-author of the paper. “However, rigorous prospective validation of the algorithm is necessary before it can be implemented in clinical practice, by practitioners and patients alike.”

Even in light of the challenges ahead, the researchers are hopeful that deep learning could someday contribute to visual diagnosis in many medical fields.




Contacts and sources:
Taylor Kubota
Stanford University

Rat-Grown Mouse Pancreases Help Reverse Diabetes in Mice


Growing organs from one species in the body of another may one day relieve transplant shortages. Now researchers show that islets from rat-grown mouse pancreases can reverse disease when transplanted into diabetic mice.

Mouse pancreases grown in rats generate functional, insulin-producing cells that can reverse diabetes when transplanted into mice with the disease, according to researchers at the Stanford University School of Medicine and the Institute of Medical Science at the University of Tokyo.

The recipient animals required only days of immunosuppressive therapy to prevent rejection of the genetically matched organ rather than lifelong treatment.

A rat in which researchers were able to grow a mouse pancreas. Islets from the pancreases were transplanted into mice with diabetes. The transplants helped control the mice's blood sugar levels.

Courtesy of the Nakauchi lab
The success of the interspecies transplantation suggests that a similar technique could one day be used to generate matched, transplantable human organs in large animals like pigs and sheep.
To conduct the work, the researchers implanted mouse pluripotent stem cells, which can become any cell in the body, into early rat embryos. The rats had been genetically engineered to be unable to develop their own pancreas and were thus forced to rely on the mouse cells for the development of the organ.

Once the rats were born and grown, the researchers transplanted the insulin-producing cells, which cluster together in groups called islets, from the rat-grown pancreases into mice genetically matched to the stem cells that formed the pancreas. These mice had been given a drug to cause them to develop diabetes.

“We found that the diabetic mice were able to normalize their blood glucose levels for over a year after the transplantation of as few as 100 of these islets,” said Hiromitsu Nakauchi, MD, PhD, a professor of genetics at Stanford. “Furthermore, the recipient animals only needed treatment with immunosuppressive drugs for five days after transplantation, rather than the ongoing immunosuppression that would be needed for unmatched organs.”

Nakauchi, who is a member of Stanford’s Institute for Stem Cell Biology and Regenerative Medicine, is the senior author of a paper describing the findings, which was published online Jan. 25 in Nature. Tomoyuki Yamaguchi, PhD, an associate professor of stem cell therapy, and researcher Hideyuki Sato, both from the University of Tokyo, share lead authorship of the paper.

Although much research remains to be done, scientist Hiromatsu Nakauchi and his colleagues believe their work with rodents shows that a similar technique could one day be used to generate matched, transplantable human organs in large animals like pigs and sheep.
Credit: Wing Hon Films


Organs in short supply

About 76,000 people in the United States are currently waiting for an organ transplant, but organs are in short supply. Generating genetically matched human organs in large animals could relieve the shortage and release transplant recipients from the need for lifelong immunosuppression, the researchers say.
People suffering from diabetes could also benefit from this approach. Diabetes is a life-threating metabolic disease in which a person or animal is unable to either make or respond appropriately to insulin, which is a hormone that allows the body to regulate its blood sugar levels in response to meals or fasting. The disease affects hundreds of millions of people worldwide and is increasing in prevalence. The transplantation of functional islets from healthy pancreases has been shown to be a potentially viable option to treat diabetes in humans, as long as rejection can be avoided.

The researchers’ current findings come on the heels of a previous study in which they grew rat pancreases in mice. Although the organs appeared functional, they were the size of a normal mouse pancreas rather than a larger rat pancreas. As a result, there were not enough functional islets in the smaller organs to successfully reverse diabetes in rats.
Mouse pancreases grown in rats

In the current study, the researchers swapped the animals’ roles, growing mouse pancreases in rats engineered to lack the organ. The pancreases were able to successfully regulate the rats’ blood sugar levels, indicating they were functioning normally. Rejection of the mouse pancreases by the rats’ immune systems was uncommon because the mouse cells were injected into the rat embryo prior to the development of immune tolerance, which is a period during development when the immune system is trained to recognize its own tissues as “self.” Most of these mouse-derived organs grew to the size expected for a rat pancreas, rendering enough individual islets for transplantation

Next, the researchers transplanted 100 islets from the rat-grown pancreases back into mice with diabetes. Subsequently, these mice were able to successfully control their blood sugar levels for over 370 days, the researchers found.

Because the transplanted islets contained some contaminating rat cells, the researchers treated each recipient mouse with immunosuppressive drugs for five days after transplant. After this time, however, the immunosuppression was stopped.

After about 10 months, the researchers removed the islets from a subset of the mice for inspection.

“We examined them closely for the presence of any rat cells, but we found that the mouse’s immune system had eliminated them,” said Nakauchi. “This is very promising for our hope to transplant human organs grown in animals because it suggests that any contaminating animal cells could be eliminated by the patient’s immune system after transplant.” 

Importantly, the researchers also did not see any signs of tumor formation or other abnormalities caused by the pluripotent mouse stem cells that formed the islets. Tumor formation is often a concern when pluripotent stem cells are used in an animal due to the cells’ remarkable developmental plasticity. The researchers believe the lack of any signs of cancer is likely due to the fact that the mouse pluripotent stem cells were guided to generate a pancreas within the developing rat embryo, rather than coaxed to develop into islet cells in the laboratory. The researchers are working on similar animal-to-animal experiments to generate kidneys, livers and lungs.

Although the findings provide proof-of-principle for future work, much research remains to be done. Ethical considerations are also important when human stem cells are transplanted into animal embryos, the researchers acknowledge.

The research was funded by the Japan Science and Technology Agency, the Japan Agency for Medical Research and Development, the Japan Society for the Promotion of Science, a KAKENHI grant, the Japan Insulin Dependent Diabetes Mellitus Network and the California Institute for Regenerative Medicine.as well as Stanford’s Department of Genetics 




Contacts and sources:
Krista Conger
Stanford University






Most Extreme Blazars Yet Discovered

NASA's Fermi Gamma-ray Space Telescope has identified the farthest gamma-ray blazars, a type of galaxy whose intense emissions are powered by supersized black holes. Light from the most distant object began its journey to us when the universe was 1.4 billion years old, or nearly 10 percent of its present age.

NASA's Fermi Gamma-ray Space Telescope has discovered the five most distant gamma-ray blazars yet known. The light detected by Fermi left these galaxies by the time the universe was two billion years old. Two of these galaxies harbor billion-solar-mass black holes that challenge current ideas about how quickly such monsters could grow.

Credits: NASA's Goddard Space Flight Center/Scott Wiessinger, producer

"Despite their youth, these far-flung blazars host some of the most massive black holes known," said Roopesh Ojha, an astronomer at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "That they developed so early in cosmic history challenges current ideas of how supermassive black holes form and grow, and we want to find more of these objects to help us better understand the process."

Ojha presented the findings Monday, Jan. 30, at the American Physical Society meeting in Washington, and a paper describing the results has been submitted to The Astrophysical Journal Letters.

Blazars constitute roughly half of the gamma-ray sources detected by Fermi's Large Area Telescope (LAT). Astronomers think their high-energy emissions are powered by matter heated and torn apart as it falls from a storage, or accretion, disk toward a supermassive black hole with a million or more times the sun's mass. A small part of this infalling material becomes redirected into a pair of particle jets, which blast outward in opposite directions at nearly the speed of light. Blazars appear bright in all forms of light, including gamma rays, the highest-energy light, when one of the jets happens to point almost directly toward us.

Black-hole-powered galaxies called blazars are the most common sources detected by NASA's Fermi. As matter falls toward the supermassive black hole at the galaxy's center, some of it is accelerated outward at nearly the speed of light along jets pointed in opposite directions. When one of the jets happens to be aimed in the direction of Earth, as illustrated here, the galaxy appears especially bright and is classified as a blazar. 

Credits: M. Weiss/CfA

Previously, the most distant blazars detected by Fermi emitted their light when the universe was about 2.1 billion years old. Earlier observations showed that the most distant blazars produce most of their light at energies right in between the range detected by the LAT and current X-ray satellites, which made finding them extremely difficult.

Then, in 2015, the Fermi team released a full reprocessing of all LAT data, called Pass 8, that ushered in so many improvements astronomers said it was like having a brand new instrument. The LAT's boosted sensitivity at lower energies increased the chances of discovering more far-off blazars.

The research team was led by Vaidehi Paliya and Marco Ajello at Clemson University in South Carolina and included Dario Gasparrini at the Italian Space Agency's Science Data Center in Rome as well as Ojha. They began by searching for the most distant sources in a catalog of 1.4 million quasars, a galaxy class closely related to blazars. Because only the brightest sources can be detected at great cosmic distances, they then eliminated all but the brightest objects at radio wavelengths from the list. With a final sample of about 1,100 objects, the scientists then examined LAT data for all of them, resulting in the detection of five new gamma-ray blazars.

Expressed in terms of redshift, astronomers' preferred measure of the deep cosmos, the new blazars range from redshift 3.3 to 4.31, which means the light we now detect from them started on its way when the universe was between 1.9 and 1.4 billion years old, respectively.

"Once we found these sources, we collected all the available multiwavelength data on them and derived properties like the black hole mass, the accretion disk luminosity, and the jet power," said Paliya.

Two of the blazars boast black holes of a billion solar masses or more. All of the objects possess extremely luminous accretion disks that emit more than two trillion times the energy output of our sun. This means matter is continuously falling inward, corralled into a disk and heated before making the final plunge to the black hole.

"The main question now is how these huge black holes could have formed in such a young universe," said Gasparrini. "We don't know what mechanisms triggered their rapid development."

In the meantime, the team plans to continue a deep search for additional examples.

"We think Fermi has detected just the tip of the iceberg, the first examples of a galaxy population that previously has not been detected in gamma rays," said Ajello.

NASA's Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership, developed in collaboration with the U.S. Department of Energy and with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.




Contacts and sources:
By Francis Reddy
NASA's Goddard Space Flight Center,

DNA Analysis of Seawater Detects 80% of Fish Species in Just One Day

A Japanese research group has used a new technology that identifies multiple fish species populating local areas by analyzing DNA samples from seawater, and proved that this method is accurate and more effective than visual observation.

Until recently, marine surveys of fish species relied on diving or capturing methods that classified fish based on appearance. In addition to requiring a lot of manpower, these methods also depend upon specialist knowledge for fish classification. A new solution to the survey issue has recently drawn attention: environmental DNA metabarcoding, a method which can simultaneously detect multiple species. This method identifies the fish species through collection and analysis of DNA released by fish in seawater (environmental DNA, or eDNA). 

 Until now, researchers could only gain limited confirmation of the effectiveness of this method because it had only been proven in areas with limited numbers of fish species. In places such as the Japan coast, home to many different fish species, data collected using traditional methods is limited. As the results could not be compared with previous data, the effectiveness of eDNA metabarcoding remained unconfirmed.

Collecting water samples
Credit: Kobe University


This research group used eDNA metabarcoding in Maizuru bay, Kyoto prefecture. After just one day of field surveys applying this method, they were able to detect 128 species of fish from the seawater samples. Over 60% of the species observed during 140 visual surveys spanning 14 years were included in these 128 varieties. Excluding fish varieties who only migrated to Maizuru bay in certain years, this rose to nearly an 80% match. They also detected fish species that could not be confirmed by visual observation. Researchers believe that this method is the first time they have been able to detect certain varieties of fish larvae that are hard to identify through visual observation.

These findings show that even in areas with many different fish species, eDNA metabarcoding enables researchers to survey fish in multiple areas during a short time period. This method has potential applications for monitoring the invasion of foreign species across large areas, surveys of expanding fish distribution, and use in hard to access areas such as the deep sea, subterranean lakes, dangerous polluted waters, and protected areas where collecting specimens is prohibited. The findings were published on January 12 in Scientific Reports.

This research was carried out as part of the Japan Science and Technology Strategic Basic Research Programs by a group including Academic Researcher YAMAMOTO Satoshi (Kobe University Graduate School of Human Development and Environment), Associate Professor MASUDA Reiji (Kyoto University), Professor ARAKI Hitoshi (Hokkaido University), Professor KONDOH Michio (Ryukoku University), Project Assistant Professor MINAMOTO Toshifumi (Kobe University Graduate School of Human Development and Environment), and Adjunct Associate Professor MIYA Masaki (Head of Department of Ecology and Environmental Sciences, Natural History Museum and Institute, Chiba).



 Contacts and sources:
Kobe University


Citation:  ”Environmental DNA metabarcoding reveals local fish communities in a species rich coastal sea.” Authors: Satoshi Yamamoto, Reiji Masuda, Yukuto Sato, Tetsuya Sado, Hitoshi Araki, Michio Kondoh, Toshifumi Minamoto & Masaki Miya.  Journal: Scientific Reports 7, Article number: 40368 (2017) doi:10.1038/srep40368

Scientists Find Brain Hormone That Triggers Fat Burning in the Gut


Biologists at The Scripps Research Institute (TSRI) have identified a brain hormone that appears to trigger fat burning in the gut. Their findings in animal models could have implications for future pharmaceutical development.

"This was basic science that unlocked an interesting mystery," said TSRI Assistant Professor Supriya Srinivasan, senior author of the new study, published today in the journal Nature Communications.

Previous studies had shown that the neurotransmitter serotonin can drive fat loss. Yet no one was sure exactly how. To answer that question, Srinivasan and her colleagues experimented with roundworms called C. elegans, which are often used as model organisms in biology. These worms have simpler metabolic systems than humans, but their brains produce many of the same signaling molecules, leading many researchers to believe that findings in C. elegans may be relevant for humans.

TSRI Assistant Professor Surpriya Srinivasan (left) and TSRI Research Associate Lavinia Palamiuc led the new study.

Credit: The Scripps Research Institute (Photo by Madeline McCurry-Schmidt.)

The researchers deleted genes in C. elegans to see if they could interrupt the path between brain serotonin and fat burning. By testing one gene after another, they hoped to find the gene without which fat burning wouldn't occur. This process of elimination led them to a gene that codes for a neuropeptide hormone they named FLP-7 (pronounced "flip 7").

Interestingly, they found that the mammalian version of FLP-7 (called Tachykinin) had been identified 80 years ago as a peptide that triggered muscle contractions when dribbled on pig intestines.

Scientists back then believed this was a hormone that connected the brain to the gut, but no one had linked the neuropeptide to fat metabolism in the time since.

The next step in the new study was to determine if FLP-7 was directly linked to serotonin levels in the brain. Study first author Lavinia Palamiuc, a TSRI research associate, spearheaded this effort by tagging FLP-7 with a fluorescent red protein so that it could be visualized in living animals, possible because the roundworm body is transparent. Her work revealed that FLP-7 was indeed secreted from neurons in the brain in response to elevated serotonin levels. FLP-7 then traveled through the circulatory system to start the fat burning process in the gut.

"That was a big moment for us," said Srinivasan. For the first time, researchers had found a brain hormone that specifically and selectively stimulates fat metabolism, without any effect on food intake.

Altogether, the newly discovered fat-burning pathway works like this: a neural circuit in the brain produces serotonin in response to sensory cues, such as food availability. This signals another set of neurons to begin producing FLP-7. FLP-7 then activates a receptor in intestinal cells, and the intestines begin turning fat into energy.

Next, the researchers investigated the consequences of manipulating FLP-7 levels. While increasing serotonin itself can have a broad impact on an animal's food intake, movement and reproductive behavior, the researchers found that increasing FLP-7 levels farther downstream didn't come with any obvious side effects. The worms continued to function normally while simply burning more fat.

Srinivasan said this finding could encourage future studies into how FLP-7 levels could be regulated without causing the side effects often experienced when manipulating overall serotonin levels.




Contacts and sources:
Madeline McCurry-Schmidt
The Scripps Research Institute

Citation:, "A tachykinin-like neuroendocrine signalling axis couples central serotonin action and nutrient sensing with peripheral lipid metabolism," http://dx.doi.org/10.1038/ncomms14237

Monday, January 30, 2017

Bag-Like Sea Creature Is Humans' Oldest Known Ancestor


Researchers have identified traces of what they believe is the earliest known prehistoric ancestor of humans -- a microscopic, bag-like sea creature, which lived about 540 million years ago.

Named Saccorhytus, after the sack-like features created by its elliptical body and large mouth, the species is new to science and was identified from microfossils found in China. It is thought to be the most primitive example of a so-called "deuterostome" -- a broad biological category that encompasses a number of sub-groups, including the vertebrates.

If the conclusions of the study, published in the journal Nature, are correct, then Saccorhytus was the common ancestor of a huge range of species, and the earliest step yet discovered on the evolutionary path that eventually led to humans, hundreds of millions of years later.

Artist's reconstruction of Saccorhytus coronarius, based on the original fossil finds. The actual creature was probably no more than a millimeter in size

Credit; S Conway Morris / Jian Han

Modern humans are, however, unlikely to perceive much by way of a family resemblance. Saccorhytus was about a millimetre in size, and probably lived between grains of sand on the seabed. Its features were spectacularly preserved in the fossil record -- and intriguingly, the researchers were unable to find any evidence that the animal had an anus.

The study was carried out by an international team of academics, including researchers from the University of Cambridge in the UK and Northwest University in Xi'an China, with support from other colleagues at institutions in China and Germany.

Simon Conway Morris, Professor of Evolutionary Palaeobiology and a Fellow of St John's College, University of Cambridge, said: "We think that as an early deuterostome this may represent the primitive beginnings of a very diverse range of species, including ourselves. To the naked eye, the fossils we studied look like tiny black grains, but under the microscope the level of detail is jaw-dropping. All deuterostomes had a common ancestor, and we think that is what we are looking at here."

Degan Shu, from Northwest University, added: "Our team has notched up some important discoveries in the past, including the earliest fish and a remarkable variety of other early deuterostomes. Saccorhytus now gives us remarkable insights into the very first stages of the evolution of a group that led to the fish, and ultimately, to us."

Most other early deuterostome groups are from about 510 to 520 million years ago, when they had already begun to diversify into not just the vertebrates, but the sea squirts, echinoderms (animals such as starfish and sea urchins) and hemichordates (a group including things like acorn worms). This level of diversity has made it extremely difficult to work out what an earlier, common ancestor might have looked like.

The Saccorhytus microfossils were found in Shaanxi Province, in central China, and pre-date all other known deuterostomes. By isolating the fossils from the surrounding rock, and then studying them both under an electron microscope and using a CT scan, the team were able to build up a picture of how Saccorhytus might have looked and lived. This revealed features and characteristics consistent with current assumptions about primitive deuterostomes.

Dr Jian Han, of Northwest University, said: "We had to process enormous volumes of limestone - about three tonnes - to get to the fossils, but a steady stream of new finds allowed us to tackle some key questions: was this a very early echinoderm, or something even more primitive? The latter now seems to be the correct answer."

In the early Cambrian period, the region would have been a shallow sea. Saccorhytus was so small that it probably lived in between individual grains of sediment on the sea bed.

The study suggests that its body was bilaterally symmetrical -- a characteristic inherited by many of its descendants, including humans -- and was covered with a thin, relatively flexible skin. This in turn suggests that it had some sort of musculature, leading the researchers to conclude that it could have made contractile movements, and got around by wriggling.

Perhaps its most striking feature, however, was its rather primitive means of eating food and then dispensing with the resulting waste. Saccorhytus had a large mouth, relative to the rest of its body, and probably ate by engulfing food particles, or even other creatures.

A crucial observation are small conical structures on its body. These may have allowed the water that it swallowed to escape and so were perhaps the evolutionary precursor of the gills we now see in fish. But the researchers were unable to find any evidence that the creature had an anus. "If that was the case, then any waste material would simply have been taken out back through the mouth, which from our perspective sounds rather unappealing," Conway Morris said.

The findings also provide evidence in support of a theory explaining the long-standing mismatch between fossil evidence of prehistoric life, and the record provided by biomolecular data, known as the "molecular clock".

Technically, it is possible to estimate roughly when species diverged by looking at differences in their genetic information. In principle, the longer two groups have evolved separately, the greater the biomolecular difference between them should be, and there are reasons to think this process is more or less clock-like.

Unfortunately, before a point corresponding roughly to the time at which Saccorhytus was wriggling in the mud, there are scarcely any fossils available to match the molecular clock's predictions. Some researchers have theorised that this is because before a certain point, many of the creatures they are searching for were simply too small to leave much of a fossil record. The microscopic scale of Saccorhytus, combined with the fact that it is probably the most primitive deuterostome yet discovered, appears to back this up.




Contacts and sources: 
Tom Kirk
St John's College, University of Cambridge

The findings are published in Nature. Reference: Jian Han, Simon Conway Morris, Qiang Ou, Degan Shu and Hai Huang. Meiofaunal deuterostomes from the basal Cambrian of Shaanxi (China). DOI: 10.1038/nature21072.

Stereotypes About “Brilliance” Affect Girls’ Interests as Early as Age 6, New Study Finds

By the age of 6, girls become less likely than boys to associate brilliance with their own gender and are more likely to avoid activities said to require brilliance, shows a new study conducted by researchers at New York University, the University of Illinois, and Princeton University.

The findings appear in the journal Science.

The research, led by Lin Bian, a doctoral student at the University of Illinois, and NYU psychology professor Andrei Cimpian, demonstrates how early gender stereotypes take hold and points to the potential of their life-long impact. Sarah-Jane Leslie, professor of philosophy at Princeton University, also contributed to the research.

“Even though the stereotype equating brilliance with men doesn’t match reality, it might nonetheless take a toll on girls’ aspirations and on their eventual careers,” observed Cimpian, the paper’s senior author.
Credit: Wikimedia Commons

“Our society tends to associate brilliance with men more than with women, and this notion pushes women away from jobs that are perceived to require brilliance,” said Bian.
“We wanted to know whether young children also endorse these stereotypes.”

With this question in mind, the researchers tested children ranging from 5 to 7 years in a series of studies. In one experiment, the children heard a story about a person who was “really, really smart” and were then asked to guess which of four unfamiliar adults (2 men, 2 women) was the story’s protagonist. They were also asked to guess which adult in a series of paired different-gender adults was “really, really smart.” While the results showed both boys and girls aged 5 viewed their own gender positively, girls aged 6 and 7 were significantly less likely than boys to associate brilliance with their gender. These age differences were largely similar across children of various socioeconomic and racial-ethnic backgrounds.

A subsequent study asked whether these perceptions shape children’s interests. A different group of boys and girls aged 6 and 7 were introduced to two games—one described as for “children who are really, really smart” and the other for “children who try really, really hard.” The content and rules of two games were otherwise very similar. Children were then asked four questions to measure their interest in these games (e.g., “Do you like this game, or do you not like it?”). Girls were significantly less interested than boys in the game for smart children; however, there was no difference between the boys’ and girls’ interest in the game for hard-working children—a finding that illuminates the targeted nature of gender stereotyping.

A final experiment compared 5- and 6-year-old boys’ and girls’ interest in games for smart children. The results showed no significant differences in interest between 5-year-old boys and girls, consistent with the absence of brilliance stereotypes at this age. However, by 6 girls’ interest in the activities for smart children was again lower than that for boys.

“In earlier work, we found that adult women were less likely to receive higher degrees in fields thought to require ‘brilliance,’ and these new findings show that these stereotypes begin to impact girls’ choices at a heartbreakingly young age,” said Leslie.

However, the researchers also caution that more work is needed to investigate how broadly these results apply.

This study was supported, in part, by a grant from the National Science Foundation (BCS-1530669).


Contacts and sources:
New York University

Why LSD Trips Last So Long Discovered, Potential Medicinal Use

For the first time, UNC School of Medicine researchers crystallized the structure of LSD attached to a human serotonin receptor of a brain cell, and they may have discovered why an “acid trip” lasts so long.

A tiny tab of acid on the tongue. A daylong trip through hallucinations and assorted other psychedelic experiences For the first time, researchers at the UNC School of Medicine have discovered precisely what the drug lysergic acid diethylamide (LSD) looks like in its active state when attached to a human serotonin receptor of a brain cell, and their first-ever crystal structure revealed a major clue for why the psychoactive effects of LSD last so long.

Center: a molecule of LSD bound to a larger serotonin receptor. The "lid" that keeps LSD bound so long is the orange bar running through the center.

Credit: Lab of Bryan L. Roth, at the UNC School of Medicine

Bryan L. Roth, MD, PhD, the Michael Hooker Distinguished Professor of Protein Therapeutics and Translational Proteomics in the UNC School of Medicine, led the research, which was published  in Cell.

“There are different levels of understanding for how drugs like LSD work,” Roth said. “The most fundamental level is to find out how the drug binds to a receptor on a cell. The only way to do that is to solve the structure. And to do that, you need x-ray crystallography, the gold standard.”

That is what Roth’s lab accomplished – essentially “freezing” LSD attached to a receptor so his team could capture crystallography images. As it turns out, when LSD latches onto a brain cell’s serotonin receptor, the LSD molecule is locked into place because part of the receptor folds over the drug molecule, like a lid. And then it stays put.

“We think this lid is likely why the effects of LSD can last so long,” said Roth, who holds a joint appointment at the UNC Eshelman School of Pharmacy. “LSD takes a long time to get onto the receptor, and then once it’s on, it doesn’t come off. And the reason is this lid.”

Eventually, though, an acid trip ends. Some LSD molecules pop off their receptors as the lid moves around. Also, brain cells eventually respond to this strange molecule by sucking the receptor into the cell, where it – along with the LSD – is degraded or disassembled for recycling.

Postdoctoral researchers Daniel Wacker, PhD, and Sheng Wang, PhD, led the experiments to crystallize LSD bound to a serotonin receptor and discover why it stays bound so long. “Serotonin, obviously, hits this receptor on brain cells,” Wacker said. “But our experiments show that serotonin does not interact with this lid in the same way LSD does.”

Although other labs have reported that LSD “washes” out of the brain’s fluid within four hours, such experiments could not determine what was happening on or inside brain cells. Roth’s lab has shown for the first time that LSD is very much not washed out of the serotonin receptors located within the membrane of brain cells in a few hours.

How this popular drug causes such powerful effects has remained a mystery ever since Swiss scientist Albert Hofmann first accidently synthesized and dosed LSD to report its effects in 1938. Now, because of the work of Roth’s lab, scientists can begin to parse how the drug sparks such a dramatic reaction in the brain, just as the scientific and medical communities renew interest in the drug as a potential treatment for a number of conditions, such as cluster headaches, substance abuse, and anxiety associated with life-threatening conditions.

Solving the structure of LSD could help drug developers design better psychiatric drugs with fewer side effects. Also, although LSD is illegal, it remains a popular recreational drug and not just for its most potent effects. Some people – most notably technology developers in Silicon Valley and elsewhere – report “microdosing” LSD to boost creativity, relieve stress, and help them solve problems, while avoiding its hallucinogenic effects.

One in 10 people in the United States – tens of millions of people – have reported using LSD at least once in their lives. “About 3 percent of all high school students – who are at an age when their brains are still developing – have reported trying it,” Roth said. “And although the drug has been used for a long time, we don’t know that much about it.”

Previous to becoming a pharmacology professor and researcher, Roth was a psychiatrist specializing in schizophrenia. Patients would occasionally report that their first schizophrenic break occurred while on LSD.

“They were never the same again,” Roth said. “Although this is rare, it has been reported. People also report flashbacks and LSD is an extremely potent drug. So for those reasons, along with its potential as part of therapeutic treatment, LSD is scientifically interesting.”

For two decades, Roth’s lab – first at Case Western Research University and then upon his arrival at UNC in 2005 – had been trying to crystalize LSD attached to its receptor through a series of tedious and unsuccessful experiments. Others, too, have been trying. Without crystals, no one would be able to see what LSD bound to a receptor would look like.

“To get crystals of a known compound bound to its receptor is incredibly difficult,” said Roth, who is also director of the National Institute of Mental Health's Psychoactive Drug Screening Program housed at UNC. “In some cases, it’s nearly impossible.”

For the past several years under Roth’s guidance, the task fell to Wacker, who was the first scientist to determine the crystal structure of a serotonin receptor. That was nearly four years ago as a graduate student in the lab of Ray Stevens, PhD, formerly at the Scripps Institute.

There are a few reasons why crystallizing LSD bound to a receptor is difficult. The first is lack of material; the receptors need to be produced in the lab using a number of tricks such as generating a virus that than infects cells and generates the receptor. Second, the receptors are incredibly flexible, even when compounds such as LSD are bound to them; the receptors do not want to sit still. Third, unlike, say, a molecule of water, a serotonin receptor is highly complex and composed of thousands of atoms.

Wacker explained: “We need a lot of receptors to generate an image because of their small size – much smaller than the wavelength of visible light. Instead we use x-rays, but for that to work we need all of these receptors to sit perfectly still, and they all need to sit still in the same exact way, and that’s what happens in crystals. So, even if you create a lot of serotonin receptors and attempt to crystallize them, one receptor might twitch in one direction, another receptor might twitch in another direction, a third receptor might not be bound to the LSD, and a fourth receptor might have a lid that moves a little more than the other receptors. So, we need to dissolve all these receptors in water and then slowly take away the water. The temperature needs to be just right. And then we need to employ all kinds of experimental tricks to continue to draw out the water and convince the molecules to sit still so that they will want to crystallize.”

It’s sort of like letting soup sit out overnight, Wacker said. You’ll notice salt crystals at the bottom. That’s because the salt in the soup is dissolved in water, but then as water slowly evaporated over time, salt molecules latch onto each other to stay stable. The result: crystals.

But serotonin receptors are not soup. Getting serotonin-LSD crystals took Wacker and colleagues two years, but once they got crystals, the serotonin receptors with LSD were packed tightly together. And that allowed them to shoot x-rays at the receptors, which allowed them to create images of atomic resolution.

Then UNC postdoctoral researcher John McCorvy, PhD, discovered that the lid was the key to LSD being bound to its serotonin receptor. McCorvy and colleagues created mutant receptors with floppier lids, and they found that LSD bound more quickly and also detached from the receptor more easily. They also noticed that the shorter binding times led to different signaling patterns inside cells. These different patterns likely means that the effects of LSD would have been different than the typical effects with the lid tightly secured.

Ron Dror, PhD, and his team at Stanford used computer simulations to confirm that this is what might happen when LSD engages its receptor protein in a human brain.

“There is a headache drug that binds to the same receptor as LSD,” Dror said. “The two drugs bind in the same receptor pocket, but the shape of that binding pocket is different when one drug or the other is bound. We used computer simulations to help explain why the two drugs favor different binding pocket shapes.

Psychedelic LSD trip

Credit: Flickr/Jonathan Zegarra


Another aspect of this computational work focused on the fact that the receptor site is not static—the receptor and the drug are both highly dynamic. “They wiggle around all the time, Dror said. “It has long been observed that LSD trips are long. The simulations helped explain why the receptor holds onto LSD for so long despite the fact that they have such a dynamic connection.”

Roth said, “We do not advocate using LSD; it is potentially very dangerous. But it could have potential medicinal uses, some of which were reported in the medical literature decades ago. Now that we’ve solved the structure of LSD bound to a receptor, we are learning what makes it so potent.”

Wacker added, “I think it’s important for the pharmaceutical industry to understand that if you modify just one tiny aspect of any compound, you might affect the way it sits in the receptor. And as a result, you might affect how the compound works.”

The National Institute of Mental Health, a Terman Faculty Fellowship, and the Michael Hooker Distinguished Chair of Pharmacology at UNC funded this research.

Other authors include UNC research associates David Nichols, PhD, Sheng Wang, PhD, Tao Che, PhD; UNC graduate students Katherine Lansu and Zachary Schools; Stanford graduate student Robin Betz and Stanford postdoctoral fellow A. J. Venkatakrishnan, PhD; and Brian Shoichet, PhD, professor of pharmaceutical chemistry at the University of California-San Francisco, and UCSF postdoc Anat Levit, PhD.



Contacts and sources:
University of North Carolina Health Care System

Cosmic Dust That Formed Our Planets Traced to Giant Stars



Scientists have identified the origin of key stardust grains present in the dust cloud from which the planets in our Solar System formed, a study suggests.

Researchers have solved a long-standing puzzle concerning the source of the grains, which formed long before our Solar System and can be recovered from meteorites that fall to Earth.

The stars that produced the dust were identified by observing how key reactions shaped the make-up of the grains, scientists say.

During their lifetime, stars around six times larger than the Sun - called Asymptotic Giant Branch or AGB stars - blow off their outer layers, forming an interstellar cloud of gas and dust grains.

Our Solar System is believed to have formed from such a cloud around 4.6 billion years ago, the team says. While most of the grains were destroyed in the process of making new rocks and planets, a small fraction survived and is present in meteorites.

Credit: Arizona State University

The chemical composition of the dust grains reveals important clues about the nuclear processes inside stars that led to their formation, the team says. Until now, however, tracing the origin of the grains to AGB stars had proven difficult.

While AGB stars are known to produce vast amounts of dust, the composition of grains recovered from meteorites did not seem to match those expected from these stars, researchers say.

The study solves this puzzle by identifying in the make-up of some meteoritic dust grains the effect of the nuclear reactions that occur in AGB stars.

A team of nuclear physicists found that fusion reactions between protons and a form of oxygen that is heavier than the type we breathe - called 17O - occur twice as often as was previously thought.

The effect of these nuclear reactions is clearly observed in some stardust grains found in meteorites, resolving the mystery of their origin, the team says.

Credit: NASA/Kepler


The discovery was made by an international team of researchers, including scientists at the University of Edinburgh, at an underground laboratory in Italy.

The Laboratory for Underground Nuclear Astrophysics - or LUNA - is located more than 1km beneath the Earth's surface. The facility is hosted by the Italian Institute for Nuclear Physics Gran Sasso Laboratory.

The study is published in the journal Nature Astronomy. The LUNA Collaboration involves around 40 scientists from 14 institutions in Italy, Germany, Hungary and the UK.

Professor Marialuisa Aliotta, of the University of Edinburgh's School of Physics and Astronomy, who led LUNA's UK team, said: "It is a great satisfaction to know that we have helped to solve a long-standing puzzle on the origin of these key stardust grains. Our study proves once again the importance of precise and accurate measurements of the nuclear reactions that take place inside stars."

Dr Maria Lugaro, of Konkoly Observatory, Hungary, who led the study, said: "The long-standing question of the missing dust was making us very uncomfortable: it undermined what we know about the origin and evolution of dust in the Galaxy. It is a relief to have finally identified this dust thanks to the renewed LUNA investigation of a crucial nuclear reaction."



Contacts and sources:
Corin Campbell
 University of Edinburgh