Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Saturday, February 25, 2017

700% Surge in Infections Caused by Antibiotic Resistant Bacteria: A Fast Growing Problem for Kids Too

The adage that kids are growing up too fast these days has yet another locus of applicability.

In a new, first-of-its-kind study, researchers from Case Western Reserve University School of Medicine have found a 700-percent surge in infections caused by bacteria from the Enterobacteriaceae family resistant to multiple kinds of antibiotics among children in the US. These antibiotic resistant infections are in turn linked to longer hospital stays and potentially greater risk of death.

The research, published in the March issue of the Journal of the Pediatric Infectious Diseases Society, is the first known effort to comprehensively examine the problem of multi-drug resistant infections among patients under 18 admitted to US children’s hospitals with Enterobacteriaceae infections. Earlier studies focused mainly on adults, while some looked at young people in more limited geographical areas, such as individual hospitals or cities, or used more limited surveillance data.

Credit: Penn State

“There is a clear and alarming upswing throughout this country of antibiotic resistant Enterobacteriaceae infections in kids and teens,” said lead author Sharon B. Meropol, MD, PhD, a pediatrician and epidemiologist at Case Western Reserve University School of Medicine and Rainbow Babies and Children’s Hospital in Cleveland. “This makes it harder to effectively treat our patients’ infections. The problem is compounded because there are fewer antibiotics approved for young people than adults to begin with. Health care providers have to make sure we only prescribe antibiotics when they’re really needed. It’s also essential to stop using antibiotics in healthy agricultural animals.”

In the retrospective study, Meropol and co-authors Allison A. Haupt, MSPH, and Sara M. Debanne, PhD, both from Case Western Reserve University School of Medicine, analyzed medical data from nearly 94,000 patients under the age of 18 years diagnosed with Enterobacteriaceae-associated infections at 48 children’s hospitals throughout the US. The average age was 4.1 years. Enterobacteriaceae are a family of bacteria; some types are harmless, but they also include such pathogens as Salmonella and Escherichia coli; Enterobacteriaceae are responsible for a rising proportion of serious bacterial infections in children.

The researchers found that the share of these infections resistant to multiple antibiotics rose from 0.2 percent in 2007 to 1.5 percent in 2015, a seven-fold-plus increase in a short, eight-year span. Children with other health problems were more likely to have the infections while there were no overall differences based on sex or insurance coverage. The yearly number of discharges with Enterobacteriaceae-associated infections remained relatively stable over the course of the study years.

In a key finding, more than 75 percent of the antibiotic-resistant infections were already present when the young people were admitted to the hospital, upending previous findings that the infections were mostly picked up in the hospital. “This suggests that the resistant bacteria are now more common in many communities,” said Meropol. For reasons that are unclear, older children and those living in the Western US were more likely to have the infections.

The investigators also found that young people with antibiotic-resistant infections stayed in the hospital 20 percent longer than those whose infections could be addressed by antibiotics. Additionally, there was a greater—but not statistically significant—risk of death among pediatric patients infected with the resistant bacterial strains.

Previous studies have shown that the problem is even worse elsewhere in the world, with an 11.4 percent global rate of antibiotic-resistant Enterobacteriaceae infections among young people, including 27 percent in Asia and the Pacific, 8.8 percent in Latin America, and 2.5 percent in Europe.

“Escalating antibiotic resistance limits our treatment options, worsens clinical results, and is a growing global public health crisis,” said Meropol. “What’s more, the development of new antibacterial drugs, especially ones appropriate for children, remains essentially stagnant. We need to stop over-using antibiotics in animals and humans and develop new ones if we want to stop a bad problem from getting worse.”

This work was supported by the National Institute for Allergy and Infectious Diseases at the National Institutes of Health [K23AI097284-01A1].

Contacts and sources: 

Human Brains Could Evolve to Require Very Little Sleep, Just Like The Cavefish

We all do it; we all need it – humans and animals alike. Sleep is an essential behavior shared by nearly all animals and disruption of this process is associated with an array of physiological and behavioral deficits. Although there are so many factors contributing to sleep loss, very little is known about the neural basis for interactions between sleep and sensory processing.

Neuroscientists at Florida Atlantic University have been studying Mexican cavefish to provide insight into the evolutionary mechanisms regulating sleep loss and the relationship between sensory processing and sleep. They are investigating how sleep evolves and using this species as a model to understand how human brains could evolve to require very little sleep, just like the cavefish

The Pachón cavefish live in deep, dark caves in central Mexico, with little food, oxygen or light, and have lost their eyes completely. Because of their harsh environment, they have evolved to get very creative in order to survive and suppress sleep. They are able to find their way around by means of their lateral lines, which are highly sensitive to fluctuating water pressure.

Credit: Pavel Masek

In their latest study, just published in the Journal of Experimental Biology, findings suggest that an inability to block out your environment is one of the ways to lose sleep. The study also provides a model for understanding how the brain’s sensory systems modulate sleep and sheds light into the evolution of the significant differences in sleep duration observed throughout the animal kingdom.

“Animals have dramatic differences in sleep with some sleeping as much as 20 hours and others as little as two hours and no one knows why these dramatic differences in sleep exist,” said Alex C. Keene, Ph.D., corresponding author of the study and an associate professor in the Department of Biological Sciences in FAU’s Charles E. Schmidt College of Science. “Our study suggests that differences in sensory systems may contribute to this sleep variability. It is possible that evolution drives sensory changes and changes in sleep are a secondary consequence, or that evolution selects for changes in sensory processing in order to change sleep.”

Credit: FAU Science Jupiter

Because the cave environment differs dramatically from the rivers inhabited by surface fish, cavefish have evolved robust differences in foraging and feeding behavior, raising the possibility that differences in nutrient availability contribute to the evolution of sleep loss in cave populations. Furthermore, multiple cave populations have evolved substantial reductions in sleep duration and enhanced sensory systems, suggesting that sleep loss is evolutionary and functionally associated with sensory and metabolic changes.

Key findings of the study have shown that the evolution of enhanced sensory capabilities contribute to sleep loss in cavefish and that sleep in cavefish is plastic and may be regulated by seasonal changes in food availability.

There are more than 29 different populations of cavefish and many of them evolved independently. This enabled the researchers to determine whether evolution occurs through the same or different mechanisms. The Pachon cavefish, the population they studied, appear to have lost sleep due to increased sensory input, but not the other populations.

“We were surprised to find that there are multiple independent mechanisms regulating sleep loss in different cave populations and this can be a significant strength moving forward,” said James Jaggard, first author and a graduate student at FAU working with Keene. “This means that there are many different ways to lose sleep or evolve a brain that sleeps less and we are going to search to identify these mechanisms.”

Keene, Jaggard and their colleagues use Mexican cavefish because they are a powerful system for examining trait evolution. In earlier research studies, they observed the evolutionary convergence on sleep loss in these fish. However, the neural mechanisms underlying this dramatic behavioral shift remained elusive. Since they already knew that cavefish also had evolved a highly sensitive lateral line (the groups of sensory neurons that line the body of the fish), they wondered if an increase in sensory input from these neurons contribute to sleep loss.

For the study, the researchers recorded the cavefish under infrared light set up in individual tanks. They automated video-tracking software that told them when the fish were inactive and they defined sleep as one minute of immobility because it correlated with changes in arousal threshold.

“Humans block out sensory cues when we enter a sleep-like state,” said Keene. “For example, we close our eyes and there are mechanisms in the brain to reduce auditory input. This is one of the reasons why a sensory stimuli like someone entering a room is less likely to get your attention if you are asleep. Our thinking was that cavefish have to some degree lost this ability and this drives sleep loss.”

The researchers recently generated transgenic fish lines and they will be able to image brain activity and genetically map anatomical differences between the Mexican cavefish populations.

This study is supported by a grant from the National Science Foundation (1601004).

Contacts and sources:
Gisele Galoustian
Florida Atlantic University:

Cat Ownership Not Linked to Mental Health Problems

New UCL research has found no link between cat ownership and psychotic symptoms, casting doubt on previous suggestions that people who grew up with cats are at higher risk of mental illness.

Recent research has suggested that cat ownership might contribute to some mental disorders, because cats are the primary host of the common parasite Toxoplasma Gondii (T. Gondii), itself linked to mental health problems such as schizophrenia.

"The message for cat owners is clear: there is no evidence that cats pose a risk to children's mental health," says lead author Dr Francesca Solmi (UCL Psychiatry). "In our study, initial unadjusted analyses suggested a small link between cat ownership and psychotic symptoms at age 13, but this turned out to be due to other factors. Once we controlled for factors such as household over-crowding and socioeconomic status, the data showed that cats were not to blame. Previous studies reporting links between cat ownership and psychosis simply failed to adequately control for other possible explanations."

Credit: UCL

The new study, published in Psychological Medicine, suggests that cat ownership in pregnancy and childhood does not play a role in developing psychotic symptoms during adolescence. The study looked at nearly 5000 people born in 1991 or 1992 who were followed-up until the age of 18. The researchers had data on whether the household had cats while the mother was pregnant and when the children were growing up.

The new study was significantly more reliable than previous research in this area since the team looked at families who were followed up regularly for almost 20 years. This is much more reliable than methods used in previous studies, which asked people with and without mental health problems to remember details about their childhood. Such accounts are more vulnerable to errors in recall which can lead to spurious findings.

Previous studies were also relatively small and had significant gaps in the data, whereas the new study looked at a large population and was able to account for missing data. The new study was not able to measure T. Gondii exposure directly, but the results suggest that if the parasite does cause psychiatric problems then cat ownership does not significantly increase exposure.

6 paintings of cats by Louis Wain with an increasing degree of abstractedness, attributed by some to his suffering from schizophrenia
Credit; Louis Wain

"Our study suggests that cat ownership during pregnancy or in early childhood does not pose a direct risk for later psychotic symptoms," explains senior author Dr James Kirkbride (UCL Psychiatry). "However, there is good evidence that T. Gondii exposure during pregnancy can lead to serious birth defects and other health problems in children. As such, we recommend that pregnant women should continue to follow advice not to handle soiled cat litter in case it contains T. Gondii."

Contacts and sources:
Harry Dayantis
University College London

Simple Rule Predicts an Ice Age’s End

A simple rule can accurately predict when Earth’s climate warms out of an ice age, according to new research led by UCL.

In a new study published  in Nature, researchers from UCL, University of Cambridge and University of Louvain have combined existing ideas to solve the problem of which solar energy peaks in the last 2.6 million years led to the melting of the ice sheets and the start of a warm period.

During this interval, Earth’s climate has alternated between cold (glacial) and warm (interglacial) periods. In the cold times, ice sheets advanced over large parts of North America and northern Europe. In the warm periods like today, the ice sheets retreated completely.

The Antarctic ice sheet 

Credit: Stephen Hudson via Wikimedia Commons

It has long been realised that these cycles were paced by astronomical changes in the Earth’s orbit around the Sun and in the tilt of its axis, which change the amount of solar energy available to melt ice at high northern latitudes in summer.

However, of the 110 incoming solar energy peaks (about every 21,000 years) only 50 led to complete melting of the ice sheets. Finding a way to translate the astronomical changes into the sequence of interglacials has previously proved elusive.

Professor Chronis Tzedakis (UCL Geography) said: “The basic idea is that there is a threshold for the amount of energy reaching high northern latitudes in summer. Above that threshold, the ice retreats completely and we enter an interglacial.”

From 2.6 to 1 million years ago, the threshold was reached roughly every 41,000 years, and this predicts almost perfectly when interglacials started and the ice sheets disappeared. Professor Eric Wolff (University of Cambridge) said: “Simply put, every second solar energy peak occurs when the Earth’s axis is more inclined, boosting the total energy at high latitudes above the threshold.”

Somewhere around a million years ago, the threshold rose, so that the ice sheets kept growing for longer than 41,000 years. However, as a glacial period lengthens, ice sheets become larger, but also more unstable.

The researchers combined these observations into a simple model, using only solar energy and waiting time since the previous interglacial, that was able to predict all the interglacial onsets of the last million years, occurring roughly every 100,000 years.

Dr Takahito Mitsui (University of Louvain) said: “The next step is to understand why the energy threshold rose around a million years ago – one idea is that this was due to a decline in the concentration of CO2, and this needs to be tested.”

The results explain why we have been in a warm period for the last 11,000 years: despite the weak increase in solar energy, ice sheets retreated completely during our current interglacial because of the very long waiting time since the previous interglacial and the accumulated instability of ice sheets.

Intriguingly, the researchers found that sometimes the amount of energy was very close to the threshold, so that some interglacials were just aborted, while others just made it. “The threshold was only just missed 50,000 years ago. If it hadn’t been missed, then we wouldn’t have had an interglacial in the last 11,000 years” added Professor Michel Crucifix (University of Louvain).

However, statistical analysis shows that the succession of interglacials is not chaotic: the sequence that has occurred is one among a very small set of possibilities. “Finding order among what can look like unpredictable swings in climate is aesthetically rather pleasing” said Professor Tzedakis.

Contacts and sources:
Ruth Howells
University College London 

FuturaCorp: A.I. Will Make Us More Human By Eliminating Workplace Drudgery Says New Research

The arrival of Artificial Intelligence (AI) in the workplace could triple productivity by automating more than 80 per cent of repetitive, process-oriented tasks - freeing human minds from tedium and enabling them to focus on creating and innovating, according to research from Goldsmiths, University of London and IPsoft.

The result will be a revolutionary shift in workplace productivity and a fundamental restructuring of work as we know it as humans are redeployed in higher-skill roles. 

The study, FuturaCorp: Artificial Intelligence & The Freedom To Be Human paints a vision of ‘FuturaCorp’ – an idealised man + machine workplace of tomorrow.
Credit: IPSoft 

The research describes job roles as comprised of a series of tasks. Some are repetitive and process-oriented (deterministic). Some require a human working in concert with machines (probabilistic). Some rely on the types of connections that can only be made by the human brain, from ideas generation to complex problem solving (cross-functional reasoning).

The Goldsmiths team predicts that, in the near future:

• More than 80% of deterministic tasks will be done by machines
• Probabilistic tasks will be shared 50:50 by machines and humans
• But humans will still carry out 80% of all cross-functional reasoning tasks

Dr. Chris Brauer, Director of Innovation and a Senior Lecturer at Goldsmiths, University of London, says: "AI will do far more than automate existing processes. It will free our minds from process-oriented repetition, enabling a refocusing of time and capital for our most human of pursuits: innovation and creativity. So the arrival of AI in workplaces will engender entirely new, unknown possibilities for humans and what they can achieve."

The study paints an optimistic picture of the future for individuals, pointing out that previous waves of automation have led to low-skill work being replaced by new, higher-skill jobs. It predicts that the arrival of the robots in the workplace will make us more human, pointing to crucial human skills that we will need to nurture to complement our digital colleagues.

Chetan Dube, CEO and President of IPsoft said: “AI engenders emergent individual qualities which push us to access the more complex parts of our minds. When routine work is automated, we will be able – and required – to flex our most human of skills. To do what the machines can’t, and likely never will be able to do. The future of society relies on individuals accessing higher reasoning, critical thinking and complex problem solving skills.” 
Credit: IPSoft

However, the need for rapid skill transformation could lead to a near-term skills shortage, according to the research.

The Goldsmiths team found little widespread evidence of businesses, universities and training institutions preparing individuals to manage these looming future shifts.

Finally, the research team developed in liaison with IPsoft a first-of-its-kind ‘organisational readiness equation’ for business leaders to assess how equipped their company is to take its first brave steps into an AI future. The equation scores an organisation in relation to the utopian vision of FuturaCorp, and helps leaders to determine what changes need to be made to push the business model towards this ideal.

Chetan Dube concludes: “CEOs must be prepared to redefine their business in order to capitalise on the productivity potential of AI. That journey begins with fundamental change to organization structure, who they hire for which roles, and how they use the new relationship between humans and machines to maximize efficiency and innovation.”

Contacts and sources:
Oliver Fry, Goldsmiths University of London

Citation: FuturaCorp: Artificial Intelligence & the Freedom to Be Human

New “Tougher-Than-Metal” Fiber-Reinforced Hydrogels

A team of Hokkaido University scientists has succeeded in creating “fiber-reinforced soft composites,” or tough hydrogels combined with woven fiber fabric. These fabrics are highly flexible, tougher than metals, and have a wide range of potential applications.

Efforts are currently underway around the world to create materials that are friendly to both society and the environment. Among them are those that comprise different materials, which exhibit the merits of each component.

The newly developed fiber-reinforced hydrogel consists of polyampholyte (PA) gels and glass fiber fabric. The team theorizes that toughness is increased by dynamic ionic bonds between the fiber and hydrogels, and within the hydrogels. 
Credit: Hokkaido University

Hokkaido University researchers, led by Professor Jian Ping Gong, have focused on creating a reinforced material using hydrogels. Though such a substance has potential as a structural biomaterial, up until now no material reliable and strong enough for long-term use has been produced. This study was conducted as a part of the Cabinet Office’s Impulsing Paradigm Change through Disruptive Technologies Program (ImPACT).

To address the problem, the team combined hydrogels containing high levels of water with glass fiber fabric to create bendable, yet tough materials, employing the same method used to produce reinforced plastics. The team found that a combination of polyampholyte (PA) gels, a type of hydrogel they developed earlier, and glass fiber fabric with a single fiber measuring around 10μm in diameter produced a strong, tensile material. The procedure to make the material is simply to immerse the fabric in PA precursor solutions for polymerization.

When used alone, the fiber-reinforced hydrogels developed by the team are 25 times tougher than glass fiber fabric, and 100 times tougher than hydrogels - in terms of the energy required to destroy them. Combining these materials enables a synergistic toughening. The team theorizes that toughness is increased by dynamic ionic bonds between the fiber and hydrogels, and within the hydrogels, as the fiber’s toughness increases in relation to that of the hydrogels. Consequently, the newly developed hydrogels are 5 times tougher compared to carbon steel.

“The fiber-reinforced hydrogels, with a 40 percent water level, are environmentally friendly,” says Dr. Jian Ping Gong, “The material has multiple potential applications because of its reliability, durability and flexibility. For example, in addition to fashion and manufacturing uses, it could be used as artificial ligaments and tendons, which are subject to strong load-bearing tensions.” The principles to create the toughness of the present study can also be applied to other soft components, such as rubber.

Contacts and sources:
Professor Jiang Ping Gong
Graduate School of Life Science
Hokkaido University

Measuring the True Size of Gods and Giants

Archeological artefacts, such as the Jupiter Column of Ladenburg, a town with an impressive Roman history, hold many as yet undiscovered secrets. Discovered in 1973, the history of the monument that is more than 1800 years old is still unclear. 

The HEiKA MUSIEKE project is aimed at uncovering some of these secrets and making the cultural heritage of Ladenburg visible and perceptible. For this purpose, modern digitization techniques of Karlsruhe Institute of Technology (KIT) are used.

“Contact-free digitization of objects opens up new approaches to research,” Dr. Thomas Vögtle of KIT’s Institute of Photogrammetry and Remote Sensing says. The Jupiter Column is about four meters high and combines Roman and Germanic symbols and conceptions. The figures on the column represent the battle between the Roman god Jupiter and a giant. The texture of the column and the equestrian figure, however, appear to follow Celtic tradition. “The digital model makes archeologists and laymen experience the artefact in an entirely new way.”

Digitization of the Jupiter Column makes this cultural heritage perceptible by both archaeologists and laymen. 

To model the three-dimensional structure of the column on the computer, the KIT team uses a professional, commercially available digital single-lens reflex camera of 36 megapixels resolution with conventional illumination technology. “Our hardware is robust and mobile so that we can collect our data easily, rapidly, and at low costs at any place,” Vögtle explains. 

On a single working day, the team took about 800 photos of the column from all perspectives. On the computer, characteristic features of the column were identified and interlinked in the different images. Information of the two-dimensional photos was processed to yield a photorealistic, three-dimensional model. Using this model, hardly visible structures can be seen with the bare eye. “The computer model then is the basis for further work of archeologists.” 

“Digital objects may also provide laymen with a new experience of cultural heritage,” Dr. Ralf Schneider of ZAK I Center for Cultural and General Studies of KIT says. He coordinates the HEiKA-MUSIEKE – Multidimensional Perceptibility of Cultural Heritage project. Large parts of our cultural heritage have long been lost from our world of interest. With the help of digital methods, cultural heritage can be acquired, analyzed, and presented to a broader public in a new way, in a context that is also understandable by laymen. 

The Jupiter Column from Ladenburg. 
Photo: KIT/IPF

The MUSIEKE project combines archeology, remote sensing, forensic computer science, geoinformatics, and applied cultural science to make cultural heritage perceptible. Apart from the digitization of artefacts, it also covers the generation of databases with geoinformation or production of digital maps of various historic stages of settlements and cities.

Vögtle normally uses photogrammetry and digitization methods for technical purposes. Based on aerial photos, he determines the orientation of roofs in cities for finding out whether they are suited for the installation of solar facilities. In industrial production, camera photos are used to find out whether the product was produced with the required accuracy and can be used in the next production stage or needs to be adjusted. Or the progress of construction of an underground station can be compared with the planned target. “In production or in the construction sector in particular, objects have to be measured in a contact-free, automatic, and rapid way. Cameras and digitization are very valuable tools for this purpose,” Vögtle says.

Contacts and sources:
Kosta Schinarakis

Darwin’s “Abominable Mystery: ”Where Do Flowers Come From? Researchers Find Clues

The mystery that is the origin of flowering plants has been partially solved thanks to a team of French researchers.  

Their discovery, published in the journal New Phytologist on February 24, 2017, sheds light on a question that much intrigued Darwin: the appearance of a structure as complex as the flower over the course of evolution.  The team was made up of researchers from the Laboratoire de Physiologie Cellulaire et Végétale (CNRS/Inra/CEA/Université Grenoble Alpes), in collaboration with the Reproduction et Développement des Plantes laboratory (CNRS/ENS Lyon/Inra/Université Claude Bernard Lyon 1) and Kew Gardens (UK).

Terrestrial flora is today dominated by flowering plants. They provide our food and contribute color to the plant world. But they have not always existed. While plants colonized the land over 400 million years ago, flowering plants appeared only 150 million years ago. They were directly preceded by a group known as the gymnosperms, whose mode of reproduction is more rudimentary and whose modern-day representatives include conifers.

Detail of a Welwitschia mirabilis plant showing its two leaves and male cones. 
Credit: Michael W. Frohlich 

Darwin long pondered the origin and rapid diversification of flowering plants, describing them as an “abominable mystery”. In comparison with gymnosperms, which possess rather rudimentary male and female cones (like the pine cone), flowering plants present several innovations: the flower contains the male organs (stamens) and the female organs (pistil), surrounded by petals and sepals, while the ovules, instead of being naked, are protected within the pistil.

A female Welwitschia mirabilis plant in its natural environment in the desert of Namibia. 
Credit: Stephen G. Weller & Ann K. Sakai

How was nature able to invent the flower, a structure so different from that of cones? The team led by François Parcy, a CNRS senior researcher at the Cell and Plant Physiology Laboratory (CNRS/Inra/CEA/Université Grenoble Alpes), has just provided part of the answer. To do so, the researchers studied a rather original gymnosperm called Welwitschia mirabilis. This plant, which can live for more than a millennium, grows in the extreme conditions of the deserts of Namibia and Angola, and, like other gymnosperms, possesses separate male and female cones.

Close-up on male cones, on which pollen can be seen. 

Credit: Michael W. Frohlich

 What is exceptional is that the male cones possess a few sterile ovules and nectar, which indicates a failed attempt to invent the bisexual flower. Yet, in this plant (as well as in certain conifers), the researchers found genes similar to those responsible for the formation of flowers, and which are organized according to the same hierarchy (with the activation of one gene activating the next gene, and so on)!

The fact that a similar gene cascade has been found in flowering plants and their gymnosperm cousins indicates that this is inherited from their common ancestor. This mechanism did not have to be invented at the time of the origins of the flower: it was simply inherited and reused by the plant, a process that is often at work in evolution.

The study of the current biodiversity of plants thus enables us to go back in time and gradually sketch the genetic portrait of the common ancestor of a large proportion of modern-day flowers. The team is continuing to study other traits to better understand how the first flower emerged.

Contacts and sources:
CNRS (Délégation Paris Michel-Ange)

Citation: A link between LEAFY and B-gene homologs in Welwitschia mirabilis sheds light on ancestral mechanisms prefiguring floral development, Edwige Moyroud, Marie Monniaux, Emmanuel Thévenon, Renaud Dumas, Charles P. Scutt, Michael W.Frohlich, François Parcy. New Phytologist, 24 February 2017. DOI:10.1111/nph.14483

Violently Exploding Ice

A droplet of water freezing from the outside in, shows an exciting series of rapid changes until it violently explodes. Researchers of the University of Twente in The Netherlands unravel this in their paper in Physical Review Letters of February 24.

Water droplets that freeze from the outside in, show an exciting series of rapid physical changes before they violently explode. This is shown by University of Twente researchers in their paper in Physical Review Letters of February 24.

A perfectly spherical droplet of water that freezes from the outside, forms a shell first. This has the diameter of the droplet. As the shell thickens inward, the remaining water ‘gets a problem’: it wants to expand, but is confined by the shell. After some time, this causes the droplet to explode. But what happens in between?

Credit: University of Twente

Shell formation

For creating spherical droplets, the UT scientists have placed them on a hydrophobic surface, placed in a vacuum chamber. The temperature is lowered by evaporative cooling so that the droplet is ‘supercooled’: the temperature is well below zero but the droplet doesn’t freeze yet. The first ice crystal is formed by touching the droplet with a silver iodide tip. A skin of ice forms rapidly, thickening from the outside in. The video images show that cracks are formed in the shell, that a peak is formed and vapour cavities are formed underneath the surface. After releasing some ice flakes, the whole droplet explodes in the end. The ice parts have velocities of about 1,5 m/sec. 
Credit: Sander Wildeman


The mathematical model Sander Wildeman and his colleagues developed for explaining this series of events, show that there is a minimum droplet size: below a diameter of 50 micron, no explosion will take place. Meteorologists already know the phenomenon of droplet explosions in the cold tops of clouds. They play a role in the onset of precipitation and can lead to the rapid transformation of a cloud with fluid droplets to a cloud of ice droplets.

‘Dutch tears’

The experiments have some resemblance to a well-known way to produce hard glass, already known to Dutch glass-blowers in 17th century. Droplets of molten glass, put into cold water, also solidify from the outside in, forming a shell of glass around the melt. They are also known as ‘Dutch tears’. The difference with water is that glass shrinks when it becomes solid. This makes the glass sphere very strong.

Max Planck Center

The research has been done in the Physics of Fluids group of Prof. Detlef Lohse. This group will be part of the new Max Planck - University of Twente Center for Complex Fluid Dynamics that will be opened March 3, 2017.

Contacts and sources:
Wiebe Van Der Veen
University of Twente

Citation: ‘Fast Dynamics of Water Droplets Freezing from the Outside In’, by Sander Wildeman, Sebastian Sterl, Chao Sun and Detlef Lohse, Physical Review Letters, February 24, 2017.

Vast Cosmic Mystery Posed by Giant Luminous Nebula

Astronomers have found an enormous, glowing blob of gas in the distant universe, with no obvious source of power for the light it is emitting. Called an "enormous Lyman-alpha nebula" (ELAN), it is the brightest and among the largest of these rare objects, only a handful of which have been observed.

ELANs are huge blobs of gas surrounding and extending between galaxies in the intergalactic medium. They are thought to be parts of the network of filaments connecting galaxies in a vast cosmic web. Previously discovered ELANs are likely illuminated by the intense radiation from quasars, but it's not clear what is causing the hydrogen gas in the newly discovered nebula to emit Lyman-alpha radiation (a characteristic wavelength of light absorbed and emitted by hydrogen atoms).

The newly discovered nebula was found at a distance of 10 billion light years in the middle of a region with an extraordinary concentration of galaxies. Researchers found this massive overdensity of early galaxies, called a "protocluster," through a novel survey project led by Zheng Cai, a Hubble Postdoctoral Fellow at UC Santa Cruz.

MAMMOTH-1 is an extended blob of gas in the intergalactic medium called an enormous Lyman-alpha nebula (ELAN). The color map and contours denote the surface brightness of the nebula, and the red arrows show its estimated spatial extent.

Credit: Cai et al., Astrophysical Journal (Figure 2)

"Our survey was not trying to find nebulae. We're looking for the most overdense environments in the early universe, the big cities where there are lots of galaxies," said Cai. "We found this enormous nebula in the middle of the protocluster, near the peak density."

Cai is first author of a paper on the discovery accepted for publication in the Astrophysical Journal and available online at arxiv.org/abs/1609.04021. His survey project is called Mapping the Most Massive Overdensities Through Hydrogen (MAMMOTH), and the newly discovered ELAN is known as MAMMOTH-1.

Coauthor J. Xavier Prochaska, professor of astronomy and astrophysics at UC Santa Cruz, said previously discovered ELANs have been detected in quasar surveys. In those cases, the intense radiation from a quasar illuminated hydrogen gas in the nebula, causing it to emit Lyman-alpha radiation. Prochaska's team discovered the first ELAN, dubbed the "Slug Nebula," in 2014. MAMMOTH-1 is the first one not associated with a visible quasar, he said.

"It's extremely bright, and it's probably larger than the Slug Nebula, but there's nothing else visible except the faint smudge of a galaxy. So it's a terrifically energetic phenomenon without an obvious power source," Prochaska said.

Equally impressive is the enormous protocluster in which it resides, he said. Protoclusters are the precursors to galaxy clusters, which consist of hundreds to thousands of galaxies bound together by gravity. Because protoclusters are spread out over a much larger area of the sky, they are much harder to find than galaxy clusters.

The protocluster hosting the MAMMOTH-1 nebula is massive, with an unusually high concentration of galaxies in an area about 50 million light years across. Because it is so far away (10 billion light years), astronomers are in effect looking back in time to see the protocluster as it was 10 billion years ago, or about 3 billion years after the big bang, during the peak epoch of galaxy formation. After evolving for 10 billion more years, this protocluster would today be a mature galaxy cluster perhaps only one million light years across, having collapsed down to a much smaller area, Prochaska said.

The standard cosmological model of structure formation in the universe predicts that galaxies are embedded in a cosmic web of matter, most of which is invisible dark matter. The gas that collapses to form galaxies and stars traces the distribution of dark matter and extends beyond the galaxies along the filaments of the cosmic web. The MAMMOTH-1 nebula appears to have a filamentary structure that aligns with the galaxy distribution in the large-scale structure of the protocluster, supporting the idea that ELANs are illuminated segments of the cosmic web, Cai said.

"From the distribution of galaxies we can infer where the filaments of the cosmic web are, and the nebula is perfectly aligned with that structure," he said.

Cai and his coauthors considered several possible mechanisms that could be powering the Lyman-alpha emission from the nebula. The most likely explanations involve radiation or outflows from an active galactic nucleus (AGN) that is strongly obscured by dust so that only a faint source can be seen associated with the nebula. An AGN is powered by a supermassive black hole actively feeding on gas in the center of a galaxy, and it is usually an extremely bright source of light (quasars being the most luminous AGNs in visible light).

The intense radiation from an AGN can ionize the gas around it (called photoionization), and this may be one mechanism at work in MAMMOTH-1. When ionized hydrogen in the nebula recombines it would emit Lyman-alpha radiation. Another possible mechanism powering the Lyman-alpha emissions is shock heating by a powerful outflow of gas from the AGN.

The researchers described several lines of evidence supporting the existence of a hidden AGN energizing the nebula, including the dynamics of the gas and emissions from other elements besides hydrogen, notably helium and carbon.

"It has all the hallmarks of an AGN, but we don't see anything in our optical images. I expect there's a quasar that is so obscured by dust that most of its light is hidden," Prochaska said.

Contacts and sources:
Tim Stephens
University of California Santa Cruz

38,000-Year-Old Engravings Confirm Ancient Origins of Pointillist Technique Used By Seurat, Van Gogh

A newly discovered trove of 16 engraved and otherwise modified limestone blocks, created 38,000 years ago, confirms the ancient origins of the pointillist techniques later adopted by 19th and 20th century artists such as Georges Seurat, Vincent Van Gogh, Camille Pissarro, and Roy Lichtenstein.

"We're quite familiar with the techniques of these modern artists," observes New York University anthropologist Randall White, who led the excavation in France's Vézère Valley. "But now we can confirm this form of image-making was already being practiced by Europe's earliest human culture, the Aurignacian."

Pointillism, a painting technique in which small dots are used to create the illusion of a larger image, was developed in the 1880s. However, archaeologists have now found evidence of this technique thousands of years earlier -- dating back more than 35,000 years.

Newly discovered limestone slab from Abri Cellier with pointillist mammoth in profile view formed my dozens of individual punctuations and re-shaping of the natural edge of the block to conform to the animals head and back line.

Photo and drawing by R. Bourrillon.

The findings appear in the journal Quaternary International.

Major discoveries by White and his colleagues--which include images of mammoths and horses--confirm that a form of pointillism was used by the Aurignacian, the earliest modern human culture in Europe. These add weight to previous isolated discoveries, such as a rhinoceros, from the Grotte Chauvet in France, formed by the application of dozens of dots, first painted on the palm of the hand, and then transferred to the cave wall.

Earlier this year, White's team reported the uncovering of a 38,000-year-old pointillist image of an aurochs or wild cow--a finding that marks some of the earliest known graphic imagery found in Western Eurasia and offers insights into the nature of modern humans during this period. Now, in short order they have found another pointillist image--this time of a woolly mammoth--in a rock shelter of the same period known as Abri Cellier located near the previous find-site of Abri Blanchard.

Abri Cellier has long been on archeologists' short-list of major art-bearing sites attributed to the European Aurignacian. Excavations in 1927 yielded 15 engraved and/or pierced limestone blocks that have served as a key point of reference for the study of Aurignacian art in the region.

This is a graphic rendering of the recently published Blanchard aurochs illustrating the arrangement of punctuations in relation to the animal.

Photo and drawing by R. Bourrillon

In 2014, White and his colleagues returned to Cellier, seeking intact deposits that would allow a better understanding of the archaeological sequence at the site and its relationship to other Aurignacian sites. They had their fingers crossed that the new excavation might yield new engraved images in context, but nothing prepared them for the discovery of the 16 stone blocks detailed in the Quaternary International article. One of these, broken in half prehistorically, was found in place with a radiocarbon date of 38,000 years ago.

Remarkably, the remaining 15 blocks, including the pointillist mammoth, one of three mammoth figures recognized during the new work at Cellier, had been left on-site by the 1927 excavators. As many of the engraved traces are rudimentary and thus difficult to interpret, the original excavators set them aside just in case they might have something inscribed on them. The new article presents evidence that the 38,000 year date for the newly excavated engraving also applies to the new trove and to the other blocks found in 1927 and now housed in the French National Prehistory Museum.

Over the past decade, with these and other discoveries, White and his team have increased our known sample of the earliest graphic arts in southwestern France by 40 percent. The team includes researchers from the University of Arizona, the University of Toronto, the University of Toulouse, Paris' Museum of Natural History, and the University of Oxford.

The research appearing in Quaternary International was supported by the Partner University Fund and the Andrew Mellon Foundation, the Direction régional des affaires culturelles d'Aquitaine (DRAC-Aquitaine), the Institut des Sciences Humaines et Sociales (INSHS) of the CNRS, the Faculty of Arts and Science at NYU, and the Fyssen Foundation.

Contacts and sources:
James Devitt
New York University

Friday, February 24, 2017

Stunning Images from 30 Years of Observing Supernova 1987a

Three decades ago, astronomers spotted one of the brightest exploding stars in more than 400 years. The titanic supernova, called Supernova 1987A (SN 1987A), blazed with the power of 100 million suns for several months following its discovery on Feb. 23, 1987.

Since that first sighting, SN 1987A has continued to fascinate astronomers with its spectacular light show. Located in the nearby Large Magellanic Cloud, it is the nearest supernova explosion observed in hundreds of years and the best opportunity yet for astronomers to study the phases before, during, and after the death of a star.

The video begins with a nighttime view of the Small and Large Magellanic clouds, satellite galaxies of our Milky Way. It then zooms into a rich star-birth region in the Large Magellanic Cloud. Nestled between mountains of red-colored gas is the odd-looking structure of Supernova 1987A, the remnant of an exploded star that was first observed in February 1987. The site of the supernova is surrounded by a ring of material that is illuminated by a wave of energy from the outburst. Two faint outer rings are also visible. All three rings existed before the explosion as fossil relics of the doomed star’s activity in its final days.
Credits: NASA, ESA, and G. Bacon (STScI)

To commemorate the 30th anniversary of SN 1987A, new images, time-lapse movies, a data-based animation based on work led by Salvatore Orlando at INAF-Osservatorio Astronomico di Palermo, Italy, and a three-dimensional model are being released. By combining data from NASA's Hubble Space Telescope and Chandra X-ray Observatory, as well as the international Atacama Large Millimeter/submillimeter Array (ALMA), astronomers — and the public — can explore SN 1987A like never before.

This Hubble Space Telescope image shows Supernova 1987A within the Large Magellanic Cloud, a neighboring galaxy to our Milky Way.

Credits: NASA, ESA, R. Kirshner (Harvard-Smithsonian Center for Astrophysics and Gordon and Betty Moore Foundation), and M. Mutchler and R. Avila (STScI)

This time-lapse video sequence of Hubble Space Telescope images reveals dramatic changes in a ring of material around the exploded star Supernova 1987A. The images, taken from 1994 to 2016, show the effects of a shock wave from the supernova blast smashing into the ring. The ring begins to brighten as the shock wave hits it. The ring is about one light-year across.

Credits: NASA, ESA, and R. Kirshner (Harvard-Smithsonian Center for Astrophysics and Gordon and Betty Moore Foundation), and P. Challis (Harvard-Smithsonian Center for Astrophysics)

Hubble has repeatedly observed SN 1987A since 1990, accumulating hundreds of images, and Chandra began observing SN 1987A shortly after its deployment in 1999. ALMA, a powerful array of 66 antennas, has been gathering high-resolution millimeter and submillimeter data on SN 1987A since its inception.

"The 30 years' worth of observations of SN 1987A are important because they provide insight into the last stages of stellar evolution," said Robert Kirshner of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, and the Gordon and Betty Moore Foundation in Palo Alto, California.

The latest data from these powerful telescopes indicate that SN 1987A has passed an important threshold. The supernova shock wave is moving beyond the dense ring of gas produced late in the life of the pre-supernova star when a fast outflow or wind from the star collided with a slower wind generated in an earlier red giant phase of the star's evolution. What lies beyond the ring is poorly known at present, and depends on the details of the evolution of the star when it was a red giant.

This scientific visualization, using data from a computer simulation, shows Supernova 1987A, as the luminous ring of material we see today.
Credits: NASA, ESA, and F. Summers and G. Bacon (STScI); Simulation Credit: S. Orlando (INAF-Osservatorio Astronomico di Palermo)

"The details of this transition will give astronomers a better understanding of the life of the doomed star, and how it ended," said Kari Frank of Penn State University who led the latest Chandra study of SN 1987A.

Supernovas such as SN 1987A can stir up the surrounding gas and trigger the formation of new stars and planets. The gas from which these stars and planets form will be enriched with elements such as carbon, nitrogen, oxygen and iron, which are the basic components of all known life. These elements are forged inside the pre-supernova star and during the supernova explosion itself, and then dispersed into their host galaxy by expanding supernova remnants. Continued studies of SN 1987A should give unique insight into the early stages of this dispersal.

Some highlights from studies involving these telescopes include:

Hubble studies have revealed that the dense ring of gas around the supernova is glowing in optical light, and has a diameter of about a light-year. The ring was there at least 20,000 years before the star exploded. A flash of ultraviolet light from the explosion energized the gas in the ring, making it glow for decades.

These images, taken between 1994 and 2016 by NASA's Hubble Space Telescope, chronicle the brightening of a ring of gas around an exploded star.

Credits: NASA, ESA, and R. Kirshner (Harvard-Smithsonian Center for Astrophysics and Gordon and Betty Moore Foundation), and P. Challis (Harvard-Smithsonian Center for Astrophysics)

The central structure visible inside the ring in the Hubble image has now grown to roughly half a light-year across. Most noticeable are two blobs of debris in the center of the supernova remnant racing away from each other at roughly 20 million miles an hour.

From 1999 until 2013, Chandra data showed an expanding ring of X-ray emission that had been steadily getting brighter. The blast wave from the original explosion has been bursting through and heating the ring of gas surrounding the supernova, producing X-ray emission.

Astronomers combined observations from three different observatories to produce this colorful, multiwavelength image of the intricate remains of Supernova 1987A.
Credits: NASA, ESA, and A. Angelich (NRAO/AUI/NSF); Hubble credit: NASA, ESA, and R. Kirshner (Harvard-Smithsonian Center for Astrophysics and Gordon and Betty Moore Foundation) Chandra credit: NASA/CXC/Penn State/K. Frank et al.; ALMA credit: ALMA (ESO/NAOJ/NRAO) and R. Indebetouw (NRAO/AUI/NSF)

In the past few years, the ring has stopped getting brighter in X-rays. From about February 2013 until the last Chandra observation analyzed in September 2015 the total amount of low-energy X-rays has remained constant. Also, the bottom left part of the ring has started to fade. These changes provide evidence that the explosion's blast wave has moved beyond the ring into a region with less dense gas. This represents the end of an era for SN 1987A.

Beginning in 2012, astronomers used ALMA to observe the glowing remains of the supernova, studying how the remnant is actually forging vast amounts of new dust from the new elements created in the progenitor star. A portion of this dust will make its way into interstellar space and may become the building blocks of future stars and planets in another system.

These observations also suggest that dust in the early universe likely formed from similar supernova explosions.

Astronomers also are still looking for evidence of a black hole or a neutron star left behind by the blast. They observed a flash of neutrinos from the star just as it erupted. This detection makes astronomers quite certain a compact object formed as the center of the star collapsed — either a neutron star or a black hole — but no telescope has uncovered any evidence for one yet.

These latest visuals were made possible by combining several sources of information including simulations by Salvatore Orlando and collaborators that appear in this paper: https://arxiv.org/abs/1508.02275. The Chandra study by Frank et al. can be found online at http://lanl.arxiv.org/abs/1608.02160. Recent ALMA results on SN 87A are available at https://arxiv.org/abs/1312.4086.

The Chandra program is managed by NASA's Marshall Space Flight Center in Huntsville, Alabama, for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington.

ALMA is a partnership of ESO (representing its member states), NSF (USA) and NINS (Japan), together with NRC (Canada), NSC and ASIAA (Taiwan), and KASI (Republic of South Korea), in cooperation with the Republic of Chile. The Joint ALMA Observatory is operated by ESO, AUI/NRAO and NAOJ.

Contacts and sources:
Donna Weaver / Ray Villard
Space Telescope Science Institute, Baltimore, Md

Megan Watzke
Chandra X-ray Center, Cambridge, Mass.

Rob Gutro
NASA’s Goddard Space Flight Center,

Nuclear Neurology Could Launch Revolution in Diagnosing and Treating Brain Diseases from Alzheimer’s to Autism

“Neuromolecular imaging with these techniques, as well as magnetic resonance imaging, can have as profound an effect on the management of brain disorders as they did on cardiology.” says Robert Miletich, MD, PhD, Interim chair and professor Department of Nuclear Medicine

 When applied to the brain, nuclear medicine techniques reveal critical information about the progression of the most devastating diseases, from Alzheimer’s to traumatic brain injury.

Just last week, Nature published research showing that brain imaging might be able to help diagnose autism in infants as young as six months old, an advance that would represent extraordinary progress in more effectively treating the disease.

Such advances don’t surprise Robert S. Miletich, MD, PhD, interim chair and professor of the Department of Nuclear Medicine at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo, who is studying brain scans obtained from 16,000 patients. He says the wealth of information nuclear imaging techniques provide could pave the way toward a dramatic improvement in the clinical detection and treatment of many brain disorders.

Image of glucose metabolism PET overlaid on CT in an Alzheimer's patient with mild dementia. Green arrow points to regional hypometabolism in the medial parietal lobe, (which processes biographical information and memory),a proposed biomarker for Alzheimer’s.

Credit: UB Center for Positron Emission Tomography

UB researcher with database of 16,000 brain scans proposes that powerful imaging can transform neurology as it did cardiology

“I predict that nuclear neurology is going to become as important to neuroscience as nuclear cardiology has been to cardiology,” he said.

Unraveling complex disorders

“The field of nuclear neurology, also called neuromolecular imaging, provides physiologic imaging of the brain and spinal cord,” he explained. “By doing measurements on physiologic processes in all parts of the brain, we increase our diagnostic accuracy of brain disorders. Understanding these processes can also help us unravel the mystery of complex disorders from autism to Alzheimer’s disease.”

Miletich, who is board-certified in nuclear medicine, uses these techniques to image disorders ranging from brain tumors, traumatic brain injury and epilepsy to transient ischemic attacks, various forms of dementia and movement disorders like Parkinson’s as well as mental illnesses, including depression, anxiety disorders and schizophrenia.

He is teaching his students and medical residents at the Jacobs School of Medicine and Biomedical Sciences to become familiar with these techniques to more accurately detect brain disorders.

Miletich compared potential advances in the study of brain diseases that could result from nuclear neurology to the advances of the 1980s and 1990s that dramatically affected cardiovascular diagnoses and treatment.

“The application of nuclear imaging to the diagnosis and treatment of heart disease led to a new standard of care for cardiology,” said Miletich, noting that cardiologists now routinely use techniques like single photon emission computed tomography (SPECT), positron emission tomography (PET) and computed tomography (CT) for the early detection of structural and electrical defects in the heart.

“Neuromolecular imaging with these techniques, as well as magnetic resonance imaging, can have as profound an effect on the management of brain disorders as they did on cardiology,” he said.

In presentations at the Society for Neuroscience and in a review article published last fall in the American Academy of Neurology journal, Continuum, he described his application of these techniques to specific brain disorders.

Miletich said that nuclear medicine techniques exist to reveal both general regional physiology, such as glucose metabolism or blood perfusion, and specific neurochemical physiology, such as the status of the brain dopamine system. Such information is useful in the care of patients with disorders ranging from strokes to dementia.

Like a stroke in slow motion

“A stroke is a very severe case of hypoperfusion,” explained Miletich, “that’s when brain cells die. What happens in Alzheimer’s is chronic hypoperfusion, almost like a stroke in slow motion, where brain cells start to dwindle. It’s a process mediated by abnormalities of very small blood vessels in the brain.”

The brain scans he studies were obtained over more than two decades from 16,000 individual patients, mostly from Western New York. The patients range from juveniles to the elderly.

Miletich relies on SPECT and PET/CT because these techniques reveal different patterns of physiology. These patterns are both sensitive and specific for many brain disorders.

“There are cases, such as in traumatic brain injury, where commonly used techniques, like CT and MRI show no abnormalities, but where a patient continues to be profoundly compromised in his or her daily life,” he said. “In those cases, cerebral perfusion SPECT and glucose metabolism PET/CT can reveal exactly where in the brain the injury is located and the extent of damage to the brain.

“When I image the brain with SPECT or PET/CT, I can see disorders long before they have fully developed. I can see Alzheimer’s before the patient has dementia. This will facilitate the development of abortive therapy for Alzheimer’s,” said Miletich.

Miletich next plans to present evidence in support of a new model of Alzheimer’s that demonstrates it is caused by multiple factors. He is conducting research that relates vascular risk factors to small vessel disease, and small vessel disease to Alzheimer’s disease. Said Miletich: “These are exciting times in the neurosciences.”

Contacts and sources:
Ellen Goldbaum
University of Buffalo

There Were at Least 2, If Not More, Waves of People Entering South America Contrary to Prevailing Thought

Analysis of ancient human skulls found in southeastern Brazil are providing new insights into the complex narrative of human migration from our origins in sub-Saharan Africa to the peopling of the Americas tens of thousands of years later.

The many differences in cranial morphology, the study of skull shape, seen in Paleoamerican remains found in the Lagoa Santa region of Brazil suggest a model of human history that included multiple waves of population dispersals from Asia, across the Bering Strait, down the North American coast and into South America.

The findings published Feb. 22, 2017 in the journal Science Advances suggest that Paleoamericans share a last common ancestor with modern native South Americans outside, rather than inside, the Americas and underscore the importance of looking at both genetic and morphological evidence, each revealing different aspects of the human story, to help unravel our species' history.

Geographic and historical relationship between populations.

"When you look at contemporary genomic data, the suggestion, particularly for South America, was for one wave of migration and that indigenous South American people are all descendants of that wave," says Noreen von Cramon-Taubadel, an associate professor of anthropology at the University at Buffalo and the paper's lead author. "But our data is suggesting that there were at least two, if not more, waves of people entering South America."

How people settled the Americas is a debate that has continued for years in the scientific community. It's now clear that the first human entry into the Americas began at least 15,000 years ago and dispersed quickly into South America following a coastal Pacific route.

 Map showing the geographic position of each contemporary population and the Paleoamerican sample from Lagoa Santa (Brazil). Stars denote the waypoints used to calculate more realistic geographic distances between populations. (B) Tree topology representing the hierarchical model of historical divergence among populations. The null history model places the Paleoamericans as a sister group to their nearest geographic neighbor (Chubut, Patagonia).
Credit:  Science Advances

The conundrum of conflicting data between morphology and genetics is among the issues fueling the debate of how people first entered the New World, but von Cramon-Taubadel's conclusions are similar to previous morphological research while also relying on a pioneering method to reach those conclusions.

"We've adopted and modified the method from ecology, but to my knowledge this method has never been used in an anthropological setting before," she says.

In the past, researchers have looked mainly at the overall similarities between the morphology of prehistoric skeletons from the Americas compared with the morphology of living people. Models of dispersal, each with a different number of waves that attempt to match existing data, have also been used.

But von Cramon-Taubadel's current research with Mark Hubbe, an associate professor in the Department of Anthropology at Ohio State University, and University of Tübingen researcher André Strauss, doesn't make any previous assumptions about dispersals. It looks at an existing population as descendants of many possible branches of a theoretical tree of relatedness and then uses statistics to determine where in the tree their sample best fits.

This method has the advantage of not needing pre-determined models of dispersal but rather considers all possible patterns of relatedness.

All living people, von Cramon-Taubadel explains, have a common ancestor, but not all fossils necessarily contribute to the ancestry of living people. Some populations of modern humans did not survive or made only a marginal contribution to living people. So fossils of these extinct humans provide few clues about the ancestry of living people.

"There are other fossils, particularly in the Americas and Eurasia where at the moment we are not 100 percent sure how they fit into the human picture," von Cramon-Taubadel says. "We could use this method to elucidate where they sit and to what extent those populations actually play a role in the modern ancestry of people in those areas."

Contacts and sources:
Bert Gambini 
University at Buffalo

Citation:  Evolutionary population history of early Paleoamerican cranial morphology Authors:  Noreen von Cramon-Taubadel,, André Strauss and Mark Hubbe  Science Advances 22 Feb 2017: Vol. 3, no. 2, e1602289 DOI: 10.1126/sciadv.1602289

Uncertainty Perception Drives Public's Trust, Mistrust of Science Affecting Funding for Projects

Many policies — from medicine to terrorism — depend on how the general public accepts and understands scientific evidence. People view different branches of sciences as having different amounts of uncertainty, which may not reflect the actual uncertainty of the field. Yet public perceptions determine action, allocation of funding resources and the direction of public policies. It is therefore necessary to understand perceptions of uncertainty and the influences that political affiliations have on scientific beliefs.

Carnegie Mellon University researchers took the first step to understanding more of the whole picture by measuring scientific uncertainty broadly — across many areas of science, not just topics that are typically polarized. Published in the Journal of Experimental Psychology, the researchers found that how people comprehend the accuracy of a specific scientific field drives their perception of it and how they gauge its uncertainty.

"Uncertainty is a natural part of scientific research, but, in the public domain, it can be used selectively to discredit undesirable results or postpone important policies. Understanding how the public perceives uncertainty is an important first step for understanding how to communicate uncertainty," said Stephen B. Broomell, assistant professor of social and decision sciences in the Dietrich College of Humanities and Social Sciences.

.This map plots scientific disciplines from least to most certain. 'The map shows that perceptions held by the public may not reflect the reality of scientific study,' said Stephen E. Broomell. 'For example, psychology is perceived as the least precise while forensics is perceived as the most precise. However, forensics is plagued by many of the same uncertainties as psychology that involve predicting human behavior with limited evidence.'

Credit: Carnegie Mellon University

To examine perceptions of scientific uncertainty, Broomell and Ph.D. student Patrick Bodilly Kane developed a scale to measure how people judge different sciences. They were then able to create a map that plots scientific disciplines from least to most certain.

"The map shows that perceptions held by the public may not reflect the reality of scientific study," Broomell said. "For example, psychology is perceived as the least precise while forensics is perceived as the most precise. However, forensics is plagued by many of the same uncertainties as psychology that involve predicting human behavior with limited evidence."

Broomell and Kane also found that perceptions of scientific uncertainty were highly correlated with judgments about a particular science's value. And this impacts how people think a scientific field should be funded.

"This tells us that people are not connected to the practice of scientific exploration. When perceived accuracy isn't the same as actual accuracy, this can lead to dangerous choices, as some essential fields like psychology, economics and genetic engineering provide vital social services but may be cut off because of this disconnect," Broomell said.
Credit: Carnegie Mellon University

While political affiliations are not the only factor motivating how science is perceived, the researchers did find that sciences that potentially conflict with a person's ideology are judged as being more uncertain.

"Our political atmosphere is changing. Alternative facts and contradicting narratives affect and heighten uncertainty. Nevertheless, we must continue scientific research. This means we must find a way to engage uncertainty in a way that speaks to the public's concerns," Broomell said.

Interestingly, the study showed that the uncertainty for scientific fields does not carry over and inform perceptions about individual study results. This provides scientists with an opportunity for better communication. Focusing on individual results can help allay misperceptions and concerns. Communicators should therefore focus on the specific details of a study's result rather than engaging in the defense of scientific practice more broadly.


Contacts and sources:
Shilo Rea
Carnegie Mellon University

Citation: Public perception and communication of scientific uncertainty. Broomell, Stephen B.; Kane, Patrick Bodilly Journal of Experimental Psychology: General, Vol 146(2), Feb 2017, 286-304. http://dx.doi.org/10.1037/xge0000260

High Levels of Dangerous Chemicals Found in Indoor Cats

A study from Stockholm University have now established what was previously suspected, that the high levels of brominated flame retardants measured in cats are from the dust in our homes. The study has been published in the journal Environmental Science & Technology.

The study shows that cats are exposed to chemicals found in electronics and furniture, chemicals that become dust and can adversely affect health. It is the first time that this connection has been verified. In a previous study, the researchers demonstrated that brominated flame retardants were found in higher concentrations in the blood of cats that had developed Feline hyperthyroidism (hyperthyroidism in cats) compared to healthy cats. Now, new measurements of healthy cats establish their dust exposure. Paired samples were taken from the same household, i.e. they took both dust samples and blood samples at the same time.

This image shows a cat. A study from Stockholm University has now established what was previously suspected, that the high levels of brominated flame retardants measured in cats are from the dust in our homes.

Photo: Jana Weiss

Exposure to chemicals

"By taking paired samples, we have greater insight into the environment that the cats live in. Moreover the cats in the study spent the majority of their time indoors and therefore air and dust in the home is expected to contribute more than the outdoor environment", says Jana Weiss at the Department of Environmental Science and Analytical Chemistry, Stockholm University.

The results are very interesting because small children, notorious for putting everything in their mouths, have exposures to these chemicals similar to cats.

"The brominated flame retardants that have been measured in cats are known endocrine disruptors. It's particularly serious when small children ingest these substances because exposure during the development can have consequences later in life, such as thyroid disease", says Jana Weiss.

About brominated flame retardants

Brominated flame retardants are added to textiles, furniture and electronic equipment to prevent the material from igniting. Many of the brominated flame retardants have been found to be health hazards, and some are suspected endocrine disruptors. A number of them have been prohibited for these reasons in products like electronic goods. However, they are extremely persistent and can leach from the products for many years after they have been produced, ultimately becoming part of dust.

About the study

The researchers took blood samples from cats and gathered dust in the children's room, the adults' bedroom and the living room. The samples were then analysed for brominated and chlorinated contaminants. The researchers found not only those that are currently in use, but also chemicals that have been banned for decades.

The article Cats' Internal Exposure to Selected Brominated Flame Retardants and Organochlorines Correlated to House Dust and Cat Food can be read here: http://pubs.acs.org/doi/pdf/10.1021/acs.est.6b05025

The MISSE project

The study is part of an ongoing project called MiSSE (Mixture Assessment of Endocrine Disrupting Compounds). The project aims to identify and evaluate the mix of endocrine disruptors we have in our indoor environment. The project will have a final conference in Stockholm on November 29, 2017 where all results will be presented and discussed.

Contacts and sourcesL
Jana Weiss, scientific coordinator of the project MiSSE,
Department of Environmental Science and Analytical Chemistry at Stockholm University

Historic Cultural Records Inform Scientific Perspectives on Woodland Uses

Scientists at the University of York and University College Cork have investigated how cultural records dating back 300 years could help improve understanding of the ways in which science interprets the many uses of woodland areas.

The researchers hope that the work will give a cultural narrative to environmental data collected over time, but also give new insight into the ways in which woodland management systems can be adapted to increase a sense of ownership amongst communities that live near woodland areas.

Historical evidence, gathered and analysed by the Shrawley Lime Group, a group of experts investigating the cultural and ecological history of Shrawley Woods, provided the team with new thinking on how documented woodland uses can be coloured by a cultural perception of them as either a home for various tree species; working woods; or spaces of leisure.

These are the Shrawley Woods.

Credit: Dr Suzi Richer

From researching pollen grains preserved in a waterlogged area of Shrawley Woods, the researchers were able to provide environmental data dating back to the 11th century. This was then compared with oral history records from the 18th century, which revealed the differences that occur in how the same type of tree is referenced between environmental and cultural records over time.

Researchers showed that the name of the tree related more closely to how it was used by woodland dwellers and not by its species name, a feature that becomes more common from the industrial revolution onwards.

The team found that the scientific data referenced both oak and lime species in the woods, but the historical information refers primarily to the products of the woods, such as 'poles' used for hop growing, and do not reference the species name at all. It is only when the local oral history evidence is included that historical and scientific data can be linked together and the evolution of wooded areas fully understood.

Dr Suzi Richer, from the University of York's Archaeology and Environment Departments, said: "We find that many books, television programmes, films, and art work, position woodlands as 'dangerous' or 'alien' places, where cultural norms can be broken, but archaeological and historical evidence shows that these were often working and living spaces with evidence of charcoal burning, brick kilns, and water-powered mills, which bring people and wooded areas much closer together in a working, living harmony.

"Scientific data by itself, particularly if it spans over many years, can miss out the cultural and social context of the period it represents and therefore the relationship between the environment and the people who lived there in the past. This can be crucial to help us interpret environmental records more fully."

Records show that from around the 1800s, woodlands become far less 'personal' in the way in which they are documented, but the oral history accounts demonstrate that this 'other way' of seeing trees persisted and still persists in areas of the West Midlands today.

The need to standardise resources was also consistent with the Enlightenment way of seeing the world at that time - one which saw the natural world as 'civilised'. It is from this point onwards that names, like 'oak', are used more commonly.

Dr Ben Gearey, from the Department of Archaeology, University College Cork, said: "We often think of environmental data as giving us information on the adverse effects that human activity can have on the environment, but our research shows that it can also demonstrate how cultural perceptions of a landscape or species can shape conservation efforts.

"We hope that this work demonstrates the importance of combining information from scientific and cultural approaches, and also accounts from the local communities in which these types of studies are undertaken.

"The next stage is to look more closely at the archaeological record and how we can present combined records so that they are meaningful for policy makers and woodland managers."

Contacts and sources:
Samantha Martin
University of York

The research is published in the Environmental Archaeology: Journal of Human Palaeoecology. http://dx.doi.org/10.1080/14614103.2017.1283765

The Hole in the Universe

The events surrounding the Big Bang were so cataclysmic that they left an indelible imprint on the fabric of the cosmos. We can detect these scars today by observing the oldest light in the Universe. As it was created nearly 14 billion years ago, this light — which exists now as weak microwave radiation and is thus named the cosmic microwave background (CMB) — has now expanded to permeate the entire cosmos, filling it with detectable photons.

The CMB can be used to probe the cosmos via something known as the Sunyaev-Zel’dovich (SZ) effect, which was first observed over 30 years ago. We detect the CMB here on Earth when its constituent microwave photons travel to us through space. On their journey to us, they can pass through galaxy clusters that contain high-energy electrons. These electrons give the photons a tiny boost of energy. Detecting these boosted photons through our telescopes is challenging but important — they can help astronomers to understand some of the fundamental properties of the Universe, such as the location and distribution of dense galaxy clusters.

Credit: ALMA (ESO/NAOJ/NRAO)/T. Kitayama (Toho University, Japan)/ESA/Hubble & NASA

This image shows the first measurements of the thermal Sunyaev-Zel’dovich effect from the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile (in blue). Astronomers combined data from ALMA’s 7- and 12-metre antennas to produce the sharpest possible image. The target was one of the most massive known galaxy clusters, RX J1347.5–1145, the centre of which shows up here in the dark “hole” in the ALMA observations. The energy distribution of the CMB photons shifts and appears as a temperature decrease at the wavelength observed by ALMA, hence a dark patch is observed in this image at the location of the cluster.

Contacts and sources:
Richard Hook