Friday, March 31, 2017

From Beethoven to Bieber, Chimps Don't Care, Music Is Falling on Deaf Ears

Playing music to captive chimpanzees has no positive effect on their welfare, researchers have concluded.

Previous research conducted with chimpanzees living in laboratories has suggested that playing music has positive effects on the animals’ welfare, however, other research with zoo-housed primates has yielded mixed results.

Many zoos continue to broadcast music to their primates either as a form of enrichment or for the enjoyment of caregivers.

Research conducted by Dr Emma K Wallace, from the University of York’s Department of Psychology, investigated how classical and pop/rock music affected the behaviour of the chimpanzees at RZSS Edinburgh Zoo to establish if it impacted positively or negatively on their welfare.

A chimp "on the jukebox"  

Credit: Emma Wallace, University of York

Music selection

Further research involved a ‘chimpanzee jukebox’ , which allowed the chimpanzees at RZSS Edinburgh Zoo and the National Centre for Chimpanzee Care, Texas, the option to choose whether they wanted to listen to classical music, pop/rock music or silence.

Some of the music that the chimpanzees were able to select included work by Mozart, Beethoven, Adele and Justin Bieber.

The combined results of these studies show that neither classical nor pop/rock music has a positive effect on the welfare of these chimpanzees. They also did not show any consistent or persistent preferences for either type of music or silence.
No effects

Dr Wallace said: “These results suggest that music is not something that is relevant to captive chimpanzees and are supported by recent work with zoo-housed orangutans that were unable to distinguish music from digitally scrambled noise.

“However, whilst music does not appear to have a positive effect on captive chimpanzee welfare, it equally did not have any negative effects.

“As such it should not be considered a successful form of enrichment for these animals but, providing that the animals have the option to avoid it, music can still be played for animal caregivers.

“These results also highlight the possibility that music appreciation is something that is a uniquely human trait.”

The welfare of captive animals, especially those living in zoos, is of the utmost importance to those who care for them.

Providing animals with enrichment, such as toys, puzzle-feeders or unfamiliar smells, is a commonly used method of presenting the animals with mental challenges or novel forms of stimulation.

The research is published in PLOS ONE.

Contacts and sources:
Alistair Keely
University of York

Fanged Fish with Heroin-Like Venom May Lead to New Pain Killers

A fearless fanged coral reef fish that disables its opponents with heroin-like venom could offer hope for the development of new painkillers.

University of Queensland researcher Associate Professor Bryan Fry said the venomous fang blenny was found in the Pacific region, including on the Great Barrier Reef.

"The fish injects other fish with opioid peptides that act like heroin or morphine, inhibiting pain rather than causing it," he said.

"Its venom is chemically unique.

"The venom causes the bitten fish to become slower in movement and dizzy by acting on their opioid receptors.

The fanged fish's heroin-like venom could lead to pain treatments.

Image courtesy Associate Professor Bryan Fry

"To put that into human terms, opioid peptides would be the last thing an elite Olympic swimmer would use as performance-enhancing substances. They would be more likely to drown than win gold."

Fang blennies, also known as poison-fang blennies or sabre-tooth blennies, of the genus Meiacanthus, are popular as ornamental tropical aquarium fish.

"Fang blennies are the most interesting fish I've ever studied and have one of the most intriguing venoms of them all," Associate Professor Fry said.

"These fish are fascinating in their behaviour. They fearlessly take on potential predators while also intensively fighting for space with similar sized fish.

This is Meiacanthus grammistes.
Image courtesy Associate Professor Bryan Fry

"Their secret weapons are two large grooved teeth on the lower jaw that are linked to venom glands."

Associate Professor Fry said the unique venom meant, the fang blenny was more easily able to escape a predator or defeat a competitor.

"This study is an excellent example of why we need to protect nature," he said.

"If we lose the Great Barrier Reef, we will lose animals like the fang blenny and its unique venom that could be the source of the next blockbuster pain-killing drug."

Contacts and sources:
Bryan Fry
University of Queensland

The research, published in Current Biology, was led by Associate Professor Fry, who works with the UQ School of Biological Sciences Venom Evolution Laboratory, and Dr Nicholas Casewell of the Liverpool School of Tropical Medicine in the UK.

Massive, Computer-Analyzed Geological Database Reveals Chemistry of Ancient Ocean

A study that used a new digital library and machine reading system to suck the factual marrow from millions of geologic publications dating back decades has unraveled a longstanding mystery of ancient life: Why did easy-to-see and once-common structures called stromatolites essentially cease forming over the long arc of earth history?

Stromatolites are contorted layers of sediment formed by microbes, and they are often found in limestone and other ancient sedimentary rocks deposited beneath oceans.

Julia Wilcots, a Madison native who was then at Princeton University, at a rock quarry in Shorewood Hills, Wisconsin. The layers at her head and chest level are both composed of different types of stromatolites.
 .Credit: Shanan Peters 

“Geologists have known for a long time that stromatolites were abundant in shallow marine environments during the Precambrian, before the emergence of multi-cellular life” more than 560 million years ago, says Jon Husson, a post-doctoral researcher and co-author of a study now online in the journal Geology. “But, stromatolites are rare in the ocean today.”

The new study measures the slide in stromatolite prevalence based on descriptions of rocks sifted from more than 3 million scientific publications.

“Paleontologists have largely attributed the decline in stromatolites to the evolution of animals, starting some 560 million years ago,” says Shanan Peters, a professor of geoscience at University of Wisconsin–Madison and study first author. “Many multi-cellular animals, like snails, eat microbes. The evolution of these big microbe-grazing animals hit ‘reset’ on the stromatolite’s world. Or so the story has gone.”

Jon Husson, a post-doctoral researcher working with Shanan Peters, holds a stromatolite, a biologically based structure that was much more common early in Earth history.
Credit: David Tenebaum 

The new study found a weak correlation between stromatolite occurrence and the diversity of animals, but a stronger link to seawater chemistry.

“The best predictor of stromatolite prevalence, both before and after the evolution of animals, is the abundance of dolomite in shallow marine sediments,” says Husson. Dolomite is a high-magnesium variety of carbonate, the type of sediment that forms limestone. Dolomite is harder to make than low-magnesium carbonate and it forms today in only a narrow range of marine environments.

When the ocean water is super-saturated with carbonate, “that can make it easier for things like stromatolites to form,” says Husson. “In Lake Tanganyika [Africa], there are stromatolites forming today, even though there are animals everywhere, snails and fish. The lake is super-saturated with carbonate, and it’s begging to be precipitated. The microbes come along and help it to precipitate, and the result is an abundance of stromatolites.” Elevated carbonate saturation can also help the formation of dolomite, thereby driving the correlation with stromatolites found in this study.

A stromatolite from Northern Wisconsin in the courtyard of Weeks Hall on the UW–Madison campus.

Credit: David Tenebaum

Measuring the prevalence of stromatolites through all Earth history is difficult because counting the number of stromatolites alone is not sufficient. You must also know how many rocks could potentially have stromatolites, but do not.

The big innovation of this study is the interplay of a new type of digital library and machine reading system called GeoDeepDive with a geological database called Macrostrat. Both were spearheaded by Peters at UW–Madison.
GeoDeepDive is a digital library built on high throughput computing technology that can “read” millions of papers and siphon off specific information. To date, the GeoDeepDive library contains more than 3 million scientific publications from all scientific disciplines; some 10,000 new published papers are added daily.

Macrostrat is a database describing the known geological properties of North America’s upper crust, at different times and depths.

The massive computing capacity at UW–Madison’s Center for High Throughput Computing and HTCondor system, the brainchild of UW–Madison computer scientist Miron Livny, powers GeoDeepDive. Combining the digital library with the geological database allowed the researchers to estimate, at different time periods, the percentage of shallow marine rocks that actually have stromatolites.

The study began in the summer of 2015, when the third author, Julia Wilcots, a Madison-native who was then an undergraduate at Princeton, asked Peters for a summer project. “In my typical fashion I gave Julia a few options,” Peters says. “She picked stromatolites, so I said, ‘Okay, go do it!’ With minimal help from us, she developed a working application to discover and extract every mention of stromatolites from our library.”

Among 10,200 papers that mentioned stromatolites, “our program was able to extract 1,013 with a name of a rock unit, which enabled us to link stromatolite occurrences to Macrostrat,” says Husson.

Wilcots did not have to travel to see stromatolites, Peters says. “In Madison, we are sitting on top of rocks recording one of the biggest rises in stromatolite abundance – at least during the age of animals.”

Scientists long ago observed that stromatolites started a long decline just before the start of the Cambrian era, but that decline represented a “fundamental question of paleobiology,” Husson says. “Stromatolites are the oldest fossils that are visible to the naked eye. If you look at rock that is a billion years old, the chance for seeing evidence of life equals the chance of seeing stromatolites.”

Beyond answering a fundamental question of Earth’s history, the new study “allows us to do the kind of analyses that scientists used to only dream about, Peters says: ‘If we could just compile all the published information on… anything!’

“Doing this study without GeoDeepDive would be all but impossible,” Peters adds. “Reading thousands of papers to pick out references to stromatolites, and then linking them to a certain rock unit and geologic period, would take an entire career, even with Google Scholar. Here we got started with a talented undergrad working on a summer project. GeoDeepDive has greatly lowered the barrier to compiling literature data in order to answer many questions.”

Another beauty of the big data, machine-reading approach is the baked-in capability for replication and improvement. “Now that this study has been done, we can run the stromatolite application again and again. We can refine the searches, and they will evaluate the new data that is being published all the time,” Peters says. “So a rerun could make a better study, with minimal effort.”

For centuries, “geologists have transferred hard-to-get information from the field to hard-to-get information in the literature,” Peters says. “To achieve a broad-scale synthesis, you have to survey all of the published knowledge. There are new discoveries waiting in the scientific literature, if you can see the big picture and get all the data into one place.”

Contacts and sources:
David Tenebaum

How Mars Lost Its Atmosphere

Solar wind and radiation are responsible for stripping the Martian atmosphere, transforming Mars from a planet that could have supported life billions of years ago into a frigid desert world, according to new results from NASA's MAVEN spacecraft.

"We've determined that most of the gas ever present in the Mars atmosphere has been lost to space," said Bruce Jakosky, principal investigator for the Mars Atmosphere and Volatile Evolution Mission (MAVEN), University of Colorado in Boulder. The team made this determination from the latest results, which reveal that about 65 percent of the argon that was ever in the atmosphere has been lost to space. Jakosky is lead author of a paper on this research to be published in Science on Friday, March 31.

This artist’s concept depicts the early Martian environment (right) – believed to contain liquid water and a thicker atmosphere – versus the cold, dry environment seen at Mars today (left). NASA's Mars Atmosphere and Volatile Evolution is in orbit of the Red Planet to study its upper atmosphere, ionosphere and interactions with the sun and solar wind.

Credits: NASA’s Goddard Space Flight Center

In 2015, MAVEN team members previously announced results that showed atmospheric gas is being lost to space today and described how atmosphere is stripped away. The present analysis uses measurements of today’s atmosphere for the first estimate of how much gas was lost through time.

Liquid water, essential for life, is not stable on Mars' surface today because the atmosphere is too cold and thin to support it. However, evidence such as features resembling dry riverbeds and minerals that only form in the presence of liquid water indicates the ancient Martian climate was much different – warm enough for water to flow on the surface for extended periods.

This infographic shows how Mars lost argon and other gasses over time due to ‘sputtering.’ Click to enlarge.

Credits: NASA’s Goddard Space Flight Center
View full size infographic

“This discovery is a significant step toward unraveling the mystery of Mars' past environments,“ said Elsayed Talaat, MAVEN Program Scientist, at NASA Headquarters in Washington. “In a broader context, this information teaches us about the processes that can change a planet’s habitability over time.”

There are many ways a planet can lose some of its atmosphere. For example, chemical reactions can lock gas away in surface rocks, or an atmosphere can be eroded by radiation and a stellar wind from a planet's parent star. The new result reveals that solar wind and radiation were responsible for most of the atmospheric loss on Mars, and the depletion was enough to transform the Martian climate. The solar wind is a thin stream of electrically conducting gas constantly blowing out from the surface of the sun.

The early Sun had far more intense ultraviolet radiation and solar wind, so atmospheric loss by these processes was likely much greater in Mars' history. According to the team, these processes may have been the dominant ones controlling the planet's climate and habitability. It's possible microbial life could have existed at the surface early in Mars’ history. As the planet cooled off and dried up, any life could have been driven underground or forced into rare surface oases.

Jakosky and his team got the new result by measuring the atmospheric abundance of two different isotopes of argon gas. Isotopes are atoms of the same element with different masses. Since the lighter of the two isotopes escapes to space more readily, it will leave the gas remaining behind enriched in the heavier isotope. The team used the relative abundance of the two isotopes measured in the upper atmosphere and at the surface to estimate the fraction of the atmospheric gas that has been lost to space.

As a "noble gas" argon cannot react chemically, so it cannot be sequestered in rocks; the only process that can remove noble gases into space is a physical process called "sputtering" by the solar wind. In sputtering, ions picked up by the solar wind can impact Mars at high speeds and physically knock atmospheric gas into space. The team tracked argon because it can be removed only by sputtering. Once they determined the amount of argon lost by sputtering, they could use this information to determine the sputtering loss of other atoms and molecules, including carbon dioxide (CO2).

CO2 is of interest because it is the major constituent of Mars' atmosphere and because it's an efficient greenhouse gas that can retain heat and warm the planet. "We determined that the majority of the planet's CO2 was also lost to space by sputtering," said Jakosky. "There are other processes that can remove CO2, so this gives the minimum amount of CO2 that's been lost to space."

The team made its estimate using data from the Martian upper atmosphere, which was collected by MAVEN's Neutral Gas and Ion Mass Spectrometer (NGIMS). This analysis included measurements from the Martian surface made by NASA's Sample Analysis at Mars (SAM) instrument on board the Curiosity rover.

"The combined measurements enable a better determination of how much Martian argon has been lost to space over billions of years," said Paul Mahaffy of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Using measurements from both platforms points to the value of having multiple missions that make complementary measurements." Mahaffy, a co-author of the paper, is principal investigator on the SAM instrument and lead on the NGIMS instrument, both of which were developed at NASA Goddard.

The research was funded by the MAVEN mission. MAVEN's principal investigator is based at the University of Colorado's Laboratory for Atmospheric and Space Physics, Boulder, and NASA Goddard manages the MAVEN project. MSL/Curiosity is managed by NASA's Jet Propulsion Laboratory, Pasadena, California.

This 2013 video explains how the process called "sputtering" may have caused Mars to lose its atmosphere.
Credits: NASA Goddard

For more information on MAVEN, visit:

Contacts and sources
Bill Steigerwald / Nancy Jones
NASA Goddard Space Flight Center

Space-Farming: A Long Legacy Leading Us to Mars

Following a new NASA bill, passed in March by the US Congress and which authorizes $19.5 billion spending for space exploration in 2017, manned missions to Mars are closer to reality than ever before.

As both public and private enterprises gear up towards a return to the Moon and the first human footsteps on the Red Planet, there is a renewed focus on keeping people alive and productive in these extreme environments. Plants, and specifically crop plants, will be a major component of proposed regenerative life-support systems as they provide food, oxygen, scrub carbon dioxide, and aid in water recycling - all in a self-regenerating or 'bioregenerative' fashion. 

Lunar Farm – The ability to grow food locally will be vital to economically sustaining human presences on the moon, on other planets, and in orbital habitats.
Credit: Space Studies Institute

Without a doubt, plants are a requirement for any sufficiently long duration (time and distance wise) human space exploration mission. There has been a great deal of research in this area - research that has not only advanced Agriculture in Space, but has resulted in a great many Earth-based advances as well (e.g., LED lighting for greenhouse and vertical farm applications; new seed potato propagation techniques, etc.)

Research into space farming has resulted in numerous Earth-based advances (e.g., LED lighting for greenhouse and vertical farm applications; new seed potato propagation techniques, etc.) There are still many technical challenges, but plants and associated biological systems can and will be a major component of the systems that keep humans alive when we establish ourselves on the Moon, Mars and beyond.
Credit: NASA

A recent article by Dr. Raymond M. Wheeler from the NASA Kennedy Space Center, now available in open access in the journal Open Agriculture, provides an informative and comprehensive account of the various international historical and current contributions to bioregenerative life-support and the use of controlled environment agriculture for human space exploration. Covering most of the major developments of international teams, it relates some of this work to technology transfer which proves valuable here on Earth.

The idea of using plants to keep people alive and productive in space is not new, both in concept and in scientific inquiry. The article covers a large portion of the historical international research effort that will be the foundation for many of the trade studies and mission design plans for use of artificial ecosystems in space.

Research in the area started in 1950s and 60s through the works of Jack Myers and others, who studied algae for oxygen production and carbon dioxide removal for the US Air Force and the National Aeronautics and Space Administration (NASA). Studies on algal production and controlled environment agriculture were also carried out by Russian researchers in Krasnoyarsk, Siberia beginning in the 1960s including tests with human crews whose air, water, and much of their food were provided by wheat and other crops. NASA initiated its Controlled Ecological Life Support System (CELSS) Program in the early 1980s with testing focused on controlled environment production of wheat, soybean, potato, lettuce, and sweet potato. Findings from these studies paved the way to conduct tests in a 20 m2, atmospherically closed chamber located at Kennedy Space Center.

At about the same time, Japanese researchers developed a Closed Ecology Experiment Facilities (CEEF) in Aomori Prefecture to conduct closed system studies with plants, humans, animals, and waste recycling systems. CEEF had 150 m2 of plant growth area, which provided a near-complete diet along with air and water regeneration for two humans and two goats.

The European Space Agency MELiSSA Project began in the late 1980s and pursued ecological approaches for providing gas, water and materials recycling for space life support, and later expanded to include plant testing.

A Canadian research team at the University of Guelph started a research facility for space crop research in 1994. Only a few years later, they went on to develop sophisticated canopy-scale hypobaric plant production chambers for testing crops for space, and have since expanded their testing for a wide range of controlled environment agriculture topics.

Most recently, a group at Beihang University in Beijing designed, built and tested a closed life support facility (Lunar Palace 1), which included a 69 m2 agricultural module for air, water, and food production for three humans.

As a result of these international studies in space agriculture, novel technologies and findings have been produced; this includes the first use of light emitting diodes for growing crops, one of the first demonstrations of vertical agriculture, use of hydroponic approaches for subterranean crops like potato and sweet potato, crop yields that surpassed reported record field yields, the ability to quantify volatile organic compound production (e.g., ethylene) from whole crop stands, innovative approaches for controlling water delivery, approaches for processing and recycling wastes back to crop production systems, and more. 

The theme of agriculture for space has contributed to, and benefited from terrestrial, controlled environment agriculture and will continue to do so into the future. There are still numerous technical challenges, but plants and associated biological systems can and will be a major component of the systems that keep humans alive when we establish ourselves on the Moon, Mars and beyond.

According to Dr. Gary W. Stutte, NASA's principal investigator for several spaceflight experiments designed to grow plants in microgravity:

"Dr. Ray Wheeler has written a compelling and complete history of the people that have committed their careers to enabling the colonization of space. Drawing upon his deep understanding of the programs developed, people involved, and progress achieved to highlight the accomplishments and contributions of scientist and engineers around the world to bring the vision of space exploration to fruition, he details the problems, challenges, results and contributions from the programs, and reveals how they benefited Earth, as well as space. 

The review underscores that the answers will be achieved not through proclamation, but through collaboration between nations, cooperation between people, and sustained commitment by institutions. His article should be required reading for anyone with even a passing interest in the Space Agriculture."

Contacts and sources:
Maria Hrynkiewicz
De Gruyter Open

Citation: Agriculture for Space: People and Places Paving the Way
Raymond M. Wheeler
Published Online: 2017-02-10 | DOI:

Search for Stellar Survivor of a Supernova Explosion

Astronomers have used the NASA/ESA Hubble Space Telescope to observe the remnant of a supernova explosion in the Large Magellanic Cloud. Beyond just delivering a beautiful image, Hubble may well have traced the surviving remains of the exploded star’s companion.

A group of astronomers used Hubble to study the remnant of the Type Ia supernova explosion SNR 0509-68.7 — also known as N103B (seen at the top). The supernova remnant is located in the Large Magellanic Cloud, just over 160 000 light-years from Earth. In contrast to many other Supernova remnants N103B does not appear to have a spherical shape but is strongly elliptical. Astronomers assume that part of material ejected by the explosion hit a denser cloud of interstellar material, which slowed its speed. The shell of expanding material being open to one side supports this idea.

This image, taken with the NASA/ESA Hubble Space Telescope, shows the supernova remnant SNR 0509-68.7, also known as N103B (top of the image). N103B was a Type Ia supernova, located in the Large Magellanic Cloud — a neighbouring galaxy of the Milky Way. Owing to its relative proximity to Earth, astronomers observe the remnant to search for a potential stellar survivor of the explosion.

The orange-red filaments visible in the image show the shock fronts of the supernova explosion. These filaments allow astronomers to calculate the original centre of the explosion. The filaments also show that the explosion is no longer expanding as a sphere, but is elliptical in shape. Astronomers assume that part of material ejected by the explosion hit a denser cloud of interstellar material, which slowed its speed. The shell of expanding material being open to one side supports this idea.

The gas in the lower half of the image and the dense concentration of stars in the lower left are the outskirts of the star cluster NGC 1850, which has been observed by Hubble in the past [heic0108].

Credit:ESA/Hubble, NASA

The relative proximity of N103B allows astronomers to study the life cycles of stars in another galaxy in great detail. And probably even to lift the veil on questions surrounding this type of supernova. The predictable luminosity of Type Ia supernovae means that astronomers can use them as cosmic standard candles to measure their distances, making them useful tools in studying the cosmos. Their exact nature, however, is still a matter of debate. Astronomers suspect Type Ia supernovae occur in binary systems in which at least one of the stars in the pair is a white dwarf.

A white dwarf is the small, dense core of a medium-mass star that is left behind after it has reached the end of its main-sequence lifetime and blown off its outer layers. Our own Sun is expected to become a white dwarf in around five billion years.

There are currently two main theories describing how these binary systems become supernovae. Studies like the one that has provided the new image of N103B — that involve searching for remnants of past explosions — can help astronomers to finally confirm one of the two theories.

This video starts with a wide-field view of the night sky, as seen from the ground, displaying the Large and the Small Magellanic Clouds. It zooms in on the Large Magellanic Cloud, a satellite galaxy of the Milky Way, and onto the star cluster NGC 1850. Just next to the bright cluster Hubble observed the supernova remnant N103B. In the remnant of this supernova astronomers hope to find the surviving star of a supernova explosion.

Credit:ESA/Hubble, Nick Risinger (, R. Gendler & ESO Music: Johan Monell

One theory assumes that both stars in the binary are white dwarfs. If the stars merge with one another it would ultimately lead to a supernova explosion of type Ia.

The second theory proposes that only one star in the system is a white dwarf, while its companion is a normal star. In this theory material from the companion star is accreted onto the white dwarf until its mass reaches a limit, leading to a dramatic explosion. In that scenario, the theory indicates that the normal star should survive the blast in at least some form. However, to date no residual companion around any type Ia supernova has been found.

Astronomers observed the N103B supernova remnant in a search for such a companion. They looked at the region in H-alpha — which highlights regions of gas ionised by the radiation from nearby stars — to locate supernova shock fronts. They hoped to find a star near the centre of the explosion which is indicated by the curved shock fronts. The discovery of a surviving companion would put an end to the ongoing discussion about the origin of type Ia supernova.

The supernova remnant N103B can be found in the Large Magellanic Cloud, a satellite galaxy of the Milky Way. The elliptical-shaped gas cloud is the leftover of the past explosion and astronomers investigate it in the hope of finding the remains of the exploded star’s companion.

This pan shows N103B as well as the outskirts of the star cluster NGC 1850.

Credit: ESA/Hubble.Music: Johan Monell

And indeed they found one candidate star that meets the criteria — for star type, temperature, luminosity and distance from the centre of the original supernova explosion. This star has approximately the same mass as the Sun, but it is surrounded by an envelope of hot material that was likely ejected from the pre-supernova system.

This ground-based image shows both the Small and the Large Magellanic Clouds — two satellite galaxies of the Milky Way. The Small Magellanic Cloud can be seen on the left, the Large Magellanic Cloud on the right.

This photo was taken by the Japanese astrophotographer Akira Fujii.

Credit: A. Fujii

Although this star is a reasonable contender for N103B’s surviving companion, its status cannot be confirmed yet without further investigation and a spectroscopic confirmation. The search is still ongoing.

The Hubble Space Telescope is a project of international cooperation between ESA and NASA.

Contacts and sources:
You-Hua Chu, Institute of Astronomy and Astrophysics, Academia Sinica
 Mathias Jäger. ESA/Hubble 

Reusable Carbon Nanotubes Can Be the Water Filter of the Future

A new class of carbon nanotubes could be the next-generation clean-up crew for toxic sludge and contaminated water, say researchers at Rochester Institute of Technology.

Enhanced single-walled carbon nanotubes offer a more effective and sustainable approach to water treatment and remediation than the standard industry materials—silicon gels and activated carbon—according to a paper published in the March issue of Environmental Science Water: Research and Technology.

RIT researchers John-David Rocha and Reginald Rogers, authors of the study, demonstrate the potential of this emerging technology to clean polluted water. Their work applies carbon nanotubes to environmental problems in a specific new way that builds on a nearly two decades of nanomaterial research. Nanotubes are more commonly associated with fuel-cell research.

Single-walled carbon nanotubes filter dirty water in experiments at RIT.

Credit: John-David Rocha and Reginald Rogers

“This aspect is new—taking knowledge of carbon nanotubes and their properties and realizing, with new processing and characterization techniques, the advantages nanotubes can provide for removing contaminants for water,” said Rocha, assistant professor in the School of Chemistry and Materials Science in RIT’s College of Science.

Rocha and Rogers are advancing nanotube technology for environmental remediation and water filtration for home use.

“We have shown that we can regenerate these materials,” said Rogers, assistant professor of chemical engineering in RIT’s Kate Gleason College of Engineering. “In the future, when your water filter finally gets saturated, put it in the microwave for about five minutes and the impurities will get evaporated off.”

Carbon nanotubes are storage units measuring about 50,000 times smaller than the width of a human hair. Carbon reduced to the nanoscale defies the rules of physics and operates in a world of quantum mechanics in which small materials become mighty.

“We know carbon as graphite for our pencils, as diamonds, as soot,” Rocha said. “We can transform that soot or graphite into a nanometer-type material known as graphene.”

A single-walled carbon nanotube is created when a sheet of graphene is rolled up. The physical change alters the material’s chemical structure and determines how it behaves. The result is “one of the most heat conductive and electrically conductive materials in the world,” Rocha said. “These are properties that only come into play because they are at the nanometer scale.”

The RIT researchers created new techniques for manipulating the tiny materials. Rocha developed a method for isolating high-quality, single-walled carbon nanotubes and for sorting them according to their semiconductive or metallic properties. Rogers redistributed the pure carbon nanotubes into thin papers akin to carbon-copy paper.

“Once the papers are formed, now we have the adsorbent—what we use to pull the contaminants out of water,” Rogers said.

The filtration process works because “carbon nanotubes dislike water,” he added. Only the organic contaminants in the water stick to the nanotube, not the water molecules.

“This type of application has not been done before,” Rogers said. “Nanotubes used in this respect is new.”

Co-authors on the paper are Ryan Capasse, RIT chemistry alumnus, and Anthony Dichiara, a former RIT post-doctoral researcher in chemical engineering now at the University of Washington.

Contacts and sources:
Susan Gawlowicz
Rochester Institute of Technology

You Are Living on a Giant Musical Instrument and This Is What It Sounds Like

The ancients believed that the Earth was surrounded by celestial spheres, which produced divine music when they moved. We lived, so to speak, in a huge musical instrument.

This may sound silly but modern science has proved them right to a certain extent. Satellites recording sound waves resonating with the Earth’s magnetosphere – the magnetic bubble that protects us from space radiation – show that we are indeed living inside a massive, magnetic musical instrument.

Two key things control how the notes of musical instruments sound: the size and shape of the instrument and the speed of sound throughout it.

These determine the pitch of the notes and the timbre, the character or quality of the sound, via the standing waves or resonances that are excited within the instrument as sound waves bounce around it. It’s elegantly simple, yet explains the rich variety of musical sounds that are possible.

The same is true within Earth’s protective magnetosphere, which is carved out by the solar wind. There are always a few sound waves – oscillations in pressure which travel through the medium that they’re in – travelling around in space.

Well, they aren’t exactly the same type of sound waves that we get on Earth. Space is filled with plasma rather than normal gas: a different state of matter made of charged particles which can generate and be affected by electric and magnetic fields. These kinds of interactions can give rise to the plasma-equivalent of sound waves: magnetosonic waves. These too are pressure waves, but with some added magnetism.

Such “magnetosonic” waves can bounce around within the magnetosphere and often set up “resonances”, where the frequency is just right so that these waves grow and grow in energy rather than fizzling out quickly.

Most musical instruments support just one type of resonance – be that the vibrations of a string such as in a guitar, surface waves on a membrane like on a drum, or sound within a cavity like in a flute. However, the magnetosphere has analogues of all three of these types of resonance going on at once.

The magnetosphere is almost always changing – it grows and shrinks in direct response to the ever fluctuating solar wind. One would imagine this should change the notes of the magnetosphere, given how a musical instrument works.

This is a topic I’ve been working on recently. The problem is that you can’t just listen to how the notes change because it’s often not possible to be sure what triggered the waves detected or what sort of resonance built up, simply because we don’t have satellites placed at all points throughout this “instrument” listening to these sounds.

We can’t actually hear these magnetosonic waves in space – the levels are far below the human hearing threshold. But satellites can pick up the sound and we can then amplify them and squash them in time to make them audible.


These notes are hidden among the full set of space sounds I’ve posted online and now you can download the whole lot to do what you like with them. In fact, I’m inviting short films that incorporate these sounds in some creative way as part of a competition. This is your chance to play the strange magnetic musical instrument that you’ve unwittingly been living inside your whole life – whether you manage to produce divine melodies or not.

Contacts and sources:
University of Sydney
Martin Archer, Space Plasma Physicist, 
Queen Mary University of London

Is Insect Farming the Future of Food?

The global human population is expected to reach 9 billion by the year 2050.With a growing middle class and the westernisation of diet in developing countries, agricultural output is already strained. Across East and South East Asia the amount of meat consumed per capita has more than doubled since 1980, while China has seen per capita meat consumption increase by around 400% across the same period. It is thought that global agriculture will need to increase output by as much as 70% over the next 30 years to meet this growing demand for meat.

Insects are likely to be a valuable source of protein for people and livestock in the future. There is rapidly growing interest in farming insects in the UK and around the world. Over 90 researchers, farmers, investors and entrepreneurs will be meeting in London on Tuesday 4 April 2017 to discuss the development of insect farming as part of supply chains in the UK.

Experts will have wide-ranging discussions about the nutritional quality of insects, their environmental impact as farmed animals and the psychology of marketing insect products. The meeting will also be an opportunity for debate about the practical and legal aspects of insect farming.

Credit: Wikimedia Commons

“We face a growing world population with 9 billion people expected by 2050 and 11 billion by the end of this century. With global agriculture stretched to its limits, new protein sources are being explored including the consumption of insects. The only way to provide a large and sustainable source of insects is to farm them and we will be asking if this can be achieved in the UK," says Peter Smithers, Honorary Fellow of the Royal Entomological Society and meeting co-organiser.

“This meeting will look to answer important questions about insect farming in this country. Are there waste streams available that could be used as food for insects? Can production be scaled up to meet future demand and offer an affordable product? What new legislation is required to offer proper safeguards to consumers and how will Brexit affect this emerging industry? We will evaluate the prospect of insect farming in the UK and generate a plan for the future of this exciting industry.”

The meeting has been jointly organised by Woven Network, The Royal Entomological Society, British Ecological Society and ADAS to bring together expertise and organisations that see a bright future for six-legged livestock.

According to Nick Rosseau, Managing Director at Woven: "Since Woven was set up two years ago we have seen massive growth in interest in the role of insect protein in the food chain. The challenges include setting clear standards to reassure consumers and farming innovation to drive down the cost of insect material. This event will continue to develop an insect farming community and an exciting new element within UK agriculture."

Contacts and sources: 
British Ecological Society (BES)

Explaining the Accelerating Expansion of The Universe Without Dark Energy

Enigmatic 'dark energy', thought to make up 68% of the universe, may not exist at all, according to a Hungarian-American team. The researchers believe that standard models of the universe fail to take account of its changing structure, but that once this is done the need for dark energy disappears. The team publish their results in a paper in Monthly Notices of the Royal Astronomical Society.

Still from an animation that shows the expansion of the universe in the standard 'Lambda Cold Dark Matter' cosmology, which includes dark energy (top left panel, red), the new Avera model, that considers the structure of the universe and eliminates the need for dark energy (top middle panel, blue), and the Einstein-de Sitter cosmology, the original model without dark energy (top right panel, green).

Credit: István Csabai et al.

The panel at the bottom shows the increase of the 'scale factor' (an indication of the size) as a function of time, where 1Gya is 1 billion years. The growth of structure can also be seen in the top panels. One dot roughly represents an entire galaxy cluster. Units of scale are in Megaparsecs (Mpc), where 1 Mpc is around 3 million million million km.
Our universe was formed in the Big Bang, 13.8 billion years ago, and has been expanding ever since. The key piece of evidence for this expansion is Hubble’s law, based on observations of galaxies, which states that on average, the speed with which a galaxy moves away from us is proportional to its distance.

Astronomers measure this velocity of recession by looking at lines in the spectrum of a galaxy, which shift more towards red the faster the galaxy is moving away. From the 1920s, mapping the velocities of galaxies led scientists to conclude that the whole universe is expanding, and that it began life as a vanishingly small point.

In the second half of the twentieth century, astronomers found evidence for unseen 'dark matter' by observing that something extra was needed to explain the motion of stars within galaxies. Dark matter is now thought to make up 27% of the content of universe (in contrast 'ordinary' matter amounts to only 5%).

Observations of the explosions of white dwarf stars in binary systems, so-called Type Ia supernovae, in the 1990s then led scientists to the conclusion that a third component, dark energy, made up 68% of the cosmos, and is responsible for driving an acceleration in the expansion of the universe.

A short animation that shows the expansion of the universe in the standard 'Lambda Cold Dark Matter' cosmology, which includes dark energy (top left panel red), the new Avera model, that considers the structure of the universe and eliminates the need for dark energy (top middle panel, blue), and the Einstein-de Sitter cosmology, the original model without dark energy (top right, green). The panel at the bottom shows the increase of the 'scale factor' (an indication of the size) as a function of time. The growth of structure can also be seen in the top panels. One dot roughly represents an entire galaxy cluster. Units of scale are in Megaparsecs (Mpc), where 1 Mpc is around 3 million million million km.

Credit: István Csabai et al.

In the new work, the researchers, led by Phd student Gábor Rácz of Eötvös Loránd University in Hungary, question the existence of dark energy and suggest an alternative explanation. They argue that conventional models of cosmology (the study of the origin and evolution of the universe), rely on approximations that ignore its structure, and where matter is assumed to have a uniform density.

"Einstein’s equations of general relativity that describe the expansion of the universe are so complex mathematically, that for a hundred years no solutions accounting for the effect of cosmic structures have been found. We know from very precise supernova observations that the universe is accelerating, but at the same time we rely on coarse approximations to Einstein’s equations which may introduce serious side-effects, such as the need for dark energy, in the models designed to fit the observational data." explains Dr László Dobos, co-author of the paper, also at Eötvös Loránd University.

In practice, normal and dark matter appear to fill the universe with a foam-like structure, where galaxies are located on the thin walls between bubbles, and are grouped into superclusters. The insides of the bubbles are in contrast almost empty of both kinds of matter.

Using a computer simulation to model the effect of gravity on the distribution of millions of particles of dark matter, the scientists reconstructed the evolution of the universe, including the early clumping of matter, and the formation of large scale structure.

Unlike conventional simulations with a smoothly expanding universe, taking the structure into account led to a model where different regions of the cosmos expand at different rate. The average expansion rate though is consistent with present observations, which suggest an overall acceleration.

Dr Dobos adds: "The theory of general relativity is fundamental in understanding the way the universe evolves. We do not question its validity; we question the validity of the approximate solutions. Our findings rely on a mathematical conjecture which permits the differential expansion of space, consistent with general relativity, and they show how the formation of complex structures of matter affects the expansion. These issues were previously swept under the rug but taking them into account can explain the acceleration without the need for dark energy."

If this finding is upheld, it could have a significant impact on models of the universe and the direction of research in physics. For the past 20 years, astronomers and theoretical physicists have speculated on the nature of dark energy, but it remains an unsolved mystery. With the new model, Csabai and his collaborators expect at the very least to start a lively debate.

Gábor Rácz wrote the code for the new model, and István Szapudi and István Csabai (University of Hawaii) developed the theoretical interpretation.

Contacts and sources:
Dr Robert Massey, Royal Astronomical Society
Dr Morgan Hollis, Royal Astronomical Society
Dr László Dobos, Eötvös Loránd University, Budapest, Hungary
Prof Istvan Szapudi, Institute for Astronomy, University of Hawaii

Their work is described in "Concordance cosmology without dark energy", Gábor Rácz, László Dobos, Róbert Beck, István Szapudi and István Csabai, Monthly Notices of the Royal Astronomical Society, Oxford University Press, in press.

Legends of the Lost Reservoirs

Tucked away in a laboratory in University of Cincinnati’s Braunstein Hall are tubes of rock and dirt that quietly tell a story –– a story that looks back on ancient society’s early water conservation. UC researchers hope the story will aid in the future preservation of our planet’s most precious resource.

In an effort to help manage the world’s water supply more efficiently, an interdisciplinary team of University of Cincinnati researchers from the departments of anthropology, geography and geology have climbed through rainforests, dug deep under arid deserts and collaborated with scientists around the world to look at how ancient humans manipulated their environment to manage water.

“We begin by asking, ‘What is water to humans, how do we engage with it and how does the environment engage us?’” asks Vernon Scarborough, professor and department head in UC’s Department of Anthropology. “When we look at the trajectory of our changing climate, we realize that the issue is not just climate change but also water change. Climate and water work synergistically and can affect one another in critical ways.

IDAR images of the elevation levels in Chaco Canyon, New Mexico reveal ancient dunes, canals, building structures and rincon watershed areas as part of a poster presentation by the UC research team at the 2016 Annual Society for American Archaeology meeting

Credit: Slide photos/Christopher Carr

“Given the current climate patterns, in this and the next century, we will likely face further rising sea levels, less potable water and a compromised availability of freshwater as a result of drought in many areas and unusually heavy rains and runoff in others.

“So we are looking at how the past can inform the present,” adds Scarborough.

High-tech collaboration

To face future sustainability and water management issues, UC’s interdisciplinary team of real-world “Indiana Jones” employ modern technology to peek inside ancient irrigation communities in obscure places around the globe like the arid American Southwest and humid rainforests in Central America and Southeast Asia.

“The point of these projects is to help, in part, create effective modern water policy,” says Scarborough, who also works closely with the United Nations Educational, Scientific and Cultural Organization (UNESCO). “Exploring all these unique points on the globe is the only way we’re going to get at it, and it’s our teamwork, communication and cooperation that will make this project so successful.”

UC grad student Jon-Paul McCool excavates the berm wall of an ancient Chaco Canyon canal while UC Professor Christopher Carr takes notes.

Credit: UC

As a result of their collaboration, several members of UC’s research team will be presenting the outcome of their field work at one or both of two upcoming prestigious scientific annual meetings: the 77th annual Society for Applied Science meeting in Santa Fe, New Mexico, and the 82nd annual Society for American Archaeology meeting in Vancouver, British Columbia, Canada. Both are meeting this week.

For more than two decades, the researchers worked intricately together in remote areas that are known for their seasonal water and environmental challenges. One core investigation lies deep in the ancestral Puebloan community in Chaco Canyon, New Mexico –– the ancestors of modern Puebloans that thrived for more than 300 years in a dry desert in the middle of the American Southwest.

UC Professor Nick Dunning (on ladder) records alluvial stratigraphy in a Chaco Canyon arroyo while UC Professor Vern Scarborough looks on.

Credit: UC

Scientists have long debated whether this area was truly a sustainable thriving community based on local resource access or an occasional gathering spot for ceremonial rituals dependent on importing food and related supplies.

To create a comprehensive snapshot for how ancestral Native American Puebloans managed water and survived in the ancient desert, UC’s research team used aerial surface imaging technology, mass spectrometry and geochemical soil sampling, as well as anthropological behavioral and DNA studies and soil excavations around ancient structures to help shed significant light on that mystery.

IDAR images of the elevation levels in Chaco Canyon, New Mexico reveal ancient dunes, canals, building structures and rincon watershed areas as part of a poster presentation by the UC research team at the 2016 Annual Society for American Archaeology meeting

Credit: Slide photos/Christopher Carr

In the field

Nicholas Dunning and Christopher Carr, both UC professors of geography, looked broadly at the geographic area documenting and sampling the stratified layers of rock and sediment, while Lewis Owen, also a UC professor of geology, used optical-stimulated luminescence, a unique technique to accurately determine the age of core sand and soil samples.

“We found geochemical evidence for corn grown in the area during this time, which is a very water-intensive crop, as well as sophisticated irrigation and water-management techniques,” says Kenneth Tankersley, UC associate professor of anthropology and geology.

To get a 3-D look at the surface of the canyon, Carr used sophisticated LIDAR technology, or light, imaging, detection and ranging technology, to measure the surface elevation of the ground from an airplane.

“This technology uses a laser beam to measure the morphology of the surface and is totally revolutionizing archaeology,” says Carr. “The key thing LIDAR gives us is elevation so we know how the water flows off the mesa tops into the drainage ditches and into the valley floors.

“LIDAR ultimately tells the archaeologists where to excavate and look for evidence of agriculture, canals and water control gates beneath the surface.”

Salty survival

To uncover the thousand-year-old secrets for survival held in the geochemical deep core soil samples, Tankersley, along with Owen and Warren Huff, UC professor of geology, employed laboratory sampling techniques to reveal that the high level of salt in the soil –– once thought by scientists to be harmful –– was in fact a form of a calcium sulfate mineralization that may have functioned to enhance the soil for the maize (corn) grown in that area.

“The surrounding mesas provided water in their springs after the snow melted,” says Tankersley. “During the rainy season when floodwaters hit, the Puebloans would capture runoff water from small canyons known as the rincons and local periodic streams such as Chaco Wash and Escavada Wash.”

UC professors Lewis Owen (L) and Ken Tankersley (R) collect ancient soil samples at Operation 100 in Chaco Canyon, New Mexico
Credit:  photo/Nick Dunning

The researchers consider this strategy a reflection of risk aversion. “When it rained in one spot over here the Ancestral Puebloans took advantage of it, and when it rained over there they took advantage of that,” Scarborough says.

Under this expeditious use of landscape, two key members of the Chaco water management project, Stephen Plog, professor of archaeology from the University of Virginia, and Adam Watson at the American Museum of Natural History were also part of the collaborative team that utilized DNA sampling techniques on human remains to reveal a remarkable matrilineal family line connected through the female lineage.

High-flowing Chaco Wash following a heavy rainstorm in present day in Chaco Canyon, New Mexico.
Credit: photo/Samantha Fladd

To effectively manage water requires flexibility and creativity as rainfall is unpredictable in the Southwest,” says Samantha Fladd, an advanced doctoral student from the University of Arizona, also working on the Chaco project here at UC. “The presence of a hierarchical matriline helps to explain how Chaco residents coordinated these activities in order to practice successful water management and agriculture."

No forests, no rain

In contrast to Chaco Canyon’s desert aridity many of the researchers also spent a significant amount of time in the Guatemalan rainforests around Tikal –– a Central American site that coexisted at about the same time as Chaco Canyon more than a thousand years ago.

While the two environments couldn’t be more opposite in climate the researchers found Tikal’s water issues just as challenging. David Lentz, UC professor of biology, with the assistance of Scarborough, Huff, Tankersley, Carr, Owen and NSF-funded Dunning, discovered how the Maya civilization survived in Tikal after suffering several droughts.

“Similar to Chaco Canyon, we found geochemical evidence for corn fields situated in specific environmental niches at Tikal,” says Dunning.

Aerial view of present-day Tikal's ancient building structures in Guatemala, Central America.

Photo/David Lentz

Scarborough speculates the Maya channeled runoff during the rainy season and created elaborate water storage systems, allowing their civilization to thrive for more than three centuries. Eventually the Maya not only suffered from a changing climate, but they had added to their own demise, say the researchers.

“Essentially, they may have affected a change in their own climate,” says Scarborough. “After several years of deforestation –– clearing out trees and forests to make room for crops –– the Maya unintentionally, but perhaps dramatically upset their annual rainfall, which precipitated degrees of drought that ultimately forced them to abandon the once fertile environment. Sound familiar?”

Illustrated slide portraying ancient water management canals and reservoirs in Tikal, Guatemala. 
Illustration/Vern Scarborough

With recent funding by the National Science Foundation, Dunning, along with Scarborough and other researchers, will spend a fifth season this summer as a co-principal investigator on the Yaxnohcah project along with Carr and four UC students. The focus of this study looks at the development of ancient urbanism in relation to water, land and forest management in the Maya lowlands and will be a presentation topic by Dunning and by Carr at the upcoming annual Society for American Archaeology meeting in Vancouver.

UC anthropology professor Vern Scarborough points out the LIDAR images of Chaco Canyon, New Mexico and aerial-view photographs of Tikal, Guatemala in Central America as part of a poster presentation at an earlier anthropology seminar.

Photo/Joseph Fuqua II/UC Creative Services

It takes a village

“Our collaborative research as a team is critical –– each one of us is an important cog in this investigation,” says Scarborough. “It takes each one of us and our individual expertise to effectively measure how well these early urban and rural communities adapted to climate change and managed their water resources.”

“We still have to deal with those same issues in our environment today. From an archaeological perspective, our changing climate is immediate, but it may be several years before the damage is fully apparent at a truly global scale,” Scarborough adds.

“We will begin to see sea levels rise by a good meter. Because over two-thirds of the largest cities on the planet occupy coastal margins, with estimates suggesting that an anticipated 80 percent of human population will gravitate toward urban settings in the near term, we really are approaching a truly ‘perfect storm’.”

While the researchers look at future water management as the direction of this research, they also focus on the constant changes to the landscape and the creatures that occupy these environments. Scarborough adds that If we are not careful, we will instigate even further change to a wide array of plant and animal species all over the world.

“If you don’t design for that appropriately, you can be building management networks and ways to capture and control water that will wind up getting buried like the build-up behind modern dams, or plans can get abandoned altogether as a river changes,” say Scarborough and Jon-Paul McCool, UC doctoral student under Dunning’s mentorship.

“How past populations dealt with variable precipitation like that identified at Tikal, Chaco Wash or drainage patterns overall has been very dynamic. Such investments in building massive dam projects today is a costly expenditure of money and time that might well benefit from views of the past.

“We don’t want to waste that money on high-priced water infrastructure if we can engage in smaller scale, lower investment strategies like our ancestors did.”

Contacts and sources:
Melanie Schefft
University of Cincinnati

Brain's 'GPS' Does a Lot More Than Just Navigate

The part of the brain that creates mental maps of one's environment plays a much broader role in memory and learning than was previously thought, according to new research published this week in the journal Nature by researchers at Princeton University.

"Almost 40 years of research suggested that a certain region of the brain was devoted to spatial navigation," said David Tank, Princeton's Henry L. Hillman Professor in Molecular Biology and co-director of the Princeton Neuroscience Institute. "We found that this same region is also involved when navigating not only spatial environments but also cognitive ones."

The study looked at a region of the brain called the hippocampus that has been known since the 1970s to become active when rats travel around their environments. That research, and related work showing that cells in the nearby entorhinal cortex fire when animals reach specific locations, led to the finding that the brain creates an internal representation of the outside world -- a sort of mental positioning system -- that tells an animal where it is in its environment. These findings earned three scientists the 2014 Nobel Prize in Physiology or Medicine.

 Rats were trained to depress a lever and then release it when the sound reached a certain frequency.
 Illustration by Julia Kuhl

Now researchers at Princeton have found that those same brain regions are active when the brain is exploring a very different kind of environment, one involving listening to sounds. The researchers monitored neural activity as the rats listened and responded to certain sounds, and found similar firing patterns to those seen when rats are exploring their environments.

The research addresses a longstanding mystery in neuroscience, how the hippocampus could be associated both with making maps of the external environment and with making new memories. People with damage to the hippocampus, such as the amnesia patient known by the initials H.M. who participated in five decades of studies until his death in 2008, lack the ability to form new memories.

In previous studies where scientists monitored the electrical activity of cells in the hippocampus, they found that the cells fired in sequences that represented where the animal was, which direction its head was facing, which direction it was traveling and where it was relative to a boundary, according to Dmitriy Aronov, first author on the paper who conducted the work while a postdoctoral researcher at the Princeton Neuroscience Institute and who is now an assistant professor of neuroscience at Columbia University. "The mystery was, what do these firing patterns have to do with memory?"

The researchers theorized that perhaps the hippocampus and the nearby entorhinal cortex, which work together to make these mental maps, were in fact not specific to mapping per se but were involved in more general cognitive tasks, and that mapping was just one aspect of larger cognitive tasks involving learning and memory. Perhaps the reason previous studies only turned up the location-finding tasks is because rats spend most of their time exploring their environments as they forage for food.

By giving the rats a different task, such as exploring sounds, the researchers might see evidence of cognitive activities in the hippocampal-entorhinal circuit. The researchers chose sound as an analogy to space because both can vary along a continuum: the rats can explore ever-increasing frequencies the way they would move forward along a lengthy corridor.

Researchers at Princeton found that brain cells known to be involved in making maps of the external environment, collectively known as the brain's "GPS," are also active in representing other tasks involved in memory and cognition. The researchers found that the neurons fired in sequence in accordance with the rats' activities as they listened to sounds and pressed levers to get rewards when the tone achieved a predetermined frequency. Below, the left diagram shows the neural activity (orange) in a group of neurons over a period of several seconds as the rats pressed and released the levers in response to the sounds they heard. The diagram on the right shows that individual cells (labeled 1 through 9) fire in sequence to represent the press and release of the lever.

Image courtesy of David Tank, Princeton University

To test the theory, the researchers monitored the electrical activity of neurons in the hippocampal and entorhinal regions while the rats manipulated sounds and learned to associate certain sound frequencies with rewards. Tank and Aronov teamed with undergraduate Rhino Nevers, Class of 2018, to conduct the work. The researchers first taught the rats to depress a lever to increase the pitch, or frequency, of a tone being played over a speaker. The rats learned that if they released the lever when the tone reached a predetermined frequency range, they would receive a reward.

The team observed that the patterns of neuronal firing corresponded to the rats' behaviors during the task. Sequences of neural activity were produced as the rats advanced through the progression of frequencies, analogous to the sequences produced during traversing a progression of places in space. There were even patterns that corresponded to particular sound frequencies. The neurons involved in these firing patterns were identical to those involved in mapping and navigation. These cells included hippocampal place cells, so named because they fire when the rat is in a particular place, and entorhinal grid cells, which fire when the rats pass through certain locations.

The findings suggest that there are common mechanisms in the hippocampal-entorhinal system that can represent diverse sorts of tasks, said Tank, who is also director of the Simons Collaboration on the Global Brain. "The implication from our work is that these brain areas don't represent location specifically, but rather they can represent other relevant features of the animal's experience. When those features vary in a continuous way, sequences of neural activation are produced," Tank said.

The discovery fits with how we think about mapping our environment in the context of learning about new places and forming memories of experiences, said Aronov. "When you visit a new location, you don't only make a mental map, but you also form memories of your location. We feel that this study solves the mystery of the hippocampus in representing both memory and location, in that these neurons are general purpose neurons capable of representing any relevant information."

Contacts and  sources:
Catherine Zandonella
Princeton University

Citation: Mapping of a non-spatial dimension by the hippocampal–entorhinal circuit Dmitriy Aronov, Rhino Nevers & David W. Tank Nature 543, 719–722 (30 March 2017) doi:10.1038/nature21692

Thursday, March 30, 2017

Products as Pals: Are People Getting Too Friendly with Objects?

"Feeling left out has been shown to trigger primal, automatic responses in an attempt to compensate for threats to social belongingness."

The Liberty Mutual commercial mentions naming your car Brad and considering him part of your family.

It's easy to try to carry on a conversation with Siri on your iPhone or Amazon's Alexa device from your living room.

As technology has become an even bigger part of our lives, do electronic products and machines now help fill a void when we feel lonely?

According to a new study that includes a University of Kansas marketing professor, it appears these humanlike products do keep people from seeking out normal human interaction, which is typically how people try to recover from loneliness. However, there are limits to this phenomenon, and the long-term consequences are unclear, the researchers said.

Credit: Journal of Consumer Research

"Generally, when people feel socially excluded, they seek out other ways of compensating, like exaggerating their number of Facebook friends or engaging in prosocial behaviors to seek out interaction with other people," said Jenny Olson, assistant professor of marketing in the KU School of Business. "When you introduce a human-like product, those compensatory behaviors stop."

A leading academic marketing journal, the Journal of Consumer Research, recently published the findings online of Olson, lead author James Mourey of DePaul University and Carolyn Yoon of the University of Michigan.

In four experiments, the researchers found evidence that people who felt socially excluded would exhibit those compensating behaviors unless they were given the opportunity to interact with an anthropomorphic product.

To establish feelings of loneliness, participants either wrote about an important time in their lives when they felt excluded — such as being stood up for the prom — or they played an online game of "catch" in which other participants stopped throwing them the ball and chose others after a few initial tosses. As part of the game, participants believed they were playing with real people online, but the other players were computerized.

However, after engaging with a Roomba vacuum whose design made it seem like it was smiling or when asked to think about their cellphone in humanlike terms, such as "how much it helps you," participants would not feel the need to plan to spend time talking to family or friends or seek out volunteer opportunities.

However, Olson and Mourey said the ability for these products to replace human contact has its limits because certain statements seemed to snap participants back to reality.

"As soon as we tell people we know that it looks like the Roomba is smiling, they seemed to realize it was a machine and not a person," Olson said. "The effect goes away. This seems to be happening on a very subconscious level."

"Alexa isn't a perfect replacement for your friend Alexis," Mourey said. "But the virtual assistant can affect your social needs."

Olson said the research could be important for consumers to realize how these types of products could thwart their motivation to interact with real people, especially because so many new products feature interactivity.

"If someone notices they are talking more to Siri lately, maybe that has something to do with feeling lonely," she said. "From that standpoint, it's important to be aware of it."

The study's findings might help marketers acknowledge it is possible to design products that might increase the well-being of lonely individuals without negatively influencing genuine interpersonal interaction.

"Maybe it is more about bolstering our current relationships," Olson said, "such as taking a break from screen time and focusing on developing your authentic personal connections."

Contacts and sources: 
University of Kansas

Citation: Products as Pals: Engaging with Anthropomorphic Products Mitigates the Effects of Social Exclusion. James A. Mourey, Jenny G. Olson, Carolyn Yoon, J Consum Res ucx038.

Decorated Raven Bone May Provide New Insight into Neanderthal Cognition

The cognitive abilities of Neanderthals are hotly debated, but a raven bone fragment found at the Zaskalnaya VI (ZSK) site in Crimea features two notches that may have been made by Neanderthals intentionally to display a visually consistent pattern, according to a study by Ana Majkic at the Universite de Bordeaux and colleagues, published in the open access journal, PLOS ONE on March 29, 2017.

Majkic and colleagues conducted a mixed-methods study to assess whether the two extra notches on the ZSK raven bone were made by Neanderthals with the intention of making the final series of notches appear to be evenly spaced. First, researchers conducted a multi-phase experiment where recruited volunteers were asked to create evenly spaced notches in domestic turkey bones, which are similar in size to the ZSK raven bone.

Legend of the image: left: notched raven bone from Zaskalnaya VI Neanderthal site, Crimea. center: experimental notching of a bird bone; right: sequences of experimentally made notches compared to those from Zaskalnaya VI.

Credit: Francesco d'Errico

Morphometric analyses reveal that the equal spacing of the experimental notches was comparable to the spacing of notches in the ZSK raven bone, even when adjusted for errors in human perception. Archeological specimens featuring aligned notches from different sites were also analyzed and compared with the ZSK raven bone specimen.

Researchers concluded that the two extra notches on the ZSK raven bone may have been made by Neanderthals intentionally to create a visually consistent, and perhaps symbolic, pattern.

A series of recent discoveries of altered bird bones across Neanderthal sites has caused many researchers to argue that the objects were used for personal ornaments, as opposed to butchery tools or activities. But this study is the first that provides direct evidence to support a symbolic argument for intentional modifications on a bird bone.

Contacts and sources: 
Sara Kassabian

Citation: Majki A, Evans S, Stepanchuk V, Tsvelykh A, d'Errico F (2017) A decorated raven bone from the Zaskalnaya VI (Kolosovskaya) Neanderthal site, Crimea. PLoS ONE 12(3): e0173435. doi:10.1371/journal.pone.0173435  (Free) 

Researchers Can Track Hazardous Chemicals from Fast-Food Wrappers in Our Bodies

Research teams from the University of Alabama at Birmingham’s School of Medicine and the University of Notre Dame have developed a new method that enables researchers to radiolabel three forms of perfluorinated and polyfluorinated alkyl substances and track the fate of these chemicals when they enter the body.

This is a significant and timely advancement in identifying and tracking these PFASs, which are known to be harmful to the human body, and just last month were found to be used extensively in fast-food wrapping paper at many popular chain restaurants.

Credit: UAB

The novelty of the newly designed method is that one of the fluorine atoms on the PFAS molecule was replaced with a radioactive form of fluorine, the same radioisotope fluorine-18 that is used for medical positron emission tomography scans in hospitals around the world.

“For the first time, we have a PFAS tracer or chemical that we have tagged to see where it goes in mice,” said Suzanne Lapi, Ph.D., senior author of the study published today in the Journal of Environment Science and Technology. Lapi is an associate professor in UAB’s Department of Radiology and Chemistry, and director of UAB’s Cyclotron Facility. “Each of the tracers exhibited some degree of uptake in all of the organs and tissues of interest that were tested, including the brain. The highest uptake was observed in the liver and stomach, and similar amounts were observed in the femur and lungs.”

In February, a study of more than 400 packaging materials found that 46 percent of food contact papers were contaminated with PFASs.PFASs are often used in stain-resistant products, firefighting materials and nonstick cookware and not meant for ingestion. Previous studies have shown PFASs can migrate, contaminating the food and, when consumed, accumulating in the body.
Credit: UAB

Now that it appears likely that any PFASs that can be synthesized and isolated could be radiolabeled and used to directly measure uptake and biodistribution kinetics in biological systems, it opens the possibility of directly measuring uptake in human subject volunteers.

“This is possible since trace amounts of the compounds are easily measurable and the radioactivity short-lived,” said Graham Peaslee, Ph.D., the study co-author and professor of experimental nuclear physics in the College of Science at the University of Notre Dame. “It’s an important discovery because PFASs are a really persistent chemical that, once in the bloodstream, stays there and accumulates, which is not good.”

Diseases including kidney and testicular cancers, thyroid disease, low birth weight and immunotoxicity in children, and other health issues have been linked to PFASs in previous studies.

Now that researchers have for the first time identified which PFASs initially accumulate — and in which specific organs — and with some surprising differences, the authors say there are health implications far beyond this initial study.

“We are very excited about this technique, which borrows from our current work developing nuclear medicine imaging agents,” said Jennifer Burkemper, Ph.D., scientist in UAB’s Cyclotron Facility and the first author on the study. “This work can enable rapid screening of PFAS compounds to gain key insights into their biological fate.”

PFASs in the news

Fluorinated chemicals have been in the news a lot recently, especially PFASs. There have been industrial accidents like those uncovered near the Hoosic River in New York this past fall, and the Dupont settlement of $670 million last month related to the dumping of the toxic chemical C8, also known as perfluorooctanoic acid, into the Ohio River.

Another source of exposure to these chemicals was reported in February, when a survey found that one-third of fast-food wrappers had been treated with these fluorinated chemicals.

“There was concern that these chemicals might directly enter the food that was wrapped with treated packaging,” said Peaslee, who used particle-induced gamma-ray emission to make the findings reported in February. “A larger concern is that, because these chemicals persist for a long time in the environment, when the treated consumer products enter the landfill, these chemicals will re-emerge into our drinking water. These overall results already call into question the safety of these alternative shorter-chain PFAS compounds.”

Lapi says the new novel tool developed by the research teams can be used for studying PFAS behavior in environmental remediation studies to measure the fate of radiolabeled compounds in environmental treatment systems.

“This is a tremendous first step,” Lapi said, “and it underscores the need for further studies to aggressively investigate different PFAS compounds in different biological and environmental systems to assess the full impact of this novel radiosynthetic method.”

A look inside UAB's Cyclotron, which makes radioactive molecular imaging agents for nuclear medicine using a type of particle accelerator that moves protons, a one kind of charged particle, along a spiral path to strike a material to produce radioisotopes.

Credit: UAB News

In addition to kidney and testicular cancer, scientists have previously found high cholesterol, thyroid disease, pregnancy-induced hypertension and ulcerative colitis to be correlated to the amount of perfluorooctanoic acid, or PFOA, found in the blood of people who were exposed to the tainted water.

As a result, the Environmental Protection Agency and U.S. manufacturers reached a compromise to voluntarily remove two specific “long-chain” PFAS from the U.S. market by 2015 — including PFOA. However, industry has switched from these “long-chain” forms of PFAS to shorter-chained versions of the same chemicals, Peaslee says. There are no toxicology data available for most of the alternative short-chain PFAS compounds used commercially.

Additional importance, future steps

Lapi’s team makes radioactive molecular imaging agents for nuclear medicine using the UAB Imaging Facility’s cyclotron, a type of particle accelerator that moves protons, one kind of charged particle, along a spiral path to strike a material to produce radioisotopes. These radioisotopes can be chemically attached to molecules created to home in on biological targets of interest. These targets typically include certain receptors on cancer, and lung and heart function. They also look at different tracers for neurology.

When researchers looked at the PFAS chemicals and saw their structure, Lapi says, her group thought the chemistry of the compound was amenable enough to do radiolabeling with their techniques.

“Conventionally, tracing these PFAS compounds is very difficult,” Lapi said. “These compounds are not UV active, and they’re very difficult to detect. There are some techniques where you can detect total fluorine concentration, but that does not give you an idea of which compound the fluorine is attached to. With our method, we can actually tag the intact compound with a fluorine 18 radio tracer, and it gives us a handle so we can see where that compound is going and make very sensitive measurements. These sensitive measurements are probably the most important thing, because it’s so difficult to detect in other methods, where you would have to take the liver out, homogenize it, extract the chemical out and do mass spectrometry to see how much of the chemical is in there. And you’d have to do it with every single organ. For us, we can take the whole mouse, image it, and we’re done. Or we can take the tissues and we can count it, and we’re done. It’s a much quicker and less time-consuming method to look at where these go.”

So far, Lapi says, the group has looked at three compounds, far short of the hundreds of PFASs that have been identified.

“While I don’t think we will look at all of these PFASs, we would like to look at different families of these compounds and see how they are distributed in the body,” Lapi said. “Because even with very small changes in these compounds, we were able to see differences in brain uptake, which is important because these may have neurological impacts. We saw different clearance patterns, blood binding and other things. We want to look at different classes of compounds, how they’re excreted from the body, how they accumulate, and see if we can really say something about how you would get rid of these compounds.”

The next step after that would be to identify how this newly discovered technique could be used to clean up compounds in environmental situations where there is a contaminate issue.

“We want to know if, say, we have a huge contaminated water supply full of PFASs, how do we make techniques to get PFAS out of the water supply,” she said. “Perhaps we can take a bucket of water, spike it with our radioactive substance, put it through filters and different types of cleanup technologies, and see how we can effectively extract that compound from the water supplies.”

Lapi and her team are excited that they have been able to show how to take techniques from nuclear medicine and previous UAB imaging studies and apply them to environmental compounds — a significant achievement moving forward.

“When people think of radio chemistry, they typically think of tritium or carbon 14 or these very long-lived compounds when doing these pharmacokinetic studies,” Lapi said. “Now we have a whole host of different radio isotopes with different chemical properties, and we really have these nice tools that we can use for different applications outside of nuclear medicine, like environmental cleanup applications.”

Contacts and sources: