OpenX

Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

Sunday, April 19, 2015

Beavers The Size of Bears: Extinction, Survival and Evolution in Kentucky


Researchers at an old geological site talk 'dirt' about how Ice Age climate change led to the extinction of mammoths and mastodons, but to the evolution and survival of bison, deer and other present-day species.

This is an 18,000 year-old mastodon molar (Mammut americanum).
Credit: Tom Robinette, UC

The answers to extinction, survival and evolution are right here in the dirt," says University of Cincinnati Quaternary science researcher Ken Tankersley, associate professor of anthropology and geology. "And we are continually surprised by what we find."

While many scientists focus on species' extinction wherever there has been rapid and profound climate change, Tankersley looks closely at why certain species survived.

For many years he has invited students and faculty from archeology and geology, and representatives from the Cincinnati Museum Center and Kentucky State Parks to participate in an in-the-field investigation at a rich paleontological and archeological site not too far from UC's campus.

Through scores of scientific data extracted from fossilized vegetation and the bones and teeth of animals and humans, Tankersley has been able to trace periods of dramatic climate change, what animals roamed the Earth during those epochs and how they survived. And his most recent evidence reveals when humans came on the scene and how they helped change the environment in Big Bone Lick, Kentucky.

"What we found is that deforestation efforts over 5,000 years ago by humans significantly modified the environment to the degree that the erosion began filling in the Ohio River Valley, killing off much of the essential plant life," says Tankersley. "At that point animals had to either move, evolve or they simply died off."

Tankersley will present the culmination of his years of Surviving Climate Change research - including countless hours in the field and in the lab as well as multiple published works - at the Society for American Archeology annual meeting, April 15-19 in San Francisco titled, Quaternary Chronostratigraphy of Big Bone Lick, Kentucky, USA. He also has a paper published online in the March issue of the prestigious journal Quaternary Research titled, "Quaternary chronostratigraphy and stable isotope paleoecology of Big Bone Lick, Kentucky, USA."

STUDENTS DIG DEEP FOR ANSWERS

Big Bone Lick (BBL) State Park in north-central Kentucky has over 25,000 years of well-preserved bones, rocks and other archeological treasures that have been easily accessible since the 1700s. But until recently, the evidence for why some of this region's former inhabitants evolved into present-day animals, while others simply died off, was buried deep in the sediment.

Only 20 minutes away from UC's main campus by interstate, Tankersley and his students have been taking advantage of BBL's rich and accessible history for the past three years. By digging through layers of soil, scavenging around in creek bottoms and scraping specimens from bone fragments, they have unearthed a treasure trove of ancient specimens - some more than 25,000 years old.

"One of my students, Stephanie Miller, discovered a 10-foot mastodon tusk beneath the water table at the bottom of a creek when she reached below the mud and felt a hard object pointed at the end," says Tankersley. "That tusk is now on display at the Cincinnati Museum Center."

Possessing a proud ancestry of part Native American Cherokee, Tankersley feels a strong need for all this discovery is in his bones, too.

Tankersley originally thought that when the ice reached its maximum advance 25,000 years ago - covering the area now known as Sharonville - the mammoths were grazing on C4 tundra vegetation of herbaceous plants and sedges. To his surprise, what he found is that he couldn't have been more wrong.

While mammoths and mastodons are two distinct species of the proboscidean family, they were originally thought to have lived in different epochs in time and in separate areas of the world:


Mastodons existed earlier, about 27-30 million years ago primarily in North and Central America.
Mammoths came on the scene 5.1 million years ago arising out of Africa.

University of Cincinnati students hold a newly discovered 10-foot mastodon tusk.
Credit: Tom Robinette, UC

The evidence at BBL now shows that mammoths and mastodons both roamed together - possibly through intercontinental migration - and were both eating the same vegetation, even with the difference in the shape of their molars.

The original model of the changing landscape botanically, and in terms of the animals' diet was completely wrong, and was a big shock to Tankersley.

Tankersley's evidence also revealed significant periods of radical shifts in environmental temperature since the last glacial maximum more than 25,000 years ago, which caused an increase in the deposit of sediment that was greater than the system was able to support. And those radical shifts from cold and dry to warm and moist significantly altered the landscape and the vegetation and plant life.

"To determine what animals roamed the area and how they survived, we looked at the stable carbon and nitrogen isotope chemistry of both the animals and plants that were in the sediment for the past 25,000 years," says Tankersley. "Since we are what we eat, we discovered that the mammoths, mastodons and bison were not eating the plants we originally thought. As it turns out, they were eating more C3 vegetation, which is tree leaves and weedy vegetation more like we see outside today."

After incidents like cataclysmic cosmic events caused temperatures to drop and darkened the air with clouds of poisonous gas, the resulting climate change presented challenges for most plant and animal species to continue living. According to Tankersley, life at that time became a true test of survival skills for all living things, so those that could move or adapt to their new surroundings survived - many by evolving into a smaller, lighter and faster species.

Larger animal species that could not move fast or for long distances starved or were imprisoned in the muddy landscape and became easy prey for hungry predators.

"My students discovered all of this," says Tankersley. "My job in this 'Surviving Climate Change' project was to give them the resources and tools and teach them the scientific techniques we use, but then let them be the discoverers, which is exactly what happened."

SURVIVAL OF THE MOST FLEXIBLE

At BBL, Tankersley focused on which species survived and which ones went extinct. They determined that during times when food sources were declining, animals had to move to more fruitful environments or learn to do with less food, which ultimately led to the evolution of today's surviving species.

Looking closer at those survival patterns, Tankersley found that species like caribou could no longer make a living in this area, but they could up north. And although bison are still around, they are a lot smaller than they were thousands of years ago.

The moral to this story, explains Tankersley, is that many species evolved into smaller animals over time as their food sources started to decline. While some larger species simply died off from a lack of necessary resources, bison and deer were two mammals that were able to survive by evolving with a smaller body mass and shorter stature.

"If you look at a species and you have an environmental downturn or major change in the amount of solar radiation, the amount of water moisture and the amount of frost-free days, can all plants respond to that equally? Of course not," says Tankersley. "As individuals, we all have different tolerance levels for change. So in the case of the caribou, when the climate changed rapidly and profoundly it could no longer make a living at BBL. But it could continue to make a living up north where it had the environment for survival.

"Species get bigger when there is a lot of food available and smaller when there is not. So the bison downsized, but the mammoth and mastodon did not. They could neither move nor downsize quickly enough so they simply died off."

BEAVERS THE SIZE OF BLACK BEARS - OH MY! 

Tankersley's team also discovered different species within a species. For example, while there were small beavers then just like there are now, from 25,000 until 10,000 years ago there were also large beavers the size of black bears.

"The larger extinct beaver lost its battle to survive because it was dependent on a certain environment that was dying off, but the modern beaver could make its own environment and consequently survived," says Tankersley. "So there is a lesson there. Animals had to adapt, downsize or go extinct."

Last year Tankersley and his students excavated over 17,000 specimens that are now housed at the Cincinnati Museum Center. While digging up animal bones they found evidence of humans who had butchered these animals.

ENTER THE HUMANS

To effectively date the plant and animal specimens, Tankersley's students examined radiocarbon and optically stimulated luminescence (OSL) ages. Dating much of the material to 5,000 years ago using OSL procedures, Tankersley was shocked to find the evidence for human activity and a new anthropological time period now called the Anthropocene - when humans became the most powerful, natural force.



"So much of science is serendipitous," claims Tankersley. "What the students discovered serendipitously, by dating these deposits, was that humans came in and broke the sod.

Deeper into the sediment, Tankersley found that humans had dug pits into the ground to process animal skins to wear as clothing. Based on ethnographic French literature, the Native Americans had put piles of rocks inside the pits along with hickory nuts, then they used hot rocks to boil the water. The oily, greasy meat of the hickory nut would float to the top and the non-edible remains like the shell and hull would sink to the bottom.

"They would skim it off and drink the water, as it was very nutritious," says Tankersley. "When they were finished, they would grab the softened deerskin and leave the rocks and nutshells behind, which is what we found."

To protect their hickory-nut trees from squirrels and other animals, Tankersley found evidence for human deforestation, where large areas of trees were cleared to create separate hickory-tree orchards, protecting them from animal invasion. That deforestation and degradation resulted in substantial erosion of the uplands, which caused the overbank and backwater flooding of the Ohio Valley area.

The changing vegetation, as a result of this deforestation also contributed to the demise, adaptation or evolution of several species.

Furthermore, Tankersley and his students uncovered evidence for animals being hunted by humans during this same period. Looking closely at the hash marks in animal bones, there was strong clues that humans had greatly contributed to the extinction of some of the species in BBL like the larger bison.

Consequently, through deforestation and arboriculture behavior, and the hunting and extinction of many species of animals, Tankersley found clear evidence that humans indeed contributed to the changing landscape even as far back as 5,000 years ago.

"It's hard to believe, but there is no volcano, no earthquake or tsunami that is moving more sediment than we are," says Tankersley. "Humans are the most powerful force on the planet right now."

To help prevent an underlying assumption of landscape change or stability where it does not exist, Tankersley's team efforts show that both natural and human anthropogenic erosional processes were taking place 5,000 years ago. This activity is directly responsible for the primary and secondary deposits of animal, plant and human artifact remains at Big Bone Lick, Kentucky.



Contacts and sources:
Tom Robinette

The Forces That Move Stars in Galaxies

Cosmic accidents are frequent occurrences in space: two or more disk galaxies collide and form elliptical systems. These contain regions in which the stars orbit the centre in precisely the opposite direction to what happens in the rest of the galaxy. Previous attempts to explain this assumed the colliding galaxies had a special relative orientation (“retrograde”). Athanasia Tsatsi, a doctoral student at the Max Planck Institute for Astronomy in Heidelberg, has now found a further possibility: the mass loss of the galaxies involved acts as a kind of huge rocket engine.

Galaxies about to collide: Snapshots from the simulation in which Athanasia Tsatsi was able to prove the effect of the galactic rocket engine. Left: galaxies before the merger; right: the result afterwards.  
© B. Moster / MPIA

Elliptical galaxies form when at least two disk galaxies (our Milky Way is one such galaxy) collide with each other and coalesce. Unusual things may happen in such systems: while the stars in the outer regions all rotate in one direction, the orbital direction shared by the stars in the core region may be a completely different one.

Why is this? One can imagine that the central region of one of the predecessor galaxies is held together particularly well by the gravitational force of the mass assembled in it. Now, the orbital orientation of the stars in this predecessor galaxy is in precisely the opposite direction to the orbital direction in which the two predecessor galaxies orbited each other before the merger (“retrograde merger”).

Under these conditions, it is plausible that the stable central region becomes the heart of the new elliptical galaxy after the merger, and that the stars in it continue to orbit in precisely the same direction as before. The surrounding stars will move in the opposite direction, however – continuing in the orbital direction in which the predecessor galaxies orbited each other before the merger.

This model appears to work well, but predicts a lower number of counter-rotating cores than are actually observed.

This was the point of departure when Athanasia Tsatsi began her doctoral research at the Max Planck Institute for Astronomy in Heidelberg and evaluated computer simulations of galactic collisions. Tsatsi’s aim was actually to find out what the evolving elliptical galaxies would look like through different types of astronomical observation instruments.

Instead, the young researcher made an unexpected discovery when looking through such a “virtual telescope”: Although the galaxy that formed in the simulated merger had a counter-rotating core, the predecessor galaxies by no means had the special orientation, which ought to be the condition for the formation of the retrograde motion according to the conventional attempt at an explanation.

The result of the simulated merger did match what was already known from observations, however. At 130 billion solar masses, the resulting elliptical galaxy was one of the more massive representatives of its class; it is precisely in such high-mass elliptical galaxies that counter-rotating cores are particularly common and long-lived: they could still be detected in computer programs even two billion years or so after the merger.

Athanasia Tsatsi saw something in the simulation which all her predecessors had missed: as the cores of the two galaxies orbit each other, there comes a moment in time when the direction reverses. This reversal takes place just as their reciprocal gravitational effect is causing the two systems to lose significant amounts of mass – and especially stars from their outer regions.

What happens in such a galaxy is closely related to the special case of a problem which the Russian mathematician Ivan Vsevolodovich Meshchersky (1859 to 1935) investigated: point particles whose masses change over time and move under the reciprocal effect of their gravitational force. The change in mass means additional forces, also known as Meshchersky forces, come into play here.

The best-known example of such forces occurs with rocket propulsion: the rocket ejects hot gases from its nozzle; the force thereby exerted on the rocket is in the opposite direction and the rocket accelerates.

This explains why counter-rotating cores can form even with galactic mergers with the same direction of rotation (“prograde merger”): The mass loss of the two galaxies has the same effect as a gigantic rocket engine and can therefore be powerful enough to reverse the orbital direction of the stars, which ultimately end up in the central region of the newly formed galaxy. Tsatsi therefore calls this way of forming counter-rotating cores the Meshchersky mechanism.

Although Athanasia Tsatsi’s discovery initially relates only to one individual case, it is sufficient to prove that central regions rotating in opposite directions really can form in this way. Next, the astronomers must find out how widespread formation processes of this type are – by investigating galactic mergers with a wide range of initial conditions.

If these systematic tests show that the Meshchersky mechanism for the formation of counter-rotating cores occurs frequently enough, this could explain the observed frequency of the phenomenon – theory and practice would then be in harmony.


Contacts and sources:
Dr. Markus Pössel
Max Planck Institute for Astronomy, Heidelberg

Proto-suns Teeming with Prebiotic Molecules

Complex organic molecules such as formamide, from which sugars, amino acids and even nucleic acids essential for life can be made, already appear in the regions where stars similar to our Sun are born. Astrophysicists from Spain and other countries have detected this biomolecule in five protostellar clouds and propose that it forms on tiny dust grains.

Nebulosa NGC1333, one of the stellar formation regions where formamide has been detected. 

Credit:  NASA-Spitzer

One of science's greatest challenges is learning about the origin of life and its precursor molecules. Formamide (NH2CHO) is an excellent candidate for helping to search for answers as it contains four essential elements (nitrogen, hydrogen, carbon and oxygen), and can synthesise amino acids, carbohydrates, nucleic acids and other key compounds for living organisms.

However, this molecule is also abundant in space, mainly in molecular clouds or the concentrations of gas and dust where stars are born. This has been confirmed by an international team of researchers, including Spanish investigators, after searching for formamide in ten star-forming regions.

"We have detected formamide in five protosuns, which proves that this molecule (in all probability also true for our Solar System) is relatively abundant in molecular clouds and is formed in the very early stages of evolution towards a star and its planets," explains Ana López Sepulcre, lead author of the study and researcher at the University of Tokyo (Japan), to SINC.

The other five objects where formamide has not been detected are less evolved and colder, "which indicates that a minimum temperature is needed for it to be detected in the gas," adds the scientist.

The study, which has just been published in the 'Monthly Notices of the Royal Astronomical Society', also offers clues on how formamide could be created in interstellar conditions. "We propose that it is formed on the surface of the dust grains.


Contacts and sources:
Plataforma SINC

Citation: A. López-Sepulcre, Ali A. Jaber, E. Mendoza, B. Lefloch, C. Ceccarelli, C.. Vastel, R. Bachiller, J. Cernicharo, C. Codella, C. Kahane, M. Kama, M. Tafalla. "Shedding light on the formation of the pre-biotic molecule formamide with ASAI". Monthly Notices of the Royal Astronomical Society, April 2015.

Inconspicuous, Tiny Particles Deform the Large-Scale Structure of the Universe

A systematic study of all massive galaxy clusters in the local universe provides information on the lightest elementary particles: Scientists at the Max Planck Institute for Extraterrestrial Physics analysed an X-ray catalogue to show that there is less structure in the universe today than what is expected from the cosmic microwave background observations of the very early universe. This discrepancy can be explained, if the three neutrino families have an overall mass of about half an electron-volt.

We are surrounded by them everywhere and they fly right through us, but we don’t feel them at all - the neutrinos, the strangest among the known elementary particles. They hardly interact with other matter, every second billions fly right through the Earth but only a fraction gets stuck. They are left over in large numbers from the Big Bang, about 340 million per cubic metre on average. Together with photons, the particles of light, they are the most numerous elementary particles in the universe.

Projection of the three-dimensional distribution of galaxy clusters detected in X-rays by the ROSAT satellite. The data are shown in galactic coordinates with the galactic plane in the centre. The gap in the data is due to the “zone-of-avoidance”, an area around the galactic plane where the extinction by the galactic interstellar medium makes observations very difficult. Blue dots are in the northern sky, red dots in the southern sky.

Because of massive neutrinos, the amount of galaxy clusters with a given mass is smaller than predicted by the cosmological standard model based on the results from the Planck satellite.
Credit: © MPE

For a long time, neutrinos were thought to be massless. But now we know from observations of solar neutrinos and from terrestrial experiments that they do carry mass. But we still don’t know how heavy they are. Nevertheless, due to their large number density they can contribute significantly to the mass density of the Universe even if they are relatively light-weight.

In space, another property of cosmic neutrinos becomes important: they are the fastest massive elementary particles left over from the Big Bang. While most other matter agglomerates due to gravitational forces over cosmic times into the large-scale structure we see today, neutrinos to some extent resist this concentration and clumping and actually hinder the growth of structure. Their effectivity depends on their mass: The more massive they are, the more they can impede the clumping of matter.

Astrophysics can take advantage of this damping effect, by measuring it in the formation of large-scale structure. A comparison of two observations unveils this effect. On one side, we see the density fluctuations in the early Universe at a time about 380 000 years after the Big Bang. This has been observed in the cosmic microwave background by the Planck satellite. With this input data, we can use accepted cosmological models to calculate quite precisely how the structure in the present day Universe should look like. This allows us to predict, for example, how many clusters of galaxies with a certain mass should be found per unit volume.

Scientists at the Max Planck Institute for Extraterrestrial Physics in Garching, Hans Böhringer and Gayoung Chon, have searched for all massive galaxy clusters in the nearby Universe (out to a distance of more than 3 Billion light years). They used X-ray observations with ROSAT and compiled a complete catalogue of these objects. This then allows a comparison of the observations with predictions from the cosmological standard model.

“Observations and theoretical prediction fit surprising well together,” asserts Hans Böhringer. “But a closer look reveals that the present day structures are less pronounced than predicted - when neglecting the mass of neutrinos.”

Even though the discrepancy is only 10%, the precision of measurements has increased dramatically over the past years, so that the scientists take the 10% discrepancy seriously.

“We can reconcile observation and theory if we allow for the neutrinos to have mass,” explains Gayoung Chon. “Our analysis indicates that all three neutrino families together have a mass in the range 0.17 to 0.73 eV.”

There three neutrino families, electron, muon and tau neutrino, which can “oscillate”, i.e. change into each other. Many experiments – and also the estimate based on the large scale structure – can only determine the mass differences or the mass of all three families combined. And this is indeed tiny: about 0.8 x 10-36 kg, one million times lighter than the mass of the electron, the lightest elementary particle in ordinary matter (that makes up our body).

Neutrinos therefore contribute only about 1-5% to dark matter. But even this tiny contribution causes an effect measurable with the current, precise methods. Some other cosmological measurements, such as the study of the gravitational lensing effect of large-scale structure and the peculiar motions of galaxies, suggest a damping in the growth of the large-scale structure amplitude as well.

Other, more exotic effects could be the cause for such a damping effect, for example the interaction of dark matter and dark energy has been suggested. “Massive neutrinos seem, however, to be the most plausible interpretation of the data at the moment,” says Hans Böhringer. “This is very encouraging and we are currently improving our measurements to provide more precise results in the future.”

This is a fascinating example for how the world is interwoven on smallest and largest scales. The largest clearly defined objects in the Universe, galaxy clusters, provide information on the lightest known elementary particles with mass. There are 48 orders of magnitude in between the mass scale of the two systems! Astrophysics is providing an important contribution to elementary particle physics.


Contacts and sources:
Dr. Hans Böhringer

Galaxies Die from the inside Out


A major astrophysical mystery has centred on how massive, quiescent elliptical galaxies, common in the modern Universe, quenched their once furious rates of star formation. Such colossal galaxies, often also called spheroids because of their shape, typically pack in stars ten times as densely in the central regions as in our home galaxy, the Milky Way, and have about ten times its mass.

Star formation in what are now "dead" galaxies sputtered out billions of years ago. ESO's Very Large Telescope and the NASA/ESA Hubble Space Telescope have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts.

This diagram illustrates this process. Galaxies in the early Universe appear at the left. The blue regions are where star formation is in progress and the red regions are the "dead" regions where only older redder stars remain and there are no more young blue stars being formed. The resulting giant spheroidal galaxies in the modern Universe appear on the right.
Credit:  ESO

Astronomers refer to these big galaxies as red and dead as they exhibit an ample abundance of ancient red stars, but lack young blue stars and show no evidence of new star formation. The estimated ages of the red stars suggest that their host galaxies ceased to make new stars about ten billion years ago. This shutdown began right at the peak of star formation in the Universe, when many galaxies were still giving birth to stars at a pace about twenty times faster than nowadays.

"Massive dead spheroids contain about half of all the stars that the Universe has produced during its entire life," said Sandro Tacchella of ETH Zurich in Switzerland, lead author of the article. "We cannot claim to understand how the Universe evolved and became as we see it today unless we understand how these galaxies come to be."

Tacchella and colleagues observed a total of 22 galaxies, spanning a range of masses, from an era about three billion years after the Big Bang [1]. The SINFONI instrument on ESO's Very Large Telescope (VLT) collected light fromthis sample of galaxies, showing precisely where they were churning out new stars. SINFONI could make these detailed measurements of distant galaxies thanks to its adaptive optics system, which largely cancels out the blurring effects of Earth's atmosphere.

The researchers also trained the NASA/ESA Hubble Space Telescope on the same set of galaxies, taking advantage of the telescope's location in space above our planet's distorting atmosphere. Hubble's WFC3 camera snapped images in the near-infrared, revealing the spatial distribution of older stars within the actively star-forming galaxies.

"What is amazing is that SINFONI's adaptive optics system can largely beat down atmospheric effects and gather information on where the new stars are being born, and do so with precisely the same accuracy as Hubble allows for the stellar mass distributions," commented Marcella Carollo, also of ETH Zurich and co-author of the study.

According to the new data, the most massive galaxies in the sample kept up a steady production of new stars in their peripheries. In their bulging, densely packed centres, however, star formation had already stopped.

"The newly demonstrated inside-out nature of star formation shutdown in massive galaxies should shed light on the underlying mechanisms involved, which astronomers have long debated," says Alvio Renzini, Padova Observatory, of the Italian National Institute of Astrophysics.

A leading theory is that star-making materials are scattered by torrents of energy released by a galaxy's central supermassive black hole asit sloppily devours matter. Another idea is that fresh gas stops flowing into a galaxy, starving it of fuel for new stars and transforming it into a red and dead spheroid.

"There are many different theoretical suggestions for the physical mechanisms that led to the death of the massive spheroids," said co-author Natascha Förster Schreiber, at the Max-Planck-Institut für extraterrestrische Physik in Garching, Germany. "Discovering that the quenching of star formation started from the centres and marched its way outwards is a very important step towards understanding how the Universe came to look like it does now."


Contacts and sources:
Sandro Tacchella
ETH Zurich 

Richard Hook
ESO

Paleolithic Remains Show Cannibalistic Habits of Human Ancestors

Analysis of ancient cadavers recovered at a famous archaeological site confirm the existence of a sophisticated culture of butchering and carving human remains, according to a team of scientists from the Natural History Museum, University College London, and a number of Spanish universities.

Gough’s Cave in Somerset was thought to have given up all its secrets when excavations ended in 1992, yet research on human bones from the site has continued in the decades since. After its discovery in the 1880s, the site was developed as a show cave and largely emptied of sediment, at times with minimal archaeological supervision. The excavations uncovered intensively-processed human bones intermingled with abundant butchered large mammal remains and a diverse range of flint, bone, antler, and ivory artefacts.

Credit: The Natural History Museum

New radiocarbon techniques have revealed remains were deposited over a very short period of time, possibly during a series of seasonal occupations, about 14,700 years ago.

Dr Silvia Bello, from the Natural History Museum’s Department of Earth Sciences, lead researcher of the work said, “The human remains have been the subject of several studies. In a previous analysis, we could determine that the cranial remains had been carefully modified to make skull-cups. During this research, however, we’ve identified a far greater degree of human modification than recorded in earlier. We’ve found undoubting evidence for defleshing, disarticulation, human chewing, crushing of spongy bone, and the cracking of bones to extract marrow.”

The presence of human tooth marks on many of the bones provides incontrovertible evidence for cannibalism, the team found. In a wider context, the treatment of the human corpses and the manufacture and use of skull-cups at Gough’s Cave has parallels with other ancient sites in central and western Europe. But the new evidence from Gough’s Cave suggests that cannibalism during the ‘Magdalenian period’ was part of a customary mortuary practice that combined intensive processing and consumption of the bodies with the ritual use of skull-cups.


Credit: The Natural History Museum

Simon Parfitt, of University College London, said, “A recurring theme of this period is the remarkable rarity of burials and how commonly we find human remains mixed with occupation waste at many sites. Further analysis along the lines used to study Gough's Cave will help to establish whether the type of ritualistic cannibalism practiced there is a regional (‘Creswellian’) phenomenon, or a more widespread practice found throughout the Magdalenian world.”


Contacts and sources:
The Natural History Museum

Citation:  Upper Palaeolithic ritualistic cannibalism at Gough's Cave (Somerset, UK): The human remains from head to toe Silvia M. Bello, Palmira Saladié, Isabel Cáceres, Antonio Rodríguez-Hidalgo, Simon A. Parfitta, Available online 15 April 2015, Journal of Human Evolution - 

Engineers Purify Sea and Wastewater in 2.5 Minutes

A group of Mexican engineers from the Jhostoblak Corporate created technology to recover and purify, either seawater or wastewater from households, hotels, hospitals, commercial and industrial facilities, regardless of the content of pollutants and microorganisms in, incredibly, just 2.5 minutes.

Credit: Investigación y Desarrollo

The System PQUA, works with a mixture of dissociating elements, capable of separating and removing all contaminants, as well as organic and inorganic pollutants. "The methodology is founded on molecularly dissociating water pollutants to recover the minerals necessary and sufficient in order for the human body to function properly nourished", technical staff explained.

Notably, the engineers developed eight dissociating elements, and after extensive testing on different types of contaminated water, implemented a unique methodology that indicates what and how much of each element should be combined.

"During the purification process no gases, odors nor toxic elements that may damage or alter the environment, human health or quality of life are generated" said the Mexican firm.

The corporation has a pilot plant in their offices that was used to demonstrate the purification process, which uses gravity to save energy. We observed that the residual water in the container was pumped to reactor tank, where it received a dosing of the dissociating elements in predetermined amounts.

In this phase solid, organic and inorganic matter as well as heavy metals are removed by precipitation and gravity; and a sludge settles at the bottom of the reactor. The latter is removed and examined to determine if it is suitable to use as fertilizer or manufacture construction materials.

Subsequently, the water is conducted to a clarifier tank, to sediment the excess charge of dissolved elements; then the liquid reaches a filter to remove turbidity and is finally passed by polishing tank that eliminates odors, colors and flavors. The treated water is transported to a container where ozone is added to ensure its purity, and finally is ready to drink. Indeed, the resulting liquid is fresh, odorless and has a neutral taste.

"We have done over 50 tests on different types of wastewater and all have been certified and authorized by the laboratories of the Mexican Accreditation Agency (EMA). Also, the Monterrey Institute of Technology and Higher Education (ITESM), the College of Mexico and the National Polytechnic Institute (IPN) have given their validation that the water treated with our technology meets the SSA NOM 127 standard, which indicates the parameters and quality characteristics for vital liquid to be used for human consumption, " says the Corporate Jhostoblak.

Moreover, they report that this development is protected under trade secret in America and soon will get the same record in Switzerland. Its implementation in the market will depend on the needs of users and the issue of new laws regarding use, consumption and water discharge.

For more information, enter Corporate Jhostoblak’s website www.sistemaspqua.weebly.com or write to vrcorporativo@gmail.com.

Contacts and sources:
Investigación y Desarrollo

Disney Researchers’ 3-D Printer Shows Soft Sides With Layered Fabric and Wiring

A team from Disney Research and Carnegie Mellon University have devised a 3-D printer that layers together laser-cut sheets of fabric to form soft, squeezable objects such as bunnies, doll clothing and phone cases. These objects can have complex geometries and incorporate circuitry that makes them interactive.

"Today's 3-D printers can easily create custom metal, plastic, and rubber objects," said Jim McCann, associate research scientist at Disney Research Pittsburgh. "But soft fabric objects, like plush toys, are still fabricated by hand. Layered fabric printing is one possible method to automate the production of this class of objects."

3D printed objects from our layered fabric 3D printer: (a) printed fabric Stanford bunny, (b) printed Japanese sunny doll with two materials, (c) printed touch sensor, (d) printed cellphone case with an embedded conductive fabric coil for wireless power reception.
Credit: Disney Research Pittsburgh

The fabric printer is similar in principle to laminated object manufacturing, which takes sheets of paper or metal that have each been cut into a 2-D shape and then bonds them together to form a 3-D object. Fabric presents particular cutting and handling challenges, however, which the Disney team has addressed in the design of its printer.

The layered-fabric printer will be described at the Association for Computing Machinery's annual Conference on Human Factors in Computing Systems, CHI 2015, April 18-23 in Seoul, South Korea, where the report has received an honorable mention for a Best Paper award. In addition to McCann, the team included Huaishu Peng, a Ph.D. student in information science at Cornell University, and Scott Hudson and Jen Mankoff, both faculty members in Carnegie Mellon's Human-Computer Interaction Institute.

Last year at CHI, Hudson presented a soft 3-D object printer he developed at Disney Research that deposits layers of needle-felted yarn. The layered-fabric printing method, by contrast, can produce thicker, more squeezable objects.

Disney presents a new type of 3D printer that can form precise, but soft and deformable 3D objects from layers of off-the-shelf fabric. Their printer employs an approach where a sheet of fabric forms each layer of a 3D object. The printer cuts this sheet along the 2D contour of the layer using a laser cutter and then bonds it to previously printed layers using a heat sensitive adhesive. Surrounding fabric in each layer is temporarily retained to provide a removable support structure for layers printed above it. This process is repeated to build up a 3D object layer by layer. 

The printer is capable of automatically feeding two separate fabric types into a single print. This allows specially cut layers of conductive fabric to be embedded in our soft prints. Using this capability Disney demonstrates 3D models with touch sensing capability built into a soft print in one complete printing process, and a simple LED display making use of a conductive fabric coil for wireless power reception.
Credit: Disney Research

The latest soft printing apparatus includes two fabrication surfaces - an upper cutting platform and a lower bonding platform. Fabric is fed from a roll into the device, where a vacuum holds the fabric up against the upper cutting platform while a laser cutting head moves below. The laser cuts a rectangular piece out of the fabric roll, then cuts the layer's desired 2-D shape or shapes within that rectangle. This second set of cuts is left purposefully incomplete so that the shapes receive support from the surrounding fabric during the fabrication process.

Once the cutting is complete, the bonding platform is raised up to the fabric and the vacuum is shut off to release the fabric. The platform is lowered and a heated bonding head is deployed, heating and pressing the fabric against previous layers. The fabric is coated with a heat-sensitive adhesive, so the bonding process is similar to a person using a hand iron to apply non-stitched fabric ornamentation onto a costume or banner.

Once the process is complete, the surrounding support fabric is torn away by hand to reveal the 3-D object.

The researchers demonstrated this technique by using 32 layers of 2-millimeter-thick felt to create a 2 ½-inch bunny. The process took about 2 ½ hours.

"The layers in the bunny print are evident because the bunny is relatively small compared to the felt we used to print it," McCann said. "It's a trade-off -- with thinner fabric, or a larger bunny, the layers would be less noticeable, but the printing time would increase."

Two types of material can be used to create objects by feeding one roll of fabric into the machine from left to right, while a second roll of a different material is fed front to back. If one of the materials is conductive, the equivalent of wiring can be incorporated into the device. The researchers demonstrated the possibilities by building a fabric starfish that serves as a touch sensor, as well as a fabric smartphone case with an antenna that can harvest enough energy from the phone to light an LED.

The feel of a fabricated object can be manipulated in the fabrication process by adding small interior cuts that make it easy to bend the object in one direction, while maintaining stiffness in the perpendicular direction.


Contacts and sources: 
Jennifer Liu
Disney Research

Lasers System Proposed To Clear Space Debris

An international team of scientists have put forward a blueprint for a purely space-based system to solve the growing problem of space debris. The proposal, published in Acta Astronautica, combines a super-wide field-of-view telescope, developed by RIKEN's EUSO team, which will be used to detect objects, and a recently developed high-efficiency laser system, the CAN laser that was presented in Nature Photonics in 2013, that will be used to track space debris and remove it from orbit.

Space debris seen from outside geosynchronous orbit (GEO). The two main debris fields are the ring of objects in GEO and the cloud of objects in low Earth orbit (LEO) Debris plot by NASA. 
Credit: NASA 

The computer-generated image shows

ROBEAR: Strong Caregiving Robot With The Gentle Touch


Scientists from RIKEN and Sumitomo Riko Company Limited have developed a new experimental nursing care robot, ROBEAR, which is capable of performing tasks such as lifting a patient from a bed into a wheelchair or providing assistance to a patient who is able to stand up but requires help to do so. ROBEAR will provide impetus for research on the creation of robots that can supplement Japan’s need for new approaches to caregiving.

Credit: RIKEN

The new robot developed by the RIKEN-SRK Collaboration Center for Human-Interactive Robot Research in Nagoya is a successor to RIBA, which was announced in 2009, and RIBA-II, which was developed in 2011. The new ROBEAR robot is lighter than its predecessors, weighing just 140 kilograms compared to RIBA-II’s 230 kilograms, and it incorporates a number of features that enable it to exert force in a gentle way.

Specifically, it includes actuator units with a very low gear ratio, allowing the joints to move very quickly and precisely, and allowing backdrivability, meaning that the force encountered by the actuators as they perform their tasks can be quickly fed back into the system, allowing softer movement. It also incorporates three types of sensors, including torque sensors and Smart Rubber capacitance-type tactile sensors made entirely of rubber, which allow for gentle movements, ensuring that the robot can perform power-intensive tasks such as lifting patients without endangering them.

ROBEAR helps a person rise from a sofa and sit in a wheelchair.
Credit: RIKEN

The robot also improves on its predecessors by having a small base, making the total system more lightweight. It avoids falling over through the use of legs that can be extended when necessary for lifting a patient but retracted to allow the robot to maneuver through tight spaces such as doorways.

Credit: RIKEN

With its rapidly increasing elderly population, Japan faces an urgent need for new approaches to assist care-giving personnel. One of the most strenuous tasks for such personnel, carried out an average of 40 times every day, is that of lifting a patient from a bed into a wheelchair, and this is a major cause of lower back pain. Robots are well-suited to this task, yet none have yet been deployed in care-giving facilities.

According to Toshiharu Mukai, leader of the Robot Sensor Systems Research Team, "We really hope that this robot will lead to advances in nursing care, relieving the burden on care-givers today. We intend to continue with research toward more practical robots capable of providing powerful yet gentle care to elderly people."


Contacts and sources:
Toshiharu Mukai, Team Leader 
Robot Sensor Systems Research Team
RIKEN―SRK Collaboration Center for Human-Interactive Robot Research
RIKEN Innovation Center

Intense Magnetic Field Discovered Close To Supermassive Black Hole


Supermassive black holes, often with masses billions of times that of the Sun, are located at the heart of almost all galaxies in the Universe. These black holes can accrete huge amounts of matter in the form of a surrounding disc.

This artist's impression shows the surroundings of a supermassive black hole, typical of that found at the heart of many galaxies. The black hole itself is surrounded by a brilliant accretion disc of very hot, infalling material and, further out, a dusty torus. There are also often high-speed jets of material ejected at the black hole's poles that can extend huge distances into space.
Credit: ESO/L. Calçada


While most of this matter is fed into the black hole, some can escape moments before capture and be flung out into space at close to the speed of light as part of a jet of plasma. How this happens is not well understood, although it is thought that strong magnetic fields, acting very close to the event horizon, play a crucial part in this process, helping the matter to escape from the gaping jaws of darkness.

Astronomers from Chalmers University of Technology have used the giant telescope Alma to reveal an extremely powerful magnetic field very close to a supermassive black hole in a distant galaxy. The results appear in the 17 April 2015 issue of the journal Science.

A team of five astronomers from Chalmers University of Technology have revealed an extremely powerful magnetic field, beyond anything previously detected in the core of a galaxy, very close to the event horizon of a supermassive black hole. This new observation helps astronomers to understand the structure and formation of these massive inhabitants of the centres of galaxies, and the twin high-speed jets of plasma they frequently eject from their poles.

Up to now only weak magnetic fields far from black holes -- several light-years away -- had been probed. In this study, however, astronomers from Chalmers University of Technology and Onsala Space Observatory in Sweden have now used Alma to detect signals directly related to a strong magnetic field very close to the event horizon of the supermassive black hole in a distant galaxy named PKS 1830-211. This magnetic field is located precisely at the place where matter is suddenly boosted away from the black hole in the form of a jet.

The giant telescope Alma, made up of 66 individual antennas and located at 5,000 altitude in northern Chile, has revealed the intense magnetic field close to a supermassive black hole. In this image, taken during the ESO Ultra HD (UHD) Expedition, the central parts of our galaxy the Milky Way can be seen above the telescope.
Credit: ESO/B. Tafreshi

The team measured the strength of the magnetic field by studying the way in which light was polarised, as it moved away from the black hole.

"Polarisation is an important property of light and is much used in daily life, for example in sun glasses or 3D glasses at the cinema," says Ivan Marti-Vidal, lead author of this work.

"When produced naturally, polarisation can be used to measure magnetic fields, since light changes its polarisation when it travels through a magnetised medium. In this case, the light that we detected with Alma had been travelling through material very close to the black hole, a place full of highly magnetised plasma."

The astronomers applied a new analysis technique that they had developed to the Alma data and found that the direction of polarisation of the radiation coming from the centre of PKS 1830-211 had rotated.

Magnetic fields introduce Faraday rotation, which makes the polarisation rotate in different ways at different wavelengths. The way in which this rotation depends on the wavelength tells us about the magnetic field in the region.

The Alma observations were at an effective wavelength of about 0.3 millimetres, the shortest wavelengths ever used in this kind of study. This allows the regions very close to the central black hole to be probed. Earlier investigations were at much longer radio wavelengths. Only light of millimetre wavelengths can escape from the region very close to the black hole; longer wavelength radiation is absorbed.

"We have found clear signals of polarisation rotation that are hundreds of times higher than the highest ever found in the Universe," says Sebastien Muller, co-author of the paper. "Our discovery is a giant leap in terms of observing frequency, thanks to the use of Alma, and in terms of distance to the black hole where the magnetic field has been probed -- of the order of only a few light-days from the event horizon. These results, and future studies, will help us understand what is really going on in the immediate vicinity of supermassive black holes."


Contacts and sources:
Robert Cumming Chalmers University of Technology 

Most Tiny Quantity of Reality Ever Imagined Sought at South Pole IceCube Experiment

Neutrinos are a type of particle that pass through just about everything in their path from even the most distant regions of the universe. The Earth is constantly bombarded by billions of neutrinos, which zip right through the entire globe, houses, animals, people - everything. 

Neutrinos are subatomic particles produced by the decay of radioactive elements and are elementary particles that lack an electric charge, or, as Nobel Prize winning physicist Frederick Reines would say, "...the most tiny quantity of reality ever imagined by a human being".

IceCube is comprised of a cubic kilometer of ice, which is densely packed with optical modules. The detector is located deep below the surface -- it starts 1½ km below the ice and ends at the bottom at a depth of 2½ km. The instruments in the detector are comprised of 86 cables each with 60 digital Optical Modules (extremely sensitive light sensors).

Credit:  IceCube Collaboration

Only very rarely do they react with matter, but the giant IceCube experiment at the South Pole can detect when there is a collision between neutrinos and atoms in the ice using a network of detectors. New research results from the Niels Bohr Institute among others have measured the neutrinos at the South Pole and have calculated some of the physical properties of the otherwise exotic and poorly understood particles. The results are published in the scientific journal Physical Review D.

Neutrinos are among nature's most abundant particles. Their number far exceeds the number of atoms in the entire universe - yet we know little about them. Neutrinos are a type of particle created in the Big Bang and are also produced in the Sun's interior and in violent events like supernovae, which are exploding stars. Neutrinos are also called 'ghost particles' because they basically do not interact with matter, but pass undisturbed through everything in their path.

Instruments at the South Pole

Researchers from 44 institutions in 12 countries are part of an international project, IceCube at the South Pole to study the mysterious particles with the strange properties.

Jason Koskinen is shown at the South Pole. In the background is the IceCube Lab. All data from the instruments deep down in the ice comes up through the two "towers" and into the computer center, where the first analyses are done.
Credit: T. Waldemaier
 
IceCube is an enormous particle detector located deep in the ice at the South Pole. The instruments in the detector are comprised of 86 cables each with 60 digital Optical Modules (extremely sensitive light sensors). Each cable is lowered down into a hole, which is melted through the 2½ km ice sheer using a hot water drill. The detector is located deep below the surface - it starts 1½ km below the ice and ends at the bottom at a depth of 2½ km.
The detector's enormous size of a cubic kilometer is necessary because neutrinos interact extremely weakly with matter, so it is only rarely that they collide with the atoms in the ice. When they finally collide, charged particles are created, which emit radiation that can be detected by the extremely sensitive Digital Optical Modules.
"In the Ice Cube project we have registered about 35 neutrinos, which are very likely to have come distant regions in space. They have a very high energy and because they have not interacted during their long journey, they can carry information from the most distant parts of the universe. In addition to the rare cosmic neutrinos, we are also studying the neutrinos created in the Earth's atmosphere in order to unravel the physical properties of neutrinos," says Jason Koskinen, Assistant Professor and head of the IceCube Group at the Niels Bohr Institute, University of Copenhagen.

From the North Pole to the South Pole

When particles (protons) with high energy - from violent events in the cosmos like supernovae and quasars hit the Earth's atmosphere, a burst of neutrinos is formed, which passes through the Earth. The neutrinos formed over the North Pole pass straight through the Earth and very small proportion of them hit the ice at the South Pole, where the IceCube detector registers the collisions.

Neutrinos are very light particles and for many years it was believed that they were completely massless. It is now believed that there are three types of neutrinos (electron, muon and tau neutrinos), each with their specific mass, which is incredibly small - less than a millionth of the mass of an electron.

"The neutrinos created in the atmosphere over the North Pole are mostly muon neutrinos. On their way through the Earth's 13,000 km, the muon neutrinos undergo quantum fluctuations that can change them into another type of neutrino, tau neutrinos, before they are finally detected by IceCube on the other side of the globe. We can now study these effects in much greater detail than before and in this way we can gain new insights into their physical characteristics," explains Jason Koskinen.

Atmospheric neutrinos

The research group has now studied atmospheric neutrinos in the IceCube detector at the South Pole for three years and have analysed 5,200 interactions between atmospheric neutrinos and atoms in the ice.

"We have confirmed that neutrinos undergo fluctuations - even at high energy levels and we have calculated how much they exhibit these oscillations. In this study, we have only measured muon neutrinos and in comparison to how many muon neutrinos form in in the atmosphere and pass through the Earth, we only see a fraction at the South Pole. The explanation is that the muon neutrinos undergo quantum fluctuations that change them into tau neutrinos and we do not see those. If they had not changed, we would see them all. Our calculations show that 20 percent have undergone quantum fluctuations and changed from muon neutrinos to another type of neutrino as they pass through the Earth," explains Jason Koskinen.

Messengers from the universe

And then what, you might ask? "Because we basically want to learn more about these strange particles that are everywhere in the universe and whose properties we still do not fully understand. Because neutrinos come from the cosmos, we could use them for astronomical observations and gain new insights into the structure of the universe," says Jason Koskinen.




Contacts and sources:
Jason Koskinen, Assistant Professor and head of the IceCube Group at the Niels Bohr Institute, University of Copenhagen

Morten Medici, PhD student, IceCube Group at the Niels Bohr Institute, University of Copenhagen
Citation: http://journals.aps.org/prd/abstract/10.1103/PhysRevD.91.072004

Complex Cognition Shaped the Stone Age Hand Axe, Study Shows

The ability to make a Lower Paleolithic hand axe depends on complex cognitive control by the prefrontal cortex, including the "central executive" function of working memory, a new study finds.

An Acheulean handaxe, Haute-Garonne France Biface - Different views of the same specimen

PLOS ONE published the results, which knock another chip off theories that Stone Age hand axes are simple tools that don't involve higher-order executive function of the brain.

"For the first time, we've showed a relationship between the degree of prefrontal brain activity, the ability to make technological judgments, and success in actually making stone tools," says Dietrich Stout, an experimental archeologist at Emory University and the leader of the study. "The findings are relevant to ongoing debates about the origins of modern human cognition, and the role of technological and social complexity in brain evolution across species."

The skill of making a prehistoric hand axe is "more complicated and nuanced than many people realize," Stout says. "It's not just a bunch of ape-men banging rocks together. We should have respect for Stone Age tool makers."

The study's co-authors include Bruce Bradley of the University of Exeter in England, Thierry Chaminade of Aix-Marseille University in France; and Erin Hecht and Nada Khreisheh of Emory University.

Stone tools - shaped by striking a stone "core" with a piece of bone, antler, or another stone - provide some of the most abundant evidence of human behavioral change over time. Simple Oldowan stone flakes are the earliest known tools, dating back 2.6 million years. The Late Acheulean hand axe goes back 500,000 years. While it's relatively easy to learn to make an Oldowan flake, the Acheulean hand axe is harder to master, due to its lens-shaped core tapering down to symmetrical edges.

Acheulean Biface from Saint Acheul
Credit:   Didier Descouens, 13 April 2011. Licensed under CC BY-SA 4.0 via Wikimedia Commons

"We wanted to tease apart and compare what parts of the brain were most actively involved in these stone tool technologies, particularly the role of motor control versus strategic thinking," Stout says.

The researchers recruited six subjects, all archeology students at Exeter University, to train in making stone tools, a skill known as "knapping." The subjects' skills were evaluated before and after they trained and practiced. For Oldowan evaluations, subjects detached five flakes from a flint core. For Acheulean evaluations, they produced a tool from a standardized porcelain core.

At the beginning, middle and end of the 18-month experiment, subjects underwent functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI) scans of their brains while they watched videos. The videos showed rotating stone cores marked with colored cues: A red dot indicated an intended point of impact, and a white area showed the flake predicted to result from the impact. The subjects were asked the following questions:

"If the core were struck in the place indicated, is what you see a correct prediction of the flake that would result?"

"Is the indicated place to hit the core a correct one given the objective of the technology?"

The subjects responded by pushing a "yes" or "no" button.

Answering the first question, how a rock will break if you hit it in a certain place, relies more on reflexive, perceptual and motor-control processes, associated with posterior portions of the brain. Stout compares it to the modern-day rote reflex of a practiced golf swing or driving a car.

The second question - is it a good idea to hit the core in a certain spot if you want to make a hand axe - involves strategic thinking, such as planning the route for a road trip. "You have to think about information that you have stored in your brain, bring it online, and then make a decision about each step of the trip," Stout says.

This so-called executive control function of the brain, associated with activity in the prefrontal cortex, allows you to project what's going to happen in the future and use that projection to guide your action. "It's kind of like mental time travel, or using a computer simulation," Stout explains. "It's considered a high level, human cognitive capacity."

The researchers mapped the skill level of the subjects onto the data from their brain scans and their responses to the questions.

Greater skill at making tools correlated with greater accuracy on the video quiz for predicting the correct strategy for making a hand axe, which was itself correlated with greater activity in the prefrontal cortex. "These data suggest that making an Acheulean hand axe is not simply a rote, auto pilot activity of the brain," Stout says. "It requires you to engage in some complicated thinking."

Most of the hand axes produced by the modern hands and minds of the study subjects would not have cut it in the Stone Age. "They weren't up to the high standards of 500,000 years ago," Stout says.

A previous study by the researchers showed that learning to make stone tools creates structural changes in fiber tracts of the brain connecting the parietal and frontal lobes, and that these brain changes correlated with increases in performance. "Something is happening to strengthen this connection," Stout says. "This adds to evidence of the importance of these brain systems for stone tool making, and also shows how tool making may have shaped the brain evolutionarily."

Stout recently launched a major, three-year archeology experiment that will build on these studies and others. Known as the Language of Technology project, the experiment involves 20 subjects who will each devote 100 hours to learning the art of making a Stone Age hand axe, and also undergo a series of MRI scans. The project aims to hone in whether the brain systems involved in putting together a sequence of words to make a meaningful sentence in spoken language overlap with systems involved in putting together a series of physical actions to reach a meaningful goal.

Contacts and sources:
Megan McRainey
Emory Health Sciences

Mysteries of Synaesthesia: Why Some Taste Sound and Hear Color

Researchers at The Australian National University (ANU) have shed new light on synesthesia - the effect of hearing colours, seeing sounds and other cross-sensory phenomena.

Lead Researcher, ANU Research School of Psychology's Dr Stephanie Goodhew, said the research found synaesthetes had much stronger mental associations between related concepts.

"For them words like 'doctor' and 'nurse' are very closely associated, where 'doctor' and 'table' are very unrelated. Much more so than for people without the condition," she said.

ANU researchers have shed new light on synesthesia. 

Image: The Health Blog, Flickr.

The findings could help researchers better understand the mysteries of synaesthesia, which Dr Goodhew said affects an estimated one in every 100 people.

Dr Goodhew said synesthetes have stronger connections between different brain areas, particularly between what we think of as the language part of the brain and the colour part of the brain. Those connections lead to a triggering effect, where a stimulus in one part of the brain would cause activity in another.

"Things like hearing shapes, so a triangle will trigger an experience of a sound or a colour, or they might have a specific taste sensation when they hear a particular sound," she said.

"One person reported that smells have certain shapes. For example the smell of fresh air is rectangular, coffee is a bubbly cloud shape and people could smell round or square."

The research centred on measuring the extent that people with Synesthesia draw meaning between words.

"Going in we were actually predicting that synesthetes might have a more concrete style of thinking that does not emphasise conceptual-level relations between stimuli, given that they have very rigid parings between sensory experiences.

"We found exactly the opposite," Dr Goodhew said.

The research paper, Enhanced semantic priming in synesthetes independent of sensory binding, was published in the journal Consciousness and Cognition.

Contacts and sources:
Aaron Walker
The Australian National University (ANU)

Zombie Worms Feasting On Giants' Bones Since Prehistoric Times

A species of bone-eating worm that was believed to have evolved in conjunction with whales has been dated back to prehistoric times when it fed on the carcasses of giant marine reptiles.

Scientists at Plymouth University found that Osedax - popularised as the 'zombie worm' - originated at least 100 million years ago, and subsisted on the bones of prehistoric reptiles such as plesiosaurs and sea turtles.

Osedax worms 
Credit: Plymouth University 

Reporting in the Royal Society journal Biology Letters this month, the research team at Plymouth reveal how they found tell-tale traces of Osedax on plesiosaur fossils held in the Sedgwick Museum at the University of Cambridge.

Dr Nicholas Higgs, a Research Fellow in the Marine Institute, said the discovery was important for both understanding the genesis of the species and its implications for fossil records. "The exploration of the deep sea in the past decades has led to the discovery of hundreds of new species with unique adaptations to survive in extreme environments, giving rise to important questions on their origin and evolution through geological time." said Nicholas. "The unusual adaptations and striking beauty of Osedax worms encapsulate the alien nature of deep-sea life in public imagination.

"And our discovery shows that these bone-eating worms did not co-evolve with whales, but that they also devoured the skeletons of large marine reptiles that dominated oceans in the age of the dinosaurs. Osedax, therefore, prevented many skeletons from becoming fossilised, which might hamper our knowledge of these extinct leviathans."

The finger-length Osedax is found in oceans across the globe at depths of up to 4,000m, and it belongs to the Siboglinidae family of worms, which, as adults, lack a mouth and digestive system. Instead, they penetrate bone using root-like tendrils through which they absorb bone collagen and lipids that are then converted into energy by bacteria inside the worm.

Typically they consume whale bones, prompting many scientists to believe that they co-evolved 45 million years ago, branching out from their cousins that used chemosysnthesis to obtain food.

But Nicholas, and research lead Dr Silvia Danise, of Plymouth's School of Geography, Earth and Environmental Sciences, studied fossil fragments taken from a plesiosaur unearthed in Cambridge, and a sea turtle found in Burham, Kent.

Using a computed tomography scanner at the Natural History Museum - essentially a three-dimensional X-ray - they were able to create a computer model of the bones, and found tell-tale bore holes and cavities consistent with the burrowing technique of Osedax.

Dr Danise said: "The increasing evidence for Osedax throughout the oceans past and present, combined with their propensity to rapidly consume a wide range of vertebrate skeletons, suggests that Osedax may have had a significant negative effect on the preservation of marine vertebrate skeletons in the fossil record.

"By destroying vertebrate skeletons before they could be buried, Osedax may be responsible for the loss of data on marine vertebrate anatomy and carcass-fall communities on a global scale. The true extent of this 'Osedax effect', previously hypothesized only for the Cenozoic, now needs to be assessed for Cretaceous marine vertebrates."



Contacts and sources:
Alan Williams
Plymouth University

The paper, Mesozoic origin for the bone-eating Osedax worms, is available in the Royal Society journal Biology Letters.

Meteorites Confirm Earth's First Crust Formed 4.5 Billion Years Ago

A new analysis of the chemical make-up of meteorites has helped scientists work out when the Earth formed its layers.

The research by an international team of scientists confirmed the Earth's first crust had formed around 4.5 billion years ago.  In geology, the crust is the outermost solid shell of a rocky planet or natural satellite, which is chemically distinct from the underlying mantle.

World geologic provinces
Credit:  U.S. Geological Survey

The crusts of Earth, the Moon, Mercury, Venus, Mars, Io, and other planetary bodies have been generated largely by igneous processes, and these crusts are richer in incompatible elements than their respective mantles.

The team measured the amount of the rare elements hafnium and lutetium in the mineral zircon in a meteorite that originated early in the solar system.

"Meteorites that contain zircons are rare. We had been looking for an old meteorite with large zircons, about 50 microns long, that contained enough hafnium for precise analysis," said Dr Yuri Amelin, from The Australian National University (ANU) Research School of Earth Sciences.

Dr Yuri Amelin. 

Image: Stuart Hay

"By chance we found one for sale from a dealer. It was just what we wanted. We believe it originated from the asteroid Vesta, following a large impact that sent rock fragments on a course to Earth."

The heat and pressure in the Earth's interior mixes the chemical composition of its layers over billions of years, as denser rocks sink and less dense minerals rise towards the surface, a process known as differentiation.

Determining how and when the layers formed relies on knowing the composition of the original material that formed into the Earth, before differentiation, said Dr Amelin.

"Meteorites are remnants of the original pool of material that formed all the planets," he said.

"But they have not had planetary-scale forces changing their composition throughout their five billion years orbiting the sun."

The team accurately measured the ratio of the isotopes hafnium-176 and hafnium-177 in the meteorite, to give a starting point for the Earth's composition.

The team were then able to compare the results with the oldest rocks on Earth, and found that the chemical composition had already been altered, proving that a crust had already formed on the surface of the Earth around 4.5 billion years ago.


Contacts and sources: 
Dr. Yuri Amelin

Thursday, April 16, 2015

9/11 Leaves Legacy of Chronic Ill Health among Emergency Medical Services Workers

The 9/11 attacks on the World Trade Center in New York City in 2001 have left a legacy of chronic ill health among emergency medical services workers who came to the rescue of the victims, reveals research published online in Occupational & Environmental Medicine.


Those who arrived in the immediate aftermath of the attacks are most at risk of physical and mental ailments, the findings show.

The health of 2281 New York City Fire Department emergency medical services workers deployed to the scene of the World Trade Center attacks was tracked over a period of 12 years, from the date of the incident on September 11 2001 to the end of December 2013.

The researchers looked at the mental and physical health conditions that have been certified as being linked to the aftermath of the incident under the James Zadroga 9/11 Health and Compensation Act of 2010.

Between 2001 and 2013, the cumulative incidence of acid reflux disease (GERD) was just over 12% while obstructive airways disease (OAD), which includes bronchitis and emphysema, was just under 12%. The cumulative incidences of rhinosinusitis and cancer were 10.6% and 3.1%, respectively.

Validated screening tests were used to gauge the prevalence of mental health conditions: this was 16.7% for probable depression; 7% for probable post-traumatic stress disorder (PTSD); and 3% for probable harmful alcohol use.

Compared with the workers who did not attend the aftermath of the World Trade Center attacks, those who arrived earliest on the scene were at greatest risk for nearly all the health conditions analysed.

They were almost four times as likely to have acid reflux and rhinosinusitis, seven times as likely to have probable PTSD, and twice as likely to have probable depression.

And the more intense the experience was at the time, the greater was the risk of a diagnosis of acid reflux, obstructive airways disease, or rhinosinusitis, and of testing positive for PTSD, depression, and harmful drinking.

The degree of ill health among workers attending the scene was generally lower than that of a demographically similar group of New York City firefighters, probably because of the differences in tasks performed at the World Trade Center site, suggest the authors.

The findings of a substantial amount of ill health underscore the need for continued monitoring and treatment of emergency medical services workers who helped the victims of the World Trade Center attacks, they conclude.


Contacts and sources: 
Caroline White
BMJ Company

Death of Giant Galaxies Spreads From the Core


Astronomers have shown for the first time how star formation in "dead" galaxies sputtered out billions of years ago. The NASA/ESA Hubble Space Telescope and ESO's Very Large Telescope (VLT) have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts. The results will be published in the 17 April 2015 issue of the journal Science.

This NASA/ESA Hubble Space Telescope image shows an elliptical galaxy known as IC 2006. Massive elliptical galaxies like these are common in the modern Universe, but how they quenched their once furious rates of star formation is an astrophysical mystery.

Credit: ESA/Hubble & NASA Image acknowledgement: Judy Schmidt and J. Blakeslee (Dominion Astrophysical Observatory). Note that the image is not related to science release content. Science acknowledgement: M. Carollo (ETH, Switzerland)

A major astrophysical mystery has centred on how the massive, quiescent elliptical galaxies, common in the modern Universe, quenched their once furious rates of star formation. Such colossal galaxies, often also called spheroids because of their shape, typically pack in stars ten times as densely in the central regions as in our home galaxy, the Milky Way, and have about ten times its mass.

Astronomers refer to these big galaxies as red and dead as they exhibit an ample abundance of ancient red stars, but lack young blue stars and show no evidence of new star formation. The estimated ages of the red stars suggest that their host galaxies ceased to make new stars about ten billion years ago. This shutdown began right at the peak of star formation in the Universe, when many galaxies were still giving birth to stars at a pace about twenty times faster than nowadays.

"Massive dead spheroids contain about half of all the stars that the Universe has produced during its entire life," said Sandro Tacchella of ETH Zurich in Switzerland, lead author of the article. "We cannot claim to understand how the Universe evolved and became as we see it today unless we understand how these galaxies come to be."

Tacchella and colleagues observed a total of 22 galaxies, spanning a range of masses, from an era about three billion years after the Big Bang.  The Universe's age is about 13.8 billion years, so the galaxies studied by Tacchella and colleagues are generally seen as they were more than 10 billion years ago. They used the NASA/ESA Hubble Space Telescope's Wide Field Camera 3 (WFC3) to peer at the galaxies from above our planet's distorting atmosphere -- WFC3 snapped detailed images in the near-infrared, revealing the spatial distribution of older stars within the actively star-forming galaxies.

The researchers also used the SINFONI) instrument on ESO's Very Large Telescope to collect light from the galaxies, showing precisely where they were churning out new stars. SINFONI could make these detailed measurements of distant galaxies thanks to its adaptive optics system, which largely cancels out the blurring effects of Earth's atmosphere.

"Hubble was able to show us how the stars are distributed within these galaxies in amazing detail," commented Marcella Carollo, also of ETH Zurich and co-author of the study. "We were able to match this accuracy with SINFONI to find patches of star formation. Using the two telescope together, we were able to explore this population of galaxies in more detail than ever before."

According to the new data, the most massive galaxies in the sample kept up a steady production of new stars in their peripheries. In their bulging, densely packed centres, however, star formation had already stopped.

"The newly demonstrated inside-out nature of star formation shutdown in massive galaxies should shed light on the underlying mechanisms involved, which astronomers have long debated," says Alvio Renzini, Padova Observatory, of the Italian National Institute of Astrophysics.

A leading theory is that star-making materials are scattered by torrents of energy released by a galaxy's central supermassive black hole as it sloppily devours matter. Another idea is that fresh gas stops flowing into a galaxy, starving it of fuel for new stars and transforming it into a red and dead spheroid.

"There are many different theoretical suggestions for the physical mechanisms that led to the death of the massive spheroids," said co-author Natascha Förster Schreiber of the Max-Planck-Institut für extraterrestrische Physik in Garching, Germany. "Discovering that the quenching of star formation started from the centres and marched its way outwards is a very important step towards understanding how the Universe came to look like it does now."


Contacts and sources:
Georgia Bladon
ESA/Hubble Information Center

Meteorites Date Moon Forming Impact

Through a combination of data analysis and numerical modeling work, researchers have found a record of the ancient Moon-forming giant impact observable in stony meteorites. Their work will appear in the April 2015 issue of the Journal Science.

Artist's depiction of a collision between two planetary bodies. Such an impact between the Earth and a Mars-sized object likely formed the Moon.
Credit: NASA/JPL-Caltech

The work was done by NASA Solar System Exploration Research Virtual Institute (SSERVI) researchers led by Principal Investigator Bill Bottke of the Institute for the Science of Exploration Targets (ISET) team at the Southwest Research Institute and included Tim Swindle, director of the University of Arizona's Lunar and Planetary Laboratory.

One possible realization of the Moon-forming impact event is animated. Here it is assumed that a Mars-sized protoplanet, defined as having 13 percent of an Earth-mass, struck the proto-Earth at a 45-degree angle near the mutual escape velocity of both worlds. The "red" particles, comprising 0.3 percent of an Earth-mass, were found to escape the Earth-Moon system. Some of this debris may eventually go on to strike other solar system bodies like large main belt asteroids. "Yellow-green" particles go into the disk that makes the Moon. "Blue" particles were accreted by the proto-Earth. The details of this simulation can be found in Canup, R. (2004, Simulations of a late lunar-forming impact, Icarus 168, 433-456).
Credit: Robin Canup, Southwest Research Institute
The inner Solar System's biggest known collision was the Moon-forming giant impact between a large protoplanet and the proto-Earth. The timing of this giant impact, however, is uncertain, with the ages of the most ancient lunar samples returned by the Apollo astronauts still being debated. Numerical simulations of the giant impact indicate this event not only created a disk of debris near Earth that formed the Moon, but it also ejected huge amounts of debris completely out of the Earth-Moon system. The fate of this material, comprising as much as several percent of an Earth mass, has not been closely examined until recently. However, it is likely some of it blasted main belt asteroids, with a record plausibly left behind in their near-surface rocks. Collisions on these asteroids in more recent times delivered these shocked remnants to Earth, which scientists have now used to date the age of the Moon.

The research indicates numerous kilometer-sized fragments from the giant impact struck main belt asteroids at much higher velocities than typical main belt collisions, heating the surface and leaving behind a permanent record of the impact event. Evidence that the giant impact produced a large number of kilometer-sized fragments can be inferred from laboratory and numerical impact experiments, the ancient lunar impact record itself, and the numbers and sizes of fragments produced by major main belt asteroid collisions.

Once the team concluded that pieces of the Moon-forming impact hit main belt asteroids and left a record of shock heating events in some meteorites, they set out to deduce both the timing and the relative magnitude of the bombardment. By modeling the evolution of giant impact debris over time and fitting the results to ancient impact heat signatures in stony meteorites, the team was able to infer the Moon formed about 4.47 billion years ago, in agreement with many previous estimates. The most ancient Solar System materials found in meteorites are about one hundred million years older than this age.

Insights into the last stages of planet formation in the inner solar system can be gleaned from these impact signatures. For example, the team is exploring how they can be used to place new constraints on how many asteroid-like bodies still existed in the inner Solar System in the aftermath of planet formation. They can also help researchers deduce the earliest bombardment history of ancient bodies like Vesta, one of the targets of NASA's Dawn mission and a main belt asteroid whose fragments were delivered to Earth in the form of meteorites. It is even possible that tiny remnants of the Moon-forming impactor or proto-Earth might still be found within meteorites that show signs of shock heating by giant impact debris. This would allow scientists to explore for the first time the unknown primordial nature of our homeworld.

Co-author Swindle, who specializes in finding the times when meteorites or lunar samples were involved in large collisions, said: "Bill Bottke had the idea of looking at the asteroid belt to see what effect a Moon-forming giant impact would have, and realized that you would expect a lot of collisions in the period shortly after that.

"Here at LPL, we had been determining ages of impact events that affected meteorites, and when we got together, we found that our data matched his predictions," he added. "It's a great example of taking advantage of groups that work in two different specialties - orbital dynamics and chronology - and combining their expertise."

Intriguingly, some debris may have also returned to hit the Earth and Moon after remaining in solar orbit over timescales ranging from tens of thousands of years to 400 million years.

"The importance of giant impact ejecta returning to strike the Moon could also play an intriguing role in the earliest phase of lunar bombardment," said Bottke, who is an alumnus of the University of Arizona's Lunar and Planetary Laboratory. "This research is helping to refine our time scales for 'what happened when' on other worlds in the Solar System."

Yvonne Pendleton, Director of the NASA SSERVI Institute, notes: "This is an excellent example of the power of multidisciplinary science. By linking studies of the Moon, of main belt asteroids, and of meteorites that fall to Earth, we gain a better understanding of the earliest history of our Solar System."


Contacts and sources:
Daniel Stolte
University of Arizona