Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Thursday, March 31, 2016

'Planet X' Linked to Mass Extinctions on Earth, Triggers Periodic Comet Showers says University Researcher



Periodic mass extinctions on Earth, as indicated in the global fossil record, could be linked to a suspected ninth planet, according to research published by a faculty member of the University of Arkansas Department of Mathematical Sciences.

Daniel Whitmire, a retired professor of astrophysics now working as a math instructor, published findings in the January issue of Monthly Notices of the Royal Astronomical Society that the as yet undiscovered “Planet X” triggers comet showers linked to mass extinctions on Earth at intervals of approximately 27 million years.

Though scientists have been looking for Planet X for 100 years, the possibility that it’s real got a big boost recently when researchers from Caltech inferred its existence based on orbital anomalies seen in objects in the Kuiper Belt, a disc-shaped region of comets and other larger bodies beyond Neptune. If the Caltech researchers are correct, Planet X is about 10 times the mass of Earth and could currently be up to 1,000 times more distant from the sun

Kuiper Belt

Credit: Johns Hopkins


Whitmire and his colleague, John Matese, first published research on the connection between Planet X and mass extinctions in the journal Nature in 1985 while working as astrophysicists at the University of Louisiana at Lafayette. Their work was featured in a 1985 Time magazine cover story titled, “Did Comets Kill the Dinosaurs? A Bold New Theory About Mass Extinctions.”

At the time there were three explanations proposed to explain the regular comet showers: Planet X, the existence of a sister star to the sun, and vertical oscillations of the sun as it orbits the galaxy. The last two ideas have subsequently been ruled out as inconsistent with the paleontological record. Only Planet X remained as a viable theory, and it is now gaining renewed attention.

New Horizons’ Camera Captures a Wandering Kuiper Belt Object
Object by Kuiper Belt
Credit: NASA

Whitemire and Matese’s theory is that as Planet X orbits the sun, its tilted orbit slowly rotates and Planet X passes through the Kuiper belt of comets every 27 million years, knocking comets into the inner solar system. The dislodged comets not only smash into the Earth, they also disintegrate in the inner solar system as they get nearer to the sun, reducing the amount of sunlight that reaches the Earth.

In 1985, a look at the paleontological record supported the idea of regular comet showers dating back 250 million years. Newer research shows evidence of such events dating as far back as 500 million years.

Whitmire and Matese published their own estimate on the size and orbit of Planet X in their original study. They believed it would be between one and five times the mass of Earth, and about 100 times more distant from the sun, much smaller numbers than Caltech’s estimates.

Matese has since retired and no longer publishes. Whitmire retired from the University of Louisiana at Lafayette in 2012 and began teaching at the University of Arkansas in 2013.

Whitmire says what’s really exciting is the possibility that a distant planet may have had a significant influence on the evolution of life on Earth.

“I’ve been part of this story for 30 years,” he said. “If there is ever a final answer I’d love to write a book about it.”

Daniel Whitmire 

Photo by Matt Reynolds






Contacts and sources:
Daniel Whitmire, Department of Mathematical Sciences
University of Arkansas
Bob Whitby, feature writer

Trigger for Milky Way’s Youngest Supernova Identified



Scientists have used data from NASA’s Chandra X-ray Observatory and the NSF’s Jansky Very Large Array to determine the likely trigger for the most recent supernova in the Milky Way. They applied a new technique that could have implications for understanding other Type Ia supernovas, a class of stellar explosions that scientists use to determine the expansion rate of the Universe.

Astronomers had previously identified G1.9+0.3 as the remnant of the most recent supernova in our Galaxy. It is estimated to have occurred about 110 years ago in a dusty region of the Galaxy that blocked visible light from reaching Earth.

G1.9+0.3 belongs to the Type Ia category, an important class of supernovas exhibiting reliable patterns in their brightness that make them valuable tools for measuring the rate at which the universe is expanding.

Scientists have used data from NASA’s Chandra X-ray Observatory and the NSF’s Jansky Very Large Array to determine the likely trigger for the most recent supernova in the Milky Way.

Image credit: NASA/CXC/CfA/S. Chakraborti et al.

“Astronomers use Type Ia supernovas as distance markers across the Universe, which helped us discover that its expansion was accelerating,” said Sayan Chakraborti, who led the study at Harvard University. “If there are any differences in how these supernovas explode and the amount of light they produce, that could have an impact on our understanding of this expansion.”

Most scientists agree that Type Ia supernovas occur when white dwarfs, the dense remnants of Sun-like stars that have run out of fuel, explode. However, there has been a debate over what triggers these white dwarf explosions. Two primary ideas are the accumulation of material onto a white dwarf from a companion star or the violent merger of two white dwarfs.

The new research with archival Chandra and VLA data examines how the expanding supernova remnant G1.0+0.3 interacts with the gas and dust surrounding the explosion. The resulting radio and X-ray emission provide clues as to the cause of the explosion. In particular, an increase in X-ray and radio brightness of the supernova remnant with time, according to theoretical work by Chakraborti’s team, is expected only if a white dwarf merger took place.

“We observed that the X-ray and radio brightness increased with time, so the data point strongly to a collision between two white dwarfs as being the trigger for the supernova explosion in G1.9+0.3,” said co-author Francesca Childs, also of Harvard.

The result implies that Type Ia supernovas are either all caused by white dwarf collisions, or are caused by a mixture of white dwarf collisions and the mechanism where the white dwarf pulls material from a companion star.

“It is important to identify the trigger mechanism for Type Ia supernovas because if there is more than one cause, then the contribution from each may change over time,” said Harvard’s Alicia Soderberg, another co-author on the study. This means astronomers might have to recalibrate some of the ways we use them as ‘standard candles’ in cosmology.”

The team also derived a new estimate for the age of the supernova remnant of about 110 years, younger than previous estimates of about 150 years.

More progress on understanding the trigger mechanism should come from studying Type Ia supernovas in nearby galaxies, using the increased sensitivity provided by a recent upgrade to the VLA.

A paper describing these results appeared in the March 1st, 2016 issue of The Astrophysical Journal and is available online. NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations.


Contacts and sources:
Molly Porter, Marshall Space Flight Center, Huntsville, Ala.
Megan Watzke, Chandra X-ray Center, Cambridge, Mass.

NASA Maps Climate Patterns on a Super-Earth



Observations from NASA's Spitzer Space Telescope have led to the first temperature map of a super-Earth planet -- a rocky planet nearly two times as big as ours. The map reveals extreme temperature swings from one side of the planet to the other, and hints that a possible reason for this is the presence of lava flows.

This animated illustration shows one possible scenario for the rocky exoplanet 55 Cancri e, nearly two times the size of Earth. New Spitzer data show that one side of the planet is much hotter than the other – which could be explained by a possible presence of lava pools.
Credits: NASA/JPL-Caltech

"Our view of this planet keeps evolving," said Brice Olivier Demory of the University of Cambridge, England, lead author of a new report appearing in the March 30 issue of the journal Nature. "The latest findings tell us the planet has hot nights and significantly hotter days. This indicates the planet inefficiently transports heat around the planet. We propose this could be explained by an atmosphere that would exist only on the day side of the planet, or by lava flows at the planet surface."

The toasty super-Earth 55 Cancri e is relatively close to Earth at 40 light-years away. It orbits very close to its star, whipping around it every 18 hours. Because of the planet's proximity to the star, it is tidally locked by gravity just as our moon is to Earth. That means one side of 55 Cancri, referred to as the day side, is always cooking under the intense heat of its star, while the night side remains in the dark and is much cooler.

"Spitzer observed the phases of 55 Cancri e, similar to the phases of the moon as seen from the Earth. We were able to observe the first, last quarters, new and full phases of this small exoplanet," said Demory. "In return, these observations helped us build a map of the planet. This map informs us which regions are hot on the planet."

The varying brightness of an exoplanet called 55 Cancri e is shown in this plot of infrared data captured by NASA's Spitzer Space Telescope.
Credits: NASA/JPL-Caltech/University of Cambridge

Spitzer stared at the planet with its infrared vision for a total of 80 hours, watching it orbit all the way around its star multiple times. These data allowed scientists to map temperature changes across the entire planet. To their surprise, they found a dramatic temperature difference of 2,340 degrees Fahrenheit (1,300 Kelvin) from one side of the planet to the other. The hottest side is nearly 4,400 degrees Fahrenheit (2,700 Kelvin), and the coolest is 2,060 degrees Fahrenheit (1,400 Kelvin).

The fact Spitzer found the night side to be significantly colder than the day side means heat is not being distributed around the planet very well. The data argues against the notion that a thick atmosphere and winds are moving heat around the planet as previously thought. Instead, the findings suggest a planet devoid of a massive atmosphere, and possibly hint at a lava world where the lava would become hardened on the night side and unable to transport heat.

"The day side could possibly have rivers of lava and big pools of extremely hot magma, but we think the night side would have solidified lava flows like those found in Hawaii," said Michael Gillon, University of Liège, Belgium.

The Spitzer data also revealed the hottest spot on the planet has shifted over a bit from where it was expected to be: directly under the blazing star. This shift either indicates some degree of heat recirculation confined to the day side, or points to surface features with extremely high temperatures, such as lava flows.

Additional observations, including from NASA's upcoming James Webb Space Telescope, will help to confirm the true nature of 55 Cancri e.

The new Spitzer observations of 55 Cancri are more detailed thanks to the telescope’s increased sensitivity to exoplanets. Over the past several years, scientists and engineers have figured out new ways to enhance Spitzer’s ability to measure changes in the brightness of exoplanet systems. One method involves precisely characterizing Spitzer’s detectors, specifically measuring “the sweet spot” -- a single pixel on the detector -- which was determined to be optimal for exoplanet studies.

“By understanding the characteristics of the instrument -- and using novel calibration techniques of a small region of a single pixel -- we are attempting to eke out every bit of science possible from a detector that was not designed for this type of high-precision observation,” said Jessica Krick of NASA’s Spitzer Space Science Center, at the California Institute of Technology in Pasadena.

NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA.



Contacts and sources: 
Whitney Clavin
Jet Propulsion Laboratory

11,000 Year Old Quarry Found in Israel Indicates Large-Scale Activity


Archaeologists from the Hebrew University of Jerusalem uncovered in central Israel the earliest known Neolithic quarry in the southern Levant, dating back 11,000 years. Finds from the site indicate large-scale quarrying activities to extract flint and limestone for the purpose of manufacturing working tools. 

Kaizer Hill quarry, discovered in central Israel, demonstrates the changing attitudes to landscape in the transition from hunting-gathering to farming

In a research paper published in the journal PLOS ONE, a team of archeologists, led by Dr. Leore Grosman and Prof. Naama Goren-Inbar from the Institute of Archeology at the Hebrew University of Jerusalem, showed how inhabitants of the Neolithic communities changed their landscape forever.
"Humans became more dominant and influential in their terrestrial landscape and Kaizer Hill quarry provides dramatic evidence to the alteration of the landscape," said Dr. Grosman.

Step-like morphology of the quarrying front on the rocks.
Credit: Gabi Laron

Kaizer Hill quarry is the first of its age, size and scope to be revealed in the southern Levant, where the Neolithic culture is believed to have begun and farming communities have developed. The introduction of farming is widely regarded as one of the biggest changes in human history, and "domestication" of the landscape was a significant process in the changing approach to nature.

The quarry is assigned to the Neolithic Pre-Pottery Neolithic A (PPNA) culture, one of the incipient cultural stages in the shift from a hunter-gatherer to a farming way of life.

The gradual transition to agricultural subsistence, when people learned how to produce their food rather than acquiring it, was accompanied by a changing attitude to 'landscape' and the practices of using the surrounding nature for the benefit of humans.

"The economic shift, from hunter-gatherers to agriculture, was accompanied by numerous changes in the social and technological spheres. Various quarrying marks including cup marks showed that the cutting of stones was done in various strategies, including identifying potential flint pockets; creating quarrying fronts on the rocks; removing blocks to allow extraction of flint; creating areas for quarrying dump; and using drilling and chiseling as a primary technique for extracting flint," said Prof. Goren-Inbar.

Researchers suggested a new interpretation to bedrock damage markings on the site of Kaizer Hill quarry, located on a 300 meter-high hill on the outskirts of the sprawling city of Modi'in, some 35 km west of Jerusalem.

"At the peak of the hill we found damaged rock surfaces, providing evidence to quarrying activity aimed at extracting flint nodules and exploiting the thick layer of caliche (a sedimentary rock locally known by the Arabic term Nari)," said Dr. Leore Grosman.

"The ancient people at the time carved the stone with flint working tools (for example axes). This suggestion differs from the commonly held view, which considers all features defined as cup marks to be devices that were primarily involved in a variety of grinding, food preparation, social or even symbolic activities," researchers wrote in their paper.



Contacts and sources:
Avivit Delgoshen
Hebrew University of Jerusalem

Its Hotter Than Ten Trillion Degrees and Scientists Thought 100 Billion Degrees Was the Limit

Astronomers using an orbiting radio telescope in conjunction with four ground-based radio telescopes have achieved the highest resolution, or ability to discern fine detail, of any astronomical observation ever made. Their achievement produced a pair of scientific surprises that promise to advance the understanding of quasars, supermassive black holes at the cores of galaxies.

The scientists combined the Russian RadioAstron satellite with the ground-based telescopes to produce a virtual radio telescope more than 100,000 miles across. They pointed this system at a quasar called 3C 273, more than 2 billion light-years from Earth. Quasars like 3C 273 propel huge jets of material outward at speeds nearly that of light. These powerful jets emit radio waves.

Just how bright such emission could be, however, was thought to be limited by physical processes. That limit, scientists thought, was about 100 billion degrees. The researchers were surprised when their Earth-space system revealed a temperature hotter then 10 trillion degrees.

Artistic view of the 10-meter space radio telescope on the Russian satellite Spektr-R comprising the space-borne component of the RadioAstron mission.
Credit: © Astro Space Center of Lebedev Physical Institute.
"Only this space-Earth system could reveal this temperature, and now we have to figure out how that environment can reach such temperatures," said Yuri Kovalev, the RadioAstron project scientist. "This result is a significant challenge to our current understanding of quasar jets," he added.

The observations also showed, for the first time, substructure caused by scattering of the radio waves by the tenuous interstellar material in our own Milky Way Galaxy. 

"This is like looking through the hot, turbulent air above a candle flame," said Michael Johnson, of the Harvard-Smithsonian Center for Astrophysics. "We had never been able to see such distortion of an extragalactic object before," he added.

"The amazing resolution we get from RadioAstron working with the ground-based telescopes gives us a powerful new tool to explore not only the extreme physics near the distant supermassive black holes, but also the diffuse material in our home Galaxy," Johnson said.

The RadioAstron satellite was combined with the Green Bank Telescope in West Virginia, The Very Large Array in New Mexico, the Effelsberg Telescope in Germany, and the Arecibo Observatory in Puerto Rico. Signals received by the orbiting radio telescope were transmitted to an antenna in Green Bank where they were recorded and then sent over the internet to Russia where they were combined with the data received by the ground-based radio telescopes to form the high resolution image of 3C 273.

The astronomers reported their results in the Astrophysical Journal Letters.

In 1963, astronomer Maarten Schmidt of Caltech recognized that a visible-light spectrum of 3C 273 indicated its great distance, resolving what had been a mystery about quasars. His discovery showed that the objects are emitting tremendous amounts of energy and led to the current model of powerful emission driven by the tremendous gravitational energy of a supermassive black hole.

The RadioAstron project is led by the Astro Space Center of the Lebedev Physical Institute of the Russian Academy of Sciences and the Lavochkin Scientific and Production Association under a contract with the Russian Federal Space Agency, in collaboration with partner organizations in Russia and other countries. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.



 
 Contacts and sources: : 
Dave Finley
The National Radio Astronomy Observatory

For The First Time Scientists Can See the Nano Structure Of Food In 3D



Scientists from the University of Copenhagen and the Paul Scherrer Institute in Switzerland have, for the first time, created a 3D image of food on the nanometer scale. The method the scientists used is called Ptychographic X-ray computed tomography. It has promising prospects as a more detailed knowledge of the structure of complex food systems could potentially save the food industry large sums of money and reduce food waste that occurs because of faulty production.



Computer animation/video: Liborius ApS (0:33)

The researchers found that 98% of the fat globules in the cream cheese-like food system are cemented together in a continuous 3D network. They have visualized the network using the same techniques that are used in computer animation. The coherent network of fat globules (approximately 25% of the volume) is seen in the video as the cohesive yellow structure, while the coherent structure of water fills out the area between the fat and is not highlighted by any color for clarity. The small areas of fat or water, which are not connected to the remaining fat or water structures, are red and blue respectively. The grey areas are the food ingredient microcrystalline cellulose. The network shown in the video is about 20 microns in diameter and made of fat globules of about 1 micron in size.

"There is still a lot we don’t know about the structure of food, but this is a good step on the way to understanding and finding solutions to a number of problems dealing with food consistency, and which cost the food industry a lot of money," says Associate Professor Jens Risbo from the Department of Food Science at the University of Copenhagen. He is one of the authors of a recently-published scientific paper in Food Structure, which deals with the new groundbreaking insight into the 3D structure of food.

The cream cheese-like food system is a good model, since it represents a broader group of food systems.


Picture: Merete Bøgelund Munk

The researchers used a cream based on vegetable fat for the research. The cream system is a good test material, since it can represent the structures of a large group of food systems, for example cheese, yogurt, ice cream, spreads, but also the more solid chocolate. All the aforementioned products contain liquid water or fat as well as small particles of solid materials, which stick together and form three-dimensional structures – i.e. a network that provides the consistency that we like about cheese, yogurt or chocolate. In cheese and yoghurt the casein particles form the network. In chocolate it is the fat crystals and in ice cream and whipped cream it is the fat globules.

"It's about understanding the food structure and texture. If you understand the structure, you can change it and obtain exactly the texture you want," says Jens Risbo.
Electrons with close to speed of light generate intense X-rays

To create a three-dimensional model of the food and convert it into images and video, the scientists have been in Switzerland, where they have used the Swiss Light Source (SLS) synchrotron at the Paul Scherrer Institute.

Photo taken inside the Swiss Light Source building. One can see a part of the ring which guide electron in circular motion with nearly the speed of light. The electron storage ring is shielded by concrete.


Picture: Paul Scherrer Institute

In the synchrotron electrons are accelerated to near speed of light. The synchrotron is used for research in materials science in areas such as biology and chemistry. The method the researchers used is called "Ptychographic X-ray computed tomography." This is a groundbreaking new method for creating images on the nanometer scale, which also provides a high contrast in biological systems. The synchrotron in Switzerland is one of the leading places in the world in this area, and it was the first time ever that it was used within food science.

"We have been using the tomography principle, also known from a X-ray CT (computed tomography) scanner. The sample of the food system is rotated and moved sideways back and forth with nanometer precision, while we send a very strong and focused X-ray beam through it. The X-rays are deflected by colliding with electrons in the food, and we shoot a lot of pictures of the patterns that the defleted X-rays form. The patterns are combined in a powerful computer, which reconstructs a 3D image of the sample. The Swiss scientists of the team have created a device that can move and rotate the sample with ultra-high precision, allowing us to see the small details," says Research Assistant Mikkel Schou Nielsen, who has recently completed his Ph.D. in tomographic methods applied to food at the Niels Bohr Institute in Copenhagen.


X-ray beams focused to a cross section of 6 microns at the passage through the cream sample. The sample is placed in a thin capillary tube of 20-30 microns in diameter, and the sample’s position is controlled with nanometer precision by a combination of piezo-motors and laser interferometry. The X-rays are deflected by collisions with electrons in the cream cheese-like food system and the resulting pattern is recorded. By moving the sample through the X-ray beams and rotating it to different angles, a 3D image of the sample is calculated from the measured patterns.



 Illustration: Mikkel Schou Nielsen

The number of electrons reveals the various food components

The reconstructed 3D image can be described as a three-dimensional table of numbers describing the electron density (the number of electrons per volume) through the entire sample. The various food components, such as water and fat, have different densities and hence different electron density. Water is heavier than fat, which is known from oil that settles on top of water when you try to mix them, and it is this contrast in electron density which causes X-rays to deflect to different degrees and eventually to form 3D-images of the sample. 

Figure 1 shows a 2D slice of the three dimensional structure. Areas with higher electron density appear lighter on the figure. Water thus appears light grey, while fat appears dark grey, and the glass around the sample with a high density is seen as a white ring. One may now use the electron density (greyscale) to identify the various food components and study their location and structure.

Figure 1. Greyscale, showing the electron density in the various food components, which also discloses their location and structure.


Illustration: Mikkel Schou Nielsen

A complicated food system

The vegetable-based cream which the method is used on consists of several ingredients. In addition to water and vegetable fat, it contains milk protein, stabilizers and emulsifiers. By adjusting the addition of emulsifiers, it is possible to achieve a state in which the cream continues to be fluid until you whip it to foam, whereby all the fat globules are reorganized and sticking together on the outside of the air bubbles in a three-dimensional system (see Figure 2).


Figure 2. Structures, respectively: a kinetic stable liquid emulsion and an air bubble stabilized by a partial coalesced fat network in foam.

Illustration: Merete Bøgelund Munk

"It is a difficult balance, because you only want the fat globules to stick together when the cream is whipped - not if it is simply being exposed to vibration or high temperatures. When the fat globules nevertheless begin to stick together prematurely - for example due to too many shocks during transport - the cream will get a consistency reminiscent of cream cheese. It becomes a relatively hard lump that can be cut," says Postdoctoral Researcher Merete Bøgelund Munk, Department of Food Science,University of Copenhagen.

Merete Bøgelund Munk's Ph.D. project, “The physical stability of whippable oil-in-water emulsions. Effects of monoglyceride-based emulsifiers and other ingredients”, was fundamental for the research. The Ph.D. project is made as a collaboration between the Department of Food Science at the University of Copenhagen and the food ingredient company Palsgaard A / S.

This undesirable cream cheese-like state of the vegetable cream system is nevertheless extremely interesting for researchers.

"The organization of the fat globules and the network structure after the cream has been converted into a ‘cream cheese-like’ product is exciting because the mass is now sliceable, even though the system consists of 65% water and only 25% fat and some other ingredients and sugars. That means we have a network structure that captures a lot of water. There are many foods with similar network systems of something solid in something liquid, where the liquid is typically, but not always, water. This applies to all semi-solid and solid products such as chocolate, butter, cheese and spreads. The network of the cream cheese-like system is thus a model for something general in our food," says Associate Professor Jens Risbo.

Associate Professor Jens Risbo, Department of Food Science, University of Copenhagen, standing next to the installation at the Swiss Light Source that focuses the X-rays and holds the sample.

Picture: Mikkel Schou Nielsen

It is the structure of the networks which forms a texture that makes you want to bite into a piece of chocolate and cut yourself a piece of cheese. But the structure and the networks are something of a mystery, because until now you could only see the surface or only slightly underneath the surface of the food material on the microns scale and the images you could see have only been two-dimensional.

Until now, it has only been possible to see the structure of the cream cheese-like food system as a two-dimensional microscopic image, which shows a picture of the surface and the area just below. The researchers’ progress implies that we now can begin to understand how the various ingredients are linked in a three-dimensional network.

 Picture: Merete Bøgelund Munk

"If we eventually come to understand the structure of chocolate, we can change it and obtain exactly the consistency that we want. A lot of money is wasted because the consistency of chocolate is really hard to control, so the end product is not good enough and must be discarded. A possible future understanding of the crystal network in chocolate might mean that we will be able to develop components that prevent the chocolate from becoming grey and crumbly, and thus unsaleable. It is certainly a possibility that tomographic methods could be developed so we would be able to understand the mysteries of chocolate," says Associate Professor, Jens Risbo, Department of Food Science, University of Copenhagen.
How the tomography works

"Ptychographic X-ray computed tomography can be compared with a CT scanner in a hospital. Instead of getting an image of a patient's organs, we are looking into food. But, unlike a CT scanner, we can go down to the nanometer scale," says Jens Risbo.

The sample with the cream cheese-like system that the scientists X-rayed was about 20 microns thick.

"It would take too much time and too many calculations to develop a nanometer resolution of the cream system for a whole package of cream cheese from the fridge. The amount of information and calculations would simply be too great. Although X-rays can almost go through everything, you lose the intensity of the beams, the more they have to shoot through," says Jens Risbo.


Contacts and sources: 
Jens Risbo

Easter Can Be Calculated with Greater Accuracy Using New ‘Old’ Astronomical Method




Easter can fall early and late, in fact, the timing can fluctuate by an entire month from late March to late April. Using a new astronomical method of calculation, the date of Easter would be better defined as the first Sunday after the spring full moon, says an astronomer.

Photo taken out from the entrance of a passage grave on Samsø looking towards the next passage grave. The direction is the same direction as the “crossover” of the rising point of the full moon in the spring and autumn.
Credit: Calus Clausen 

“According to Christianity, Easter must fall on the first Sunday after the first full moon after the equinox. It is a general rule of thumb and the rule works in 9 out of 10 cases. But at certain year intervals the rule does not function properly, as there is a muddle in the equinoxes and when the first full moon is after the equinox. This may happen in 2019, 2030, 2038, 2049 and so on in cycles of 19 years, staggered between 11 and 8 years. In these years, the current definition that Easter will be a month later than it should be,” explains Claus Clausen, astronomer, ph.d.-student at the Niels Bohr Institute, University of Copenhagen.

Claus Clausen believes that Easter should be calculated very precisely as the first Sunday after the spring full moon (as determined astronomically by a spring crossover, that is, when the point where the full moon rises on the horizon changes places with the point where the Sun rises on the horizon). So there will never be any doubt about when Easter should be.
Stone Age equinox

Claus Clausen is an astronomer, but he has also always been interested in archaeology and has studied the mysterious barrows and passage graves from the Stone Age (Neolithic times). Archaeologists estimate that around 40,000 large stone graves were in the years from around 3500 to 3000 BC. Only about 700 of the large passage graves are preserved today, but one of the great mysteries is their orientation in the landscape with a significant concentration of orientations towards east/southeast as seen from inside the passage grave.


The figure shows how the “crossover” takes place in the spring when the rise of the full moon moves south and the sunrise moves north. In the autumn the sunrise and rise of the full moon move in the opposite way. The spring full moon and the autumn full moon always rise in almost the same direction about 100 degrees from North. This direction is roughly the same throughout Europe. 
Credit: Claus Clausen

The Earth’s rotation has changed over time, but with the help of a computer programme, Claus Clausen could go 5500 years back in time and calculate the rise of the Sun, the full moon and the timing of lunar eclipses. He measured the entrance tunnels of approximately 170 passage graves and discovered that they probably were built according to the direction of the rise of certain full moons, for example, the first full moon after the spring equinox. The point at which the full moon rises on the horizon changed places with the point at which the sun rises on the horizon. This ‘crossover’ defines the Stone Age equinox and the spring full moon.

Passage graves in Western Europe have also been constructed in the same way as the Danish. This is true in Spain, Portugal, France and Sweden, perhaps also in Germany and the Netherlands. At that time, in the younger Stone Age, they certainly held rituals at the burial mounds, perhaps fertility rituals in the spring and a ritual for finishing the harvest in the autumn. Now the method can be used to calculate the date of Easter.

The spring full moon heralds Easter

“So they used this simple method in the Stone Age, which was later forgotten and has now been rediscovered,” point out Claus Clausen. He adds that if you use this definition (which is astronomical) for the spring full moon (crossover) you will never be in doubt about the right date. It is the spring moon that rises on that date and heralds Easter.

Photo taken out from the entrance of a passage grave at Knudhoved Odde looking towards the next large stone grave (here a dolmen). The direction here is also same direction as the “crossover” of the rising point of the full moon in the spring and autumn.

 (Credit: Calus Calusen)

From 23 March to 24 March 2016, the point on the horizon where the full moon rises will change places with the point where the Sun rises on the horizon, so Stone Age spring full moon is the same as the full moon that falls during Easter on 24 March this year, explains Claus Clausen.



Contacts and sources:
Claus Clausen 
Niels Bohr Institute, University of Copenhagen , 

Cleaning Air with Light Proven Uniquely Versatile



A novel invention using light to remove air pollution has proven to be more versatile than any competing systems. It eliminates fumes as chemically diverse as odorous sulfur compounds and health hazardous hydrocarbons while consuming a minimum of energy.


Credit: University of Copenhagen

The name of the air cleaner is GPAO and its inventor, Professor of environmental chemistry Matthew Johnson, University of Copenhagen, Denmark, published the results of testing the system in the article “Gas Phase Advanced Oxidation for effective, efficient In Situ Control of Pollution” in the scientific periodical “Environmental Science and Technology”.

Air pollution hard to remove

Pollution is notoriously difficult to remove from air. Previous systems trying to control air pollution consume large amounts of energy, for example because they burn or freeze the pollution. Other systems require frequent maintenance because the charcoal filters they use need replacement. GPAO needs no filters, little energy and less maintenance, explains atmosphere chemist Matthew Johnson.

“As a chemist, I have studied the natural ability of the atmosphere to clean itself. Nature cleans air in a process involving ozone, sunlight and rain. Except for the rain, GPAO does the very same thing, but speeded up by a factor of a hundred thousand”, explains Johnson.

Gas is difficult to remove. Dust is easy

In the GPAO system, the polluted gas is mixed with ozone in the presence of fluorescent lamps. This causes free radicals to form that attack pollution, forming sticky products that clump together. The products form fine particles which grow into a type of airborn dust. And whereas gas phase pollution was hard to remove, dust is easy. Just give it an electrostatically charged surface to stick to, and it goes no further.

“Anyone who has ever tried dusting a computer screen knows how well dust sticks to a charged surface. This effect means that we don’t need traditional filters, giving our system an advantage in working with large dilute airstreams”, says Johnson.

See the principle of GPAO in this animation


Removes foul smells as well as noxious fumes

Patented in 2009, the system has been commercialized since 2013 and is already in use at an industrial site processing waste water. Here it eliminates foul smells from the process and saved the plant from being closed. A second industrial installation removes 96% of the smell generated by a factory making food for livestock. Further testing by University of Copenhagen atmospheric chemists have shown that the GPAO system efficiently removes toxic fumes from fiberglass production and from an iron foundry, which emitted benzene, toluene, ethyl benzene and xylene.

Vira, fungal spores and bacteria removed in bonus effect

Another series of tests revealed that the rotten egg smells of pig farming and wastewater treatment are easily removed. Odors such as the smells from breweries, bakeries, food production, slaughterhouses and other process industries can be eliminated. And that is not all, says Professor Johnson.

“Because the system eats dust, even hazardous particles such as pollen, spores and viruses are removed” states Johnson, who hopes to see his system in use in all manner of industries because air pollution is such a huge health risk.

Air pollution more deadly than traffic, smoking and diabetes

According to a recent report by the World Health Organization, indoor and outdoor air pollution causes 7 million premature deaths annually which is more than the combined effects of road deaths, smoking and diabetes. Pollution in air is linked to heart disease, cancer, asthma, allergy, lost productivity and irritation.

“I have always wanted to use chemistry to make the world a better place. I genuinely feel that GPAO will improve life for millions of people, especially those living in cities or near industrial producers” concludes Matthew Johnson.


Contacts and sources:
Jes Andersen
University of Copenhagen 

How The ‘Blob’ Stacks Up and Tracking ‘Marine Heatwaves’ Since 1950



Unusually warm oceans can have widespread effects on marine ecosystems. Warm patches off the Pacific Northwest from 2013 to 2015, and a couple of years earlier in the Atlantic Ocean, affected everything from sea lions to fish migration routes to coastal weather.

A University of Washington oceanographer is lead author of a study looking at the history of such features across the Northern Hemisphere. The study was published in March in the journal Geophysical Research Letters.

"We can think of marine heatwaves as the analog to atmospheric heatwaves, except they happen at the sea surface and affect marine ecosystems," said lead author Hillary Scannell, a UW doctoral student in oceanography. "There are a lot of similarities."

Land-based heatwaves are becoming more frequent and more intense due to climate change. Scannell and her collaborators' work suggests this may also be happening in the north Atlantic and Pacific oceans. Their study finds that marine heatwaves have recurred regularly in the past but have become more common since the 1970s, as global warming has become more pronounced.

The “warm blob” off the Pacific Northwest coast in April 2014, as shown in the July 2014 newsletter where it got its evocative name. The new study shows this feature was most prominent through the end of 2014, though it persisted into 2015.
Credit: NOAA

The new paper looks at the frequency of marine heatwaves in the North Atlantic and the North Pacific since 1950. Scannell did the work as a master's student at the University of Maine, where she was inspired by the 2012 record-breaking warm waters off New England.

"After that big warming event of 2012 we keyed into it and wanted to know how unusual it was," Scannell said. The study also analyzes another recent event, the so-called "warm blob" that emerged in 2013 and 2014 off the Pacific Northwest.

The authors analyzed 65 years of ocean surface temperature observations, from 1950 to 2014, and
also looked at how these two recent events stack up.

In general, the results show that the larger, more intense and longer-lasting a marine heatwave is, the less frequently it will occur. The study also shows that the two recent events were similar to others seen in the historical record, but got pushed into new territory by the overall warming of the surface oceans.

An event like the northwest Atlantic Ocean marine heatwave, in which an area about the size of the U.S. stayed 2.0 degrees Fahrenheit (1.1 C) above normal for three months, is likely to naturally occur about every five years in the North Atlantic and northwestern Pacific oceans, and more frequently in the northeast Pacific.

The "blob" in the northeast Pacific covered an even larger area, with surface temperatures 2.7 degrees Fahrenheit (1.5 C) above normal for 17 months, and is expected from the record to naturally happen about once every five years off the West Coast.

In the northeast Pacific, the record shows that marine heatwaves are more likely during an El Niño year and when the Pacific Decadal Oscillation brings warmer temperatures off the west coast of North America. So the 2013-15 "blob" likely got an extra kick from a possible transition to the favorable phase of the Pacific Decadal Oscillation, as well as from the overall warming of the ocean.

"The blob was an unfortunate but excellent example of these events," Scannell said. "As we go into the uncharted waters of a warming climate, we may expect a greater frequency of these marine heatwaves."

Scannell is also co-author of an earlier study published in February in which the authors define the term "marine heatwave" and specify the duration, temperature change and spatial extent that would meet their criteria. That study was led by researchers in Australia, who were curious about a warm event from 2010 to 2011 in the Indian Ocean.

"We're working towards a more streamlined definition so we can more easily compare these events when they occur in the future," Scannell said.

Better understanding of marine heatwaves could help prepare ocean ecosystems and maritime industries. At the UW, Scannell currently works with Michael McPhaden, a UW affiliate professor of oceanography and scientist at the National Oceanic and Atmospheric Administration looking at air-sea interactions along the equator and other factors that might create marine heatwaves.



Contacts and sources
Hannah Hickey
University of Washington

Co-authors on the new paper are Andrew Pershing and Katherine Mills at the Gulf of Maine Research Institute, Michael Alexander at the National Oceanic and Atmospheric Administration and Andrew Thomas at the University of Maine. The study was funded by the National Science Foundation.

Fracking Linked to Most Induced Earthquakes in Western Canada

A survey of a major oil and natural gas-producing region in Western Canada suggests a link between hydraulic fracturing or "fracking" and induced earthquakes in the region, according to a new report published online in the journal Seismological Research Letters.

The study's findings differ from those reported from oil and gas fields in the central United States, where fracking is not considered to be the main cause of a sharp rise in induced seismicity in the region. Instead, the proliferation of hundreds of small earthquakes in that part of the U.S. is thought to be caused primarily by massive amounts of wastewater injected back into the ground after oil and gas recovery.

The SRL study does not examine why induced seismicity would be linked to different processes in the central U.S. and western Canada. However, some oil and gas fields in the U.S., especially Oklahoma, use "very large amounts of water" in their operations, leading to much more wastewater disposal than in Canadian operations, said Gail M. Atkinson of Western University.

Western Canadian Sedimentary Basin (outlined in black) on a geological map of Canada.  Wikimedia Commons
Credit: Wikimedia Commons

It is possible that massive wastewater disposal in the U.S. is "masking another signal" of induced seismicity caused by fracking, Atkinson said. "So we're not entirely sure that there isn't more seismicity in the central U.S. from hydraulic fracturing than is widely recognized."

The fracking process uses high-pressure injections of fluid to break apart rock and release trapped oil and natural gas. Both fracking and wastewater injections can increase the fluid pressure in the natural pores and fractures in rock, or change the state of stress on existing faults, to produce earthquakes.

The Western Canada Sedimentary Basin (WCSB) contains one of the world's largest oil and gas reserves, and is dotted with thousands of fracking wells drilled in multi-stage horizontal operations. Atkinson and her colleagues compared the relationship of 12,289 fracking wells and 1236 wastewater disposal wells to magnitude 3 or larger earthquakes in an area of 454,000 square kilometers near the border between Alberta and British Columbia, between 1985 and 2015.

The researchers performed statistical analyses to determine which earthquakes were most likely to be related to hydraulic fracturing, given their location and timing. The analyses identified earthquakes as being related to fracking if they took place close to a well and within a time window spanning the start of fracking to three months after its completion, and if other causes, such as wastewater disposal, were not involved.

Atkinson and colleagues found 39 hydraulic fracturing wells (0.3% of the total of fracking wells studied), and 17 wastewater disposal wells (1% of the disposal wells studied) that could be linked to earthquakes of magnitude 3 or larger.
Scale diagram showing how wells can drain gas from a large area from a single pad
Multiple well diagram.jpg
Credit: UK Government - Dept of Energy and Climate Change
While these percentages sound small, Atkinson pointed out that thousands of hydraulic fracturing wells are being drilled every year in the WCSB, increasing the likelihood of earthquake activity. "We haven't had a large earthquake near vulnerable infrastructure yet," she said, "but I think it's really just a matter of time before we start seeing damage coming out of this."

The study also confirmed that in the last few years nearly all the region's overall seismicity of magnitude 3 or larger has been induced by human activity. More than 60% of these quakes are linked to hydraulic fracture, about 30-35% come from disposal wells, and only 5 to 10% of the earthquakes have a natural tectonic origin, Atkinson said.

Atkinson said the new numbers could be used to recalculate the seismic hazard for the region, which could impact everything from building codes to safety assessments of critical infrastructure such as dams and bridges. "Everything has been designed and assessed in terms of earthquake hazard in the past, considering the natural hazard," she said. "And now we've fundamentally changed that, and so our seismic hazard picture has changed."
Scale diagram of a shale gas well showing large separation between aquifer and shale source rock
Shale gas well.jpg
Credit: UK Government - Dept of Energy and Climate Change

The researchers were also surprised to find that their data showed no relationship between the volume of fluid injected at a hydraulic fracturing well site and the maximum magnitude of its induced earthquake.

"It had previously been believed that hydraulic fracturing couldn't trigger larger earthquakes because the fluid volumes were so small compared to that of a disposal well," Atkinson explained. "But if there isn't any relationship between the maximum magnitude and the fluid disposal, then potentially one could trigger larger events if the fluid pressures find their way to a suitably stressed fault."

Atkinson and her colleagues hope to refine their analyses to include other variables, such as information about extraction processes and the geology at individual well sites, "to help us understand why some areas seem much more prone to induced seismicity than others."

The scientists say the seismic risks associated with hydraulic fracturing could increase as oil and gas companies expand fracking's use in developing countries, which often contain dense populations and earthquake-vulnerable infrastructure.



Contacts and sources:
Becky Ham
Seismological Society of America

ALMA Spots a Baby Star and a Missing Piece of Stellar Evolution



Researchers using the Atacama Large Millimeter/submillimeter Array (ALMA) have made the first direct observations delineating the gas disk around a baby star from the infalling gas envelope. This finding fills an important missing piece in our understanding of the early phases of stellar evolution.

A research team, led by Yusuke Aso (a graduate student at the University of Tokyo) and Nagayoshi Ohashi (a professor at the Subaru Telescope, National Astronomical Observatory of Japan) observed the baby star named TMC-1A located 450 light years away from us, in the constellation Taurus (the Bull). TMC-1A is a protostar, a star still in the process of forming. Large amounts of gas still surround TMC-1A.

Artist's impression of the baby star TMC-1A. The star is located in the center and surrounded by a rotating gas disk. Gas is infalling to the disk from the envelope further out.
 Credit: NAOJ

Stars form in dense gas clouds. Baby stars grow by taking in the surrounding gas, like a fetus receiving nutrition from the mother's placenta. In this process, gas cannot flow directly into the star. Instead it first accumulates and forms a disk around the star, and then the disk feeds into the star. However, it is still unknown when in the process of star formation this disk appears and how it evolves. Lack of sensitivity and resolution in radio observations has made it difficult to observe these phenomena.

"The disks around young stars are the places where planets will be formed," said Aso, the lead author of the paper that appeared in the Astrophysical Journal. "To understand the formation mechanism of a disk, we need to differentiate the disk from the outer envelope precisely and pinpoint the location of its boundary."

  Artist's impression video of the baby star TMC-1A.

Credit: NAOJ

Using ALMA, the team directly observed the boundary between the inner rotating disk and the outer infalling envelope with high accuracy for the first time. Since gas from the outer envelope is continuously falling into the disk, it had been difficult to identify the transition region in previous studies. In particular, the tenuous but high speed gas in rotating disks is not easy to see. But ALMA has enough sensitivity to highlight such a component and illustrate the speed and distribution of gas in the disk very precisely. This enabled the team to distinguish the disk from the infalling envelope.

\Composite image of TMC-1A observations. Dense gas seen around the star with ALMA is shown in red. ALMA also spotted outflowing gas from the star, a feature often seen around baby stars; this outflowing gas is shown in white. The position of the star is indicated with a cross. The image without the cross sign can be downloaded from here.
Credit: ALMA (ESO/NAOJ/NRAO), Aso et al.

 The team found that the boundary between the disk and envelope is located 90 astronomical units from the central baby star. This distance is three times longer than the orbit of Neptune, the outermost planet in the Solar System. The observed disk obeys Keplerian rotation: the material orbiting closer to the central star revolves faster than material further out.

 Gas motion around TMC-1A. The red color indicates gas is moving away from us while the blue color is coming closer to us.
Credit: ALMA (ESO/NAOJ/NRAO), Aso et al.

The high-sensitivity observations provided other important information about the object. From detailed measurement of the rotation speed, the research team could calculate that the mass of the baby star is 0.68 times the mass of the Sun. The team also determined the gas infall rate to be a millionth of the mass of the Sun per year, with a speed of 1 km per second. Gravity causes gas to fall towards the central baby star, but the measured speed is much less than the free-fall speed. Something must be slowing the gas down. The researchers suspect that a magnetic field around the baby star might be what is slowing the gas.

"We expect that as the baby star grows, the boundary between the disk and the infall region moves outward," said Aso. "We are sure that future ALMA observations will reveal such evolution."

These observational results were published as Aso et al. "ALMA Observations of the Transition from Infall Motion to Keplerian Rotation around the Late-phase Protostar TMC-1A " in the Astrophysical Journal, issued in October 2015.


Contacts and sources:
National Radio Astronomy Observatory

How Did Sauropod Dinosaurs Become the Largest Land Animals Ever



Scientists from the University of Liverpool have developed computer models of the bodies of sauropod dinosaurs to examine the evolution of their body shape.

Sauropod dinosaurs include the largest land animals to have ever lived. Some of the more well-known sauropods include Diplodocus, Apatosaurus and Brontosaurus. They are renowned for their extremely long necks, long tails as well as four thick, pillar-like legs and small heads in relation to their body.

This is a Giraffatitan model of a Sauropod.  Grid is in 1m squares.
Credit: Dr Peter L Falkingham (Liverpool John Moores University)

To date, however, there have been only limited attempts to examine how this unique body-plan evolved and how it might be related to their gigantic body size. Dr Karl Bates from the University's Department of Musculoskeletal Biology and his colleagues used three-dimensional computer models reconstructing the bodies of sauropod dinosaurs to analyse how their size, shape and weight-distribution evolved over time.

Animation showing how the center of mass is moved by reconstructing the soft tissues differently using the convex hulling approach by co-author Dr Peter L Falkingham of the three-dimensional computer model. 
Credit: Peter Falkingham, Liverpool John Moores University

In the animation we see the laser scan of the skeleton of _Giraffatitan_ in a neutral posture, followed by the reconstructed lung/airsac volume. A convex hull is then generated around the skeleton which provides a 'shrink wrapped' volume. 

Previous work has shown that in mammals, this volume should be increased by 21% to get a more accurate result. The hull is expanded until it the hull begins to severely self-intersect, and this is considered the 'maximal' volume.

Variations are then tested of maximal caudal and cranial reconstructions (where the tail is maximal and the neck minimal and vice versa), as well as in changing the postion of the neck.
Evolutionary history

Dr Bates found evidence that changes in body shape coincided with major events in sauropod evolutionary history such as the rise of the titanosaurs. The early dinosaurs that sauropods evolved from were small and walked on two legs, with long tails, small chests and small forelimbs. The team estimate that this body shape concentrated their weight close to the hip joint, which would have helped them balance while walking bipedally on their hind legs.

Argentinosaurus, a titanosaur, . The dinosaur lived on the then-island continent of South America somewhere between 94 and 97 million years ago, during the Late Cretaceous Epoch. It is among the largest known dinosaurs.

Credit: James Emery

As sauropods evolved they gradually altered both their size and shape from this ancestral template, becoming not only significantly larger and heavier, but also gaining a proportionally larger chest, forelimbs and in particular a dramatically larger neck.

The team's findings show that these changes altered sauropods' weight distribution as they grew in size, gradually shifting from being tail-heavy, two-legged animals to being front-heavy, four-legged animals, such as the large, fully quadrupedal Jurassic sauropods Diplodocus and Apatosaurus.

Apatosaurus, lived in North America during the Late Jurassic period.
Louisae.jpg
Credit: Carnegie Museum

The team found that these linked trends in size, body shape and weight distribution did not end with the evolution of fully quadrupedal sauropods. In the Cretaceous period - the last of the three ages of the dinosaurs - many earlier sauropod groups dwindled. In their place, a new and extremely large type of sauropod known as titanosaurs evolved, including the truly massive Argentinosaurus and Dreadnoughtus, among the largest known animals ever to have lived.

Front heavy

The team's computer models suggest that in addition to their size, the titanosaurs evolved the most extreme 'front heavy' body shape of all sauropods, as a result of their extremely long necks.

Dr Bates said: "As a result of devising these models we were able to ascertain that the relative size of sauropods' necks increased gradually over time, leading to animals that were increasingly more front-heavy relative to their ancestors."

Dr Philip Mannion from Imperial College London, a collaborator in the research, added: "These innovations in body shape might have been key to the success of titanosaurs, which were the only sauropod dinosaurs to survive until the end-Cretaceous mass extinction, 66 million years ago."

Dr Vivian Allen from the Royal Veterinary College London, who also collaborated in the research, added: "What's important to remember about studies like this is that there is a very high degree of uncertainty about exactly how these animals were put together. While we have good skeletons for many of them, it's difficult to be sure how much meat there was around each of the bones. We have built this uncertainly into our models, ranging each body part from emaciated to borderline obesity, and even using these extremes we still find these solid, trending changes in body proportions over sauropod evolution."


Contacts and sources:
University of Liverpool



Citation: 'Temporal and phylogenetic evolution of the sauropod dinosaur body plan', has been published by Royal Society Open Science and can be found here once the embargo has lifted.

Indonesian 'Hobbits' 50,000 Years Old, Homo floresiensis Died Out Earlier Than First Thought

An ancient species of pint-sized humans discovered in the tropics of Indonesia may have met their demise earlier than once bEelieved, according to an international team of scientists who re-investigated the original finding.

Published in the journal Nature this week, the group challenges reports that these inhabitants of remote Flores island co-existed with modern humans for tens of thousands of years.

They found that the youngest age for Homo floresiensis, dubbed the 'Hobbit', is around 50,000 years ago not between 13,000 and 11,000 years as initially claimed. The most complete ‘hobbit’ skeleton belonged to a young adult female that was only about a meter tall, and whose brain was roughly one-third the size of a modern adult female’s brain.

Recent discoveries suggest that the ancestors of ‘hobbits’ first arrived on Flores at least one million years ago. However, the identity of this founding population is a complete mystery.

Cast of  Homo floresiensis skull


Credit: American Museum of Natural History

Led by Indonesian scientists and involving researchers from Griffith University's Research Centre of Human Evolution (RCHE) the team found problems with prior dating efforts at the cave site, Liang Bua.

"In fact, Homo floresiensis seems to have disappeared soon after our species reached Flores, suggesting it was us who drove them to extinction," says Associate Professor Maxime Aubert, a geochronologist and archaeologist at RCHE, who with RCHE's Director Professor Rainer measured the amount of uranium and thorium inside Homo floresiensis fossils to test their age.

"The science is unequivocal,'' Aubert said.

"The youngest Hobbit skeletal remains occur at 60,000 years ago but evidence for their simple stone tools continues until 50,000 years ago. After this there are no more traces of these humans."


Credit Griffith University

While excavating at the limestone cave of Liang Bua in 2003, archaeologists found bones from diminutive humans unlike any people alive today. The researchers concluded the tiny cave dwellers evolved from an older branch of the human family that had been marooned on Flores for at least a million years. It was thought that this previously unknown population lived on Flores until about 12,000 years ago.

But the site is large and complex and the original excavators dug only a tiny portion of it. Years of further excavation has led to a much clearer understanding of the order of archaeological layers. It is now evident that when the original team collected samples for dating the main layer containing Hobbit bones they mistakenly took them from an overlying layer that is similar in composition, but far younger.

"This problem has now been resolved and the newly published dates provide a more reliable estimate of the antiquity of this species,'' Aubert said.

But the mystery of what happened to these creatures remains.

Cave where the remains of Homo floresiensis were discovered in 2003, Lian Bua, Flores, Indonesia

Credit: Rosino

RCHE archaeologist Dr Adam Brumm, who also participated in the study, said Hobbits are likely to have inhabited other Flores caves which may yield more recent signs of their existence. He believes Homo floresiensis probably suffered the same fate that befell Europe's Neanderthals - our species simply out-competed and replaced them within a few thousand years.

"They might have retreated to more remote parts of Flores, but it's a small place and they couldn't have avoided our species for long. I think their days were numbered the moment we set foot on the island."

In the past decade, Asia has produced the 17,000 year old and highly enigmatic Indonesian Homo floresiensis ("The Hobbit") and evidence for modern human interbreeding with the ancient Denisovans from Siberia.

Assistant Prof Maxime Aubert with the skull of the Homo floresiensis holotype skeleton (LB1). Aubert conducted Uranium-series dating of one of the bones from this skeleton, and bones from other ‘hobbit’ individuals from Liang Bua, to determine their age.

Credit: Griffith University

Contacts and sources:
Deborah Marshall
Griffith University


RCHE is an initiative of Griffith University and is the world's first academic centre specifically focused on the subject of human evolution in our region. Based within the Environmental Futures Research Institute, its mission is to foster research excellence through multi-disciplinary projects that bring together leading Australian and international scholars and institutions in the field of human evolution in Australia, Southeast Asia and elsewhere in the region.

Jekyll and Hyde Planet Found: Hellish Lava World Is Molten on the Hot Side, Solid on the Cool Side

The most detailed map of a small, rocky ‘super Earth’ to date reveals a two-faced planet almost completely covered by lava, with a molten ‘hot’ side and solid ‘cool’ side.


Credit: Cambridge University

An international team of astronomers, led by the University of Cambridge, has obtained the most detailed ‘fingerprint’ of a rocky planet outside our solar system to date, and found a planet of two halves: one that is almost completely molten, and the other which is almost completely solid.

According to the researchers, conditions on the hot side of the planet are so extreme that it may have caused the atmosphere to evaporate, with the result that conditions on the two sides of the planet vary widely: temperatures on the hot side can reach 2500 degrees Celsius, while temperatures on the cool side are around 1100 degrees. The results are reported in the journal Nature.


Illustration of the hot lava world 55 Cancri e
Credit: NASA/JPL-Caltech/R. Hurt 

Using data from NASA’s Spitzer Space Telescope, the researchers examined a planet known as 55 Cancri e, which orbits a sun-like star located 40 light years away in the Cancer constellation, and have mapped how conditions on the planet change throughout a complete orbit, the first time this has been accomplished for such a small planet.

55 Cancri e is a ‘super Earth’: a rocky exoplanet about twice the size and eight times the mass of Earth, and orbits its parent star so closely that a year lasts just 18 hours. The planet is also tidally locked, meaning that it always shows the same face to its parent star, similar to the Moon, so there is a permanent ‘day’ side and a ‘night’ side. Since it is among the nearest super Earths whose composition can be studied, 55 Cancri e is among the best candidates for detailed observations of surface and atmospheric conditions on rocky exoplanets.

Uncovering the characteristics of super Earths is difficult, since they are so small compared to the parent star and their contrast relative to the star is extremely small compared to larger, hotter gas giant planets, the so-called ‘hot Jupiters’.

“We haven’t yet found any other planet that is this small and orbits so close to its parent star, and is relatively close to us, so 55 Cancri e offers lots of possibilities,” said Dr Brice-Olivier Demory of the University’s Cavendish Laboratory, the paper’s lead author. “We still don’t know exactly what this planet is made of – it’s still a riddle. These results are like adding another brick to the wall, but the exact nature of this planet is still not completely understood.”

55 Cancri e has been extensively studied since it was discovered in 2011. Based on readings taken at different points in time, it was thought to be a water world, or even made of diamond, but researchers now believe that it is almost completely covered by lava.

“We have entered a new era of atmospheric remote sensing of rocky exoplanets,” said study co-author Dr Nikku Madhusudhan, from the Institute of Astronomy at Cambridge. “It is incredible that we are now able to measure the large scale temperature distribution on the surface of a rocky exoplanet.”

Based on these new infrared measurements, the ‘day’ side of the planet appears to be almost completely molten, while the ‘night’ side is almost completely solid. The heat from the day side is not efficiently circulated to the night side, however. On Earth, the atmosphere aids in the recirculation of heat, keeping the temperature across the whole planet within a relatively narrow range. But on 55 Cancri e, the hot side stays hot, and the cold side stays cold.

According to Demory, one possibility for this variation could be either a complete lack of atmosphere, or one which has been partially destroyed due to the strong irradiation from the nearby host star. “On the day side, the temperature is around 2500 degrees Celsius, while on the night side it’s about 1100 degrees – that’s a huge difference,” he said. “We think that there could still be an atmosphere on the night side, but temperatures on the day side are so extreme that the atmosphere may have evaporated completely, meaning that heat is not being efficiently transferred, or transferred at all from the day side to the night side.”

Another possibility for the huge discrepancy between the day side and the night side may be that the molten lava on the day side moves heat along the surface, but since lava is mostly solid on the night side, heat is not moved around as efficiently.


Credit: Cambridge University

What is unclear however, is where exactly the ‘extra’ heat on 55 Cancri e comes from in the first place, since the observations reveal an unknown source of heat that makes the planet hotter than expected solely from the irradiation from the star – but the researchers may have to wait until the next generation of space telescopes are launched to find out.

For Demory, these new readings also show just how difficult it will be to detect a planet that is similar to Earth. The smaller a planet is, the more difficult it is to detect. And once a rocky planet has been found, there is the question of whether it lies in the so-called habitable zone, where life can be supported. “The problem is, people don’t agree on what the habitable zone is,” said Demory. “For example, some studies consider Mars and Venus to be in the habitable zone, but life as we know it is not possible on either of those planets. Understanding the surface and climate properties of these other worlds will eventually allow us to put the Earth’s climate and habitability into context.”

One possibility might be to look at stars which are much cooler and smaller than our sun, such as the M-dwarfs, which would mean that planets could be much closer to their star and still be in the habitable zone. The sizes of such planets relative to their star would be larger, which make them more detectable from Earth.

But for the time being, Demory and his colleagues plan to keep studying 55 Cancri e, in order to see what other secrets it might hold, including the possibility that it might be surrounded by a torus of gas and dust, which could account for some of the variations in the data. And in 2018, the successor to Hubble and Spitzer, the James Webb Space Telescope, will launch, allowing astronomers to look at planets outside our solar system with entirely new levels of precision.



Contacts and sources:
University of Cambridge


Citation: Brice-Olivier Demory et al. ‘A map of the extreme day-night temperature gradient of a super-Earth exoplanet.’ Nature (2016). DOI: 10.1038/nature17169- See more at: http://www.cam.ac.uk/research/news/map-of-rocky-exoplanet-reveals-a-lava-world#sthash.WczFpwne.dpuf

Simulations Explains How Galaxies Formed in the Early Universe

Cutting-edge simulations explain how supermassive black holes and galaxies formed from collapsing gas clouds in the early Universe. 

Near the edge of the visible Universe are some of the brightest objects ever observed, known as quasars, which are believed to contain supermassive black holes of more than a billion times the mass of our Sun. Simulations by Kentaro Nagamine at Osaka University's Department of Earth and Space Science, Isaac Shlosman at the University of Kentucky and co-workers have revealed for the first time exactly how these black holes formed 700 million years after the Big Bang.

"The early Universe was a dense, hot and uniform plasma," explains Nagamine. "As it cooled, fluctuations in the mass distribution formed seeds around which matter could gather due to gravity." These are the origins of the first stars. Similar processes might have later seeded the growth of bigger structures such as supermassive black holes.

Simulation of a network of dark matter filaments in a high-density region of the early universe. Each dense bright spot is a dark matter halo into which gas collapses to form large galaxies and supermassive black holes.


Credit:  2015 Kentaro Nagamine, Osaka University

Until recently, many researchers thought supermassive black holes were seeded by the collapse of some of the first stars. But modeling work by several groups has suggested that this process would only lead to small black holes. Nagamine and co-workers simulated a different situation, in which supermassive black holes are seeded by clouds of gas falling into potential wells created by dark matter -- the invisible matter that astronomers believe makes up 85% of the mass of the Universe.

Simulating the dynamics of huge gas clouds is extremely complex, so the team had to use some numerical tricks called 'sink particles' to simplify the problem.

"Although we have access to extremely powerful supercomputers at Osaka University's Cybermedia Center and the National Astronomical Observatory of Japan, we can't simulate every single gas particle," explains Nagamine. "Instead, we model small spatial scales using sink particles, which grow as the surrounding gas evolves. This allows us to simulate much longer timescales than was previously possible."

The researchers found that most seed particles in their simulations did not grow very much, except for one central seed, which grew rapidly to more than 2 million Sun-masses in just 2 million years, representing a feasible path toward a supermassive black hole. Moreover, as the gas spun and collapsed around the central seed it formed two misaligned accretion discs, which have never been observed before.

In other recent work, Nagamine and co-workers described the growth of massive galaxies that formed around the same time as supermassive black holes [1]. "We like to push the frontier of how far back in time we can see," says Nagamine. The researchers hope their simulations will be validated by real data when NASA's James Webb Space Telescope, due to be launched in 2018, observes distant sources where direct gas collapse is happening.



Contacts and sources:
Saori Obayashi
 Osaka University

Citation: 1. Yajima, H., Shlosman, I., Romano-Díaz, E. & Nagamine, K. Observational properties of simulated galaxies in overdense and average regions at redshifts z?6-12. Monthly Notices of the Royal Astronomical Society 451, 418-432 (2015) DOI: http://dx.doi.org/10.1093/mnras/stv974.

2. Shlosman, I., Choi, J.-H., Begelman, M.C. & Nagamine, K. Supermassive black hole seed formation at high redshifts: Long-term evolution of the direct collapse. Monthly Notices of the Royal Astronomical Society 456, 500–511 (2016).
DOI: http://dx.doi.org/10.1093/mnras/stv2700 
 

From Where in the Universe Does Gold Come? One of Science's Most Puzzling Questions



So you think the gold in your ring or watch came from a mine in Africa or Australia? Well, think farther away. Much, much farther.

Michigan State University researchers, working with colleagues from Technical University Darmstadt in Germany, are zeroing in on the answer to one of science's most puzzling questions: Where did heavy elements, such as gold, originate?

This illustration depicts two neutron stars colliding. As they merge, the stars eject material into space at 10 to 50 percent the speed of light. Mergers of these kinds of stars are thought to be the source of gold and other heavy metals found throughout the universe. 
Credit: Stephan Rosswog, Jacobs University Bremen.

Currently there are two candidates, neither of which are located on Earth - a supernova, a massive star that, in its old age, collapsed and then catastrophically exploded under its own weight; or a neutron-star merger, in which two of these small yet incredibly massive stars come together and spew out huge amounts of stellar debris.

In a recently published paper in the journal Physical Review Letters, the researchers detail how they are using computer models to come closer to an answer.

"At this time, no one knows the answer," said Witold Nazarewicz, a professor at the MSU-based Facility for Rare Isotope Beams and one of the co-authors of the paper. "But this work will help guide future experiments and theoretical developments."

Mojave Nugget, a gold nugget weighing 156 ounces from the Stringer district, Kern County, California and beyond. 
Credit: Reno Chris
By using existing data, often obtained by means of high-performance computing, the researchers were able to simulate production of heavy elements in both supernovae and neutron-star mergers.

"Our work shows regions of elements where the models provide a good prediction," said Nazarewicz, a Hannah Distinguished Professor of Physics who also serves as FRIB's chief scientist. "What we can do is identify the critical areas where future experiments, which will be conducted at FRIB, will work to reduce uncertainties of nuclear models."

Other researchers included Dirk Martin and Almudena Arcones from Technical University Darmstadt and Erik Olsen of MSU.

MSU is establishing FRIB as a new scientific user facility for the Office of Nuclear Physics in the U.S. Department of Energy Office of Science.

Under construction on campus and operated by MSU, FRIB will enable scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security and industry.

Native gold in quartz - Eagles Nest Mine, Placer County, California, USA. To make the gold crystals visible, the quartz was partially etched away. 


Contacts and sources:
Tom Oswald
Michigan State University

Citation: D. Martin, A. Arcones, W. Nazarewicz, and E. Olsen: Impact of Nuclear Mass Uncertainties on the r process.Physical Review Letters 116, 121101 - Published 25 March 2016th