Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

Friday, April 29, 2016

Ten Billion Year Old Blazer Blast Linked to Cosmic Neutrinos

Nearly 10 billion years ago, the black hole at the center of a galaxy known as PKS B1424-418 produced a powerful outburst. Light from this blast began arriving at Earth in 2012. Now astronomers using data from NASA's Fermi Gamma-ray Space Telescope and other space- and ground-based observatories have shown that a record-breaking neutrino seen around the same time likely was born in the same event.

NASA Goddard scientist Roopesh Ojha explains how Fermi and TANAMI uncovered the first plausible link between a blazar eruption and a neutrino from deep space.

Credits: NASA’s Goddard Space Flight Center

"Neutrinos are the fastest, lightest, most unsociable and least understood fundamental particles, and we are just now capable of detecting high-energy ones arriving from beyond our galaxy," said Roopesh Ojha, a Fermi team member at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and a coauthor of the study. "Our work provides the first plausible association between a single extragalactic object and one of these cosmic neutrinos.

Although neutrinos far outnumber all the atoms in the universe, they rarely interact with matter, which makes detecting them quite a challenge. But this same property lets neutrinos make a fast exit from places where light cannot easily escape -- such as the core of a collapsing star -- and zip across the universe almost completely unimpeded. Neutrinos can provide information about processes and environments that simply aren't available through a study of light alone.

The IceCube Neutrino Observatory, built into a cubic kilometer of clear glacial ice at the South Pole, detects neutrinos when they interact with atoms in the ice. This triggers a cascade of fast-moving charged particles that emit a faint glow, called Cerenkov light, as they travel, which is picked up by thousands of optical sensors strung throughout IceCube. Scientists determine the energy of an incoming neutrino by the amount of light its particle cascade emits.

To date, the IceCube science team has detected about a hundred very high-energy neutrinos and nicknamed some of the most extreme events after characters on the children's TV series "Sesame Street." On Dec. 4, 2012, IceCube detected an event known as Big Bird, a neutrino with an energy exceeding 2 quadrillion electron volts (PeV). To put that in perspective, it's more than a million million times greater than the energy of a dental X-ray packed into a single particle thought to possess less than a millionth the mass of an electron. Big Bird was the highest-energy neutrino ever detected at the time and still ranks second.

Where did it come from? The best IceCube position only narrowed the source to a patch of the southern sky about 32 degrees across, equivalent to the apparent size of 64 full moons.

Enter Fermi. Starting in the summer of 2012, the satellite's Large Area Telescope (LAT) witnessed a dramatic brightening of PKS B1424-418, an active galaxy classified as a gamma-ray blazar. An active galaxy is an otherwise typical galaxy with a compact and unusually bright core. The excess luminosity of the central region is produced by matter falling toward a supermassive black hole weighing millions of times the mass of our sun. As it approaches the black hole, some of the material becomes channeled into particle jets moving outward in opposite directions at nearly the speed of light. In blazars, one of these jets happens to point almost directly toward Earth.

Fermi LAT images showing the gamma-ray sky around the blazar PKS B1424-418. Brighter colors indicate greater numbers of gamma rays. The dashed arc marks part of the source region established by IceCube for the Big Bird neutrino (50-percent confidence level).

 An average of LAT data centered on July 8, 2011, and covering 300 days when the blazar was inactive.


Right: An average of 300 active days centered on Feb. 27, 2013, when PKS B1424-418 was the brightest blazar in this part of the sky.


Credits: NASA/DOE/LAT Collaboration

During the year-long outburst, PKS B1424-418 shone between 15 and 30 times brighter in gamma rays than its average before the eruption. The blazar is located within the Big Bird source region, but then so are many other active galaxies detected by Fermi.

The scientists searching for the neutrino source then turned to data from a long-term observing program named TANAMI. Since 2007, TANAMI has routinely monitored nearly 100 active galaxies in the southern sky, including many flaring sources detected by Fermi. The program includes regular radio observations using the Australian Long Baseline Array (LBA) and associated telescopes in Chile, South Africa, New Zealand and Antarctica. When networked together, they operate as a single radio telescope more than 6,000 miles across and provide a unique high-resolution look into the jets of active galaxies.

Radio images from the TANAMI project reveal the 2012-2013 eruption of PKS B1424-418 at a wavelength of 8.4 GHz. The core of the blazar’s jet brightened by four times, producing the most dramatic blazar outburst TANAMI has observed to date.

Credits: TANAMI


Three radio observations of PKS B1424-418 between 2011 and 2013 cover the period of the Fermi outburst. They reveal that the core of the galaxy's jet had brightened by about four times. No other galaxy observed by TANAMI over the life of the program has exhibited such a dramatic change.

"We combed through the field where Big Bird must have originated looking for astrophysical objects capable of producing high-energy particles and light," said coauthor Felicia Krauss, a doctoral student at the University of Erlangen-Nuremberg in Germany. "There was a moment of wonder and awe when we realized that the most dramatic outburst we had ever seen in a blazar happened in just the right place at just the right time."

In a paper published Monday, April 18, in Nature Physics, the team suggests the PKS B1424-418 outburst and Big Bird are linked, calculating only a 5-percent probability the two events occurred by chance alone. Using data from Fermi, NASA’s Swift and WISE satellites, the LBA and other facilities, the researchers determined how the energy of the eruption was distributed across the electromagnetic spectrum and showed that it was sufficiently powerful to produce a neutrino at PeV energies.

"Taking into account all of the observations, the blazar seems to have had means, motive and opportunity to fire off the Big Bird neutrino, which makes it our prime suspect," said lead author Matthias Kadler, a professor of astrophysics at the University of Wuerzburg in Germany.

Francis Halzen, the principal investigator of IceCube at the University of Wisconsin–Madison, and not involved in this study, thinks the result is an exciting hint of things to come. "IceCube is about to send out real-time alerts when it records a neutrino that can be localized to an area a little more than half a degree across, or slightly larger than the apparent size of a full moon," he said. "We're slowly opening a neutrino window onto the cosmos."

NASA's Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership, developed in collaboration with the U.S. Department of Energy and with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.


Contacts and sources:
By Francis Reddy
NASA's Goddard Space Flight Center,

World's First Gut Flora Study Reveals Links Between Lifestyle And Gut Flora

The Flemish Gut Flora Project, one of the largest population-wide studies on gut flora variation among healthy volunteers, has presented its first major results.

Through the analysis of more than 1,000 human stool samples, a team of researchers led by professor Jeroen Raes (VIB/VUB/KU Leuven) has identified 69 factors that are linked to gut flora composition. These results provide important information for future disease research and clinical studies. The project's fundamental insights will be published in the upcoming issue of the leading academic journal Science.

2012 marked the launch of the Flemish Gut Flora Project, initiated by prof. Jeroen Raes (VIB/VUB/KU Leuven). Together with his team, prof. Raes aimed at the ambitious task of mapping the gut flora composition of around 5,000 volunteers in Flanders (Belgium). The purpose of this endeavor was to investigate links between the human gut flora and health, diet, and lifestyle.

Escherichia coli, one of the many species of bacteria present in the human gut


Credit: Wikipedia

Gut flora composition linked to health, diet, and lifestyle

Titled "Population-level analysis of gut microbiome variation", prof. Raes' study has identified 69 factors associated with gut flora composition and diversity. Most of these covariates are related to transit time, health, diet, medication, gender, and age. Integration of the Flemish Gut Flora Project results with other data sets gathered around the world revealed a set of 14 bacterial genera that make up a universal core microbiota present in all individuals.

Jeroen Raes (VIB/VUB/KU Leuven): "Our research has given us a tremendous amount of new insight into the microbiota composition of normal people like you and me. This makes the Flemish Gut Flora Project unique, since the majority of previous studies focused on specific diseases or featured a significantly smaller geographical scope. However, analyzing the 'average' gut flora is essential for developing gut bacteria-based diagnostics and drugs. You need to understand what's normal before you can understand and treat disease".

Beer and buttermilk

Stool transit time showed the strongest association to gut flora composition. Also diet was an important factor, with most associations related to fiber consumption. One of the many surprising findings was the association of a particular bacterial group with a preference for dark chocolate! "The Belgian chocolate effect.", Raes laughs. "As many readers might expect, we also found an association between gut flora composition and beer consumption." 

Other project results incite deeper investigation, such as the relationship between the gut flora and factors linked to oxygen uptake capacity. Medication also had a strong link to the gut flora profile. The Raes Lab researchers not only identified associations with antibiotics and laxatives, but also with hay fever drugs and hormones used for anticonception or alleviation of menopause symptoms. Remarkably, early life events such as birth mode or whether or not volunteers were breast-fed as babies were not reflected in adult microbiota composition.

Jeroen Raes (VIB/VUB/KU Leuven): "These results are essential for disease studies. Parkinson's disease, for example, is typically associated with a longer intestinal transit time, which in turn impacts microbiota composition. So to study the microbiota in Parkinson's disease, you need to take that into account. These and many other observations can help scientists in their research into future therapies."

Jeroen Raes (VIB/VUB/KU Leuven)
Credit: © Eric De Mildt

A key factor in this study was the collaboration with the Dutch LifeLines study, which allowed the researchers to replicate their findings: more than 90% of the identified factors were also detected in the Dutch cohort. International collaborations like these are the key for advancing the field and speed up the path to developing gut flora-based drugs. "Such replication adds a tremendous amount of robustness to the results", Raes emphasizes. "Of course, we also found some differences between both cohorts. Believe it or not, but one of the important dietary covariates identified in the Dutch cohort was the consumption of buttermilk."

Tip of the iceberg

Although the Flemish Gut Flora Project has enormously enriched our knowledge on gut flora composition, it only allowed to explain 7% of gut flora variation. An enormous amount of work still needs to be done in order to sketch out the entire gut flora ecosystem. The Raes Lab estimates that around 40,000 human samples will be required just to capture a complete picture of gut flora biodiversity. In other words: we are only seeing the tip of the iceberg. And although the VIB team revealed a wide range of associations, further research is required to unveil what is cause and what is consequence.

This is why this first publication doesn't mark the end of the Flemish Gut Flora Project. The Raes Lab is already planning follow-up studies, including new large-scale research projects that will explore the evolution of gut flora over time. More volunteers are now being recruited for this long-term study. The more people willing to participate, the faster VIB will be able to unveil new insights into the relationship between the trillions of microbes in the human body and our health. "The thousands of volunteers, pharmacists, and healthcare professionals that participated to the Flemish Gut Flora Project are the heart of this study", Raes says. "Without their enthusiasm, this couldn't have been done."



Contacts and sources: 
Sooike Stoops
VIB (The Flanders Institute For Biotechnology)

Citation: Population-level analysis of gut microbiome variation, Falony et al., Science (2016).

NASA Helps Forecast Zika Risk; See The Map


A risk-assessment map shows Aedes aegypti potential abundance for July and the monthly average number arrivals to the U.S. by air and land from countries on the Center for Disease Control Zika travel advisory. Red dots represent areas with potentially high abundance, while yellow dots represent potentially low abundance areas. Shaded regions represent the approximate maximum range of Aedes aegypti.
A risk-assessment map shows Aedes aegypti potential abundance for July.
Credits: UCAR

NASA is assisting public health officials, scientists and communities to better understand Zika virus and try to limit the spread of the disease it causes.

Scientists at the agency's Marshall Space Flight Center in Huntsville, Alabama, have partnered with the National Center for Atmospheric Research in Boulder, Colorado, and other institutions to forecast the potential spread of Zika virus in the United States.

The research team looked at key factors -- including temperature, rainfall and socioeconomic factors -- that contribute to the spread of Zika virus to understand where and when a potential outbreak may occur. Their final product, a Zika risk map, can help government agencies and health organizations better prepare for possible disease outbreaks related to the spread of the virus. The researchers described their findings in the peer-reviewed journal PLOS Current Outbreaks.

"This information can help public health officials effectively target resources to fight the disease and control its spread," said Dale Quattrochi, NASA senior research scientist at Marshall.

To determine the potential risk in the mainland United States, Morin, Quattrochi and their colleagues applied methodology being employed in their current vector-borne disease project to potentially identify and predict the spread of Zika in 50 cities across the U.S. in or near the known range of the species. The team has studied this mosquito species for years, because it also transmits the dengue and chikungunya viruses.

The research team found that the Aedes aegypti mosquito, which is spreading the virus, will likely increase in number across much of the southern and eastern U.S. as the weather warms across those regions in the coming months. Summertime weather conditions are favorable for populations of the mosquito along the East Coast as far north as New York City and across the southern tier of the country as far west as Phoenix and Los Angeles.

Aedes aegypti mosquito
File:Aedes aegypti during blood meal.jpg
Credit: US Department of Health and Human Services

"The results generally confirmed many of our suspicions about the relative risk of Zika virus transmission in the U.S.," said Cory Morin, a NASA postdoctoral program fellow with Marshall’s Earth Science Office. "However, there were some surprises, such as the northern extent of Aedes aegypti potential survival during the summer months. This suggests that the mosquito can potentially survive in these locations if introduced during certain seasons, even if it hasn’t or can’t become fully established."

While the virus is not new, its presence in the Americas is. Officials in Brazil reported the first case of human infection in the region last spring. Since then, it has spread throughout South and Central America, and the Caribbean. No locally-transmitted Zika cases from mosquitoes have been reported in the continental U.S., but cases have been reported in travelers returning from areas where Zika virus is present and in U.S. territories. As Zika virus continues to spread, the number of cases among travelers visiting or returning to the continental U.S. is likely to increase.

"Knowledge is one of the most effective barriers to disease transmission and can alleviate unnecessary concern," Morin added. "By identifying the key risk factors and producing forecasts of disease transmission, we can enable citizens canto take effective actions that will greatly reduce their risk of disease.

Over the past three decades, NASA has partnered with various world health organizations to use remotely sensed observations to help develop predictive models for the spread of vector-borne diseases such as Malaria, Plague, Yellow Fever, West Nile virus, Lyme disease, Rift Valley fever and onchocerciasis, or River Blindness.

NASA is also part of the Office of Science and Technology Policy’s National Science and Technology Task Force on Science and Technology for Zika Vector Control, which includes partners at the Centers for Disease Control and Prevention and other federal agencies.

This research was supported by the National Institutes of Health and NASA. Funding was also provided by the Marshall Space Flight Center Innovation Fund. The National Science Foundation sponsors the National Center for Atmospheric Research.





Contacts and sources:
Molly Porter
Marshall Space Flight Center 

Winds Gusting To 43,000 Miles Per Second Created by Mysterious Binary Systems




Two black holes in nearby galaxies have been observed devouring their companion stars at a rate exceeding classically understood limits, and in the process, kicking out matter into surrounding space at astonishing speeds of around a quarter the speed of light.

The researchers, from the University of Cambridge, used data from the European Space Agency's (ESA) XMM-Newton space observatory to reveal for the first time strong winds gusting at very high speeds from two mysterious sources of x-ray radiation. The discovery, published in the journal Nature, confirms that these sources conceal a compact object pulling in matter at extraordinarily high rates.

Artist’s impression depicting a compact object – either a black hole or a neutron star – feeding on gas from a companion star in a binary system.


Credit: ESA - C. Carreau

When observing the Universe at x-ray wavelengths, the celestial sky is dominated by two types of astronomical objects: supermassive black holes, sitting at the centres of large galaxies and ferociously devouring the material around them, and binary systems, consisting of a stellar remnant - a white dwarf, neutron star or black hole - feeding on gas from a companion star.

In both cases, the gas forms a swirling disc around the compact and very dense central object. Friction in the disc causes the gas to heat up and emit light at different wavelengths, with a peak in x-rays.

But an intermediate class of objects was discovered in the 1980s and is still not well understood. Ten to a hundred times brighter than ordinary x-ray binaries, these sources are nevertheless too faint to be linked to supermassive black holes, and in any case, are usually found far from the centre of their host galaxy.

"We think these so-called 'ultra-luminous x-ray sources' are special binary systems, sucking up gas at a much higher rate than an ordinary x-ray binary," said Dr Ciro Pinto from Cambridge's Institute of Astronomy, the paper's lead author. "Some of these sources host highly magnetised neutron stars, while others might conceal the long-sought-after intermediate-mass black holes, which have masses around one thousand times the mass of the Sun. But in the majority of cases, the reason for their extreme behaviour is still unclear."

Pinto and his colleagues collected several days' worth of observations of three ultra-luminous x-ray sources, all located in nearby galaxies located less than 22 million light-years from the Milky Way. The data was obtained over several years with the Reflection Grating Spectrometer on XMM-Newton, which allowed the researchers to identify subtle features in the spectrum of the x-rays from the sources.

In all three sources, the scientists were able to identify x-ray emission from gas in the outer portions of the disc surrounding the central compact object, slowly flowing towards it.

But two of the three sources - known as NGC 1313 X-1 and NGC 5408 X-1 - also show clear signs of x-rays being absorbed by gas that is streaming away from the central source at 70,000 kilometres per second - almost a quarter of the speed of light.


The irregular galaxy NGC 5408 viewed by the NASA/ESA Hubble Space Telescope. The galaxy is located some 16 million light-years away and hosts a very bright source of X-rays, NGC 1313 X-1.
NGC 5408 X-1 is an ultra-luminous X-ray source – a binary system consisting of a stellar remnant that is feeding on gas from a companion star at an especially high rate.

Scientists using ESA's XMM-Newton have discovered gas streaming away at a quarter of the speed of light from NGC 5408 X-1 and another bright X-ray binary, NGC 1313 X-1, confirming that these sources conceal a compact object accreting matter at extraordinarily high rates.

Credit: ESA/Hubble & NASA. Acknowledgement: J. Schmidt (Geckzilla)

"This is the first time we've seen winds streaming away from ultra-luminous x-ray sources," said Pinto. "And the very high speed of these outflows is telling us something about the nature of the compact objects in these sources, which are frantically devouring matter."

While the hot gas is pulled inwards by the central object's gravity, it also shines brightly, and the pressure exerted by the radiation pushes it outwards. This is a balancing act: the greater the mass, the faster it draws the surrounding gas; but this also causes the gas to heat up faster, emitting more light and increasing the pressure that blows the gas away.

There is a theoretical limit to how much matter can be pulled in by an object of a given mass, known as the Eddington limit. The limit was first calculated for stars by astronomer Arthur Eddington, but it can also be applied to compact objects like black holes and neutron stars.

Eddington's calculation refers to an ideal case in which both the matter being accreted onto the central object and the radiation being emitted by it do so equally in all directions.

But the sources studied by Pinto and his collaborators are potentially being fed through a disc which has been puffed up due to internal pressures arising from the incredible rates of material passing through it. These thick discs can naturally exceed the Eddington limit and can even trap the radiation in a cone, making these sources appear brighter when we look straight at them. As the thick disc moves material further from the black hole's gravitational grasp it also gives rise to very high-speed winds like the ones observed by the Cambridge researchers.

"By observing x-ray sources that are radiating beyond the Eddington limit, it is possible to study their accretion process in great detail, investigating by how much the limit can be exceeded and what exactly triggers the outflow of such powerful winds," said Norbert Schartel, ESA XMM-Newton Project Scientist.

The nature of the compact objects hosted at the core of the two sources observed in this study is, however, still uncertain.

Based on the x-ray brightness, the scientists suspect that these mighty winds are driven from accretion flows onto either neutron stars or black holes, the latter with masses of several to a few dozen times that of the Sun.

To investigate further, the team is still scrutinising the data archive of XMM-Newton, searching for more sources of this type, and are also planning future observations, in x-rays as well as at optical and radio wavelengths.

"With a broader sample of sources and multi-wavelength observations, we hope to finally uncover the physical nature of these powerful, peculiar objects," said Pinto.




Contacts and sources:
Sarah Collins
University of Cabridge

Citation: C. Pinto et al. ‘Resolved atomic lines reveal outflows in two ultraluminous X-ray sources’ Nature (2016). DOI: 10.1038/nature17417.

New Equation Calculates Life in the Universe and Our Uniqueness Among Ten Billion Trillion Stars

Are humans unique and alone in the vast universe? This question-- summed up in the famous Drake equation -- has for a half-century been one of the most intractable and uncertain in science.
But a new paper shows that the recent discoveries of exoplanets combined with a broader approach to the question makes it possible to assign a new empirically valid probability to whether any other advanced technological civilizations have ever existed.
In 1961, astrophysicist Frank Drake developed an equation to estimate the number of advanced civilizations likely to exist in the Milky Way galaxy. The Drake equation (top row) has proven to be a durable framework for research, and space technology has advanced scientists' knowledge of several variables. But it is impossible to do anything more than guess at variables such as L, the probably longevity of other advanced civilizations. 
In new research, Adam Frank and Woodruff Sullivan offer a new equation (bottom row) to address a slightly different question: What is the number of advanced civilizations likely to have developed over the history of the observable universe? Frank and Sullivan's equation draws on Drake's, but eliminates the need for L.
Credit:  University of Rochester

And it shows that unless the odds of advanced life evolving on a habitable planet are astonishingly low, then human kind is not the universe's first technological, or advanced, civilization.
The paper, to be published in Astrobiology, also shows for the first time just what "pessimism" or "optimism" mean when it comes to estimating the likelihood of advanced extraterrestrial life.
"The question of whether advanced civilizations exist elsewhere in the universe has always been vexed with three large uncertainties in the Drake equation," said Adam Frank, professor of physics and astronomy at the University of Rochester and co-author of the paper. "We've known for a long time approximately how many stars exist. We didn't know how many of those stars had planets that could potentially harbor life, how often life might evolve and lead to intelligent beings, and how long any civilizations might last before becoming extinct."
"Thanks to NASA's Kepler satellite and other searches, we now know that roughly one-fifth of stars have planets in "habitable zones," where temperatures could support life as we know it. So one of the three big uncertainties has now been constrained."
Frank said that the third big question--how long civilizations might survive--is still completely unknown. "The fact that humans have had rudimentary technology for roughly ten thousand years doesn't really tell us if other societies would last that long or perhaps much longer," he explained.
But Frank and his coauthor, Woodruff Sullivan of the astronomy department and astrobiology program at the University of Washington, found they could eliminate that term altogether by simply expanding the question.
"Rather than asking how many civilizations may exist now, we ask 'Are we the only technological species that has ever arisen?" said Sullivan. "This shifted focus eliminates the uncertainty of the civilization lifetime question and allows us to address what we call the 'cosmic archaeological question'--how often in the history of the universe has life evolved to an advanced state?"
That still leaves huge uncertainties in calculating the probability for advanced life to evolve on habitable planets. It's here that Frank and Sullivan flip the question around. Rather than guessing at the odds of advanced life developing, they calculate the odds against it occurring in order for humanity to be the only advanced civilization in the entire history of the observable universe. With that, Frank and Sullivan then calculated the line between a Universe where humanity has been the sole experiment in civilization and one where others have come before us.
"Of course, we have no idea how likely it is that an intelligent technological species will evolve on a given habitable planet," says Frank. But using our method we can tell exactly how low that probability would have to be for us to be the ONLY civilization the Universe has produced. We call that the pessimism line. If the actual probability is greater than the pessimism line, then a technological species and civilization has likely happened before."
Using this approach, Frank and Sullivan calculate how unlikely advanced life must be if there has never been another example among the universe's ten billion trillion stars, or even among our own Milky Way galaxy's hundred billion.
The result? By applying the new exoplanet data to the universe's 2 x 10 to the 22nd power stars, Frank and Sullivan find that human civilization is likely to be unique in the cosmos only if the odds of a civilization developing on a habitable planet are less than about one in 10 billion trillion, or one part in 10 to the 22th power.
"One in 10 billion trillion is incredibly small," says Frank. "To me, this implies that other intelligent, technology producing species very likely have evolved before us. Think of it this way. Before our result you'd be considered a pessimist if you imagined the probability of evolving a civilization on a habitable planet were, say, one in a trillion. But even that guess, one chance in a trillion, implies that what has happened here on Earth with humanity has in fact happened about a 10 billion other times over cosmic history!"
For smaller volumes the numbers are less extreme. For example, another technological species likely has evolved on a habitable planet in our own Milky Way galaxy if the odds against it are better than one chance in 60 billion.
But if those numbers seem to give ammunition to the "optimists" about the existence of alien civilizations, Sullivan points out that the full Drake equation--which calculates the odds that other civilizations are around today -- may give solace to the pessimists.
"The universe is more than 13 billion years old," said Sullivan. "That means that even if there have been a thousand civilizations in our own galaxy, if they live only as long as we have been around -- roughly ten thousand years -- then all of them are likely already extinct. And others won't evolve until we are long gone. For us to have much chance of success in finding another "contemporary" active technological civilization, on average they must last much longer than our present lifetime."
"Given the vast distances between stars and the fixed speed of light we might never really be able to have a conversation with another civilization anyway," said Frank. "If they were 20,000 light years away then every exchange would take 40,000 years to go back and forth."
But, as Frank and Sullivan point out, even if there aren't other civilizations in our galaxy to communicate with now, the new result still has a profound scientific and philosophical importance. "From a fundamental perspective the question is 'has it ever happened anywhere before?'" said Frank. Our result is the first time anyone has been able to set any empirical answer for that question and it is astonishingly likely that we are not the only time and place that an advance civilization has evolved."
According to Frank and Sullivan their result has a practical application as well. As humanity faces its crisis in sustainability and climate change we can wonder if other civilization-building species on other planets have gone through a similar bottleneck and made it to the other side. As Frank puts it "We don't even know if it's possible to have a high-tech civilization that lasts more than a few centuries." 
With Frank and Sullivan's new result, scientists can begin using everything they know about planets and climate to begin modeling the interactions of an energy-intensive species with their home world knowing that a large sample of such cases has already existed in the cosmos. "Our results imply that our evolution has not been unique and has probably happened many times before. The other cases are likely to include many energy intensive civilizations dealing with their feedbacks onto their planets as their civilizations grow. That means we can begin exploring the problem using simulations to get a sense of what leads to long lived civilizations and what doesn't."
Frank and Sullivan's argument hinges upon the recent discovery of how many planets exist and how many of those lie in what scientists call the "habitable zone" -- planets in which liquid water, and therefore life, could exist. This allows Frank and Sullivan to define a number they call Nast. Nast is the product of N*, the total number of stars; fp, the fraction of those stars that form planets; and np, the average number of those planets in the habitable zones of their stars.
First Earth sized planet found in a habitable zone. 
Credit: NASA 
They then set out what they call the "Archaelogical-form" of the Drake equation, which defines A as the "number of technological species that have ever formed over the history of the observable Universe."
Their equation, A=Nast*fbt, describes A as the product of Nast - the number of habitable planets in a given volume of the Universe - multiplied by fbt - the likelihood of a technological species arising on one of these planets. The volume considered could be, for example, the entire Universe, or just our Galaxy.



Contacts and sources:
Leonor Sierra
University of Rochester

The Shocking Origins of the Oldest Crystals on Earth



The tiny crystals probably formed in huge impact craters not long after Earth formed, some 4 billion years ago.

New research suggests that the very oldest pieces of rock on Earth -- zircon crystals -- are likely to have formed in the craters left by violent asteroid impacts that peppered our nascent planet, rather than via plate tectonics as was previously believed.

Recently, geologists suggested these grains may have formed in huge impact craters produced as chunks of rock from space, up to several kilometers in diameter, slammed into a young Earth.

Scanning electron microscope picture of a zircon crystal from the Sudbury crater.

Credit: Gavin Kenny, Trinity College Dublin.

Rocks that formed over the course of Earth's history allow geologists to infer things such as when water first appeared on the planet, how our climate has varied, and even where life came from. However, we can only go back in time so far, as the only material we have from the very early Earth comes in the form of tiny, naturally occurring zircon crystals.

Naturally then, the origin of these crystals, which are approximately the width of a human hair and more than four billion years old (the Earth being just over four and a half billion years old), has become a matter of major debate. Fifteen years ago these crystals first made headlines when they revealed the presence of water on the surface of the Earth (thought to be a key ingredient for the origin of life) when they were forming.

Shatter cones (pyramid-like structures) formed from the shock wave of the impact, and can be seen as that wave migrated through the rock from the bottom up.

Credit: Gavin Kenny, Trinity College Dublin

Ten years ago, a team of researchers in the US argued that the ancient zircon crystals probably formed when tectonic plates moving around on the Earth's surface collided with each other in a similar fashion to the disruption taking place in the Andes Mountains today, where the ocean floor under the Pacific Ocean is plunging under South America.However, current evidence suggests that plate tectonics -- as we know it today -- was not occurring on the early Earth. So, the question remained: Where did the crystals come from?

 To test this idea, researchers from Trinity College Dublin decided to study a much younger impact crater to see if zircon crystals similar to the very old ones could possibly have formed in these violent settings.

In the summer of 2014, with the support of the Irish Reseach Council (IRC) and Science Foundation Ireland (SFI), the team collected thousands of zircons from the Sudbury impact crater, Ontario, Canada - the best preserved large impact crater on Earth and the planet's second oldest confirmed crater at almost two billion years old.

The Sudbury crater in Canada was created by a big rock from space crashing into the Earth 1.87 billion years ago. 

Credit: Astronauts on Space Shuttle Challenger on flight 41-G

After analyzing these crystals at the Swedish Museum of Natural History in Stockholm, they discovered that the crystal compositions were indistinguishable from the ancient set.PhD Researcher in Trinity's School of Natural Sciences, Gavin Kenny, is first author of the article which explains these findings, and which has just been published in leading international journal, Geology.He said: "What we found was quite surprising. Many people thought the very ancient zircon crystals couldn't have formed in impact craters, but we now know they could have. 

There's a lot we still don't fully understand about these little guys but it looks like we may now be able to form a more coherent story of Earth's early years -- one which fits with the idea that our planet suffered far more frequent bombardment from asteroids early on than it has in relatively recent times."

Gavin Kenny recently traveled to the annual Lunar and Planetary Science Conference (LPSC) in Houston, Texas, to present these findings to the space science community.

He added: "There was a lot of enthusiasm for our findings. Just two years ago a group had studied the likely timing of impacts on the early Earth and they suggested that these impacts might explain the ages of the ancient zircons. They were understandably very happy to see that the chemistry of the zircons from the Canadian impact crater matched the oldest crystals known to man.



Contacts and sources:
Thomas Deane
Trinity College Dublin

Citation: Kenny GG, Whitehouse MJ, Kamber BS. Differentiated impact melt sheets may be a potential source of Hadean detrital zircon. Geology. 2016; DOI: 10.1130/G37898.1 http://dx.doi.org/10.1130/G37898.1






















What If ET Called Earth but No One Picked Up

As scientists step up their search for other life in the universe, two astrophysicists are proposing a way to make sure we don’t miss the signal if extraterrestrial observers try to contact us first.

Researchers René Heller and Ralph Pudritz say the best chance for us finding a signal from beyond is to presume that extraterrestrial observers are using the same methods to search for us that we are using to search for life beyond Earth.


Credit: McMaster University

Here on Earth, space researchers are focusing most of their search efforts on planets and moons that are too far away to see directly. Instead, they study them by tracking their shadows as they pass in front of their own host stars.

Measuring the dimming of starlight as a planet crosses the face of its star during orbit, scientists can collect a wealth of information, even without ever seeing those worlds directly.

Using methods that allow them to estimate the average stellar illumination and temperatures on their surfaces, scientists have already identified dozens of locations where life could potentially exist.

In a paper to published in the journal Astrobiology, and available now online, Heller and Pudritz turn the telescope around to ask, what if extraterrestrial observers discover the Earth as it transits the sun?

If such observers are using the same search methods that scientists are using on Earth, the researchers propose that humanity should turn its collective ear to Earth’s “transit zone”, the thin slice of space from which our planet’s passage in front of the sun can be detected.

Credit: McMasterUTV

“It’s impossible to predict whether extraterrestrials use the same observational techniques as we do,” says Heller. “But they will have to deal with the same physical principles as we do, and Earth's solar transits are an obvious method to detect us.”

The transit zone is rich in host stars for planetary systems, offering approximately 100,000 potential targets, each potentially orbited by habitable planets and moons, the scientists say – and that’s just the number we can see with today’s radio telescope technologies.

“If any of these planets host intelligent observers, they could have identified Earth as a habitable, even as a living world long ago and we could be receiving their broadcasts today,” write Heller and Pudritz.

Heller is a post-doctoral fellow who, while at McMaster, worked with Pudritz, a professor of Physics and Astronomy. Heller is now at the Institute for Astrophysics in Göttingen, Germany.

The question of contact with others beyond Earth is hardly hypothetical, as several projects are under way, both to send signals from Earth and to search for signals that have been sent directly or have “leaked” around obstacles, possibly travelling for thousands of years.

Heller and Pudritz propose that the Breakthrough Listen Initiative, part of the most comprehensive search for extraterrestrial life ever conducted, can maximize its chances of success by concentrating its search on Earth’s transit zone.


Contacts and sources:
Michelle Donovan McMaster University

Tuesday, April 26, 2016

Mars’ Surface Revealed in Unprecedented Detail

The surface of Mars – including the location of Beagle-2 – has been shown in unprecedented detail by University College London (UCL) scientists using a revolutionary image stacking and matching technique.



Exciting pictures of the Beagle-2 lander, the ancient lakebeds discovered by NASA’s Curiosity rover, NASA’s MER-A rover tracks and Home Plate’s rocks have been released by the UCL researchers who stacked and matched images taken from orbit, to reveal objects at a resolution up to five times greater than previously achieved.

A paper describing the technique, called Super-Resolution Restoration (SRR), was published inPlanetary and Space Science in February but has only recently been used to focus on specific objects on Mars. The technique could be used to search for other artefacts from past failed landings as well as identify safe landing locations for future rover missions. It will also allow scientists to explore vastly more terrain than is possible with a single rover.

Co-author Professor Jan-Peter Muller from the UCL Mullard Space Science Laboratory, said: “We now have the equivalent of drone-eye vision anywhere on the surface of Mars where there are enough clear repeat pictures. It allows us to see objects in much sharper focus from orbit than ever before and the picture quality is comparable to that obtained from landers.

“As more pictures are collected, we will see increasing evidence of the kind we have only seen from the three successful rover missions to date. This will be a game-changer and the start of a new era in planetary exploration.”

Even with the largest telescopes that can be launched into orbit, the level of detail that can be seen on the surface of planets is limited. This is due to constraints on mass, mainly telescope optics, the communication bandwidth needed to deliver higher resolution images to Earth and the interference from planetary atmospheres. For cameras orbiting Earth and Mars, the resolution limit today is around 25cm (or about 10 inches).

By stacking and matching pictures of the same area taken from different angles, Super-Resolution Restoration (SRR) allows objects as small as 5cm (about 2 inches) to be seen from the same 25cm telescope. For Mars, where the surface usually takes decades to millions of years to change, these images can be captured over a period of ten years and still achieve a high resolution. For Earth, the atmosphere is much more turbulent so images for each stack have to be obtained in a matter of seconds.

The UCL team applied SRR to stacks of between four and eight 25cm images of the Martian surface taken using the NASA HiRISE camera to achieve the 5cm target resolution. These included some of the latest HiRISE images of the Beagle-2 landing area that were kindly provided by Professor John Bridges from the University of Leicester.

Zoom-up of 2 areas from the Original 25cm HiRISE image (upper row) with reference name PSP_001513-1655 of the MER-A Spirit Homeplate region and Super-Resolution Restoration at 5cm of 8 input HiRISE images (lower row) with a rock-field (left column) and with the MER_A rover tracks (right column).
Credit: UCL  More photos at Flickr

“Using novel machine vision methods, information from lower resolution images can be extracted to estimate the best possible true scene. This technique has huge potential to improve our knowledge of a planet’s surface from multiple remotely sensed images. In the future, we will be able to recreate rover-scale images anywhere on the surface of Mars and other planets from repeat image stacks” said Mr Yu Tao, Research Associate at UCL and lead author of the paper.

The team’s ‘super-resolution’ zoomed-in image of the Beagle-2 location proposed by Professor Mark Sims and colleagues at the University of Leicester provides strong supporting evidence that this is the site of the lander. The scientists plan on exploring other areas of Mars using the technique to see what else they find.


Contacts and sources:
Rebecca Caygill
University College London

Moon Orbiting the Dwarf Planet Makemake Discovered

Peering to the outskirts of our solar system, NASA’s Hubble Space Telescope has spotted a small, dark moon orbiting Makemake, the second brightest icy dwarf planet — after Pluto — in the Kuiper Belt.

The moon — provisionally designated S/2015 (136472) 1 and nicknamed MK 2 — is more than 1,300 times fainter than Makemake. MK 2 was seen approximately 13,000 miles from the dwarf planet, and its diameter is estimated to be 100 miles across. Makemake is 870 miles wide. The dwarf planet, discovered in 2005, is named for a creation deity of the Rapa Nui people of Easter Island.

Astronomers using the Hubble Space Telescope discovered a moon orbiting dwarf planet Makemake -- the third largest known object past the orbit of Neptune, about two thirds the size of Pluto. Further observations of this moon may allow astronomers to calculate Makemake's mass, which will give them a better idea of its density and thus its bulk composition. The Hubble Space Telescope has been instrumental in studying our outer solar system; it also discovered four of the five moons orbiting Pluto.

Credits: NASA/Goddard/Katrina Jackson 

The Kuiper Belt is a vast reservoir of leftover frozen material from the construction of our solar system 4.5 billion years ago and home to several dwarf planets. Some of these worlds have known satellites, but this is the first discovery of a companion object to Makemake. Makemake is one of five dwarf planets recognized by the International Astronomical Union.

The observations were made in April 2015 with Hubble’s Wide Field Camera 3. Hubble’s unique ability to see faint objects near bright ones, together with its sharp resolution, allowed astronomers to pluck out the moon from Makemake’s glare. The discovery was announced today in a Minor Planet Electronic Circular.

This Hubble image reveals the first moon ever discovered around the dwarf planet Makemake. The tiny satellite, located just above Makemake in this image, is barely visible because it is almost lost in the glare of the very bright dwarf planet. Hubble’s sharp-eyed WFC3 made the observation in April 2015.

Credits: NASA, ESA, and A. Parker and M. Buie (SwRI)

The observing team used the same Hubble technique to observe the moon as they did for finding the small satellites of Pluto in 2005, 2011, and 2012. Several previous searches around Makemake had turned up empty. “Our preliminary estimates show that the moon’s orbit seems to be edge-on, and that means that often when you look at the system you are going to miss the moon because it gets lost in the bright glare of Makemake,” said Alex Parker of Southwest Research Institute, Boulder, Colorado, who led the image analysis for the observations.

A moon’s discovery can provide valuable information on the dwarf-planet system. By measuring the moon’s orbit, astronomers can calculate a mass for the system and gain insight into its evolution.

Uncovering the moon also reinforces the idea that most dwarf planets have satellites.

“Makemake is in the class of rare Pluto-like objects, so finding a companion is important,” Parker said. “The discovery of this moon has given us an opportunity to study Makemake in far greater detail than we ever would have been able to without the companion.”

Finding this moon only increases the parallels between Pluto and Makemake. Both objects are already known to be covered in frozen methane. As was done with Pluto, further study of the satellite will easily reveal the density of Makemake, a key result that will indicate if the bulk compositions of Pluto and Makemake are also similar. “This new discovery opens a new chapter in comparative planetology in the outer solar system,” said team leader Marc Buie of the Southwest Research Institute, Boulder, Colorado.

The researchers will need more Hubble observations to make accurate measurements to determine if the moon’s orbit is elliptical or circular. Preliminary estimates indicate that if the moon is in a circular orbit, it completes a circuit around Makemake in 12 days or longer.

Determining the shape of the moon’s orbit will help settle the question of its origin. A tight circular orbit means that MK 2 is probably the product of a collision between Makemake and another Kuiper Belt Object. If the moon is in a wide, elongated orbit, it is more likely to be a captured object from the Kuiper Belt. Either event would have likely occurred several billion years ago, when the solar system was young.

The discovery may have solved one mystery about Makemake. Previous infrared studies of the dwarf planet revealed that while Makemake’s surface is almost entirely bright and very cold, some areas appear warmer than other areas. Astronomers had suggested that this discrepancy may be due to the sun warming discrete dark patches on Makemake’s surface. However, unless Makemake is in a special orientation, these dark patches should make the dwarf planet’s brightness vary substantially as it rotates. But this amount of variability has never been observed.

These previous infrared data did not have sufficient resolution to separate Makemake from MK 2. The team’s reanalysis, based on the new Hubble observations, suggests that much of the warmer surface detected previously in infrared light may, in reality, simply have been the dark surface of the companion MK 2.

This artist's concept shows the distant dwarf planet Makemake and its newly discovered moon. Makemake and its moon, nicknamed MK 2, are more than 50 times farther away than Earth is from the sun.

Credits: NASA, ESA, and A. Parker (Southwest Research Institute)

There are several possibilities that could explain why the moon would have a charcoal-black surface, even though it is orbiting a dwarf planet that is as bright as fresh snow. One idea is that, unlike larger objects such as Makemake, MK 2 is small enough that it cannot gravitationally hold onto a bright, icy crust, which sublimates, changing from solid to gas, under sunlight. This would make the moon similar to comets and other Kuiper Belt Objects, many of which are covered with very dark material.

When Pluto’s moon Charon was discovered in 1978, astronomers quickly calculated the mass of the system. Pluto’s mass was hundreds of times smaller than the mass originally estimated when it was found in 1930. With Charon’s discovery, astronomers suddenly knew something was fundamentally different about Pluto. “That’s the kind of transformative measurement that having a satellite can enable,” Parker said.

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.



Contacts and sources:
Felicia Chou, NASA Headquarters
Donna Weaver / Ray Villard, Space Telescope Science Institute
Alex Parker, Southwest Research Institute 

Nearby Massive Star Explosion Equaled Detonation of 100 Million Suns 30 Million Years Ago

Analysis of exploding star's light curve and color spectrum reveal spectacular demise of one of the closest supernova to Earth in recent years; its parent star was so big it's radius was 200 times larger than our sun.

Watch the explosion of a star in the M74 galaxy as it goes supernova, visible by changes in its brightness over time as observed by a robotic telescope. SN 2013ej, as it's been named, was a massive star in the nearby galaxy M74. It exploded 30 million years ago. While the explosion was near to Earth by cosmic standards, its light wasn't visible in our night sky until July 23, 2013.
Credit: SMU

A giant star that exploded 30 million years ago in a galaxy near Earth had a radius prior to going supernova that was 200 times larger than our sun, according to astrophysicists at Southern Methodist University, Dallas.

The sudden blast hurled material outward from the star at a speed of 10,000 kilometers a second. That's equivalent to 36 million kilometers an hour or 22.4 million miles an hour, said SMU physicist Govinda Dhungana, lead author on the new analysis.

The comprehensive analysis of the exploding star's light curve and color spectrum have revealed new information about the existence and sudden death of supernovae in general, many aspects of which have long baffled scientists.

"There are so many characteristics we can derive from the early data," Dhungana said. "This was a big massive star, burning tremendous fuel. When it finally reached a point its core couldn't support the gravitational pull inward, suddenly it collapsed and then exploded."

A giant star that exploded 30 million years ago was one of the closest to Earth in recent years to go supernova, say astrophysicists at Southern Methodist University, Dallas. It was visible as a point of light in the night sky. This image of Supernova 2013ej shows the star just prior to explosion.

Credit:  Govinda Dhungana, SMU

The massive explosion was one of the closest to Earth in recent years, visible as a point of light in the night sky starting July 24, 2013, said Robert Kehoe, SMU physics professor, who leads SMU's astrophysics team.

The explosion, termed by astronomers Supernova 2013ej, in a galaxy near our Milky Way was equal in energy output to the simultaneous detonation of 100 million of the Earth's suns.

The star was one of billions in the spiral galaxy M74 in the constellation Pisces.

Considered close by supernova standards, SN 2013ej was in fact so far away that light from the explosion took 30 million years to reach Earth. At that distance, even such a large explosion was only visible by telescopes.

Dhungana and colleagues were able to explore SN 2013ej via a rare collection of extensive data from seven ground-based telescopes and NASA's Swift satellite. The data span a time period prior to appearance of the supernova in July 2013 until more than 450 days after.

The team measured the supernova's evolving temperature, its mass, its radius, the abundance of a variety of chemical elements in its explosion and debris and its distance from Earth. They also estimated the time of the shock breakout, the bright flash from the shockwave of the explosion.

A giant star that exploded 30 million years ago was one of the closest to Earth in recent years to go supernova, say astrophysicists at Southern Methodist University, Dallas. It was visible as a point of light in the night sky. This image of Supernova 2013ej shows the star at peak explosion

Credit:  Govinda Dhungana, SMU

The star's original mass was about 15 times that of our sun, Dhungana said. Its temperature was a hot 12,000 Kelvin (approximately 22,000 degrees Fahrenheit) on the tenth day after the explosion, steadily cooling until it reached 4,500 Kelvin after 50 days. The sun's surface is 5,800 Kelvin, while the Earth's core is estimated to be about 6,000 Kelvin.

Shedding new light on supernovae, mysterious objects of our universe

Supernovae occur throughout the universe, but they are not fully understood. Scientists don't directly observe the explosions but instead detect changes in emerging light as material is hurled from the exploding star in the seconds and days after the blast.

Telescopes such as SMU's robotic ROTSE-IIIb telescope at McDonald Observatory in Texas, watch our sky and pick up the light as a point of brightening light. Others, such as the Hobby Eberly telescope, also at McDonald, observe a spectrum.

SN 2013ej is M74's third supernova in just 10 years. That is quite frequent compared to our Milky Way, which has had a scant one supernova observed over the past 400 years. NASA estimates that the M74 galaxy consists of 100 billion stars.

M74 is one of only a few dozen galaxies first cataloged by the astronomer Charles Messier in the late 1700s. It has a spiral structure -- also the Milky Way's apparent shape -- indicating it is still undergoing star formation, as opposed to being an elliptical galaxy in which new stars no longer form.

It's possible that planets were orbiting SN 2013ej's progenitor star prior to it going supernova, in which case those objects would have been obliterated by the blast, Kehoe said.

"If you were nearby, you wouldn't know there was a problem beforehand, because at the surface you can't see the core heating up and collapsing," Kehoe said. "Then suddenly it explodes -- and you're toast."

Distances to nearby galaxies help determine cosmic distance ladder

Scientists remain unsure whether supernovae leave behind a black hole or a neutron star like a giant atomic nucleus the size of a city.

"The core collapse and how it produces the explosion is particularly tricky," Kehoe said. "Part of what makes SN 2013ej so interesting is that astronomers are able to compare a variety of models to better understand what is happening. Using some of this information, we are also able to calculate the distance to this object. This allows us a new type of object with which to study the larger universe, and maybe someday dark energy."

Being 30 million light years away, SN 2013ej was a relatively nearby extragalactic event, according to Jozsef Vinko, astrophysicist at Konkoly Observatory and University of Szeged in Hungary.

"Distances to nearby galaxies play a significant role in establishing the so-called cosmic distance ladder, where each rung is a galaxy at a known distance."

Vinko provided important data from telescopes at Konkoly Observatory and Hungary's Baja Observatory and carried out distance measurement analysis on SN 2013ej.

"Nearby supernovae are especially important," Vinko said. "Paradoxically, we know the distances to the nearest galaxies less certainly than to the more distant ones. In this particular case we were able to combine the extensive datasets of SN 2013ej with those of another supernova, SN 2002ap, both of which occurred in M74, to suppress the uncertainty of their common distance derived from those data."

Supernova spectrum analysis is like taking a core sample

While stars appear to be static objects that exist indefinitely, in reality they are primarily a burning ball, fueled by the fusion of elements, including hydrogen and helium into heavier elements. As they exhaust lighter elements, they must contract in the core and heat up to burn heavier elements. Over time, they fuse the various chemical elements of the periodic table, proceeding from lightest to heaviest. Initially they fuse helium into carbon, nitrogen and oxygen. Those elements then fuel the fusion of progressively heavier elements such as sulfur, argon, chlorine and potassium.

"Studying the spectrum of a supernova over time is like taking a core sample," Kehoe said. "The calcium in our bones, for example, was cooked in a star. A star's nuclear fusion is always forging heavier and heavier elements. At the beginning of the universe there was only hydrogen and helium. The other elements were made in stars and in supernovae. The last product to get created is iron, which is an element that is so heavy it can't be burned as fuel."

Dhungana's spectrum analysis of SN 2013ej revealed many elements, including hydrogen, helium, calcium, titanium, barium, sodium and iron.

"When we have as many spectra as we have for this supernova at different times," Kehoe added, "we are able to look deeper and deeper into the original star, sort of like an X-ray or a CAT scan."

SN 2013ej's short-lived existence was just tens of millions of years

Analysis of SN 2013ej's spectrum from ultraviolet through infrared indicates light from the explosion reached Earth July 23, 2013. It was discovered July 25, 2013 by the Katzman Automatic Imaging Telescope at California's Lick Observatory. A look back at images captured by SMU's ROTSE-IIIb showed that SMU's robotic telescope detected the supernova several hours earlier, Dhungana said.

"These observations were able to show a rapidly brightening supernova that started just 20 hours beforehand," he said. "The start of the supernova, termed 'shock breakout,' corresponds to the moment when the internal explosion crashes through the star's outer layers."

Like many others, SN 2013ej was a Type II supernova. That is a massive star still undergoing nuclear fusion. Once iron is fused, the fuel runs out, causing the core to collapse. Within a quarter second the star explodes.

Supernovae have death and birth written all over them

Massive stars typically have a shorter life span than smaller ones.

"SN 2013ej probably lived tens of millions of years," Kehoe said. "In universe time, that's the blink of an eye. It's not very long-lived at all compared to our sun, which will live billions of years. Even though these stars are bigger and have a lot more fuel, they burn it really fast, so they just get hotter and hotter until they just gobble up the matter and burn it."

For most of its brief life, SN 2013ej would probably have burned hydrogen, which then fused to helium, burning for a few hundred thousand years, then perhaps carbon and oxygen for a few hundred days, calcium for a few months and silicon for several days.

"Supernovae have death and birth written all over them," Kehoe said. "Not only do they create the elements we are made of, but the shockwave that goes out from the explosion -- that's where our solar system comes from."

Outflowing material slams into clouds of material in interstellar space, causing it to collapse and form a solar system.

"The heavy elements made in the supernova and its parent star are those which comprise the bulk of terrestrial planets, like Earth, and are necessary for life," Kehoe said.


Contacts and sources:
Margaret Allen
Southern Methodist University, Dallas

The new measurements are published online in the May 2016 issue of The Astrophysical Journal, "Extensive spectroscopy and photometry of the Type IIP Supernova 2013j,² at http://iopscience.iop.org/article/10.3847/0004-637X/822/1/6.

Saturday, April 23, 2016

Making Movies of Electrons in Motion Soon Possible

With the aid of terahertz radiation, Munich physicists have developed a method for generating and controlling ultrashort electron pulses. With further improvements, this technique should be capable of capturing even electrons in motion.

Seeing how atoms and electrons in a material respond to external stimuli can give scientists insight into unsolved problems in solid-state physics, such as the basis for high-temperature superconductivity and the many intriguing properties of other exotic materials. Short pulses of electrons can be used to image such responses. Due to their quantum mechanical wave-like properties, when electrons are scattered off a crystal, they interfere with each other to create a diffraction pattern. By recording these patterns, researchers can work out the atomic and electronic structure of the material, resolving details smaller than the size of an atom.

A pulse of electrons (green, coming from the left) encounters a microstructured antenna, which is operated with laser-generated terahertz radiation (red). Thus, the duration of the electron pulse is shortened to a few femtoseconds. 
Photo: Christian Hackenberger

Short electron pulses are, however, difficult to generate, because electrons carry a charge and move more slowly than the speed of light. In particular, electron pulse technology still has a long way to go to achieve the temporal resolution required to capture the motions of electrons inside a material. Now, a team headed by Dr. Peter Baum and Prof. Ferenc Krausz from the Laboratory for Attosecond Physics (LAP), LMU and the Max-Planck Institute of Quantum Optics (MPQ) has succeeded in developing a new technique for controlling ultrafast electron pulses. To date, microwave technology has been used to control electron pulses. 

Now, the LMU and MPQ researchers have - for the first time - used optically generated terahertz radiation. Using this technique, the team was able to reduce the length of the electron pulses significantly. Moreover, the method has the potential to visualize not only atoms, but also electrons in motion.

Observing atoms and their motions requires highly specialized techniques. Electron microscopy and electron diffraction can provide the spatial resolution to image atoms, but filming atomic motions requires ultrashort shutter speeds - the shorter the electron pulses, the sharper the images from the microcosmos. Electron pulses with durations in the femtosecond to attosecond range (10-15-10-18 s) would be ideal for monitoring processes inside matter with the required resolution in both space and time, i.e. in four dimensions. 

While it is already possible to generate extremely short light pulses with lasers, optical pulses do not have the short wavelengths required to make atoms or charges in molecules and solids directly visible. Electrons are superior to light in this context, because their wavelengths are 100,000 times shorter. However, generating short pulses is much more difficult to do with electrons than with light. This is because electrons, unlike photons, have both rest mass and charge.

Exploiting the properties of THz radiation

Like visible light, terahertz radiation is a form of electromagnetic radiation. The wavelength of terahertz radiation is much longer, however, falling in the range between microwaves and infrared light. The researchers directed the pulsed terahertz radiation and the electron beam onto a special antenna, where the electrons and the terahertz photons can interact. They oriented the electric field of the terahertz radiation so that electrons arriving earlier were slowed down, and electrons arriving later were accelerated. Under these conditions, as the electron pulse continues to propagate, it is compressed, reaching a minimum duration at the location where it scatters from the material sample under study.

Furthermore, the researchers can actually determine how long the electron pulses are when they arrive at the sample position. This involves forcing the electron pulses to interact a second time with terahertz radiation, but this time the terahertz electromagnetic fields are oriented such that they impart a sideways deflection to the electrons. Crucially, the extent of the deflection depends on the timing of the electrons' interaction with the terahertz pulse. Hence, the physicists have created a virtual terahertz-stopwatch for the electron pulses.

The new technology puts Baum and colleagues in a position to shorten the electron pulses even more. That will enable them to record ever faster atomic and eventually electronic motions. The aim is to track the attosecond motions of charged clouds in and around atoms in order to better understand the fundamentals of the interaction between light and matter. Such insights may eventually lead to new kinds of photonic and electronic materials and devices, driving the technologies of tomorrow.



Contacts and sources:
Luise Dirscherl
Ludwig Maximilian University of Munich

Citation: All-optical control and metrology of electron pulses; Authors:  C. Kealhofer1,2, W. Schneider1, D. Ehberger, A. Ryabov, F. Krausz, P. Baum  Science 22 Apr 2016: Vol. 352, Issue 6284, pp. 429-433 DOI: 10.1126/science.aae0003  http://dx.doi.org/10.1126/science.aae0003

Gender Stereotyping May Start as Young as Three Months Says Baby Study



Gender stereotyping may start as young as three months, according to a study of babies' cries from the University of Sussex.



Adults attribute degrees of femininity and masculinity to babies based on the pitch of their cries, as shown by a new study by researchers from the University of Sussex, the University of Lyon/Saint-Etienne and Hunter College City University of New York. The research is published in the journal BMC Psychology.

The study found:
  • Adults often wrongly assume babies with higher-pitched cries are female and lower pitched cries are male
  • When told the gender of the baby, adults make assumptions about the degree of masculinity or femininity of the baby, based on the pitch of the cry
  • Adults generally assume that babies with higher-pitched cries are in more intense discomfort 
  • Men who are told that a baby is a boy tend to perceive greater discomfort in the cry of the baby. This is likely to be due to an ingrained stereotype that boy babies should have low-pitched cries. (There was no equivalent finding for women, or for men's perception of baby girls.)

Despite no actual difference in pitch between the voices of girls and boys before puberty, the study found that adults make gender assumptions about babies based on their cries.


Dr David Reby from the School of Psychology at the University of Sussex said:

"It is intriguing that gender stereotyping can start as young as three months, with adults attributing degrees of femininity and masculinity to babies solely based on the pitch of their cries. Adults who are told, or already know, that a baby with a high-pitched cry is a boy said they thought he was less masculine than average. And baby girls with low-pitched voices are perceived as less feminine.

"There is already widespread evidence that gender stereotypes influence parental behaviour but this is the first time we have seen it occur in relation to babies' cries.

"We now plan to investigate if such stereotypical attributions affect the way babies are treated, and whether parents inadvertently choose different clothes, toys and activities based on the pitch of their babies' cries.

"The finding that men assume that boy babies are in more discomfort than girl babies with the same pitched cry may indicate that this sort of gender stereotyping is more ingrained in men.

"It may even have direct implications for babies' immediate welfare: if a baby girl is in intense discomfort and her cry is high-pitched, her needs might be more easily overlooked when compared with a boy crying at the same pitch.

"While such effects are obviously hypothetical, parents and care-givers should be made aware of how these biases can affect how they assess the level of discomfort based on the pitch of the cry alone."

Professor Nicolas Mathevon, from the University of Lyon/Saint-Etienne & Hunter College CUNY, commented:

“This research shows that we tend to wrongly attribute what we know about adults - that men have lower pitched voices than women - to babies, when in fact the pitch of children's voices does not differ between sexes until puberty.

"The potential implications for parent-child interactions and for the development of children's gender identity are fascinating and we intend to look into this further.”

The researchers recorded the spontaneous cries of 15 boys and 13 girls who were on average four months old. The team also synthetically altered the pitch of the cries while leaving all other features of the cries unchanged to ensure they could isolate the impact of the pitch alone. The participating adults were a mixture of parents and non-parents.

'Sex Stereotypes Influence Adults' Perception of Babies' Cries' is published in the BMC Psychology journal. It is authored by David Reby from the University of Sussex, Florence Levrero and Erik Gustafsson at the University of Lyon/Saint-Etienne and Nicolas Mathevon at the University of Lyon/Saint-Etienne & Hunter College CUNY.



Contacts and sources:
By: Anna Ford
University of Sussex 

Smart Hand: Using Your Skin as a Touch Screen:



Using your skin as a touchscreen has been brought a step closer after UK scientists successfully created tactile sensations on the palm using ultrasound sent through the hand.

The University of Sussex-led study - funded by the Nokia Research Centre and the European Research Council – is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand.

The SkinHaptics device sends ultrasound through the hand to precise points on the palm, paving the way for next-generation smart technology that uses your own skin as a touchscreen.


This solves one of the biggest challenges for technology companies who see the human body, particularly the hand, as the ideal display extension for the next generation of smartwatches and other smart devices.

Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display.

However, this new innovation, called SkinHaptics, sends sensations to the palm from the other side of the hand, leaving the palm free to display the screen.



The device uses ‘time-reversal’ processing to send ultrasound waves through the hand. This technique is effectively like ripples in water but in reverse – the waves become more targeted as they travel through the hand, ending at a precise point on the palm.

It draws on a rapidly growing field of technology called haptics, which is the science of applying touch sensation and control to interaction with computers and technology.

Professor Sriram Subramanian, who leads the research team at the University of Sussex, says that technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an ‘eye-free’ age of technology.

He says: “Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important.

“If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user.

“What we offer people is the ability to feel their actions when they are interacting with the hand.”

Professor Sriram Subramanian is a Professor of Informatics at the University of Sussex where he leads the Interact Laband is a member of the Creative Technology Group.

The findings were presented at the IEEE Haptics Symposium 2016 in Philadelphia, USA, by the study’s co-author Dr Daniel Spelmezan, a research assistant in the Interact Lab. The symposium concluded  Monday 11 April 2016.




Contacts and sources:
By: James Hakner
University of Sussex