Friday, December 29, 2017

Ruins of Ancient Turkic Monument Surrounded by 14 Pillars with Inscriptions Discovered

A joint excavation team from Osaka University and the Institute of History and Archaeology of the Mongolian Academy of Sciences discovered the ruins of a unique monument surrounded by 14 large stone pillars with Turkic Runic inscriptions arranged in a square on the steppe called Dongoin shiree in eastern Mongolia during their three-year (2015 ~ 2017) joint excavation. (Figure 1)

Figure 1. Drone aerial shot of the ancient Turkish ruins on Dongoin shiree. (North at the top.) Segments of the inscriptions and sarcophagus excavated from the hole at the center of the ruins can be seen. (September 2016) 
Credit: Osaka University and Institute of History and Archaeology, Mongolian Academy of Science


Before the investigation of the ruins began in May 2015, intellectuals involved had thought that inscriptions and ruins of Turkic royalties were only on the steppes in the western part of Ulan Bator, the capital of Mongolia (Figure 2). However, this excavation team led by Professor Takashi OSAWA at Osaka University discovered 12 new inscriptions at the site, obtaining clues for clarifying power relationships in eastern Mongolia in the Middle Ages from the contents of the inscriptions and the stone configuration at the monument.

The major feature of the monument is its structural configuration in which the stone sarcophagus at the center of the mound, where a deceased person might be placed, is surrounded by 14 stone pillars with inscriptions (Figure 3). On ever stone inscription, tamga (signs) of the ancient Turkic tribes are carved more than 100. These excavated inscriptions are some of the largest Turkic inscriptions discovered in Mongolia. Using radiocarbon dating of pieces of calcined coal, sheepskin, and horse bone excavated from the sarcophagus, it was estimated that this unique monument was built in the 8th century, during the late Second Ancient Turkic Qaghanate.

Figure 2. Conventional map showing places of ancient Turkic inscriptions and ruins on the Mongolian Plateau 
Credit: Takashi OSAWA


Professor Takashi OSAWA deciphered these inscriptions and found that the person who is buried and commemorated in the inscriptions assumed the position of Yabgu (viceroy), the highest ranking just behind Qaghan*, during the reign of Bilge Qaghan (716-734 AD) of the Second Turkic Qaghanate. It was also found that the Yabgu became a Tölis-Shad (Royalty of the East), a commander in chief and highest administrative officer, in eastern Mongolia during the reign of Tengri Qaghan (734-741 AD).

* Qaghan (or Khagan) is a title of imperial rank in the Turkic and Mongolian languages equal to the status of emperor, and is someone who rules a Qaghanate or Khaganate (empire).

These findings show that the Dongoin shiree steppe, where the unique monument ruins remain, was the center of the eastern area of the ancient Turkic Qaghanate, whose location was not known from materials written in Chinese and Turkic texts.

Figure 3. Illustration of a ritual conducted around the monument (drawn by former director of the National Museum of Mongolian History) (September 2016) 
Credit: Osaka University and Institute of History and Archaeology, Mongolian Academy of Science

This monument will reveal that power relationships of rulers in the east area of the Turkic Qaghanate and their territories as well as their political and military relationships with Mongolian tribes, such as the Khitan, Tatabi, and Tatar. In addition, the arrangement of these stone pillars on the plateau will also provide important information for discussing the religious ideas and world outlook of the ancient nomads.

 
Contacts and sources:
Osaka University








It's Not the Heat but the Humidity That May Be Breaking Point for Humans in Warming World

Climate scientists say that killer heat waves will become increasingly prevalent in many regions as climate warms. However, most projections leave out a major factor that could worsen things: humidity, which can greatly magnify the effects of heat alone. 

Now, a new global study projects that in coming decades the effects of high humidity in many areas will dramatically increase. At times, they may surpass humans’ ability to work or, in some cases, even survive. Health and economies would suffer, especially in regions where people work outside and have little access to air conditioning. Potentially affected regions include large swaths of the already muggy southeastern United States, the Amazon, western and central Africa, southern areas of the Mideast and Arabian peninsula, northern India and eastern China.

“The conditions we’re talking about basically never occur now—people in most places have never experienced them,” said lead author Ethan Coffel, a graduate student at Columbia University’s Lamont-Doherty Earth Observatory. “But they’re projected to occur close to the end of the century.” The study will appears this week in the journal Environmental Research Letters.

A new study projects that drastic combinations of heat and humidity may hit large areas of the world later this century


Credit: Map: Ethan Coffel

Warming climate is projected to make many now-dry areas dryer, in part by changing precipitation patterns. But by the same token, as global temperatures rise, the atmosphere can hold more water vapor. That means chronically humid areas located along coasts or otherwise hooked into humid-weather patterns may only get more so. And, as many people know, muggy heat is more oppressive than the “dry” kind. That is because humans and other mammals cool their bodies by sweating; sweat evaporates off the skin into the air, taking the excess heat with it. It works nicely in the desert. But when the air is already crowded with moisture—think muggiest days of summer in the city–evaporation off the skin slows down, and eventually becomes impossible. When this cooling process halts, one’s core body temperature rises beyond the narrow tolerable range. Absent air conditioning, organs strain and then start to fail. The results are lethargy, sickness and, in the worst conditions, death.

Using global climate models, the researchers in the new study mapped current and projected future “wet bulb” temperatures, which reflect the combined effects of heat and humidity. (The measurement is made by draping a water-saturated cloth over the bulb of a conventional thermometer; it does not correspond directly to air temperature alone.) The study found that by the 2070s, high wet-bulb readings that now occur maybe only once a year could prevail 100 to 250 days of the year in some parts of the tropics. In the southeast United States, wet-bulb temperatures now sometimes reach an already oppressive 29 or 30 degrees Celsius; by the 2070s or 2080s, such weather could occur 25 to 40 days each year, say the researchers.

Lab experiments have shown wet-bulb readings of 32 degrees Celsius are the threshold beyond which many people would have trouble carrying out normal activities outside. This level is rarely reached anywhere today. But the study projects that by the 2070s or 2080s the mark could be reached one or two days a year in the U.S. southeast, and three to five days in parts of South America, Africa, India and China. Worldwide, hundreds of millions of people would suffer. The hardest-hit area in terms of human impact, the researchers say, will probably be densely populated northeastern India.

“Lots of people would crumble well before you reach wet-bulb temperatures of 32 C, or anything close,” said coauthor Radley Horton, a climate scientist at Lamont-Doherty. “They’d run into terrible problems.” Horton said the results could be “transformative” for all areas of human endeavor—“economy, agriculture, military, recreation.”

The study projects that some parts of the southern Mideast and northern India may even sometimes hit 35 wet-bulb degrees Celsius by late century—equal to the human skin temperature, and the theoretical limit at which people will die within hours without artificial cooling. Using a related combined heat/humidity measure, the so-called heat index, this would be the equivalent of nearly 170 degrees Fahrenheit of “dry” heat. But the heat index, invented in the 1970s to measure the “real feel” of moist summer weather, actually ends at 136; anything above that is literally off the chart. On the bright side, the paper says that if nations can substantially cut greenhouse-gas emissions in the next few decades, the worst effects could be avoided.

Only a few weather events like those projected have ever been recorded. Most recent was in Iran’s Bandar Mahshahr, on July 31, 2015. The city of more than 100,000 sits along the Persian Gulf, where seawater can warm into the 90s Fahrenheit, and offshore winds blow moisture onto land. On that day, the “dry” air temperature alone was 115 degrees Fahrenheit; saturated with moisture, the air’s wet bulb reading neared the 35 C fatal limit, translating to a heat index of 165 Fahrenheit.

Bandar Mahshahr’s infrastructure is good and electricity cheap, so residents reported adapting by staying in air-conditioned buildings and vehicles, and showering after brief ventures outside. But this may not be an option in other vulnerable places, where many people don’t have middle-class luxuries.

“It’s not just about the heat, or the number of people. It’s about how many people are poor, how many are old, who has to go outside to work, who has air conditioning,” said study coauthor Alex de Sherbinin of Columbia’s Center for International Earth Science Information Network. De Sherbinin said that even if the weather does not kill people outright or stop all activity, the necessity of working on farms or in other outdoor pursuits in such conditions can bring chronic kidney problems and other damaging health effects. “Obviously, the tropics will suffer the greatest,” he said. Questions of how human infrastructure or natural ecosystems might be affected are almost completely unexplored, he said.

Only a handful of previous studies have looked at the humidity issue in relation to climate change. It was in 2010 that a paper in the Proceedings of the National Academy of Sciences proposed the 35-degree survivability limit. In 2015, researchers published a paper in the journal Nature Climate Change that mapped areas in the southern Mideast and Persian Gulf regions as vulnerable to extreme conditions. There was another this year in the journal Science Advances, zeroing in on the densely populated, low-lying Ganges and Indus river basins. The new study builds on this earlier research, extending the projections globally using a variety of climate models and taking into account future population growth.

Elfatih Eltahir, a professor of hydrology and climate at the Massachusetts Institute of Technology who has studied the issue in the Mideast and Asia, said the new study “is an important paper which emphasizes the need to consider both temperature and humidity in defining heat stress.”

Climate scientist Steven Sherwood of the University of New South Wales, who proposed the 35-degree survivability limit, said he was skeptical that this threshold could be reached as soon as the researchers say. Regardless, he said, “the basic point stands.” Unless greenhouse emissions are cut, “we move toward a world where heat stress is a vastly greater problem than it has been in the rest of human history. The effects will fall hardest on hot and humid regions.”


Contacts and sources:
Kevin Krajick
The Earth Institute at Columbia University

Citation: Temperature and humidity based projections of a rapid rise in global heat stress exposure during the 21st century Ethan D Coffel1,, Radley M Horton and Alex de Sherbinin4 Published 22 December 2017 • © 2017   Published by IOP Publishing Ltd Environmental Research Letters, Volume 13, Number 1

Thursday, December 28, 2017

Earth Wind Transports Oxygen to The Moon



Successful direct observation of oxygen transported to the Moon by Earth wind

When and how was the Earth formed in the vast Universe? Is the Earth unique in the Universe? These questions always arouse our curiosity. The article "Biogenic oxygen from Earth transported to the Moon by a wind of magnetospheric ions" was published in Nature Astronomy in January 2017 and ranked in the top three of the journal’s most read articles in 2017.
Photo by courtesy of Osaka University/NASA

The Earth is protected from solar wind and cosmic rays by the Earth's magnetic field. On the night side of the earth, the side opposite the Sun, the Earth’s magnetic field is extended like a comet tail and makes a space like a streamer (we call it “Geotail”). At the center of the Geotail, there is an area which exists as a sheet-like structure of hot plasma.

A joint group of researchers from the Institute of Space and Astronautical Science and Nagoya University, led by Professor TERADA Kentaro at Osaka University, by using the Magnetic field and Plasma experiment/Plasma energy Angle and Composition experiment (MAP-PACE) on SELENE (Selenological and Engineering Explorer), known in Japan as the spacecraft "Kaguya," succeeded in observing that oxygen from the gravisphere of Earth was transported to the Moon 380,000 km away by solar activity.

Geometrical setting of Sun, Earth, Moon and Geotail (plasma sheet). 

Credit: Osaka University

This group examined plasma data of MAP-PACE 100 km above the Moon’s surface and discovered that high-energy oxygen ions (O+) appeared only when the Moon and Kaguya crossed the plasma sheet.

It had been known that oxygen ions flowed out from the polar regions into space; however, this group succeeded in observing that they were transported as “Earth wind” to the surface of the Moon, a world first.

O+ ions detected by this group had a high energy of 1-10 keV. These O+ ions can be implanted into a depth of tens of nanometers of a metal particle. This is a very important finding in understanding the complicated isotopic composition of oxygen on the lunar regolith, which has long been a mystery. Through their observation, this group demonstrated the possibility that components that lack 16O, which is a stable isotope of oxygen and is observed in the ozone layer, a region of Earth's stratosphere, were transported to the Moon surface as “Earth wind” and implanted into a depth of tens of nanometers on the surface of lunar soils.

Energy spectra of O ions observed by KAGUYA satellite 
Credit: Osaka University

Oxygen, which is vital to sustaining life and makes up about 20 percent of the Earth’s atmosphere, has been produced from photosynthesis of plants during the last 2 to 3 billion years. Discoveries like the one by this group will not only encourage researchers’ spirit of seeking truth, but will also have a great impact on the views of nature and science of the general public.


Contacts and sources:
Osaka University


Citation: Biogenic oxygen from Earth transported to the Moon by a wind of magnetospheric ions
Journal: Nature Astronomy
DOI: 10.1038/s41550-016-0026
Authors: Kentaro Terada, Shoichiro Yokota, Yoshifumi Saito, Naritoshi Kitamura, Kazushi Asamura, Masaki N. Nishino

Funder: Japan Society for the Promotion of Science (JSPS) KAKENHI Grants, the Mitsubishi Foundation and Yamada Science Foundation

New Patch Turns Energy-Storing Fats into Energy-Burning Fats

A new approach to reducing bulging tummy fats has shown promise in laboratory trials

It combines a new way to deliver drugs, via a micro-needle patch, with drugs that are known to turn energy-storing white fat into energy-burning brown fat. This innovative approach developed by scientists from Nanyang Technological University, Singapore (NTU Singapore) reduced weight gain in mice on a high fat diet and their fat mass by more than 30 percent over four weeks.

The new type of skin patch contains hundreds of micro-needles, each thinner than a human hair, which are loaded with the drug Beta-3 adrenergic receptor agonist or another drug called thyroid hormone T3 triiodothyronine.

When the patch is pressed into the skin for about two minutes, these micro-needles become embedded in the skin and detach from the patch, which can then be removed.

Prof Chen Peng (left) holding the new microneedle fat burning patch with Asst Prof Xu Chenjie

Credit: Nanyang Technological University

As the needles degrade, the drug molecules then slowly diffuse to the energy-storing white fat underneath the skin layer, turning them into energy-burning brown fats.

Brown fats are found in babies and they help to keep the baby warm by burning energy. As humans grow older, the amount of brown fats lessens and is replaced with visceral white fats.

Published in the journal Small Methods recently by NTU Professor Chen Peng and Assistant Professor Xu Chenjie, this approach could help to address the worldwide obesity problem without resorting to surgical operations or oral medication which could require large dosages and could have serious side effects.

“With the embedded microneedles in the skin of the mice, the surrounding fats started browning in five days, which helped to increase the energy expenditure of the mice, leading to a reduction in body fat gain,” said Asst Prof Xu, who focuses on research in drug delivery systems.

“The amount of drugs we used in the patch is much less than those used in oral medication or an injected dose. This lowers the drug ingredient costs while our slow-release design minimises its side effects,” said Asst Prof Xu.

Obesity which results from an excessive accumulation of fat is a major health risk factor for various diseases, including heart disease, stroke and type-2 diabetes. The World Health Organisation estimates that 1.9 billion adults in the world are overweight in 2016 with 650 million of them being obese.

“What we aim to develop is a painless patch that everyone could use easily, is unobtrusive and yet affordable,” said Prof Chen, a biotechnology expert who researches on obesity. “Most importantly, our solution aims to use a person’s own body fats to burn more energy, which is a natural process in babies.”

Under the two scientists’ guidance at NTU’s School of Chemical and Biomedical Engineering, research fellow Dr Aung Than conducted experiments which showed that the patch could suppress weight gain in mice that were fed a high-fat diet and reduce their fat mass by over 30 percent, over a period of four weeks.

The treated mice also had significantly lower blood cholesterol and fatty acids levels compared to the untreated mice.

Being able to deliver the drug directly to the site of action is a major reason why it is less likely to have side effects than orally delivered medication.

The team estimates that their prototype patch had a material cost of about S$5 (US$3.50) to make, which contains beta-3 adrenergic receptor agonist combined with Hyaluronic acid, a substance naturally found in the human body and commonly used in products like skin moisturisers.

(L-R) Asst Prof Xu Chenjie - Dr Than Aung - and Prof Chen Peng discussing a microscope image of the microneedle patch

Credit: Nanyang Technological University

Beta-3 adrenergic receptor agonist is a drug approved by the Federal Drug Administration of the United States and is used to treat overactive bladders, while T3 triiodothyronine is a thyroid hormone commonly used for medication for an underactive thyroid gland.

Both have been shown in other research studies to be able to turn white fats brown, but their use in reducing weight gain is hampered by potentially serious side-effects and drug accumulation in non-targeted tissues if conventional drug delivery routes were used, such as through oral intake.

NTU’s Lee Kong Chian School of Medicine Associate Professor Melvin Leow, who was not affiliated with this study, said it is exciting to be able to tackle obesity via the browning of white fat, and the results were promising.

“These data should encourage Phase I Clinical studies in humans to translate these basic science findings to the bedside, with the hope that these microneedle patches may be developed into an established cost-effective modality for the prevention or treatment of obesity in the near future,” added Assoc Prof Leow, an endocrinologist.

Since the publication of the paper, the team has received keen interest from biotechnology companies and is looking to partner clinician scientists to further their research.




Contacts and sources:
Nanyang Technological University

Citation: Transdermal Delivery of Anti-Obesity Compounds to Subcutaneous Adipose Tissue with Polymeric Microneedle Patches
Aung Than1, Ke Liang1, Shaohai Xu1, Lei Sun2, Hongwei Duan1, Fengna Xi3, Chenjie Xu1,* andPeng Chen1,* Version of Record online: 13 OCT 2017  DOI: 10.1002/smtd.201700269

Cause of Spontaneous Tumors Related to Red Meat Found

A sugar called Neu5Gc, present in red meat, some fish and dairy products, is related to the appearance of spontaneous tumors in humans. 

Researchers at the University of Nevada, Reno, led by Spaniard David Álvarez Ponce, have analyzed the evolutionary history of the CMAH gene - which allows the synthesis of this sugar - and shown which groups of animals have lost the gene and therefore are more suitable for human consumption and for organ transplants.

About two million years ago, humans experienced a genetic change that differentiated us from most primates. This change protected us from some diseases, but caused current consumer products, such as red meat, to pose a high risk to health.

<p>Si los humanos consumen productos derivados de animales que tienen el gen, el cuerpo sufre una reacción inmune al azúcar, que es una sustancia extraña en el cuerpo. Esto puede producir inflamación, artritis e incluso cáncer / <a href="https://www.flickr.com/photos/egoitzmoreno/14021670560/in/photolist-nn3J9G-7GSwWu-3f5e2c-a4YhkZ-rjTQs-5RMkXm-4UoNjr-teUXt-GaTEZN-7AwyhX-5rGTWw-6vL23u-GYav9-a3Gw1Z-2ogi84-CECTv-6GVDbv-4ZEYtF-BVy5nB-8FF8fT-8y7NWe-KbRF6-6JbTDD-oP7dEc-4bWwjN-nSBCMU-4DCPPJ-3bgdjb-37Fky-8ho3HH-GudGWh-ceRnyW-5vrg3f-bqo95n-FZYTvq-MMwqXu-X19dF7-AKL99U-6Ecv8v-9YTz5D-bV5Mmn-e4HinH-a8xDKH-6ggv8h-VnLmsM-d1Rks-5cfsTa-6fJsHb-49Rmje-nwnAzy" title="Ir a la galería de Egoitz Moreno" target="_blank">Egoitz Moreno</a></p>
Credit: Egoitz Moreno

At that time in evolution, a gene called CMAH - which allows the synthesis of a sugar called Neu5Gc - was deactivated. This carbohydrate is found in red meat, some fish and dairy products. If humans consume products derived from animals that have the gene, the body suffers an immune reaction to sugar, which is a foreign substance in the body. This can cause inflammation, arthritis and even cancer.

Scientists from the University of Nevada, Reno (USA), led by Spaniard David Álvarez-Ponce, have carried out an analysis of 322 animal genomes to determine whether or not they have active CMAH genes. Next, they placed the results in the evolutionary tree of the animals, to determine at what moments of their evolution said gene was deactivated. This allowed them to understand why certain species have an active HCMA gene, while other similar ones do not.

"In a first analysis we scanned all the available genomes. We only found the gene in a few bacteria, in a pair of algae, and in the deuterostomes, a group of animals that includes vertebrates and echinoderms, among others. The non-deuterostomes animals did not present the gene. Next, we focused on the 322 deuterostomy genomes that were available," the Spanish researcher explains to Sinc.

The laboratory of Álvarez-Ponce specializes in the study of the evolution of genes and genomes, through the use of bioinformatics. That is, it is not composed of test tubes, microscopes or other instruments, but of computers that are used to understand evolution through the analysis of massive amounts of data.



The toxic sugar present in fish

So far, very few fish species had been studied to know whether or not they had amounts of toxic sugar. "Our analyzes show that there are fish that have the CMAH gene and others that do not, but for the moment the Neu5Gc sugar has been measured in very few of them. In fish that do have this gene, sugar is found in very small proportions in their meat, but in high quantities in caviar. This may be because the gene is expressed specifically in eggs or oviducts," says the scientist.

Sateesh Peri, a master's course student at the Alvarez-Ponce laboratory, adds: "It turns out that caviar, one of the most expensive meals in the world, is also one of the products with the highest concentrations of Neu5Gc." However, the research also reveals a multitude of fish that do not have the CMAH gene, and whose caviar is expected to be free of Neu5Gc.

hicken, turkey and duck, free of CMAH

Like humans, birds also do not have CMAH genes; so consuming chicken, turkey or duck does not have the negative effects of consuming red meat. Another group of animals that does not have CMAH genes are reptiles, except for one species of lizard. "The presence of the gene in this lizard was unexpected, and invalidates the belief (until now accepted) that the gene had been lost in an ancestor of all reptiles and birds," the scientists say.

In addition to the above-mentioned food risks, the CMAH gene also plays a key role in the transplantation of organs from animals to humans, a practice known as xenotransplantation: it is one of the factors that determine whether these organs are going to be rejected or not by the human body. When the organ of an animal that has the CMAH gene is transplanted to a person, the human body can react to the Neu5Gc sugar and reject the organ.

"It is possible that the deactivation of the CMAH gene during human evolution has protected humans from certain pathogens. For example, there is a type of malaria that needs Neu5Gc sugar to cause infection. This type of malaria affects some primates, but not humans," declares Álvarez-Ponce.

Products to be consumed moderately

The presence or absence of the CMAH gene in different animals, which this study has characterized, points out which animals we should not eat (or eat only moderately), and which animals may present pathogenic microbes that affect humans, according to the scientists. If the animal has the gene, then its meat can have the same negative effects as red meat. If the animal does not present the gene, it may contain pathogenic microbes that bind to the Neu5Ac sugar (the precursor of Neu5Gc) and, therefore, may affect humans.

The researchers hope that this study will have a significant impact on later work in the fields of nutrition, genetics and medicine. "To determine in which groups and at what moments of the evolution the CMAH gene has been deactivated is critical to know which species are most likely to contain the toxic Neu5Gcy sugar, which are recommended for feeding, xenotransplantation, and certain scientific studies", they add.

The work of the Alvarez-Ponce team will help understand why certain diseases occur, and find ways to prevent them from spreading.


Contacts and sources:
Eva Rodríguez
SINC

Citation: Sateesh Peri, Asmita Kulkarni, Felix Feyertag, Patricia M. Berninsone, David Alvarez-Ponce. “Phylogenetic distribution of CMP-Neu5Ac hydroxylase (CMAH), the enzyme synthetizing the pro-inflammatory human xeno-antigen Neu5Gc”. Genome Biology and Evolution, 30 de noviembre de 2017. https://doi.org/10.1093/gbe/evx251

Wednesday, December 27, 2017

Halloween Asteroid Shaped Like Human Skull to Return in 2018

There is less than a year to go until asteroid 2015 TB145 approaches Earth once again, just as it did in 2015 around the night of Halloween, an occasion which astronomers did not pass up to study its characteristics. This dark object measures between 625 and 700 metres, its rotation period is around three hours and, in certain lighting conditions, it resembles a human skull.

Artist´s impression of the Halloween asteroid 2015 TB145, which resembles a human skull in certain light conditions.
<p>Artist´s impression of the Halloween asteroid 2015 TB<sub>145</sub>, which resembles a human skull in certain light conditions. / <a href="http://www.agenciasinc.es/Multimedia/Ilustraciones/El-asteroide-que-recuerda-a-una-calavera" target="_blank">J. A. Peñas/SINC</a> </p>
Credit:  J. A. Peñas/SINC

An asteroid zipped past on 31 October 2015, relatively close to us, just 486,000 km away, 1.3 times the distance separating us from the Moon. The object is called 2015 TB145 and was discovered a few days earlier -on 10 October- from Hawaii using the Pan-STARRS telescope, however, the fact that it came closest to our planet on Halloween also helped it become known as the Halloween asteroid.

Different teams of astronomers pointed their instruments towards 2015 TB145, including NASA, which captured it using the Green Bank (West Virginia, USA) and Arecibo (Puerto Rico) radio telescopes. In some of the images registered by the latter, the rotating asteroid was seen at times to resemble a human skull due to the lighting conditions at particular moments during its rotation.

European scientists, including the researcher Pablo Santos-Sanz from the Institute of Astrophysics of Andalusia (IAA-CSIC), also organised observing campaigns of the Halloween asteroid to discover its characteristics. The results have been published in the journal Astronomy & Astrophysics.

This image of asteroid 2015 TB145 was generated using radar data collected by the National Science Foundation's Arecibo Observatory in Puerto Rico.

Credit:  NAIC-Arecibo/NSF

“It is an Apollo-type near-Earth asteroid (NEA),” Santos-Sanz explains to SINC. “The proximity of this small object meant greater brightness, so we decided to study it using various observation techniques: on the one hand, we used optical telescopes from the Sierra Nevada Observatory in Granada, the Calar Alto Observatory in Almería and the La Hita Observatory in Toledo; and on the other, we analysed it in the mid-infrared using the Very Large Telescope (VLT) VISIR instrument at the European Southern Observatory (ESO) in Cerro Paranal, Chile”.

“From the observations from Spain, we discovered that this object’s most likely rotation period is 2.94 hours, in other words, this is the approximate length of its day, although we cannot rule out another possibility: 4.78 hours, another solution which is consistent with our optical data,” the expert points out.

Thanks to the observations in the mid-infrared made from the VLT, the authors were able to detect the thermal emission of the object. Using this information and a thermophysical model, various properties of 2015 TB145could be discerned.

Santos-Sanz mentions a few of these: “The object measures between 625 m and 700 m, its shape is a slightly flattened ellipsoid, and its rotation axis was roughly perpendicular to the Earth at the time of its closest proximity. Furthermore, its thermal inertia (the amount of heat which it retains and the speed at which it absorbs or transfers heat) is consistent with that of similar sized asteroids.”

The reflectivity or albedo of the surface of this asteroid is around 5 or 6%, which means that it reflects approximately 5 to 6% of sunlight. “This means that it is very dark, only slightly more reflective than charcoal,” the Spanish astrophysicist explains.

Next approach: November 2018

Researchers are confident of obtaining more data on 2015 TB145 the next time it approaches our planet, which will happen in November 2018, although this time it will zip past much further away than the last, at a distance 105 times the average Lunar distance. “Although this approach shall not be so favourable, we will be able to obtain new data which could help improve our knowledge of this mass and other similar masses that come close to our planet,” Santos-Sanz says.

“It is currently 3.7 astronomical units away from Earth, that is 3.7 times the average distance from the Earth to the Sun,” he points out. “It has a magnitude of 26.5, which means it is only visible from Earth using very large telescopes or space telescopes.”

Thomas G. Müller, researcher from the Max-Planck-Institut für extraterrestrische Physik (Germany) and co-author of the study, adds: “The next slightly more exciting encounter will be around Halloween's day in the year 2088, when the object approaches Earth to a distance of about 20 lunar distances. The encounter on Halloween's day 2015 was the closest approach of an object of that size since 2006, and the next known similar event is the passage of 137108 (1999 AN10) on August 7, 2027. Later, 99942 Apophis will follow on April 13, 2029 with an Earth passage at approximately 0.1 lunar distances."

Orbit diagram of 2015 TB145 and current position.

Credit: JPL-NASA

This study has received funding from the 'Small Bodies: Near and Far' (SBNAF) European project. "We formed a team of expert astronomers from Poland, Hungary, Spain, and Germany –Müller explains–. The goal of the project is to characterise small objects at various distances from the Sun, including near-Earth asteroids like the Halloween asteroid. 2015 TB145 was one of our first targets where we combined different observations, modelling techniques, and concepts for the scientific interpretation. It is going to be interesting to compare our results with future findings and to apply our techniques to many more potentially hazardous objects.”

Scientists think that the Halloween asteroid could in fact be an extinct comet which lost its volatile compounds after orbiting the Sun numerous times. In general, asteroids and comets are distinguished by their composition (the former being more rocky and metallic, while the latter have a higher proportion of ice and rock) and type of orbit around the Sun, but at times it is not easy to tell them apart. The boundaries between them are becoming increasingly diffuse. In any case, both were formed and witnessed the first stages of our solar system, which was born around 4,600 million years ago.



 /
Contacts and sources:
SINC

Citation: T. G. Müller, A. Marciniak, M. Butkiewicz-Bąk, R. Duffard, D. Oszkiewicz, H. U. Käufl, R. Szakáts, T. Santana-Ros, C. Kiss and P. Santos-Sanz. “Large Halloween Asteroid at Lunar Distance”. Astronomy&Astrophysics 598- A63, 2017.

Iberian Brown Bear Did Not Descend from Ancestors Escaping The Ice Age



According to the glacier refuges theory, after the last glaciations the bears of northern Europe sought shelter in the South. Researchers from A Coruña University reject this idea: they have reconstructed the colonization of brown bears in the Iberian Peninsula and have shown that the lineage of the Pleistocene bears was lost.

After the last glaciations in Western Europe, southern Europe could be a refuge for some species of bears arrived from the North, according to a widely accepted scientific hypothesis. However, recent studies have added nuances to this idea. Now, a team of scientists from the A Coruña University has reconstructed the dynamics of the populations of bears in the Iberian Peninsula

The delay in the re-colonization of the Iberian Peninsula could be due to the orographical characteristics of the Pyrenees and the abundant presence of human beings in the natural entrance to the Peninsula. 

<p>The delay in the re-colonization of the Iberian Peninsula could be due to the orographical characteristics of the Pyrenees and the abundant presence of human beings in the natural entrance to the Peninsula. / <a href="https://www.flickr.com/photos/ivi/5820993994/in/photolist-9So7i9-9SkeJX-pmSWxw-9So77C-9So7Hq-oSMVoa-3b6nhe-hYex1b-pPEP1d-hYewE1-8ycaPD-3b6o8r-pPENz3-pD6oFX-pumw3d-ogxHCK-HuGfuh-nWnFt4-obPSH9-pLKJ9k-8AAv4L-hYf3vH-hYekDq-hYf326-hYf4dK-nXhruL-oeHr6h-H4Bj9j-qyb2zu-hYe7qF-hYejUE-hYejCh-hYe7PB-hYem8w-hYea96-hYejbA-hYe8Wg-hYf5ht-hYeuyY-hYewhY-hYeifN-hYf3Qv-hYf5Fz-9So7X5" title="Ir a la galería de Ivan PC" target="_self">Ivan PC</a></p>
Credit:  Ivan PC


"We have investigated the mitochondrial DNA [the cellular energy 'factories'] of a significant number of samples of current bears, from the Holocene the and Pleistocene, in the European context, and we have seen that the glacier refugee theory, commonly accepted, does not work for this species", explains Aurora Grandal Danglade, researcher of A Coruña University and co-author of the study published by the Historical Biology journal.

In their work they compiled data on the chronology and mitochondrial genome of brown bears (Ursus arctos) from Western Europe, adding new sequences of the current ones from the Cantábrico Mountains.

"Through the study of today's´ bears, it had been seen that the brown bears of southern Scandinavia were of the same lineage as the current Iberian ones. This led to the hypothesis that the Peninsula would have served as a glacier refuge for the brown bears, which at the end of the last glacial maximum, would have colonized again Western Europe from here, "explains Ana García Vázquez, co-author of the research.

In the North of the Iberian peninsula there were three maternal lineages during the Pleistocene: one more numerous and two others of which only remains of a single individual of each group exist. The coexistence of three different lineages of bears is something that happens, for example, nowadays in Russia. However, on the Peninsula there is only one, which means that the other two lineages arrived from very distant areas and did not have continuity over time.

This implies that the Pleistocene lineage was lost, and the Holocene bears, after the last glacial maximum, entered the peninsula from some unidentified area -probably France- and they did it 5,000 years after having colonized the British Isles.

"It would be necessary to obtain more mitochondrial sequences from bears from other regions of Western Europe to clarify if the presence of these maternal lineages is casual or, on the contrary, no other representatives were found due to the scarcity of data," the scientist points out.

The delay in the re-colonization of the Iberian Peninsula could be due to the orographical characteristics of the Pyrenees and the abundant presence of human beings in the natural entrance to the Peninsula. However, as there is no continuity of any of the Pleistocene lineages in the Holocene, Grandal and her team propose the existence of a cryptic refuge on the Atlantic slopes of continental Europe, ""from where the bears expanded as the ice receded", she concludes.


Contacts and sources:
Eva Rodríguez
FECYT - Spanish Foundation for Science and Technology


Citation: Ana García-Vázquez, Ana Cristina Pinto Llona y Aurora Grandal Danglade. “Post-glacial colonization of Western Europe brown bears from a cryptic Atlantic refugium out of the Iberian Peninsula” Historical Biology, 2017.

Callous and Unemotional Traits Show in Brain Structure of Boys Only

"Snips and snails, and puppy dogs tails, That's what little boys are made of. Sugar and spice and all things nice, That's what little girls are made of."  The nursery rhyme reflects some underlying truth about the differences between males and females. 

Callous-unemotional traits have been linked to deficits in development of the conscience and of empathy. Children and adolescents react less to negative stimuli; they often prefer risky activities and show less caution or fear. In recent years, researchers and doctors have given these personality traits increased attention, since they have been associated with the development of more serious and persistent antisocial behavior.

Boy sitting on stairs
Credit: University of Basel

However, until now, most research in this area has focused on studying callous-unemotional traits in populations with a psychiatric diagnosis, especially conduct disorder. This meant that it was unclear whether associations between callous-unemotional traits and brain structure were only present in clinical populations with increased aggression, or whether the antisocial behavior and aggression explained the brain differences.

Using magnetic resonance imaging, the researchers were able to take a closer look at the brain development of typically-developing teenagers to find out whether callous-unemotional traits are linked to differences in brain structure. The researchers were particularly interested to find out if the relationship between callous-unemotional traits and brain structure differs between boys and girls.

Only boys show differences in brain structure

The findings show that in typically-developing boys, the volume of the anterior insula - a brain region implicated in recognizing emotions in others and empathy - is larger in those with higher levels of callous-unemotional traits. This variation in brain structure was only seen in boys, but not in girls with the same personality traits.

"Our findings demonstrate that callous-unemotional traits are related to differences in brain structure in typically-developing boys without a clinical diagnosis," explains lead author Nora Maria Raschle from the University and the Psychiatric Hospital of the University of Basel in Switzerland. "In a next step, we want to find out what kind of trigger leads some of these children to develop mental health problems later in life while others never develop problems."

This study is part of the FemNAT-CD project, a large Europe-wide research project aiming at investigating neurobiology and treatment of adolescent female conduct disorder.



Contacts and sources:
Dr. Nora Maria Raschle
University of Basel

Citation: Nora Maria Raschle et al. Callous-unemotional traits and brain structure: Sex-specific effects in anterior insula of typically-developing youthsNeuro Image: Clinical (2018) | doi: 10.1016/j.nicl.2017.12.015

New, More Powerful Depth Sensors Sensitive Enough for Self-Driving Cars

For the past 10 years, the Camera Culture group at MIT’s Media Lab has been developing innovative imaging systems — from a camera that can see around corners to one that can read text in closed books — by using “time of flight,” an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor.

In a new paper appearing in IEEE Access, members of the Camera Culture group present a new approach to time-of-flight imaging that increases its depth resolution 1,000-fold. That’s the type of resolution that could make self-driving cars practical.

The new approach could also enable accurate distance measurements through fog, which has proven to be a major obstacle to the development of self-driving cars.

At a range of 2 meters, existing time-of-flight systems have a depth resolution of about a centimeter. That’s good enough for the assisted-parking and collision-detection systems on today’s cars.

Comparing of the cascaded GHz approach with Kinect-style approaches visually represented on a key. From left to right, the original image, a Kinect-style approach, a GHz approach, and a stronger GHz approach.
Comparing of the cascaded GHz approach with Kinect-style approaches visually represented on a key. From left to right, the original image, a Kinect-style approach, a GHz approach, and a stronger GHz approach.
Courtesy of the researchers

But as Achuta Kadambi, a joint PhD student in electrical engineering and computer science and media arts and sciences and first author on the paper, explains, “As you increase the range, your resolution goes down exponentially. Let’s say you have a long-range scenario, and you want your car to detect an object further away so it can make a fast update decision. You may have started at 1 centimeter, but now you’re back down to [a resolution of] a foot or even 5 feet. And if you make a mistake, it could lead to loss of life.”

At distances of 2 meters, the MIT researchers’ system, by contrast, has a depth resolution of 3 micrometers. Kadambi also conducted tests in which he sent a light signal through 500 meters of optical fiber with regularly spaced filters along its length, to simulate the power falloff incurred over longer distances, before feeding it to his system. Those tests suggest that at a range of 500 meters, the MIT system should still achieve a depth resolution of only a centimeter.

Kadambi is joined on the paper by his thesis advisor, Ramesh Raskar, an associate professor of media arts and sciences and head of the Camera Culture group.

Slow uptake

With time-of-flight imaging, a short burst of light is fired into a scene, and a camera measures the time it takes to return, which indicates the distance of the object that reflected it. The longer the light burst, the more ambiguous the measurement of how far it’s traveled. So light-burst length is one of the factors that determines system resolution.

The other factor, however, is detection rate. Modulators, which turn a light beam off and on, can switch a billion times a second, but today’s detectors can make only about 100 million measurements a second. Detection rate is what limits existing time-of-flight systems to centimeter-scale resolution.

There is, however, another imaging technique that enables higher resolution, Kadambi says. That technique is interferometry, in which a light beam is split in two, and half of it is kept circulating locally while the other half — the “sample beam” — is fired into a visual scene. The reflected sample beam is recombined with the locally circulated light, and the difference in phase between the two beams — the relative alignment of the troughs and crests of their electromagnetic waves — yields a very precise measure of the distance the sample beam has traveled.

But interferometry requires careful synchronization of the two light beams. “You could never put interferometry on a car because it’s so sensitive to vibrations,” Kadambi says. “We’re using some ideas from interferometry and some of the ideas from LIDAR, and we’re really combining the two here.”

On the beat

They’re also, he explains, using some ideas from acoustics. Anyone who’s performed in a musical ensemble is familiar with the phenomenon of “beating.” If two singers, say, are slightly out of tune — one producing a pitch at 440 hertz and the other at 437 hertz — the interplay of their voices will produce another tone, whose frequency is the difference between those of the notes they’re singing — in this case, 3 hertz.

The same is true with light pulses. If a time-of-flight imaging system is firing light into a scene at the rate of a billion pulses a second, and the returning light is combined with light pulsing 999,999,999 times a second, the result will be a light signal pulsing once a second — a rate easily detectable with a commodity video camera. And that slow “beat” will contain all the phase information necessary to gauge distance.

But rather than try to synchronize two high-frequency light signals — as interferometry systems must — Kadambi and Raskar simply modulate the returning signal, using the same technology that produced it in the first place. That is, they pulse the already pulsed light. The result is the same, but the approach is much more practical for automotive systems.

“The fusion of the optical coherence and electronic coherence is very unique,” Raskar says. “We’re modulating the light at a few gigahertz, so it’s like turning a flashlight on and off millions of times per second. But we’re changing that electronically, not optically. The combination of the two is really where you get the power for this system.”

Through the fog

Gigahertz optical systems are naturally better at compensating for fog than lower-frequency systems. Fog is problematic for time-of-flight systems because it scatters light: It deflects the returning light signals so that they arrive late and at odd angles. Trying to isolate a true signal in all that noise is too computationally challenging to do on the fly.

With low-frequency systems, scattering causes a slight shift in phase, one that simply muddies the signal that reaches the detector. But with high-frequency systems, the phase shift is much larger relative to the frequency of the signal. Scattered light signals arriving over different paths will actually cancel each other out: The troughs of one wave will align with the crests of another. Theoretical analyses performed at the University of Wisconsin and Columbia University suggest that this cancellation will be widespread enough to make identifying a true signal much easier.

“I am excited about medical applications of this technique,” says Rajiv Gupta, director of the Advanced X-ray Imaging Sciences Center at Massachusetts General Hospital and an associate professor at Harvard Medical School. “I was so impressed by the potential of this work to transform medical imaging that we took the rare step of recruiting a graduate student directly to the faculty in our department to continue this work.”

“I think it is a significant milestone in development of time-of-flight techniques because it removes the most stringent requirement in mass deployment of cameras and devices that use time-of-flight principles for light, namely, [the need for] a very fast camera,” he adds. “The beauty of Achuta and Ramesh’s work is that by creating beats between lights of two different frequencies, they are able to use ordinary cameras to record time of flight.”



Contacts and sources:
Larry Hardesty  
Massachusetts  Institute of Technology (MIT)

A Bench Made of Glass as Resilient as Steel or Concrete

A PhD student in civil engineering at EPFL and a recent architecture graduate joined forces to make a bench out of exceptionally resistant glass. Several companies broke new ground in support of this highly innovative project.

By combining her know-how with that of EPFL architect Alexander Wolhoff, Jagoda Cupać, doctor in civil engineering, has proven that glass can be transformed into an ultra-resistant material. Their project? Creating a six-meter-long glass bench named ATLAS. Several companies contributed to the project’s success using processes and materials that were new to them. The ATLAS bench is a true technology prototype and will be on display for a semester at the Rolex Learning Center. 

© Marc Delachaux / EPFL 2017
Credit: © Marc Delachaux / EPFL 2017

In contemporary architecture, glass has long been used for so much more than just windows. It's everywhere – in roofs, stairs, walls, floors and beams – and that's not likely to change anytime soon. And so like Atlas, the mythical figure condemned to hold the sky aloft for eternity, the engineers had an epic challenge of their own, that of turning glass into a load-bearing material that is as resilient as steel or concrete without losing its trademark transparency.

Reinforced and post-tensioned glass

Cupać first began working with glass for her Master's degree in civil engineering in Croatia. Then, for her thesis at the Resilient Steel Structures Laboratory (RESSLAB), she tested glass beams that had been reinforced and post-tensioned with pieces of stainless steel. The properties of these beams were unknown and had barely been studied before at EPFL. During the post-tensioning process, the steel is stretched mechanically before being attached to the glass beam to make it more resistant.

Credit: EPFL

The researcher conducted dozens of bending resistance tests to identify the most resilient beam system. “The glass beams are post-tensioned using stainless steel bars that are attached at the beam ends and glued along the glass edges. Due to the applied prestress, the bending resistance of the beams is increased. In case of failure of a glass roof or floor due to exceptional load or vandalism, for instance, this system can ensure that everybody inside and outside the building is safe because the steel bars take over the tensile forces and make the glass more ductile – meaning that it can sustain significant deformations. This prevents the structure from collapsing immediately and without warning,” explains Cupać. The results of her research challenge preconceived ideas about the fragility of glass, which can in fact be made as resilient and ductile as more conventional load-bearing materials.

Three types of prestressed glass

Alongside the theoretical component of her thesis, Cupać suggests three types of post-tensioned glass beams that can be directly applied in construction. That's how she got the idea of joining forces with Alexander Wolhoff, who has just completed a Master's in architecture at EPFL. “Working with an architect meant that I could apply my thesis findings directly to the construction sector. Adding an artistic angle made the whole project more interesting,” says Cupać. Wolhoff also highlights the importance of sharing knowledge: “What made this combination of two different areas of expertise so enriching was that we had to pull each other out of our comfort zones and be inventive. The ATLAS project wouldn't have been possible without this in-depth exchange and our willingness to listen to each other.”

The ATLAS project

So that's how ATLAS came to be. It is a bench composed of three post-tensioned glass beams reinforced with steel, and two support blocks made out of high-performance fiber-reinforced concrete (HPFRC). These blocks are tinted with a shiny surface, and transfer the load from the beams to the ground. Wolhoff developed a special type of concrete for the project, based partly on his own research. EPFL's Structural Maintenance and Safety Laboratory (MCS) helped produce the two supports. Christophe Loraux, a PhD student in the MCS Laboratory, got involved in the project, advising Wolhoff to use HPFRC instead of conventional concrete, which has several drawbacks for this type of application: “It made sense to use HPFRC because it goes well with glass. It meant we combined two highly resistant materials that are stable over time.” Perhaps this is another nod to the Greek mythical figure.

The HPFRC made with synthetic fibers, which was developed by the MCS Laboratory, was thus chosen because it offered numerous advantages: it’s easy to cast, mechanically resistant – both when compressed and in terms of tensile strength – and doesn’t wear over time. The resulting material can cause confusion: it looks more like porcelain than conventional concrete. The ATLAS project plays with this visual aspect: the bench resembles something between glass and porcelain, and onlookers would be forgiven for thinking it is fragile.

Sparking the curiosity of sponsors

To get the other materials, the researchers had to convince potential sponsors, as Cupać explains: “One Zurich-based company specialized in the field of post-tensioning, commonly applied in the construction of concrete bridges, had never heard of this glass-steel combination and agreed to provide us the steel. And the adhesive we chose tends to be used only in electronic components, even though we think it has great potential for other applications. It was all put together by a major constructor of glass facades that hadn’t heard of this type of beam before either.”

The ATLAS bench will be showcased for the first time outdoors on the EPFL campus before being moved to a more permanent, indoor location. Cupać and Wolhoff are still looking for the perfect spot.




Contacts and sources:
 Sandrine Perroud
École polytechnique fédérale de Lausanne (EPFL)

The ATLAS project is the result of a collaboration between EPFL's RESSLAB, ALICE and MCS Laboratories and was supported by BBR Network Group, Huntsman, AGC VIM and Félix Constructions.
ATLAS project: ALICE blog, EPFL
ATLAS project: tumblr

Reference: Jagoda Cupać, “Post-tensioned glass beams,” supervised by Professor Alain Nussbaumer and Professor Christian Louter, September 2017.

Researchers Discover a Drug Combination That Can Regenerate Hair Cells In The Inner Ear to Restore Hearing Loss

Within the inner ear, thousands of hair cells detect sound waves and translate them into nerve signals that allow us to hear speech, music, and other everyday sounds. Damage to these cells is one of the leading causes of hearing loss, which affects 48 million Americans.

Each of us is born with about 15,000 hair cells per ear, and once damaged, these cells cannot regrow. However, researchers at MIT, Brigham and Women’s Hospital, and Massachusetts Eye and Ear have now discovered a combination of drugs that expands the population of progenitor cells (also called supporting cells) in the ear and induces them to become hair cells, offering a potential new way to treat hearing loss.

This image shows large clonal colonies of cochlear progenitor cells formed from single cells and converted into high-purity colonies of hair cells (cyan) with intricate hair bundles (red).
This image shows large clonal colonies of cochlear progenitor cells formed from single cells and converted into high-purity colonies of hair cells (cyan) with intricate hair bundles (red).
Image: Will McLean

“Hearing loss is a real problem as people get older. It’s very much of an unmet need, and this is an entirely new approach,” says Robert Langer, the David H. Koch Institute Professor at MIT, a member of the Koch Institute for Integrative Cancer Research, and one of the senior authors of the study.

Jeffrey Karp, an associate professor of medicine at Brigham and Women’s Hospital (BWH) and Harvard Medical School in Boston; and Albert Edge, a professor of otolaryngology at Harvard Medical School based at Massachusetts Eye and Ear, are also senior authors of the paper, which appears in the Feb. 21 issue of Cell Reports.

Lead authors are Will McLean, a recent PhD recipient at the Harvard-MIT Division of Health Sciences and Technology, and Xiaolei Yin, an instructor at Brigham and Women’s and a research affiliate at the Koch Institute. Other authors are former MIT visiting student Lin Lu, Mass Eye and Ear postdoc Danielle Lenz, and Mass Eye and Ear research assistant Dalton McLean.

Cell regeneration

Noise exposure, aging, and some antibiotics and chemotherapy drugs can lead to hair cell death. In some animals, those cells naturally regenerate, but not in humans.

The research team began investigating the possibility of regenerating hair cells during an earlier study on cells of the intestinal lining. In that study, published in 2013, Karp, Langer, Yin, and others reported that they could generate large quantities of immature intestinal cells and then stimulate them to differentiate, by exposing them to certain molecules.

During that study, the team became aware that cells that provide structural support in the cochlea express some of the same surface proteins as intestinal stem cells. The researchers decided to explore whether the same approach would work in those supporting cells.

This image shows large clonal colonies of cochlear progenitor cells formed from single cells and converted into high-purity colonies of hair cells (magenta) with intricate hair bundles (cyan).
This image shows large clonal colonies of cochlear progenitor cells formed from single cells and converted into high-purity colonies of hair cells (magenta) with intricate hair bundles (cyan).
Image: Will McLean

They exposed cells from a mouse cochlea, grown in a lab dish, to molecules that stimulate the Wnt pathway, which makes the cells multiply rapidly.

“We used small molecules to activate the supporting cells so they become proliferative and can generate hair cells,” Yin says.

At the same time, to prevent the cells from differentiating too soon, the researchers also exposed the cells to molecules that activate another signaling pathway known as Notch.

Once they had a large pool of immature progenitor cells (about 2,000-fold greater than any previously reported), the researchers added another set of molecules that provoked the cells to differentiate into mature hair cells. This procedure generates about 60 times more mature hair cells than the technique that had previously worked the best, which uses growth factors to induce the supporting cochlea cells to become hair cells without first expanding the population.

The researchers found that their new approach also worked in an intact mouse cochlea removed from the body. In that experiment, the researchers did not need to add the second set of drugs because once the progenitor cells were formed, they were naturally exposed to signals that stimulated them to become mature hair cells.

“We only need to promote the proliferation of these supporting cells, and then the natural signaling cascade that exists in the body will drive a portion of those cells to become hair cells,” Karp says.

Easy administration

Because this treatment involves a simple drug exposure, the researchers believe it could be easy to administer it to human patients. They envision that the drugs could be injected into the middle ear, from which they would diffuse across a membrane into the inner ear. This type of injection is commonly performed to treat ear infections.

Some of the researchers have started a company called Frequency Therapeutics, which has licensed the MIT/BWH technology and plans to begin testing it in human patients within 18 months.

Jeffrey Holt, a professor of otolaryngology and neurology at Boston Children’s Hospital and Harvard Medical School, says this approach holds potential for treating hearing loss, if its safety and effectiveness can be demonstrated.

“The ability to promote proliferation of inner-ear stem cells and direct their maturation toward an auditory hair cell fate is an important advance that will accelerate the pace of scientific discovery and facilitate translation of regenerative medicine approaches for restoration of auditory function in patients with acquired hearing loss,” says Holt, who was not involved in the research.

The researchers also hope their work will help other scientists who study hearing loss.

“Drug discovery for the inner ear has been limited by the inability to acquire enough progenitor cells or sensory hair cells to explore drug targets and their effects on these cell types,” McLean says. “We hope that our work will serve as a useful tool for other scientists to more effectively pursue studies of supporting cells and hair cells for basic research and potential therapeutic solutions to hearing loss.”

Karp, Langer, and Yin are also working on applying this approach to other types of cells, including types of intestinal cells involved in insulin regulation and control of the gut microbiota.

The research was funded by the National Institutes of Health, the European Commission, the Harvard-MIT IDEA2 Award, the Shulsky Foundation, and Robert Boucai.




Contacts and sources:
Anne Trafton 
Massachusetts Institute of Technology.

Gold Nanoparticles Trick, Attack and Destroy Viruses



École polytechnique fédérale de Lausanne (EPFL) researchers have created nanoparticles that attract viruses and, using the pressure resulting from the binding process, destroy them. This revolutionary approach could lead to the development of broad-spectrum antiviral drugs.

HIV, dengue, papillomavirus, herpes and Ebola – these are just some of the many viruses that kill millions of people every year, mostly children in developing countries. While drugs can be used against some viruses, there is currently no broad-spectrum treatment that is effective against several at the same time, in the same way that broad-spectrum antibiotics fight a range of bacteria. But researchers at EPFL's Supramolecular Nano-Materials and Interfaces Laboratory – Constellium Chair (SUNMIL) have created gold nanoparticles for just this purpose, and their findings could lead to a broad-spectrum treatment.

Once injected in the body, these nanoparticles imitate human cells and “trick” the viruses. When the viruses bind to them – in order to infect them – the nanoparticles use pressure produced locally by this link-up to “break” the viruses, rendering them innocuous. The results of this research have just been published in Nature Materials.

Cartoon depicting an imaginary attack of the nanoparticles to a virus leading to its loss of integrity
Credit: © SUNMIL/EPFL -

Pressing need for a broad-spectrum treatment

“Fortunately, we have drugs that are effective against some viruses, like HIV and hepatitis C,” says Francesco Stellacci, who runs SUNMIL, from the School of Engineering. “But these drugs work only on a specific virus.” Hence the need for broad-spectrum antiviral drugs. This would enable doctors to use a single drug to combat all viruses that are still deadly because no treatment currently exists. Such non-specific therapies are especially needed in countries – particularly in developing regions – where doctors do not have the tools they need to make accurate diagnoses. And broad-spectrum antiviral drugs would help curb the antimicrobial resistance resulting from the over-prescription of antibiotics. “Doctors often prescribe antibiotics in response to viral infections, since there is no other drug available. But antibiotics are only effective against bacteria, and this blanket use fosters the development of virus mutations and a build-up of resistance in humans,” says Stellacci.

Tricky nanoparticles

Until now, research into broad-spectrum virus treatments has only produced approaches that are toxic to humans or that work effectively in vitro – i.e., in the lab – but not in vivo. The EPFL researchers found a way around these problems by creating gold nanoparticles. They are harmless to humans, and they imitate human cell receptors – specifically the ones viruses seek for their own attachment to cells. Viruses infect human bodies by binding to replicating into cells. It is as if the nanoparticles work by tricking the viruses into thinking that they are invading a human cell. When they bind to the nanoparticles, the resulting pressure deforms the virus and opens it, rendering it harmless. Unlike other treatments, the use of pressure is non-toxic. “Viruses replicate within cells, and it is very difficult to find a chemical substance that attacks viruses without harming the host cells,” says Stellacci. “But until now, that’s been the only known approach attempted permanently damage viruses.” The method developed at SUNMIL is unique in that it achieves permanent damage to the viral integrity without damaging living cells.

Encouraging results on several viruses

Successful in vitro experiments have been conducted on cell cultures infected by herpes simplex virus, papillomavirus (which can lead to uterine cancer), respiratory syncytial virus (RSV, which can cause pneumonia), dengue virus and HIV (lentivirus). In other tests, mice infected by RSV were cured. For this project, the SUNMIL researchers teamed up with several other universities that contributed their expertise in nanomaterials and virology.

Broad-spectrum non-toxic antiviral nanoparticles with a virucidal inhibition mechanism”, published in Nature Materials. This research was funded in part by the Fondation Leenaards and NCCR Bio-Inspired Materials




Contacts and sources:
Clara Marc
École polytechnique fédérale de Lausanne (EPFL)?

New Way for DNA to Control Synthetic Chemical Systems Precisely

DNA molecules that follow specific instructions could offer more precise molecular control of synthetic chemical systems, a discovery that opens the door for engineers to create molecular machines with new and complex behaviors. Researchers have created chemical amplifiers and a chemical oscillator using a systematic method that has the potential to embed sophisticated circuit computation within molecular systems designed for applications in health care, advanced materials and nanotechnology.

Chemical oscillators have long been studied by engineers and scientists. The researchers who discovered the chemical oscillator that controls the human circadian rhythm —responsible for our bodies’ day and night rhythm — earned the 2017 Nobel Prize in physiology or medicine.

Though understanding of chemical oscillators and other biological chemical processes has evolved significantly, scientists do not know enough to control the chemical activities of living cells. This is leading engineers and scientists to turn to synthetic oscillators that work in test tubes rather than in cells.

Credit: University of Texas-Austin

The findings are published in the Dec. 15 issue of the journal Science.



In the new study, David Soloveichik and his research team in the Cockrell School of Engineering at The University of Texas at Austin show how to program synthetic oscillators and other systems by building DNA molecules that follow specific instructions.

Soloveichik, an assistant professor in the Cockrell School’s Department of Electrical and Computer Engineering, along with Niranjan Srinivas, a graduate student at the California Institute of Technology, and the study’s co-authors, have successfully constructed a first-of-its-kind chemical oscillator that uses DNA components — and no proteins, enzymes or other cellular components — demonstrating that DNA alone is capable of complex behavior.

According to the researchers, their discovery suggests that DNA can be much more than simply a passive molecule used solely to carry genetic information. “DNA can be used in a much more active manner,” Soloveichik said. “We can actually make it dance — with a rhythm, if you will. This suggests that nucleic acids (DNA and RNA) might be doing more than we thought, which can even inform our understanding of the origin of life, since it is commonly thought that early life was based entirely on RNA.”

The team’s new synthetic oscillator could one day be used in synthetic biology or in completely artificial cells, ensuring that certain processes happen in order. But oscillation is just one example of sophisticated molecular behavior. Looking beyond oscillators, this work opens the door for engineers to create more sophisticated molecular machines out of DNA. Depending on how the molecular machines are programmed, different behaviors could be generated, such as communication and signal processing, problem-solving and decision-making, control of motion, etc. — the kind of circuit computation generally attributed only to electronic circuits.

“As engineers, we are very good at building sophisticated electronics, but biology uses complex chemical reactions inside cells to do many of the same kinds of things, like making decisions,” Soloveichik said. “Eventually, we want to be able to interact with the chemical circuits of a cell, or fix malfunctioning circuits or even reprogram them for greater control. But in the near term, our DNA circuits could be used to program the behavior of cell-free chemical systems that synthesize complex molecules, diagnose complex chemical signatures and respond to their environments.”

The team developed their new oscillator by building DNA molecules that have a specific programming language, producing a repeatable workflow that can generate other complex temporal patterns and respond to input chemical signals. They compiled their language down to precise interactions — a standard practice in the field of electronics but completely novel in biochemistry.

The team’s research was conducted as part of the National Science Foundation’s (NSF) Molecular Programming Project, which launched in 2008 as a faculty collaboration to develop molecular programming into a sophisticated, user-friendly and widely used technology for creating nanoscale devices and systems.

Funding for the UT Austin team’s work was provided by the NSF, the Office of Naval Research, the National Institutes of Health and the Gordon and Betty Moore Foundation.



Contacts and sources:
University of Texas-Austin

New Twist in the Dark Matter Tale

An innovative interpretation of X-ray data from a cluster of galaxies could help scientists fulfill a quest they have been on for decades: determining the nature of dark matter.

The finding involves a new explanation for a set of results made with NASA’s Chandra X-ray Observatory, ESA’s XMM-Newton and Hitomi, a Japanese-led X-ray telescope. If confirmed with future observations, this may represent a major step forward in understanding the nature of the mysterious, invisible substance that makes up about 85% of matter in the universe.

“We expect that this result will either be hugely important or a total dud,” said Joseph Conlon of Oxford University who led the new study. “I don't think there is a halfway point when you are looking for answers to one of the biggest questions in science.”

Composite image of the Perseus galaxy cluster using data from NASA’s Chandra X-ray Observatory, ESA’s XMM-Newton and Hitomi, a Japanese-led X-ray telescope.
Composite image of the Perseus galaxy cluster using data from Chandra X-ray Observatory, ESA’s XMM-Newton and Hitomi.
Credits: X-ray: NASA/CXO/Fabian et al.; Radio: Gendron-Marsolais et al.; NRAO/AUI/NSF Optical: NASA, SDSS

The story of this work started in 2014 when a team of astronomers led by Esra Bulbul (Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass.) found a spike of intensity at a very specific energy in Chandra and XMM-Newton observations of the hot gas in the Perseus galaxy cluster.

This spike, or emission line, is at an energy of 3.5 kiloelectron volts (keV). The intensity of the 3.5 keV emission line is very difficult if not impossible to explain in terms of previously observed or predicted features from astronomical objects, and therefore a dark matter origin was suggested. Bulbul and colleagues also reported the existence of the 3.5 keV line in a study of 73 other galaxy clusters using XMM-Newton.

The plot of this dark matter tale thickened when only a week after Bulbul’s team submitted their paper a different group, led by Alexey Boyarsky of Leiden University in the Netherlands, reported evidence for an emission line at 3.5 keV in XMM-Newton observations of the galaxy M31 and the outskirts of the Perseus cluster, confirming the Bulbul et al. result.

However, these two results were controversial, with other astronomers later detecting the 3.5 keV line when observing other objects, and some failing to detect it.

The debate seemed to be resolved in 2016 when Hitomi especially designed to observe detailed features such as line emission in the X-ray spectra of cosmic sources, failed to detect the 3.5 keV line in the Perseus cluster.

“One might think that when Hitomi didn’t see the 3.5 keV line that we would have just thrown in the towel for this line of investigation,” said co-author Francesca Day, also from Oxford. “On the contrary, this is where, like in any good story, an interesting plot twist occurred.”

Conlon and colleagues noted that the Hitomi telescope had much fuzzier images than Chandra, so its data on the Perseus cluster are actually comprised of a mixture of the X-ray signals from two sources: a diffuse component of hot gas enveloping the large galaxy in the center of the cluster and X-ray emission from near the supermassive black hole in this galaxy. The sharper vision of Chandra can separate the contribution from the two regions. Capitalizing on this, Bulbul et al. isolated the X-ray signal from the hot gas by removing point sources from their analysis, including X-rays from material near the supermassive black hole

In order to test whether this difference mattered, the Oxford team re-analyzed Chandra data from close to the black hole at the center of the Perseus cluster taken in 2009. They found something surprising: evidence for a deficit rather than a surplus of X-rays at 3.5 keV. This suggests that something in Perseus is absorbing X-rays at this exact energy. When the researchers simulated the Hitomi spectrum by adding this absorption line to the hot gas’ emission line seen with Chandra and XMM-Newton, they found no evidence in the summed spectrum for either absorption or emission of X-rays at 3.5 keV, consistent with the Hitomi observations.

The challenge is to explain this behavior: detecting absorption of X-ray light when observing the black hole and emission of X-ray light at the same energy when looking at the hot gas at larger angles away from the black hole.

In fact, such behavior is well known to astronomers who study stars and clouds of gas with optical telescopes. Light from a star surrounded by a cloud of gas often shows absorption lines produced when starlight of a specific energy is absorbed by atoms in the gas cloud. The absorption kicks the atoms from a low to a high energy state. The atom quickly drops back to the low energy state with the emission of light of a specific energy, but the light is re-emitted in all directions, producing a net loss of light at the specific energy – an absorption line – in the observed spectrum of the star. In contrast, an observation of a cloud in a direction away from the star would detect only the re-emitted, or fluorescent light at a specific energy, which would show up as an emission line.

n 2014, astronomers detected an unusual feature in X-ray data from Perseus and other galaxy clusters. Since then, Chandra and other X-ray telescopes have taken more observations to replicate the finding. The existence and interpretation of this spike in X-ray light has been controversial and difficult to explain. Now, a new team of astronomers has reanalyzed these and other data and come up with a fresh take on this debate. They suggest that dark matter particles in the galaxy clusters are both absorbing and emitting X-rays. If the new model turns out to be correct, it could provide a path for scientists to identify the true nature of dark matter.

Credit: Chandra X-ray Center

The Oxford team suggests in their report that dark matter particles may be like atoms in having two energy states separated by 3.5 keV. If so, it could be possible to observe an absorption line at 3.5 keV when observing at angles close to the direction of the black hole, and an emission line when looking at the cluster hot gas at large angles away from the black hole.

“This is not a simple picture to paint, but it’s possible that we’ve found a way to both explain the unusual X-ray signals coming from Perseus and uncover a hint about what dark matter actually is,” said co-author Nicholas Jennings, also of Oxford.

To write the next chapter of this story, astronomers will need further observations of the Perseus cluster and others like it. For example, more data is needed to confirm the reality of the dip and to exclude a more mundane possibility, namely that we have a combination of an unexpected instrumental effect and a statistically unlikely dip in X-rays at an energy of 3.5 keV. Chandra, XMM-Newton and future X-ray missions will continue to observe clusters to address the dark matter mystery.

A paper describing these results was published in Physical Review D on December 19, 2017 and a preprint is available online. The other co-authors of the paper are Sven Krippendorf and Markus Rummel, both from Oxford. NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations.



Contacts and sources:
Molly Porter
NASA Marshall Space Flight Center, Huntsville, Ala.

Megan Watzke
Chandra X-ray Center

Cleaner Air Means Longer Life Says Research Team

The air we breathe contains particulate matter from a range of natural and human-related sources. Particulate matter is responsible for thousands of premature deaths in the United States each year, but legislation from the U.S. Environmental Protection Agency (EPA) is credited with significantly decreasing this number, as well as the amount of particulate matter in the atmosphere. However, the EPA may not be getting the full credit they deserve: New research from MIT’s Department of Civil and Environmental Engineering (CEE) proposes that the EPA’s legislation may have saved even more lives than initially reported.

“In the United States, the number of premature deaths associated with exposure to outdoor particulate matter exceeds the number of car accident fatalities every year. This highlights the vital role that the EPA plays in reducing the exposure of people living in the United States to harmful pollutants,” says Colette Heald, associate professor in CEE and the Department of Earth, Atmospheric and Planetary Sciences.

MIT researchers found a more dramatic decline in organic aerosol across the U.S. than previously reported, which may account for more lives saved than the U.S. Environmental Protection Agency anticipated in a 2011 report on the Clean Air Act and amendments. The study found that the decline is likely due to human behaviors.
MIT researchers found a more dramatic decline in organic aerosol across the U.S. than previously reported, which may account for more lives saved than the U.S. Environmental Protection Agency anticipated in a 2011 report on the Clean Air Act and amendments. The study found that the decline is likely due to human behaviors.
Photo: Andrius K / Shutterstock

The EPA’s 1970 Clean Air Act and amendments in 1990 address the health effects of particulate matter, specifically by regulating emissions of air pollutants and promoting research into cleaner alternatives. In 2011 the EPA announced that the legislation was responsible for a considerable decrease in particulate matter in the atmosphere, estimating that over 100,000 lives were saved every year from 2000 to 2010. However, the report did not consider organic aerosol, a major component of atmospheric particulate matter, to be a large contributor to the decline in particulate matter during this period. Organic aerosol is emitted directly from fossil fuel combustion (e.g. vehicles), residential burning, and wildfires but is also chemically produced in the atmosphere from the oxidation of both natural and anthropogenically emitted hydrocarbons.

The CEE research team, including Heald; Jesse Kroll, an associate professor of CEE and of chemical engineering; David Ridley, a research scientist in CEE; and Kelsey Ridley SM ’15, looked at surface measurements of organic aerosol from across the United States from 1990 to 2012, creating a comprehensive picture of organic aerosol in the United States.

“Widespread monitoring of air pollutant concentrations across the United States enables us to verify changes in air quality over time in response to regulations. Previous work has focused on the decline in particulate matter associated with efforts to reduce acid rain in the United States. But to date, no one had really explored the long-term trend in organic aerosol,” Heald says.

The MIT researchers found a more dramatic decline in organic aerosol across the U.S. than previously reported, which may account for more lives saved than the EPA anticipated. Their work showed that these changes are likely due to anthropogenic, or human, behaviors. The paper is published this week in Proceedings of the National Academy of Sciences.

“The EPA report showed a very large impact from the decline in particulate matter, but we were surprised to see a very little change in the organic aerosol concentration in their estimates,” explains Ridley. “The observations suggest that the decrease in organic aerosol had been six times larger than estimated between 2000 and 2010 in the EPA report.”

Using data from the Interagency Monitoring of Protected Visual Environments (IMPROVE) network the researchers found that organic aerosol decreased across the entire country in the winter and summer seasons. This decline in organic aerosol is surprising, especially when considering the increase in wildfires. But the researchers found that despite the wildfires, organic aerosols continue to decline.

The researchers also used information from the NASA Modern-Era Retrospective analysis for Research and Applications to analyze the impact of other natural influences on organic aerosol, such as precipitation and temperature, finding that the decline would be occurring despite cloud cover, rain, and temperature changes.

The absence of a clear natural cause for the decline in organic aerosol suggests the decline was the result of anthropogenic causes. Further, the decline in organic aerosol was similar to the decrease in other measured atmospheric pollutants, such as nitrogen dioxide and carbon monoxide, which are likewise thought to be due to EPA regulations. Also, similarities in trends across both urban and rural areas suggest that the declines may also be the result of behavioral changes stemming from EPA regulations.

By leveraging the emissions data of organic aerosol and its precursors, from both natural and anthropogenic sources, the researchers simulated organic aerosol concentrations from 1990 to 2012 in a model. They found that more than half of the decline in organic aerosol is accounted for by changes in human emissions behaviors, including vehicle emissions and residential and commercial fuel burning.

“We see that the model captures much of the observed trend of organic aerosol across the U.S., and we can explain a lot of that purely through changes in anthropogenic emissions. The changes in organic aerosol emissions are likely to be indirectly driven by controls by the EPA on different species, like black carbon from fuel burning and nitrogen dioxide from vehicles,” says Ridley. ”This wasn’t really something that the EPA was anticipating, so it’s an added benefit of the Clean Air Act.”

In considering mortality rates and the impact of organic aerosol over time, the researchers used a previously established method that relates exposure to particulate matter to increased risk of mortality through different diseases such as cardiovascular disease or respiratory disease. The researchers could thus figure out the change in mortality rate based on the change in particulate matter. Since the researchers knew how much organic aerosol is in the particulate matter samples, they were able to determine how much changes in organic aerosol levels decreased mortality.

“There are costs and benefits to implementing regulations such as those in the Clean Air Act, but it seems that we are reaping even greater benefits from the reduced mortality associated with particulate matter because of the change in organic aerosol,” Ridley says. “There are health benefits to reducing organic aerosol further, especially in urban locations. As we do, natural sources will contribute a larger fraction, so we need to understand how they will vary into the future too.”

This research was funded, in part, by the National Science Foundation, the National Aeronautics and Space Administration, and the National Oceanic and Atmospheric Administration.



Contacts and sources:
Carolyn Schmitt |
 Department of Civil and Environmental Engineering
Massachusetts Institute of Technology (MIT)

Hyperlens Crystal Capable of Viewing Living Cells in Unprecedented Detail



Just imagine: An optical lens so powerful that it lets you view features the size of a small virus on the surface of a living cell in its natural environment.

Construction of instruments with this capability is now possible because of a fundamental advance in the quality of an optical material used in hyperlensing, a method of creating lenses that can resolve objects much smaller than the wavelength of light. The achievement was reported by a team of researchers led by Joshua Caldwell, associate professor of mechanical engineering at Vanderbilt University, in a paper published Nov. 11 in the journal Nature Materials.

The optical material involved is hexagonal boron nitride (hBN), a natural crystal with hyperlensing properties. The best previously reported resolution using hBN was an object about 36 times smaller than the infrared wavelength used: about the size of the smallest bacteria. The new paper describes improvements in the quality of the crystal that enhance its potential imaging capability by about a factor of ten.

New hyperlens crystal is capable of resolving details as small as a virus on the surface of living cells. The atomic structure of the hexagonal boron nitride crystal is shown in the cutout. 
image of magnifying glass magnifying a cell
Credit: Keith Wood / Vanderbilt

The researchers achieved this enhancement by making hBN crystals using isotopically purified boron. Natural boron contains two isotopes that differ in weight by about 10 percent, a combination that significantly degrades the crystal’s optical properties in the infrared.

“We have demonstrated that the inherent efficiency limitations of hyperlenses can be overcome through isotopic engineering,” said team member Alexander Giles, research physicist at the the U.S. Naval Research Laboratory. “Controlling and manipulating light at nanoscale dimensions is notoriously difficult and inefficient. Our work provides a new path forward for the next generation of materials and devices.”

Researchers from the University of California, San Diego, Kansas State University, Oak Ridge National Laboratory and Columbia University also contributed to the study.

The researchers calculate that a lens made from their purified crystal can in principle capture images of objects as small as 30 nanometers in size. To put this in perspective, there are 25 million nanometers in an inch and human hair ranges from 80,000 to 100,000 nanometers in diameter. A human red blood cell is about 9,000 nanometers and viruses range from 20 to 400 nanometers.

Over the years, scientists have developed many instruments capable of producing images with nanoscale resolution, such as electron-based and atomic-force microscopes. However, they are incompatible with living organisms: either they operate under a high vacuum, expose samples to harmful levels of radiation, require lethal sample preparation techniques like freeze drying or remove samples from their natural, solution-based environment.

The primary reason for developing hyperlenses is the prospect that they can provide such highly detailed images of living cells in their natural environments using low-energy light that does not harm them. In addition, using infrared light to perform the imaging can also provide spectroscopic information about the objects it images, providing a means to ‘fingerprint’ the material. These capabilities could have a significant impact on biological and medical science. The technology also has potential applications in communications and nanoscale optical components.

The physics of hyperlenses is quite complex. The level of detail that optical microscopes can image is limited by the wavelength of light and the index of refraction of the lens material. When combined with the factors of lens aperture, distance from the object to the lens and the refractive index of the object under observation, this translates to a typical optical limit of about one half the wavelength used for imaging. At the infrared wavelengths used in this experiment, this “diffraction limit” is about 3,250 nanometers. This limit can be surpassed by using hBN due to its ability to support surface phonon polaritons, hybrid particles made up of photons of light coupling with vibrating, charged atoms in a crystal that have wavelengths much shorter than the incident light.

In the past, the problem with using polaritons in this fashion has been the rapidity with which they dissipate. By using hBN crystals made from 99 percent isotopically pure boron, the researchers have measured a dramatic reduction in optical losses compared to natural crystals, increasing the polariton’s lifetime three-fold, which allows them to travel triple the distance. This improvement translates into a significant improvement in imaging resolution. The researchers’ theoretical analysis suggests that another factor of ten improvement is possible.

“Currently, we have been testing very small flakes of purified hBN,” said Caldwell. “We think that we will see even further improvements with larger crystals.”

In 1674 Anton van Leeuwenhoek used one of the first handcrafted microscopes to discover the previously unknown world of microscopic life. This latest advance in hyperlens development is a significant step toward taking van Leeuwenhoek’s discovery to a whole new level, one which will allow biologists to directly observe cellular processes in action, like viruses invading cells or immune cells attacking foreign invaders.

Also contributing to the research were Siyuan Dai and Michael M. Fogler at the University of California, San Diego; Alexander J. Giles, Igor Vurgaftman, Chase T. Ellis, Thomas L Reinecke, Joseph G. Tischler, Nathanael Assefa and Ioannis Chatzakis at the U.S. Naval Research Laboratory; Timothy Hoffman, Song Liu and J.H. Edgar at Kansas State University; Lucas Lindsay at Oak Ridge National Laboratory and D.N. Basov at Columbia University. Various components and participants of this work were funded either in whole or part by the Office of Naval Research, the Army Research Office, the Air Force Office of Scientific Research, the National Science Foundation and the U.S. Department of Energy.




Contacts and sources:
David Salisbury
 Vanderbilt University