Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Monday, February 20, 2017

Galactic Archaeology: Astronomers Map Our Sun's Family Tree


Astronomers are borrowing principles applied in biology and archaeology to build a family tree of the stars in the galaxy. By studying chemical signatures found in the stars, they are piecing together these evolutionary trees looking at how the stars formed and how they are connected to each other. The signatures act as a proxy for DNA sequences. It’s akin to chemical tagging of stars and forms the basis of a discipline astronomers refer to as Galactic archaeology.

It was Charles Darwin, who, in 1859 published his revolutionary theory that all life forms are descended from one common ancestor. This theory has informed evolutionary biology ever since but it was a chance encounter between an astronomer and an biologist over dinner at King’s College in Cambridge that got the astronomer thinking about how it could be applied to stars in the Milky Way.

Image showing family trees of stars in our solar system, including the Sun

Credit: Institute of Astronomy


Writing in Monthly Notices of the Royal Astronomical Society, Dr Paula Jofré, of the University of Cambridge’s Institute of Astronomy, describes how she set about creating a phylogenetic “tree of life” that connects a number of stars in the galaxy.

“The use of algorithms to identify families of stars is a science that is constantly under development. Phylogenetic trees add an extra dimension to our endeavours which is why this approach is so special. The branches of the tree serve to inform us about the stars’ shared history“ she says.

The team picked twenty-two stars, including the Sun, to study. The chemical elements have been carefully measured from data coming from ground-based high-resolution spectra taken with large telescopes located in the north of Chile. Once the families were identified using the chemical DNA, their evolution was studied with the help of their ages and kinematical properties obtained from the space mission Hipparcos, the precursor of Gaia, the spacecraft orbiting Earth that was launched by the European Space Agency and is almost halfway through a 5-year project to map the sky.

Stars are born from violent explosions in the gas clouds of the galaxy. Two stars with the same chemical compositions are likely to have been born in the same molecular cloud. Some live longer than the age of the Universe and serve as fossil records of the composition of the gas at the time they were formed. The oldest star in the sample analysed by the team is estimated to be almost ten billion years old, which is twice as old as the Sun. The youngest is 700 million years old.

In evolution, organisms are linked together by a pattern of descent with modification as they evolve. Stars are very different from living organisms, but they still have a history of shared descent as they are formed from gas clouds, and carry that history in their chemical structure. By applying the same phylogenetic methods that biologists use to trace descent in plants and animals it is possible to explore the ‘evolution’ of stars in the Galaxy.

“The differences between stars and animals is immense, but they share the property of changing over time, and so both can be analysed by building trees of their history”, says Professor Robert Foley, of the Leverhulme Centre for Human Evolutionary Studies at Cambridge.

With an increasing number of datasets being made available from both Gaia and more advanced telescopes on the ground, and on-going and future large spectroscopic surveys, astronomers are moving closer to being able to assemble one tree that would connect all the stars in the Milky Way.



Contacts and sources:
Paul Seagrove
University of Cambridge

Paula Jofré et al. ‘Cosmic phylogeny: reconstructing the chemical history of the solar neighbourhood with an evolutionary tree’ is published by Monthly Notices of the Royal Astronomical Society. DOI 10.1093/mnras/stx075

$100 Trillion Argument for a World Without Borders

In an ideal world, we would all be able to freely move wherever we wanted. The basic right of people to escape from war, persecution and poverty would be accepted as a given, and no one would have their life determined by their place of birth.

But we don’t live in this world, and national borders continue to block the freedom of people to move. Around the world, protectionism is on the rise, as people are told to blame outsiders for threatening their way of life and, more importantly, stealing their jobs.

There is, however, an overwhelming case for open borders that can be made even in the traditionally self-interested language of economics. In fact, our best estimates are that opening the world’s borders could increase global GDP by US$100 trillion.

Credit: Occupy Wall Street


That’s US$100,000,000,000,000

It sounds like a crazy idea, particularly when the media is dominated by stories about the need to control immigration and the right-wing tabloids trumpet “alternative facts” about how immigration hurts our economies. But every piece of evidence we have says that ending borders would be the single easiest way to improve the living standards of workers around the world – including those in wealthy countries.

The argument is simple enough and has been made by more than one economist. Workers in poorer economies make less than they should. If they were to have all of the benefits of rich countries – advanced education, the latest workplace technologies, and all the necessary infrastructure – these workers would produce and earn as much as their rich country counterparts. What keeps them in poverty is their surroundings. If they were able to pick up and move to more productive areas, they would see their incomes increase many times over.

This means that opening borders is, by a massive amount, the easiest and most effective way to tackle global poverty. Research shows that alternative approaches – for instance, microcredit, higher education standards, and anti-sweatshop activism – all produce lifetime economic gains that would be matched in weeks by open borders. Even small reductions in the barriers posed by borders would bring massive benefits for workers.

Gains for all

Of course, the immediate fear of having open borders is that it will increase competition for jobs and lower wages for those living in rich countries. This misses the fact that globalisation means competition already exists between workers worldwide – under conditions that harm their pay and security. UK workers in manufacturing or IT, for instance, are already competing with low-wage workers in India and Vietnam. Workers in rich countries are already losing, as companies eliminate good jobs and move their factories and offices elsewhere.

Under these circumstances, the function of borders is to keep workers trapped in low-wage areas that companies can freely exploit. Every worker – whether from a rich country or a poor country – suffers as a result. Ending borders would mean an end to this type of competition between workers. It would make us all better off.

The European Union has provided a natural experiment in what happens when borders between rich and poor countries are opened up. And the evidence here is unambiguous: the long-run effects of open borders improve the conditions and wages of all workers. However, in the short-run, some groups (particularly unskilled labourers) can be negatively affected.

The fixes for this are exceedingly simple though. A shortening of the work week would reduce the amount of work supplied, spread the work out more equally among everyone, and give more power to workers – not to mention, more free time to everyone. And the strengthening and proper enforcement of labour laws would make it impossible for companies to hyper-exploit migrant workers. The overall impacts of more workers are exceedingly small in the short-run, and exceedingly positive in the long-run.

As it stands, borders leave workers stranded and competing against each other. The way the global economy is set up is based entirely on competition. This makes us think that potential allies are irreconcilable enemies. The real culprits, however, are businesses that pick up and leave at the drop of a hat, that fire long-time workers in favour of cheaper newcomers, and that break labour laws outright, in order to boost their profits.

Borders leave us as strangers rather than allies. Yet this need not be the case, and as a principle guiding political action, the abolition of borders would rank among the greatest of human achievements.


Contacts and sources:
Nick Srnicek, Lecturer in International Political Economy, 

This article was originally published on The Conversation. Read the original article.

'Tully Monster' Mystery Is Not Solved, Argues Penn-Led Group

Tullimonstrum gregarium,  known as the Tully Monster, is an extinct, soft-bodied vertebrate that lived in shallow tropical coastal waters of muddy estuaries during the Pennsylvanian geological period, about 300 million years ago. Examples of Tullimonstrum have been found only in the Mazon Creek fossil beds of Illinois, United States. Until 2016, its classification was uncertain, and interpretations of the fossil likened it to a mollusc, an arthropod, a conodont, or to one of the many phyla of worms.

Last year, headlines in The New York Times, The Atlantic, Scientific American and other outlets declared that a decades-old paleontological mystery had been solved. The "Tully monster," an ancient animal that had long defied classification, was in fact a vertebrate, two groups of scientists claimed. Specifically, it seemed to be a type of fish called a lamprey.

The problem with this resolution? According to a group of paleobiologists led by the University of Pennsylvania's Lauren Sallan, it's plain wrong.

"This animal doesn't fit easy classification because it's so weird," said Sallan, an assistant professor in Penn's School of Arts & Sciences' Department of Earth and Environmental Science. "It has these eyes that are on stalks and it has this pincer at the end of a long proboscis and there's even disagreement about which way is up. But the last thing that the Tully monster could be is a fish."


Life reconstruction of Tullimonstrum gregarium based on studies by McCoy et al. 2016


Credit: Wikipedia/Nobu Tamura

In a new report in the journal Palaeontology, Sallan and colleagues argue that the two papers that seemingly settled the Tully monster debate are flawed, failing to definitively classify it as a vertebrate. The mystery of the Tully monster, known to scientists as Tullimonstrum gregarium, remains.

"It's important to incorporate all lines of evidence when considering enigmatic fossils: anatomical, preservational and comparative," said Sam Giles, a junior research fellow at the University of Oxford and coauthor of the study. "Applying that standard to the Tully monster argues strongly against a vertebrate identity."

Sallan and Giles coauthored the work with Robert Sansom of the University of Manchester, Penn postdoctoral researcher John Clarke, Zerina Johnason of the Natural History Museum London, Ivan Sansom of the University of Birmingham and Philippe Janvier of France's Muséum National d'Histoire Naturelle.

The Tully monster has been known since the 1950s, when the first fossils were found in Mazon Creek fossil beds in central Illinois. Since then, thousands of specimens have been identified from the area. The species is the state fossil of Illinois and even graces the side of UHauls. But none of the attempts to classify it to an animal group over the last half century had stuck.

"Initially it was published as a worm," Sallan said. "There is a well-constructed argument that it is some kind of mollusc, like a sea cucumber. And there's another very strong argument that it's some kind of arthropod, similar to a lobster."

That's why it took the scientific community by surprise when in 2016 two studies came out in close succession both claiming they had firm evidence that the Tully monster was in fact a vertebrate.

The first examined more than 1,200 Tully monster fossils. In some, the researchers observed a light band running down the creature's midline, which they determined was a notochord, a kind of primitive backbone. They also claimed it contained other internal organ structures, such as gill sacs, that identified it as a vertebrate, and that the animal's teeth resembled those of lamprey.

An illustration depicts what Mazon Creek may have looked like 300 million years ago, complete with Tully monsters (the two small swimming creatures), a large shark and a salamander relative.

Credit: John Megahan

But Sallan and colleagues noted that these conclusions are based on a misunderstanding of how fossils in Mazon Creek are preserved. The Tully monster samples come from what was once a marine area.

"In the marine rocks you just see soft tissues, you don't see much internal structure preserved," Sallan said.

The Penn-led team further noted that there have been lampreys found in this area of Mazon Creek, and that these animals don't resemble the Tully monster.

The other 2016 study reported that scanning electron microscope images of the Tully monsters' eyes had revealed structures called melanosomes, which produce and store melanin. That paper's authors argued that the complex tissue structure they saw in the animals' eyes indicated it was likely a vertebrate.

Yet species besides vertebrates, such as arthropods and cephalopods like octopuses, also have complex eyes, the Penn-led team wrote.

"Eyes have evolved dozens of times," Sallan said. "It's not too much of a leap to imagine Tully monsters could have evolved an eye that resembled a vertebrate eye."

Based on Sallan's and her colleagues' examination of Tullimonstrum eyes, these creatures in fact possess what is known as a cup eye, a relatively simpler structure that lacks a lens.

"So the problem is, if it does have cup eyes, then it can't be a vertebrate because all vertebrates either have more complex eyes than that or they secondarily lost them," Sallan said. "But lots of other things have cup eyes, like primitive chordates, molluscs and certain types of worms."

Fossil in the Museo Civico di Storia Naturale di Milano. This specimen clearly shows the eye bar structure.
Credit: Wikimedia Commons/Ghedoghedo
Their Palaeontology report noted that none of the more than 1,000 examined Tully specimens appeared to possess structures that are believed to be universal in aquatic vertebrates, notably otic capsules, components of the ear that allow animals to balance, and a lateral line, a sensory structure that enables fishes to orient themselves in space.

"You would expect at least a handful of the specimens to have preserved these structures," Sallan said. "Not only does this creature have things that should not be preserved in vertebrates, it doesn't have things that absolutely should be preserved."

The researchers said that an improper classification of such an unusual species has ripple effects on the larger field of evolution.

The fossil invertebrate collection at The Field Museum holds several fossil specimens of the Tully Monster (Tullimonstrum gregarium). These rare fossils were first discovered by an amateur collector and it still remains a mystery what type of animal it is.

The Tully Monster from The Field Museum on Vimeo.


"Having this kind of misassignment really affects our understanding of vertebrate evolution and vertebrate diversity at this given time," Sallan said. "It makes it harder to get at how things are changing in response to an ecosystem if you have this outlier. And though of course there are outliers in the fossil record -- there are plenty of weird things and that's great--if you're going to make extraordinary claims, you need extraordinary evidence."

As for the true identity of the Tully monster, the Penn-led team said that's still up in the air.



Contacts and sources:
Katherine Unger Baillie
University of Pennsylvania

Lithium-Sulfur Battery The Next Big Leap In Portable Power


USC researchers may have just found a solution for one of the biggest stumbling blocks to the next wave of rechargeable batteries -- small enough for cellphones and powerful enough for cars.

In a paper published in the January issue of the Journal of the Electrochemical Society, Sri Narayan and Derek Moy of the USC Loker Hydrocarbon Research Institute outline how they developed an alteration to the lithium-sulfur battery that could make it more than competitive with the industry standard lithium-ion battery.

The lithium-sulfur battery, long thought to be better at energy storage capacity than its more popular lithium-ion counterpart, was hampered by its short cycle life. Currently the lithium-sulfur battery can be recharged 50 to 100 times -- impractical as an alternative energy source compared to 1,000 times for many rechargeable batteries on the market today.


This is a lithium-sulfur battery with Mixed Conduction Membrane barrier to stop polysulfide shuttling.

Credit: Sri Narayan and Derek Moy


A small piece of material saves so much life

The solution devised by Narayan and lead author and research assistant Moy is something they call the "Mixed Conduction Membrane," or MCM, a small piece of non-porous, fabricated material sandwiched between two layers of porous separators, soaked in electrolytes and placed between the two electrodes.

The membrane works as a barrier in reducing the shuttling of dissolved polysulfides between anode and cathode, a process that increases the kind of cycle strain that has made the use of lithium-sulfur batteries for energy storage a challenge. The MCM still allows for the necessary movement of lithium ions, mimicking the process as it occurs in lithium-ion batteries. This novel membrane solution preserves the high-discharge rate capability and energy density without losing capacity over time.

At various rates of discharge, the researchers found that the lithium-sulfur batteries that made use of MCM led to 100 percent capacity retention and had up to four times longer life compared to batteries without the membrane.

"This advance removes one of the major technical barriers to the commercialization of the lithium-sulfur battery, allowing us to realize better options for energy efficiency," said Narayan, senior author and professor of chemistry at the USC Dornsife College of Letters, Arts and Sciences. "We can now focus our efforts on improving other parts of lithium-sulfur battery discharge and recharge that hurt the overall life cycle of the battery."

Cheap and abundant building blocks

Lithium-sulfur batteries have a host of advantages over lithium-ion batteries: They are made with abundant and cheap sulfur, and are two to three times denser, which makes them both smaller and better at storing charge.

A lithium-sulfur battery would be ideal for saving space in mobile phones and computers, as well as allowing for weight reduction in future electric vehicles, including cars and even planes, further reducing reliance on fossil fuels, researchers said.

The actual MCM layer that Narayan and Moy devised is a thin film of lithiated cobalt oxide, though future alternative materials could produce even better results. According to Narayan and Moy, any substitute material used as an MCM must satisfy some fundamental criteria: The material must be non-porous, it should have mixed conduction properties and it must be electrochemically inert.



Contacts and sources:
Ian Chaffee
University of Southern California (USC)

Bee Decline Threatens US Crop Production: First US Wild Bee Map Reveals 139 'Trouble Zone' Counties

The first-ever study to map U.S. wild bees suggests they are disappearing in the country's most important farmlands -- from California's Central Valley to the Midwest's corn belt and the Mississippi River valley.

If wild bee declines continue, it could hurt U.S. crop production and farmers' costs, said Taylor Ricketts, a conservation ecologist at the University of Vermont, at the American Association for the Advancement of Science (AAAS) annual meeting panel, Plan Bee: Pollinators, Food Production and U.S. Policy on Feb. 19.

"This study provides the first national picture of wild bees and their impacts on pollination," said Ricketts, Director of UVM's Gund Institute for Ecological Economics, noting that each year $3 billion of the U.S. economy depends on pollination from native pollinators like wild bees.

The first national study to map US wild bees suggests they're disappearing in many of the country's most important farmlands. Relatively low abundances are shown here in yellow; higher abundances in blue.
Credit: PNAS

At AAAS, Ricketts briefed scholars, policy makers, and journalists on how the national bee map, first published in the Proceedings of the National Academy of Sciences in late 2015, can help to protect wild bees and pinpoint habitat restoration efforts.

At the event, Ricketts also introduced a new mobile app that he is co-developing to help farmers upgrade their farms to better support wild bees.

"Wild bees are a precious natural resource we should celebrate and protect," said Ricketts, Gund Professor in UVM's Rubenstein School of Environment and Natural Resources. "If managed with care, they can help us continue to produce billions of dollars in agricultural income and a wonderful diversity of nutritious food."

TROUBLE ZONES

The map identifies 139 counties in key agricultural regions of California, the Pacific Northwest, the upper Midwest and Great Plains, west Texas, and Mississippi River valley, which appear to have most worrisome mismatch between falling wild bee supply and rising crop pollination demand.

These counties tend to be places that grow specialty crops -- like almonds, blueberries and apples -- that are highly dependent on pollinators. Or they are counties that grow less dependent crops -- like soybeans, canola and cotton -- in very large quantities.

Of particular concern, some crops most dependent on pollinators -- including pumpkins, watermelons, pears, peaches, plums, apples and blueberries -- appeared to have the strongest pollination mismatch, growing in areas with dropping wild bee supply and increasing in pollination demand.

Globally, more than two-thirds of the most important crops either benefit from or require pollinators, including coffee, cacao, and many fruits and vegetables.

Pesticides, climate change and diseases threaten wild bees -- but their decline may be caused by the conversion of bee habitat into cropland, the study suggests. In 11 key states where the map shows bees in decline, the amount of land tilled to grow corn spiked by 200 percent in five years -- replacing grasslands and pastures that once supported bee populations.

RISING DEMAND, FALLING SUPPLY

Over the last decade, honeybee keepers facing colony losses have struggled with rising demand for commercial pollination services, pushing up the cost of managed pollinators - and the importance of wild bees.

A new study of wild bees identifies 139 counties in key agricultural regions of California, the Pacific Northwest, the Midwest, west Texas and the Mississippi River valley that face a worrisome mismatch between falling wild bee supply and rising crop pollination demand.
Credit: PNAS

"Most people can think of one or two types of bee, but there are 4,000 species in the U.S. alone," said Insu Koh, a UVM postdoctoral researcher who co-hosted the AAAS panel and led the study.

"When sufficient habitat exists, wild bees are already contributing the majority of pollination for some crops," Koh adds. "And even around managed pollinators, wild bees complement pollination in ways that can increase crop yields."

MAKING THE MAPS

A team of seven researchers -- from UVM, Franklin and Marshall College, University of California at Davis, and Michigan State University -- created the maps by first identifying 45 land-use types from two federal land databases, including croplands and natural habitats. Then they gathered detailed input from national and state bee experts about the suitability of each land-use type for providing wild bees with nesting and food resources.

The scientists built a bee habitat model that predicts the relative abundance of wild bees for every area of the contiguous United States, based on their quality for nesting and feeding from flowers. Finally, the team checked and validated their model against bee collections and field observations in many actual landscapes.

THE GOOD NEWS

"The good news about bees," said Ricketts, "is now that we know where to focus conservation efforts, paired with all we know about what bees need, habitat-wise, there is hope for preserving wild bees."



Contacts and sources:
Basil Waugh
University of Vermont

Touchless Gestures To Control Cellphones and Other Devices, Ambient Light Will Charge Them

Cellphones and other devices could soon be controlled with touchless gestures and charge themselves using ambient light, thanks to new LED arrays that can both emit and detect light.

Made of tiny nanorods arrayed in a thin film, the LEDs could enable new interactive functions and multitasking devices. Researchers at the University of Illinois at Urbana-Champaign and Dow Electronic Materials in Marlborough, Massachusetts, report the advance in the Feb. 10 issue of the journal Science.

“These LEDs are the beginning of enabling displays to do something completely different, moving well beyond just displaying information to be much more interactive devices,” said Moonsub Shim, a professor of materials science and engineering at the U. of I. and the leader of the study. “That can become the basis for new and interesting designs for a lot of electronics.”

A laser stylus writes on a small array of multifunction pixels made by dual-function LEDs than can both emit and respond to light.

Photo courtesy of Moonsub Shim

The tiny nanorods, each measuring less than 5 nanometers in diameter, are made of three types of semiconductor material. One type emits and absorbs visible light. The other two semiconductors control how charge flows through the first material. The combination is what allows the LEDs to emit, sense and respond to light.

The nanorod LEDs are able to perform both functions by quickly switching back and forth from emitting to detecting. They switch so fast that, to the human eye, the display appears to stay on continuously – in fact, it’s three orders of magnitude faster than standard display refresh rates. Yet the LEDs are also near-continuously detecting and absorbing light, and a display made of the LEDs can be programmed to respond to light signals in a number of ways.

For example, a display could automatically adjust brightness in response to ambient light conditions – on a pixel-by-pixel basis.


Professor Moonsub Shim, postdoctoral researcher Seongyong Cho and collaborators developed dual-function nanorod LEDs that could be the basis for multifunctional device displays.
Photo by L. Brian Stauffer

“You can imagine sitting outside with your tablet, reading. Your tablet will detect the brightness and adjust it for individual pixels,” Shim said. “Where there’s a shadow falling across the screen it will be dimmer, and where it’s in the sun it will be brighter, so you can maintain steady contrast.”

The researchers demonstrated pixels that automatically adjust brightness, as well as pixels that respond to an approaching finger, which could be integrated into interactive displays that respond to touchless gestures or recognize objects.

They also demonstrated arrays that respond to a laser stylus, which could be the basis of smart whiteboards, tablets or other surfaces for writing or drawing with light. And the researchers found that the LEDs not only respond to light, but can convert it to electricity as well.

“The way it responds to light is like a solar cell. So not only can we enhance interaction between users and devices or displays, now we can actually use the displays to harvest light,” Shim said. “So imagine your cellphone just sitting there collecting the ambient light and charging. That’s a possibility without having to integrate separate solar cells. We still have a lot of development to do before a display can be completely self-powered, but we think that we can boost the power-harvesting properties without compromising LED performance, so that a significant amount of the display’s power is coming from the array itself.”

In addition to interacting with users and their environment, nanorod LED displays can interact with each other as large parallel communication arrays. It would be slower than device-to-device technologies like Bluetooth, Shim said, but those technologies are serial – they can only send one bit at a time. Two LED arrays facing each other could communicate with as many bits as there are pixels in the screen.

“We primarily interface with our electronic devices through their displays, and a display’s appeal resides in the user’s experience of viewing and manipulating information,” said study coauthor Peter Trefonas, a corporate fellow in Electronic Materials at The Dow Chemical Company. “The bidirectional capability of these new LED materials could enable devices to respond intelligently to external stimuli in new ways. The potential for touchless gesture control alone is intriguing, and we’re only scratching the surface of what could be possible.”

The researchers did all their demonstrations with arrays of red LEDs. They are now working on methods to pattern three-color displays with red, blue and green pixels, as well as working on ways to boost the light-harvesting capabilities by adjusting the composition of the nanorods.

This work was supported by a collaborative research effort between the Dow Chemical Company and the University of Illinois, with the aim of advancing technologies important to industry. The National Science Foundation also supported this work.



Contacts and sources:
Liz Ahlberg Touchstone
University of Illinois at Urbana-Champaign

Engine Produces Hydrogen from Methane and Captures CO2



When is an internal combustion engine not an internal combustion engine? When it’s been transformed into a modular reforming reactor that could make hydrogen available to power fuel cells wherever there’s a natural gas supply available.

By adding a catalyst, a hydrogen separating membrane and carbon dioxide sorbent to the century-old four-stroke engine cycle, researchers have demonstrated a laboratory-scale hydrogen reforming system that produces the green fuel at relatively low temperature in a process that can be scaled up or down to meet specific needs.

The process could provide hydrogen at the point of use for residential fuel cells or neighborhood power plants, electricity and power production in natural-gas powered vehicles, fueling of municipal buses or other hydrogen-based vehicles, and supplementing intermittent renewable energy sources such as photovoltaics.

Georgia Tech researchers have demonstrated a CHAMP reactor, which uses the four-stroke engine cycle to create hydrogen while simultaneously capturing carbon dioxide emission. 
Credit: Candler Hobbs, Georgia Tech

Known as the CO2/H2 Active Membrane Piston (CHAMP) reactor, the device operates at temperatures much lower than conventional steam reforming processes, consumes substantially less water and could also operate on other fuels such as methanol or bio-derived feedstock. It also captures and concentrates carbon dioxide emissions, a by-product that now lacks a secondary use – though that could change in the future.

Unlike conventional engines that run at thousands of revolutions per minute, the reactor operates at only a few cycles per minute – or more slowly – depending on the reactor scale and required rate of hydrogen production. And there are no spark plugs because there’s no fuel combusted.

“We already have a nationwide natural gas distribution infrastructure, so it’s much better to produce hydrogen at the point of use rather than trying to distribute it,” said Andrei Fedorov, a Georgia Institute of Technology professor who’s been working on CHAMP since 2008. “Our technology could produce this fuel of choice wherever natural gas is available, which could resolve one of the major challenges with the hydrogen economy.”

A paper published February 9 in the journal Industrial & Engineering Chemistry Research describes the operating model of the CHAMP process, including a critical step of internally adsorbing carbon dioxide, a byproduct of the methane reforming process, so it can be concentrated and expelled from the reactor for capture, storage or utilization.

Other implementations of the system have been reported as thesis work by three Georgia Tech Ph.D. graduates since the project began in 2008. The research was supported by the National Science Foundation, the Department of Defense through NDSEG fellowships, and the U.S. Civilian Research & Development Foundation (CRDF Global).

Key to the reaction process is the variable volume provided by a piston rising and falling in a cylinder. As with a conventional engine, a valve controls the flow of gases into and out of the reactor as the piston moves up and down. The four-stroke system works like this:
Natural gas (methane) and steam are drawn into the reaction cylinder through a valve as the piston inside is lowered. The valve closes once the piston reaches the bottom of the cylinder.
The piston rises into the cylinder, compressing the steam and methane as the reactor is heated. Once it reaches approximately 400 degrees Celsius, catalytic reactions take place inside the reactor, forming hydrogen and carbon dioxide. The hydrogen exits through a selective membrane, and the pressurized carbon dioxide is adsorbed by the sorbent material, which is mixed with the catalyst.
Once the hydrogen has exited the reactor and carbon dioxide is tied up in the sorbent, the piston is lowered, reducing the volume (and pressure) in the cylinder. The carbon dioxide is released from the sorbent into the cylinder.
The piston is again moved up into the chamber and the valve opens, expelling the concentrated carbon dioxide and clearing the reactor for the start of a new cycle.

“All of the pieces of the puzzle have come together,” said Fedorov, a professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering. “The challenges ahead are primarily economic in nature. Our next step would be to build a pilot-scale CHAMP reactor.”

The project was begun to address some of the challenges to the use of hydrogen in fuel cells. Most hydrogen used today is produced in a high-temperature reforming process in which methane is combined with steam at about 900 degrees Celsius. The industrial-scale process requires as many as three water molecules for every molecule of hydrogen, and the resulting low density gas must be transported to where it will be used.

Fedorov’s lab first carried out thermodynamic calculations suggesting that the four-stroke process could be modified to produce hydrogen in relatively small amounts where it would be used. The goals of the research were to create a modular reforming process that could operate at between 400 and 500 degrees Celsius, use just two molecules of water for every molecule of methane to produce four hydrogen molecules, be able to scale down to meet the specific needs, and capture the resulting carbon dioxide for potential utilization or sequestration.

“We wanted to completely rethink how we designed reactor systems,” said Fedorov. “To gain the kind of efficiency we needed, we realized we’d need to dynamically change the volume of the reactor vessel. We looked at existing mechanical systems that could do this, and realized that this capability could be found in a system that has had more than a century of improvements: the internal combustion engine.”

The CHAMP system could be scaled up or down to produce the hundreds of kilograms of hydrogen per day required for a typical automotive refueling station – or a few kilograms for an individual vehicle or residential fuel cell, Fedorov said. The volume and piston speed in the CHAMP reactor can be adjusted to meet hydrogen demands while matching the requirements for the carbon dioxide sorbent regeneration and separation efficiency of the hydrogen membrane. In practical use, multiple reactors would likely be operated together to produce a continuous stream of hydrogen at a desired production level.

“We took the conventional chemical processing plant and created an analog using the magnificent machinery of the internal combustion engine,” Fedorov said. “The reactor is scalable and modular, so you could have one module or a hundred of modules depending on how much hydrogen you needed. The processes for reforming fuel, purifying hydrogen and capturing carbon dioxide emission are all combined into one compact system.”

This publication is based on work supported by the National Science Foundation (NSF) CBET award 0928716, which was funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5), and by award 61220 of the U.S. Civilian Research & Development Foundation (CRDF Global) and by the National Science Foundation under Cooperative Agreement OISE- 9531011. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of NSF or CRDF Global. Graduate work of David M. Anderson, the first author on the paper, was conducted with government support under an award by the DoD, Air Force Office of Scientific Research, National Defense Science and Engineering Graduate (NDSEG) Fellowship, 32 CFR 168a.



Contacts and sources:
John Toon
Georgia Institute of Technology

Citation: David M. Anderson, Thomas M. Yun, Peter A. Kottke and Andrei G. Fedorov, “Comprehensive Analysis of Sorption Enhanced Steam Methane Reforming in a Variable Volume Membrane Reactor,” (Industrial & Engineering Chemistry Research, 2017). http://dx.doi.org/10.1021/acs.iecr.6b04392


How To Build a Bio-Bot


Creating tiny muscle-powered robots that can walk or swim by themselves -- or better yet, when prompted -- is more complicated than it looks.

Rashid Bashir, the head of the bioengineering department at the University of Illinois, and Taher Saif, a professor of mechanical science and engineering at Illinois, will speak in Boston on the design and development of walking and swimming bio-bots at the annual meeting of the American Association for the Advancement of Science.

Tiny walking "bio-bots" are powered by muscle cells and controlled by an electric field.

Graphic by Janet Sinn-Hanlon, Design Group@VetMed


The symposium "Integrated Cellular Systems: Building Machines with Cells" was held Feb. 18 in at the Hynes Convention Center.  

Through the National Science Foundation-funded Emergent Behavior of Integrated Cellular Systems center, Bashir, Saif and colleagues have developed small, soft biological robots, dubbed "bio-bots," that can walk and swim on their own or when triggered by electrical or light signals. The researchers make a soft 3-D printed scaffold measuring a centimeter or two in length, seed it with muscle cells, and the cells self-organize to form functional tissues that make the bio-bots move.

"These machines are now viewed as partially living, with the ability to form, the ability to age and the ability to heal if there's an injury," Saif said. "Now that we've got them working, we are beginning to look back and try to understand how the cells organize themselves and what language they use to communicate. This is the developmental biology of living machines."

Miniature "bio-bots" developed at the University of Illinois are made of hydrogel and heart cells, but can walk on their own.

Photo by Elise A. Corbin

In the talk "How to Engineer a Living System," Bashir will describe the methods that the group has used to build the bio-bots and to direct their behavior.

"As engineers, we usually build with materials like wood, steel or silicon. Our focus here is to forward-engineer biological or cell-based systems," Bashir said. "The design is inspired by the muscle-tendon-bone complex found in nature. There's a skeleton or backbone, but made out of soft polymers similar to the ones used in contact lenses, so it can bend instead of needing joints like the body does."

Credit: University of Illinois

Bashir's group developed multiple designs to make bio-bots walk in certain directions and to control their motion with light or electrical currents.

In the talk "Engineered Living Micro Swimmers," Saif will describe bio-bots that swim and the physical and biological interactions that cause the cells to come into alignment. They form a single muscle unit that contracts to beat a tail, propelling the bio-bot through liquid.

"They align themselves in a direction where the tail of the swimmer can be bent most. Which is exactly what we wanted, although we did not pattern or direct them to do it," Saif said. "Why do they behave this way? If each cell beat at its own time, we wouldn't have the swimmer. What made them synchronize into a single entity?"

Bashir and Saif will share insights learned from these questions and more.

"The objective is not to make a walker and a swimmer, but to lay the scientific foundation so we have principles for building biological machines in the future," Saif said.



Contacts and sources:
Liz Ahlberg Touchstone
University of Illinois

Magnet Triggers Drug Release from Implant

University of British Columbia researchers have developed a magnetic drug implant—the first of its kind in Canada—that could offer an alternative for patients struggling with numerous pills or intravenous injections.

The device, a silicone sponge with magnetic carbonyl iron particles wrapped in a round polymer layer, measures just six millimetres in diameter. The drug is injected into the device and then surgically implanted in the area being treated. Passing a magnet over the patient’s skin activates the device by deforming the sponge and triggering the release of the drug into surrounding tissue through a tiny opening.

Size of the magnetic implant compared to the Canadian one-dollar coin. 
Credit: UBC

“Drug implants can be safe and effective for treating many conditions, and magnetically controlled implants are particularly interesting because you can adjust the dose after implantation by using different magnet strengths. Many other implants lack that feature,” said study author Ali Shademani, a PhD student in the biomedical engineering program at UBC.

Actively controlling drug delivery is particularly relevant for conditions like diabetes, where the required dose and timing of insulin varies from patient to patient, said co-author John K. Jackson, a research scientist in UBC’s faculty of pharmaceutical sciences.



“This device lets you release the actual dose that the patient needs when they need it, and it’s sufficiently easy to use that patients could administer their own medication one day without having to go to a hospital,” said Jackson.

The researchers tested their device on animal tissue in the lab using the prostate cancer drug docetaxel. They found that it was able to deliver the drug on demand even after repeated use. The drug also produced an effect on cancer cells comparable to that of freshly administered docetaxel, proving that drugs stored in the device stay effective.

Mu Chiao, Shademani’s supervisor and a professor of mechanical engineering at UBC, said the team is working on refining the device and narrowing down the conditions for its use.

“This could one day be used for administering painkillers, hormones, chemotherapy drugs and other treatments for a wide range of health conditions. In the next few years we hope to be able to test it for long-term use and for viability in living models,” said Chiao.



Contacts and sources:
Lou Corpuz-Bosshart
University of British Columbia

Citation: “Active regulation of on-demand drug delivery by magnetically triggerable microspouters” was recently published online in the journal Advanced Functional Materials. Click here to download a copy.

Sunday, February 19, 2017

Yeast in Babies' Guts Increases Risk of Asthma


University of British Columbia microbiologists have found a yeast in the gut of new babies in Ecuador that appears to be a strong predictor that they will develop asthma in childhood. The new research furthers our understanding of the role microscopic organisms play in our overall health.

"Children with this type of yeast called Pichia were much more at risk of asthma," said Brett Finlay, a microbiologist at UBC. "This is the first time anyone has shown any kind of association between yeast and asthma."

In previous research, Finlay and his colleagues identified four gut bacteria in Canadian children that, if present in the first 100 days of life, seem to prevent asthma. In a followup to this study, Finlay and his colleagues repeated the experiment using fecal samples and health information from 100 children in a rural village in Ecuador.

Canada and Ecuador both have high rates of asthma with about 10 per cent of the population suffering from the disease.

Yeast linked to asthma in Ecuador

Credit:ubcpublicaffairs

They found that while gut bacteria play a role in preventing asthma in Ecuador, it was the presence of a microscopic fungus or yeast known as Pichia that was more strongly linked to asthma. Instead of helping to prevent asthma, however, the presence of Pichia in those early days puts children at risk.

Finlay also suggests there could be a link between the risk of asthma and the cleanliness of the environment for Ecuadorian children. As part of the study, the researchers noted whether children had access to clean water.

"Those that had access to good, clean water had much higher asthma rates and we think it is because they were deprived of the beneficial microbes," said Finlay. "That was a surprise because we tend to think that clean is good but we realize that we actually need some dirt in the world to help protect you."

Now Finlay's colleagues will re-examine the Canadian samples and look for the presence of yeast in the gut of infants. This technology was not available to the researchers when they conducted their initial study.



Contacts and sources:
Heather Amos
University of British Columbia

New Levitation Method Uses Heat and Cold Used To Lift A Variety of Materials

Although scientists have been able to levitate specific types of material, a pair of UChicago undergraduate physics students helped take the science to a new level.

Third-year Frankie Fung and fourth-year Mykhaylo Usatyuk led a team of UChicago researchers who demonstrated how to levitate a variety of objects—ceramic and polyethylene spheres, glass bubbles, ice particles, lint strands and thistle seeds—between a warm plate and a cold plate in a vacuum chamber.

“They made lots of intriguing observations that blew my mind,” said Cheng Chin, professor of physics, whose ultracold lab in the Gordon Center for Integrative Science was home to the experiments.

Researchers achieved levitation of lint among other particles 
Courtesy of Chin Lab


In their work, researchers achieved a number of levitation breakthroughs, in terms of duration, orientation and method: The levitation lasted for more than an hour, as opposed to a few minutes; stability was achieved radially and vertically, as opposed to just vertically; and it used a temperature gradient rather than light or a magnetic field. Their findings appeared Jan. 20 in Applied Physics Letters.

“Magnetic levitation only works on magnetic particles, and optical levitation only works on objects that can be polarized by light, but with our first-of-its-kind method, we demonstrate a method to levitate generic objects,” said Chin.

In the experiment, the bottom copper plate was kept at room temperature while a stainless steel cylinder filled with liquid nitrogen kept at negative 300 degrees Fahrenheit served as the top plate. The upward flow of heat from the warm to the cold plate kept the particles suspended indefinitely.

UChicago researchers achieved levitation of macroscopic objects between warm and cold plates in a vacuum chamber.

Photo byJean Lachat

“The large temperature gradient leads to a force that balances gravity and results in stable levitation,” said Fung, the study’s lead author. “We managed to quantify the thermophoretic force and found reasonable agreement with what is predicted by theory. This will allow us to explore the possibilities of levitating different types of objects.” (Thermophoresis refers to the movement of particles by means of a temperature gradient.)

“Our increased understanding of the thermophoretic force will help us investigate the interactions and binding affinities between the particles we observed,” said Usatyuk, a study co-author. “We are excited about the future research directions we can follow with our system.”

The key to obtaining high levitation stability is the geometrical design of the two plates. A proper ratio of their sizes and vertical spacing allows the warm air to flow around and efficiently capture the levitated objects when they drift away from the center. Another sensitivity factor is that the thermal gradient needs to be pointing upward—even a misalignment of one degree will greatly reduce the levitation stability.

“Only within a narrow range of pressure, temperature gradient and plate geometric factors can we reach stable and long levitation,” Chin said. “Different particles also require fine adjustment of the parameters.”



The apparatus offers a new ground-based platform to investigate the dynamics of astrophysical, chemical and biological systems in a microgravity environment, according to the researchers.

Fourth-year Mykhaylo Usatyuk (left) and third-year Frankie Fung. 
Photo by Jean Lachat

Levitation of macroscopic particles in a vacuum is of particular interest due to its wide applications in space, atmospheric and astro-chemical research. And thermophoresis has been utilized in aerosol thermal precipitators, nuclear reactor safety and the manufacturing of optical fibers through vacuum deposition processes, which apply progressive layers of atoms or molecules during fabrication.

The new method is significant because it offers a new approach to manipulating small objects without contacting or contaminating them, said Thomas Witten, the Homer J. Livingston Professor Emeritus of Physics. “It offers new avenues for mass assembly of tiny parts for micro-electro-mechanical systems, for example, and to measure small forces within such systems.

“Also, it forces us to re-examine how ‘driven gases,’ such as gases driven by heat flow, can differ from ordinary gases,” he added. “Driven gases hold promise to create new forms of interaction between suspended particles.”

Levitation of materials in ground-based experiments provides an ideal platform for the study of particle dynamics and interactions in a pristine isolated environment, the paper concluded. Chin’s lab is now looking at how to levitate macroscopic substances greater than a centimeter in size, as well as how these objects interact or aggregate in a weightless environment. “There are ample research opportunities to which our talented undergraduate students can contribute,” Chin said.



Contacts and sources: 

University of Chicago

Citation: “Stable thermophoretic trapping of generic particles at low pressures,” by Frankie Fung, Mykhaylo Usatyuk, B. J. DeSalvo and Cheng Chin in Applied Physics Letters, Jan. 20, 2017. DOI 10.1063/1.4974489

Funding: National Science Foundation, Grainger Foundation and Enrico Fermi Institute.


Why Do Meteors Make Spooky sounds?

When a meteor is about to conk your neighborhood and gives fair warning by emitting sizzling, rustling and hissing sounds as it descends, you might think that the universe is being sporting.

Sandia National Laboratories researcher Richard Spalding, recently deceased, examines the sky through which meteors travel.

But these auditory warnings, which do occur, seem contrary to the laws of physics if they are caused by the friction of the fast-moving meteor or asteroid plunging into Earth’s atmosphere. Because sound travels far slower than light, the sounds should arrive several minutes after the meteor hits, rather than accompany or even precede it.

So maybe atmospheric shock waves from the meteors are not the cause of the spooky noises
Photo by Randy Montoya 

Another theory is that the sounds are created by radio frequency emissions. That seems unlikely without designated receivers.

But what if the sounds are caused by the brilliant, pulsating light emitted by the asteroid as it burns up in Earth’s atmosphere?

In an article published Feb. 1 in the journal Scientific Reports, the late Sandia National Laboratories researcher Richard Spalding reasoned that such intense light could suddenly heat the surface of objects many miles away, which in turn heats the surrounding air. This could create sounds near the observer. Colleagues John Tencer, William Sweatt, Ben Conley, Roy Hogan, Mark Boslough and Gigi Gonzales, along with Pavel Spurny from the Astronomical Institute of the Czech Republic, experimentally demonstrated and analyzed that effect.

They found that objects with low conductivity, such as leaves, grass, dark paint and even hair, could rapidly warm and transmit heat into nearby air and generate pressure waves by subtle oscillations that create a variety of sounds. The process is called photoacoustic coupling.

Sounds concurrent with a meteor’s arrival “must be associated with some form of electromagnetic energy generated by the meteor, propagated to the vicinity of the observer and transduced into acoustic waves,” according to the article. “A succession of light-pulse-produced pressure waves can then manifest as sound to a nearby observer.”

This bolide appeared over the Flinders Ranges, in the South Australian desert on the evening of the 24th April 2011.
Credit: Wikimedia Commons

The experimenters exposed several materials, including dark cloths and a wig, to intense pulsing light akin to that produced by a fireball. The process produced faint sounds similar to rustling leaves or faint whispers. Computer models bear out the results.

A less extreme version of the photoacoustic effect had been observed in 1880 by Alexander Graham Bell when, testing the possibilities of light for long-distance phone transmissions, he intermittently interrupted sunlight shining on a variety of materials and noted the sounds produced.

Sandia National Laboratories is a multimission laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies and economic competitiveness.


Contacts and sources:
Neal Singer,
Sandia National Laboratories

How an Ice Age Paradox Could Inform Sea Level Rise Predictions

New findings from the University of Michigan explain an Ice Age paradox and add to the mounting evidence that climate change could bring higher seas than most models predict.

The study, published in Nature, shows how small spikes in the temperature of the ocean, rather than the air, likely drove the rapid disintegration cycles of the expansive ice sheet that once covered much of North America.

 The behavior of this ancient ice sheet—called Laurentide—has puzzled scientists for decades because its periods of melting and splintering into the sea occurred at the coldest times in the last Ice Age. Ice should melt when the weather is warm, but that's not what happened.

Credit: University of Michigan

"We've shown that we don't really need atmospheric warming to trigger large-scale disintegration events if the ocean warms up and starts tickling the edges of the ice sheets," said Jeremy Bassis, U-M associate professor of climate and space sciences and engineering. "It is possible that modern-day glaciers, not just the parts that are floating but the parts that are just touching the ocean, are more sensitive to ocean warming than we previously thought."

This mechanism is likely at work today on the Greenland ice sheet and possibly Antarctica. Scientists know this in part due to Bassis' previous work. Several years ago, he came up with a new, more accurate way to mathematically describe how ice breaks and flows. His model has led to a deeper understanding of how the Earth's store of ice could react to changes in air or ocean temperatures, and how that might translate to sea level rise.

Last year, other researchers used it to predict that melting Antarctic ice could raise sea levels by more than three feet, as opposed to the previous estimate that Antarctica would only contribute centimeters by 2100.

In the new study, Bassis and his colleagues applied a version of this model to the climate of the last Ice Age, which ended about 10,000 years ago. They used ice core and ocean-floor sediment records to estimate water temperature and how it varied. Their aim was to see if what's happening in Greenland today could describe the behavior of the Laurentide Ice Sheet.

Scientists refer to these bygone periods of rapid ice disintegration as Heinrich events: Icebergs broke off the edges of Northern Hemisphere ice sheets and flowed into the ocean, raising sea level by more than 6 feet over the course of hundreds of years. As the icebergs drifted and melted, dirt they carried settled onto the ocean floor, forming thick layers that can be seen in sediment cores across the North Atlantic basin. These unusual sediment layers are what allowed researchers to first identify Heinrich events.


Credit: University of Michigan

"Decades of work looking at ocean sediment records has shown that these ice sheet collapse events happened periodically during the last Ice Age, but it has taken a lot longer to come up with a mechanism that can explain why the Laurentide ice sheet collapsed during the coldest periods only. This study has done that," said geochemist and co-author Sierra Petersen, U-M research fellow in earth and environmental sciences.

Bassis and his colleagues set out to understand the timing and size of the Heinrich events. Through their simulations, they were able to predict both, and also to explain why some ocean warming events triggered Heinrich events and some did not. They even identified an additional Heinrich event that had previously been missed.

Heinrich events were followed by brief periods of rapid warming. The Northern Hemisphere warmed repeatedly by as many as 15 degrees Fahrenheit in just a few decades. The area would stabilize, but then the ice would slowly grow to its breaking point over the next thousand years. Their model was able to simulate these events as well.

Bassis' model takes into account how the Earth's surface reacts to the weight of the ice on top of it. Heavy ice depresses the planet's surface, at times pushing it below sea level. That's when the ice sheets are most vulnerable to warmer seas. But as a glacier retreats, the solid Earth rebounds out of the water again, stabilizing the system. From that point the ice sheet can begin to expand again.

"There is currently large uncertainty about how much sea level will rise and much of this uncertainty is related to whether models incorporate the fact that ice sheets break," Bassis said. "What we are showing is that the models we have of this process seem to work for Greenland, as well as in the past so we should be able to more confidently predict sea level rise."

He added that portions of Antarctica have similar geography to Laurentide: Pine Island, Thwaites glacier, for example.

"We're seeing ocean warming in those region and we're seeing these regions start to change. In that area, they're seeing ocean temperature changes of about 2.7 degrees Fahrenheit," Bassis said. "That's pretty similar magnitude as we believe occurred in the Laurentide events, and what we saw in our simulations is that just a small amount of ocean warming can destabilize a region if it's in the right configuration, and even in the absence of atmospheric warming."

The study is called "Heinrich events triggered by ocean forcing and modulated
by isostatic adjustment." The research is supported by the National Science Foundation and the National Atmospheric and Oceanic Administration.



Contacts and sources:
Nicole Casal Moore
University of Michigan 

New Mechanical Metamaterials Block Motion One Way and Push the Other Way

Engineers and scientists at The University of Texas at Austin and the AMOLF institute in the Netherlands have invented the first mechanical metamaterials that easily transfer motion effortlessly in one direction while blocking it in the other, as described in a paper published on Feb. 13 in Nature. The material can be thought of as a mechanical one-way shield that blocks energy from coming in but easily transmits it going out the other side.

The researchers developed the first nonreciprocal mechanical materials using metamaterials, which are synthetic materials with properties that cannot be found in nature.

Breaking the symmetry of motion may enable greater control on mechanical systems and improved efficiency. These nonreciprocal metamaterials can potentially be used to realize new types of mechanical devices: for example, actuators (components of a machine that are responsible for moving or controlling a mechanism) and other devices that could improve energy absorption, conversion and harvesting, soft robotics and prosthetics.

Credit: The University of Texas at Austin

The researchers’ breakthrough lies in the ability to overcome reciprocity, a fundamental principle governing many physical systems, which ensures that we get the same response when we push an arbitrary structure from opposite directions. This principle governs how signals of various forms travel in space and explains why, if we can send a radio or an acoustic signal, we can also receive it. In mechanics, reciprocity implies that motion through an object is transmitted symmetrically: If by pushing on side A we move side B by a certain amount, we can expect the same motion at side A when pushing B.

“The mechanical metamaterials we created provide new elements in the palette that material scientists can use in order to design mechanical structures,” said Andrea Alù, a professor in the Cockrell School of Engineering and co-author of the paper. “This can be of extreme interest for applications in which it is desirable to break the natural symmetry with which the displacement of molecules travels in the microstructure of a material.”

During the past couple of years, Alù, along with Cockrell School research scientist Dimitrios Sounas and other members of their research team, have made exciting breakthroughs in the area of nonreciprocal devices for electromagnetics and acoustics, including the realization of first-of-their-kind nonreciprocal devices for sound, radio waves and light. While visiting the institute AMOLF in the Netherlands, they started a fruitful collaboration with Corentin Coulais, an AMOLF researcher, who recently has been developing mechanical metamaterials. Their close interaction led to this breakthrough.

The researchers first created a rubber-made, centimeter-scale metamaterial with a specifically tailored fishbone skeleton design. They tailored its design to meet the main conditions to break reciprocity, namely asymmetry and a response that is not linearly proportional to the exerted force.

The researchers first created a rubber-made, centimeter-scale metamaterial with a specifically tailored fishbone skeleton design. They tailored its design to meet the main conditions to break reciprocity, namely asymmetry and a response that is not linearly proportional to the exerted force.

“This structure provided us inspiration for the design of a second metamaterial, with unusually strong nonreciprocal properties,” Coulais said. “By substituting the simple geometrical elements of the fishbone metamaterial with a more intricate architecture made of connected squares and diamonds, we found that we can break very strongly the conditions for reciprocity, and we can achieve a very large nonreciprocal response.”

The material’s structure is a lattice of squares and diamonds that is completely homogeneous throughout the sample, like an ordinary material. However, each unit of the lattice is slightly tilted in a certain way, and this subtle difference dramatically controls the way the metamaterial responds to external stimuli.

“The metamaterial as a whole reacts asymmetrically, with one very rigid side and one very soft side,” Sounas said. “The relation between the unit asymmetry and the soft side location can be predicted by a very generic mathematical framework called topology. Here, when the architectural units lean left, the right side of the metamaterial will be very soft, and vice-versa.”

When the researchers apply a force on the soft side of the metamaterial, it easily induces rotations of the squares and diamonds within the structure, but only in the near vicinity of the pressure point, and the effect on the other side is small. Conversely, when they apply the same force on the rigid side, the motion propagates and is amplified throughout the material, with a large effect at the other side. As a result, pushing from the left or from the right results in very different responses, yielding a large nonreciprocity even for small applied forces.

The team is looking forward to leveraging these topological mechanical metamaterials for various applications, optimizing them, and carving devices out of them for applications in soft robotics, prosthetics and energy harvesting.

This research received funding from the Air Force Office of Scientific Research, the Office of Naval Research, the National Science Foundation, the Simons Foundation and the Netherlands Organization for Scientific Research.



Contacts and sources:
Andrea Alù, Professor in the Electrical & Computer Engineering
The University of Texas at Austin

Examining the DNA of Exploding Stars

Imagine being able to view microscopic aspects of a classical nova, a massive stellar explosion on the surface of a white dwarf star (about as big as Earth), in a laboratory rather than from afar via a telescope.

Cosmic detonations of this scale and larger created many of the atoms in our bodies, says Michigan State University's Christopher Wrede, who presented at the American Association for the Advancement of Science meeting. A safe way to study these events in laboratories on Earth is to investigate the exotic nuclei or "rare isotopes" that influence them.

"Astronomers observe exploding stars and astrophysicists model them on supercomputers," said Wrede, assistant professor of physics at MSU's National Superconducting Cyclotron Laboratory. "At NSCL and, in the future at the Facility for Rare Isotope Beams, we're able to measure the nuclear properties that drive stellar explosions and synthesize the chemical elements - essential input for the models. Rare isotopes are like the DNA of exploding stars."

Nova of Star "V838 Mon" in constellation "Monocerotis"
Credit: NASA/ESA Hubble Space Telescope

Wrede's presentation explained how rare isotopes are produced and studied at MSU's NSCL, and how they shed light on the evolution of visible matter in the universe.

"Rare isotopes will help us to understand how stars processed some of the hydrogen and helium gas from the Big Bang into elements that make up solid planets and life," Wrede said. "Experiments at rare isotope beam facilities are beginning to provide the detailed nuclear physics information needed to understand our origins."

In a recent experiment, Wrede's team investigated stellar production of the radioactive isotope aluminum-26 present in the Milky Way. An injection of aluminum-26 into the nebula that formed the solar system could have influenced the amount of water on Earth.

MSU's Chris Wrede explains what it's like to view microscopic aspects of a classical nova, a massive stellar explosion on the surface of a white dwarf star (about as big as Earth), in a laboratory rather than from afar via a telescope.
Credit: MSU

Using a rare isotope beam created at NSCL, the team determined the last unknown nuclear-reaction rate affecting the production of aluminum-26 in classical novae.

They concluded that up to 30 percent could be produced in novae, and the rest must be produced in other sources like supernovae.

Future research can now focus on counting the number of novae in the galaxy per year, modeling the hydrodynamics of novae and investigating the other sources in complete nuclear detail.

To extend their reach to more extreme astrophysical events, nuclear scientists are continuing to improve their technology and techniques. Traditionally, stable ion beams have been used to measure nuclear reactions. For example, bombarding a piece of aluminum foil with a beam of protons can produce silicon atoms. However, exploding stars make radioactive isotopes of aluminum that would decay into other elements too quickly to make a foil target out of them.

"With FRIB, we will reverse the process; we'll create a beam of radioactive aluminum ions and use it to bombard a target of protons," Wrede said. "Once FRIB comes online, we will be able to measure many more of the nuclear reactions that affect exploding stars."



Contacts and sources:
Layne Cameron
Michigan State University

Drug SkQ1 Slows Aging, Works on Mice, May Hit Market in 2 to 3 Years Says Russian Scientist


A group of Russian and Swedish scientists just published a breakthrough paper, reporting results of a joint study by Lomonosov Moscow State University and Stockholm university. The article was published in the US journal Aging.

The major goal of the study was to investigate the role of intracellular power stations -- mitochondria -- in the process of ageing of organism. Importantly, scientists made an attempt to slow down ageing using a novel compound: artificial antioxidant SkQ1 precisely targeted into mitochondria. This compound was developed in the Moscow State University by the most cited Russian biologist professor Vladimir Skulachev.

A genetically-modified mouse participated in the experiment.

Credit: The A.N. Belozersky Institute Of Physico-Chemical Biology

Experiments involved a special strain of genetically-modified mice created and characterized in Sweden. A single mutation was introduced into genome of these mice resulting in the substantially accelerated mutagenesis in mitochondria. This leads to accelerated ageing and early death of the mutant mice. They live less than 1 year (normal mouse lives more than 2 years). The mutation promotes development of many age-related defects and diseases indicating that the major defect of these mice is indeed ageing.

Starting from the age of 100 days one group of mutant mice was treated with small doses of SkQ1 (approx. 12 micrograms) added into their drinking water. Per scientists' hypothesis, the compound must protect animal cells from the toxic byproducts of mitochondria -- free radicals (reactive oxygen species). Another group of animals served as a control group receiving pure water.

Differences between the two groups became obvious starting from the age 200-250 days. Animals in the control group aged rapidly as expected. They were losing weight, their body temperature decreased, severe curvature of the spine (as a result of osteoporosis) and alopecia were developing, their skin became thinner, and in case of females estrus cycle was impaired. Finally their mobility and oxygen consumption were decreased. The development of all these typical traits of ageing was dramatically decelerated in the group treated with SkQ1. Some of the ageing traits did not appear in that group at all.

Professor Vladimir Skulachev, the creator of SkQ1 molecule design and co-author of this study, says: "This work is quite valuable from both theoretical and practical points of view. First, it clearly demonstrates the key role of mitochondrially produced reactive oxygen species in the process of ageing of mammals. At the same time our study opens the way to the treatment of ageing with mitochondrially targeted antioxidants. We are also very honored to cooperate within this project with such prominent Swedish scientists as prof. Barbara Cannon who has such title as the President of Royal Swedish Academy of Sciences in her CV and prof. Jan Nedergaard, Head of Wenner-Gren institute".

Prof. Skulachev's project is now developing a set of pharmaceuticals based on SkQ1 molecule. The first drug -- Visomitin eye drops -- is already approved and marketed in Russia, it also passed phase 2 clinical trials in US. The next pharmaceutical product in project's pipeline is an oral form of SkQ1 (similar to the one used in the aforementioned experiments). It is now in the process of clinical trials in Russia. In case of positive results of these trials, such "anti-ageing" drug can be approved for systemic indications in 2-3 years.



Contacts and sources: 
Vladimir Koryagin
Lomonosov Moscow State University 

Citation: Improved health-span and lifespan in mtDNA mutator mice treated with the mitochondrially targeted antioxidant SkQ1 http://dx.doi.org/10.18632/aging.101174

Researchers See DNA 'Blink' for the First Time

Many of the secrets of cancer and other diseases lie in the cell's nucleus. But getting way down to that level -- to see and investigate the important genetic material housed there -- requires creative thinking and extremely powerful imaging techniques.

Vadim Backman and Hao Zhang, nanoscale imaging experts at Northwestern University, have developed a new imaging technology that is the first to see DNA "blink," or fluoresce. The tool enables the researchers to study individual biomolecules as well as important global patterns of gene expression, which could yield insights into cancer.

A powerful Northwestern University imaging tool is the first to measure the structure of isolated chromosomes without the use of fluorescent labels.
Credit: Northwestern University

Backman discussed the tool and its applications -- including the new concept of macrogenomics, a technology that aims to regulate the global patterns of gene expression without gene editing -- Friday (Feb. 17) at the American Association for the Advancement of Science (AAAS) annual meeting in Boston.

The talk, "Label-Free Super-Resolution Imaging of Chromatin Structure and Dynamics," was part of the symposium "Optical Nanoscale Imaging: Unraveling the Chromatin Structure-Function Relationship."  

The Northwestern tool features six-nanometer resolution and is the first to break the 10-nanometer resolution threshold. It can image DNA, chromatin and proteins in cells in their native states, without the need for labels.

For decades, textbooks have stated that macromolecules within living cells, such as DNA, RNA and proteins, do not have visible fluorescence on their own.

"People have overlooked this natural effect because they didn't question conventional wisdom," said Backman, the Walter Dill Professor of Biomedical Engineering in the McCormick School of Engineering. "With our super-resolution imaging, we found that DNA and other biomolecules do fluoresce, but only for a very short time. Then they rest for a very long time, in a 'dark' state. The natural fluorescence was beautiful to see."

Backman, Zhang and collaborators now are using the label-free technique to study chromatin -- the bundle of genetic material in the cell nucleus -- to see how it is organized. Zhang is an associate professor of biomedical engineering at McCormick.

"Insights into the workings of the chromatin folding code, which regulates patterns of gene expression, will help us better understand cancer and its ability to adapt to changing environments," Backman said. "Cancer is not a single-gene disease."

Current technology for imaging DNA and other genetic material relies on special fluorescent dyes to enhance contrast when macromolecules are imaged. These dyes may perturb cell function, and some eventually kill the cells -- undesirable effects in scientific studies.

In contrast, the Northwestern technique, called spectroscopic intrinsic-contrast photon-localization optical nanoscopy (SICLON), allows researchers to study biomolecules in their natural environment, without the need for these fluorescent labels.

Backman, Zhang and Cheng Sun, an associate professor of mechanical engineering at McCormick, discovered that when illuminated with visible light, the biomolecules get excited and light up well enough to be imaged without fluorescent stains. When excited with the right wavelength, the biomolecules even light up better than they would with the best, most powerful fluorescent labels.

"Our technology will allow us and the broader research community to push the boundaries of nanoscopic imaging and molecular biology even further," Backman said.



Contacts and sources: 
Megan Fellman
Northwestern University,


Join the Search for Planet Nine: Backyard Worlds Website Lets Public Search The Heavens

NASA is inviting the public to help search for possible undiscovered worlds in the outer reaches of our solar system and in neighboring interstellar space. 
A new website, called Backyard Worlds: Planet 9, lets everyone participate in the search by viewing brief movies made from images captured by NASA's Wide-field Infrared Survey Explorer (WISE) mission. The movies highlight objects that have gradually moved across the sky.
Caltech researchers have found evidence suggesting there may be a "Planet X" deep in the solar system. This hypothetical Neptune-sized planet orbits our sun in a highly elongated orbit far beyond Pluto. The object, which the researchers have nicknamed "Planet Nine," could have a mass about 10 times that of Earth and orbit about 20 times farther from the sun on average than Neptune. It may take between 10,000 and 20,000 Earth years to make one full orbit around the sun.
Credit: NASA
"There are just over four light-years between Neptune and Proxima Centauri, the nearest star, and much of this vast territory is unexplored," said lead researcher Marc Kuchner, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Because there's so little sunlight, even large objects in that region barely shine in visible light. But by looking in the infrared, WISE may have imaged objects we otherwise would have missed."
Join the search for new worlds in the outer reaches of our solar system and in nearby interstellar space at Backyard Worlds Planet 9.
\Credits: NASA's Goddard Space Flight Center Conceptual Image Lab/Krystofer D.J. Kim 
WISE scanned the entire sky between 2010 and 2011, producing the most comprehensive survey at mid-infrared wavelengths currently available. With the completion of its primary mission, WISE was shut down in 2011. It was then reactivated in 2013 and given a new mission assisting NASA's efforts to identify potentially hazardous near-Earth objects (NEOs), which are asteroids and comets on orbits that bring them into the vicinity of Earth’s orbit. The mission was renamed the Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE).

The new website uses the data to search for unknown objects in and beyond our own solar system. In 2016, astronomers at Caltech in Pasadena, California, showed that several distant solar system objects possessed orbital features indicating they were affected by the gravity of an as-yet-undetected planet, which the researchers nicknamed "Planet Nine." If Planet Nine — also known as Planet X — exists and is as bright as some predictions, it could show up in WISE data.

Planet X has not yet been discovered, and there is debate in the scientific community about whether it exists. The prediction in the Jan. 20 issue of the Astronomical Journal is based on mathematical modeling.


The search also may discover more distant objects like brown dwarfs, sometimes called failed stars, in nearby interstellar space.

"Brown dwarfs form like stars but evolve like planets, and the coldest ones are much like Jupiter," said team member Jackie Faherty, an astronomer at the American Museum of Natural History in New York. "By using Backyard Worlds: Planet 9, the public can help us discover more of these strange rogue worlds."

A previously cataloged brown dwarf named WISE 0855−0714 shows up as a moving orange dot (upper left) in this loop of WISE images spanning five years. By viewing movies like this, anyone can help discover more of these objects.Credits: NASA/WISE
Unlike more distant objects, those in or closer to the solar system appear to move across the sky at different rates. The best way to discover them is through a systematic search of moving objects in WISE images. While parts of this search can be done by computers, machines are often overwhelmed by image artifacts, especially in crowded parts of the sky. These include brightness spikes associated with star images and blurry blobs caused by light scattered inside WISE's instruments.

Backyard Worlds: Planet 9 relies on human eyes because we easily recognize the important moving objects while ignoring the artifacts. It's a 21st-century version of the technique astronomer Clyde Tombaugh used to find Pluto in 1930, a discovery made 87 years ago this week.

On the website, people around the world can work their way through millions of "flipbooks," which are brief animations showing how small patches of the sky changed over several years. Moving objects flagged by participants will be prioritized by the science team for follow-up observations by professional astronomers. Participants will share credit for their discoveries in any scientific publications that result from the project.

"Backyard Worlds: Planet 9 has the potential to unlock once-in-a-century discoveries, and it's exciting to think they could be spotted first by a citizen scientist," said team member Aaron Meisner, a postdoctoral researcher at the University of California, Berkeley, who specializes in analyzing WISE images.

Backyard Worlds: Planet 9 is a collaboration between NASA, UC Berkeley, the American Museum of Natural History in New York, Arizona State University, the Space Telescope Science Institute in Baltimore, and Zooniverse, a collaboration of scientists, software developers and educators who collectively develop and manage citizen science projects on the internet.

NASA's Jet Propulsion Laboratory in Pasadena, California, manages and operates WISE for NASA's Science Mission Directorate. The WISE mission was selected competitively under NASA's Explorers Program managed by the agency's Goddard Space Flight Center. The science instrument was built by the Space Dynamics Laboratory in Logan, Utah. The spacecraft was built by Ball Aerospace & Technologies Corp. in Boulder, Colorado. Science operations and data processing take place at the Infrared Processing and Analysis Center at Caltech, which manages JPL for NASA.

For more information about Backyard Worlds: Planet 9, visit: http://backyardworlds.org
For more information about NASA's WISE mission, visit:http://www.nasa.gov/wise


Contacts and sources:
By Francis Reddy
NASA's Goddard Space Flight Center